.htaccess addition to stop bad bots

According to recent discussions on webmasterworld, we can block Bad Bots from scraping and wasting bandwidth using the following code:


.htaccess Code :: BEGIN

Block Bad Bots by User-Agent[/b]

[list of bad bots]

Order Allow,Deny
Allow from all
Deny from env=bad_bot

.htaccess Code :: END


This has been discussed and updated at:

Is this a good idea?

Do we have access to .htaccess?

Can Dreamhost maintain this for all sites?

Thanks for any info - I’m not a technical person, so please excuse if this is a silly question!

I couldn’t say for sure, but if you are having problems with badly behaved bots wasting your bandwidth then I guess it’s worth a shot.

Yes. Just create a file called .htaccess with the required content and place it in your domains web directory (by default this will be named the same as your domain).

Note: .htaccess is a ‘hidden’ file, as such it will not show up in your FTP client unless the client is configured to show such files.

Nope, it is up to you to create and maintain the .htaccess files for your domains.


Save [color=#CC0000]$50[/color] on DreamHost plans using [color=#CC0000]PRICESLASH[/color] promo code (Click for DreamHost promo code details)

Thanks for your help, I’ll do a bit of reading first!!

No problem, I am glad I could be of some assistance. :slight_smile:

That is probably a wise idea. It is possible to take your site down very quickly with a simple error in the .htaccess file. Thankfully, fixing things is usually just a matter of undoing whatever changes you made.

Usually, I would recommend the DreamHost wiki as a good source of information on the subject, but it appears to be down at the moment. :frowning:

Edit: I just checked and the wiki is back up.


Save [color=#CC0000]$50[/color] on DreamHost plans using [color=#CC0000]PRICESLASH[/color] promo code (Click for DreamHost promo code details)