According to recent discussions on webmasterworld, we can block Bad Bots from scraping and wasting bandwidth using the following code:
.htaccess Code :: BEGIN
Block Bad Bots by User-Agent[/b]
[list of bad bots]
Allow from all
Deny from env=bad_bot
.htaccess Code :: END
Is this a good idea?
Do we have access to .htaccess?
Can Dreamhost maintain this for all sites?
Thanks for any info - I’m not a technical person, so please excuse if this is a silly question!