Site Unsearchable

Hello all,

I have an odd question. Is there a way to make my website not searchable by Yahoo, Google, or the like?


Yes. Thankfully most of the robots used to crawl websites obey the rules contained within a special file named robots.txt

To prevent all such robots from indexing your site you just need to create a file with the file name robots.txt containing the following text;

User-agent: *
Disallow: /

Place this file into the root directory of your site and the robots who follow the rules (all the major search engines) will not index your site.


Save [color=#CC0000]$50[/color] on DreamHost hosting using promo code [color=#CC0000]SAVEMONEY[/color] ( Click for promo code details )

edit: beaten by Bob, he must type faster than me :slight_smile:

Awesome, thanks all!

Hi there,

Is there any way to make a particular page unsearchable instead of whole website?

Thanks in Advance.

The robots.txt file that was included in post quoted in your reply directs all robots that obey the rules to disregard the entire site.

The robots.txt file can be customized google the term: robots.txt generator
and you will find a number of sites that will generate robots.txt files for you customized for your site by filling out a form. I’ve used several but have no specific recommendations.

You also might want to be concerned with bots that don’t follow the rules. particularly from russia.

In terms of bots that don’t abide by robots.txt rules: If you don’t want something accessible to everyone in the world don’t put it on the WWW for all to read!
Also check server logs for site crawls and feel free to use .htaccess to block -

Do you have a good way to block yandex and have it stay blocked? It’s like playing russian whack-a-mole :stuck_out_tongue:

Check the instructions on this site: