I want to make sure people can’t access my domain from Google, Yahoo, et. al. How do I do this with either the robots.txt file or .htaccess file?
Well you don’t block people using robots.txt - it is used to tell search engine bots what not to index. So if you don’t want your site in an index, use it.
You can block both bots and people using .htaccess - just deny known bots or deny people with a Google/Yahoo referrer header.
Using robots.txt is documented at http://www.robotstxt.org/
There are a couple of different ways blocking access based on referer. One of them is:
SetEnvIfNoCase Referer "^http://www.google.com/" BadReferrer
SetEnvIfNoCase Referer "^http://www.yahoo.com/" BadReferrer
deny from env=BadReferrerWhich in English causes Apache to set an environment variable if the referer is either Google or Yahoo. If the environment variable is set, access is denied and visitors get a status code “403 Forbidden”
openvein.org -//- One-time [color=#6600CC]$50.00 discount[/color] on [color=#0000CC]DreamHost[/color] plans: Use ATROPOS7
With robots.txt, you’d just enter something like this:
In .htaccess it would be a bit trickier, as you could either ban everyone’s IP except your own, or you could just throw up an .htpasswd file that would require a password to access your domain.
The dreamhost wiki has a lot more info on such thing: http://wiki.dreamhost.com
EDIT: doh! Atropos7 answered better! xD
Chips N Cheese - Custom PHP installs and the like!
Atropos7 gave a VERY good answer. I say you follow that.
Just create the Robots.txt in root with the text.