depends on how many users are hitting it/them throughout the day
we just got http://www.gurusnetwork.com off of an eval server after the logs showed mostly search engine robots hammering us. We looked at robots.txt directives and implemented those and personally contacted the other engines that don’t follow the crawl-delay directive (Google!) and asked them to take it easy.
This was the majority of the problem, Google and Yahoo Slurp were the biggest offenders, but there are a lot of generic site downloading bots that hit the GN pretty hard at times, takes a diligent person to keep on top of that stuff and block offending IPs.
blocking the search engines had an immediate effect on page rank in the search engines too, so we gotta either find a happy medium or go dedicated
any previous mention of “us” almost certainly pertains to a diligent creature name of Emperor thereabouts