Unwanted robots.txt by Dreamhost?


#1

Hello!

A user in another forum is saying that Dreamhost inserted a robots.txt file in all his domains, without his knowledge, to ban all bots, including Google bot, Yahoo Slurp, etc.

go away

User-agent: *
Disallow: /

Is this true? It’s a serious accusation, and it would be nice to have an official reply.

In my case search engine traffic is very important, and having my sites blocking them would be terrible.

Dreamhost $97 promo code

[color=#CC0000]97promo[/color] will give you $97 off
[color=#CC0000]90promo[/color] will give $7 to Unicef


#2

Well, I don’t have an “official” response for you since I am just another customer, but of the half-dozen or so domains I’ve had at DH, I have never had them put ANY kind of robots.txt file in place.

Read my blog. You know you want to…


#3

Obviously I can’t speak for DreamHost, but I have not noticed anything like that happening on any of my domains. The only robots.txt files I see are the ones I put there.

Mark


Save [color=#CC0000]$50[/color] on DreamHost plans using [color=#CC0000]PRICESLASH[/color] promo code (Click for DreamHost promo code details)


#4

I also never had such problems, so I want to know more about this.
He also quoted the support, so if anyone of DreamHost wants to confirm it, please PM and I’ll send you the place.

This is really important.

Dreamhost $97 promo code

[color=#CC0000]97promo[/color] will give you $97 off
[color=#CC0000]90promo[/color] will give $7 to Unicef


#5

If google bot is your primary visitor and your website is inefficient enough to be using 250 cpu minutes from just the google bot we will put up a robots.txt. If you put up your own robots.txt to exclude your 16MB photo gallery that is resizing the images on the fly then we won’t create one for you, because we won’t notice you pushing the load on your machine high enough to make others complain.

So basically if you are always watching and optimizing your site and being a good neighbor, you are in the clear. But most people don’t know how to take care of their sites, so we do it for them to keep the machines stable. The trade-off being that a couple robot.txts are better then disabling the problem users or making them upgrade to a dedicated server or making them learn how to take care of their site if they don’t already know how.

If we did place a robots.txt on your site you are welcome to update it to better suit your site, since you obviously know what should be skipped better then we do.

If we did place a robots.txt on your site the policy is that you would be notified. Since it seems that some are claiming that a robots.txt was placed on their site without notification, I will issue an internal memo reiterating the policy.


#6

Do you have a link to the forum?


#7

http://forums.digitalpoint.com/showthread.php?t=206600

We contacted him about it. Google crawls everyone’s sites, his is one of the 1/10,000 that can’t handle it becuase the code is so crappy.

http://forums.digitalpoint.com/showthread.php?s=52417ca9af34fca9d00abeaf5d84bf93&t=202428

Of course we disable sites that are openly distributing large amounts of copyrighted material that they don’t have the rights to distribute.


#8

Good job Michael! When I woke up today, and went there, you had already solved the problem.
What I didn’t want was to start another exchange of posts like in other forums you know, without an official reply. Flame wars just scare people away.
I knew something had to be wrong with his site, concerning heavy usage.

And you solved it :slight_smile:

Dreamhost $97 promo code

[color=#CC0000]97promo[/color] will give you $97 off
[color=#CC0000]90promo[/color] will give $7 to Unicef


#9

Thanks for pointing it out.


#10

No problem Michael. Anytime I see something that needs official clarification I’ll tell you. Your image could have been seriously affected without it.

Bye and have a great new year!

Dreamhost $97 promo code

[color=#CC0000]97promo[/color] will give you $97 off
[color=#CC0000]90promo[/color] will give $7 to Unicef