I would like to hear DreamHost's side

There is an article here:

Suggesting that DreamHost blocked google-bot from indexing one of it’s client website.

The writer didn’t mention he asked for DH’s comment on that, so… what is DH’s side?


You don’t get a dedicated server for $9.95/month. Some people just can’t seem to understand that.

If he was hogging up the resources, they could have just completely shut him down. Then, he’d be complaining about that, instead of just missing some Google traffic until he gets things under control or moves to a dedicated server.

It’s SHARED hosting. Should DH have just let things go the way they were, making every other user on his server suffer?

:stuck_out_tongue: Save up to $96 at Dreamhost with ALMOST97 promo code (I get $1).
Or save $97 with THEFULL97.

I recall a thread about a similar issue a while ago.

If I recall correctly, the reason given by a DreamHost rep. was that the site was grossly inefficient and the GoogleBot crawling alone was enough to bring the server to it’s knees, impacting other users on the server.

I just did a quick search and found the thread I am referring to…



Save [color=#CC0000]$50[/color] on DreamHost plans using [color=#CC0000]PRICESLASH[/color] promo code (Click for DreamHost promo code details)

Fully agree.

We are in a SHARED server.

For 9.95/mo, for 200GB space, for 2TB bandwidth, what else do you want?

Just be nice and we will be happy – with our neighbors.

Save $97 with promo code: [color=#CC0000]97YES Sign Up NOW[/color]

Well… What is the meaning of 2TB bandwidth if it can’t be used? Just a nice number? Should it be read “we give you 2TB, provided that you never use it”?

If the bandwidth is exhausted, the account should be blocked, but while it’s not - forgive me for beeing blunt - it’s none of anybody business if the traffic is coming from browsers, from crawlers or from outer space.

2TB bandwidth does not necessarily mean high server usage and vice versa. If someone writes a bad PHP script, everyone on the server loses, regardless of how much data is being pushed through that site.

For 30% more diskspace, 30% more bandwidth, and $40 off use promo code [color=#CC0000]THEFULLMONTY[/color]

Web Design and Development

There is more to it than simple bandwidth usage, other (shared) system resources have to be considered as well.

In the thread I linked above, Michael (a DreamHost honcho) mentions the possibility of a site having a large gallery, where the images are resized on-the-fly, consuming a large amount of CPU/Memory resources. Would you really want your sites sharing a server with such a site when the GoogleBot decided to crawl the entire gallery?

It is things such as this that people really need to consider when creating and optimising their sites for a shared environment. Unfortunately, many do not consider such things until after they are causing problems for others on their shared server.


Save [color=#CC0000]$50[/color] on DreamHost plans using [color=#CC0000]PRICESLASH[/color] promo code (Click for DreamHost promo code details)

If the usage is limited by CPU usage or by any other parameter, I should know about it.

Is it mentioned in the TOS? Can I see anywhere that I’m reaching a threshold? And what this threshold be?

If the shared server can’t handle the thresholds according which they sell their hosting services, they should set REAL thresholds or put fewer accounts per server.

Now… I’m a VERY happy customer, but this policy might affect me in the future. Apperently they don’t send a notice to the user that his account is reaching a threshold, they chance it and inform him post factum.

And indeed, rightfully they don’t notice that the threshold has reached, because it hasn’t.

So please - as long as a subscriber is not deviating from his package constrains, it’s nobody’s business if the code is “crappy” (quoting michael) or there’s an on-the-fly-resizing-of-large-gallery.

DreamHost offers shared hosting. This means it is imperative that web server performance be maintained to avoid impacting many customers.

Read this post from DreamHost: Introduction to Server Administration

“If a site is affecting the performance of a server we reserve the right to even shut it down completely until the site can be fixed. Still, we do work hard to find other solutions and in this case we merely blocked google bot until the problem could be resolved. Overall there is nothing dumb or evil or sneaky about trying to keep a server up and functioning well. If we didn’t stop sites from running out of control there would be ten times the number of customers complaining that our servers were slow and crashy.”

Anyway it appears that most of the people not happy with blocking the bot are the people who don’t expect a lot of human visitors in the first place. Which explains why they don’t care if some other customer’s human visitors have to wait 30 seconds per page. It is more important to them that a robot add them to a search engine because whether or not it does affects their bottom line. The same ones who don’t want to spend money to make sure their site never goes down…

:cool: openvein.org -//- One-time [color=#6600CC]$50.00 discount[/color] on [color=#0000CC]DreamHost[/color] plans: Use ATROPOS7

Who’s business is it what kind of visitors one expects?

Is my site less important than another site only because it serves green people with antennas?

Any activity in the site (as negligible as it might be) affects the servers performance. As long as an account is within the limits of the package, it should not be interfered with.

If its exeeding the limits, it should be blocked. No administrator should change the account settings just because he has the feeling that the code should be more elegant.

Actually, excessive CPU usage does get a mention in the DreamHost TOS

[color=#0000CC]“Servers are shared with other customers, and as such IRC-related activities or severely CPU intensive CGI scripts (e.g. chat scripts, proxy scripts, scripts which have bugs causing them to not close properly after being run) are not encouraged.”[/color]

The hard limit on CPU usage was removed a while ago. These days, your sites CPU usage is not a factor, unless it is great enough to negatively impact others on your shared server.

The way I see it, DreamHost support basically has two options when they discover a site using enough resources to negatively impact others on that particular server. They can simply shut-down the site completely, then notify the customer so that they may attempt to rectify the situation, or they can (as they seem to have done in this case) determine the root cause of the resource usage, temporarily remove this cause, then notify the customer so that they can attempt to rectify the situation. The second option at least allows the site to stay up for visitors in the interim.


Save [color=#CC0000]$50[/color] on DreamHost plans using [color=#CC0000]PRICESLASH[/color] promo code (Click for DreamHost promo code details)

To respond to the questions about CPU usage, DH is nice enough to not limit how much you can CPU time you get unless you’re affecting other users. You can turn on CPU monitoring for a user in the Panel > Manage Users area.

DH did set hard limits on the CPU usage for a few months, but it wasn’t working out very well for a lot of customers - so they lifted the hard restrictions and began working to put fewer users on each machine.

Also, it is easily possible to use up all your bandwidth and storage with out over taxing the server. Simple HTML files and images will accomplish this. The things that mess that are sites that require the server to process each and every page before sending it out to the visitor. Word press in it’s default incarnations can be a very bad resource hog for a semi-popular site. Poorly set up and or coded sites that begin getting a lot of traffic can even bring dedicated servers to a crawl.

It has been my experience in reading this forum that DH does not care what is bring the traffic to your site, but simply that you’re affecting other users on your shared server. If it’s a Google bot or Fark users, it doesn’t make much difference. Also, it’s very rare that DH actually shuts any one down immediately for using up too much CPU time. Generally they will either A, contact the user and ask them to fix up their site or move to dedicated hosting, or B move them to a temporary server where they can sort things out for a month and reduce usage or have time to find other hosting.

However, if things really worked this way, you wouldn’t like DH at all. What happens the next time you contact support because your server is crawling. The tech would reply, I’m sorry your server is slow, one of the other users has a run away script that’s stuck in a loop and using up all of the CPU and RAM resources, but because he hasn’t exceed his bandwidth yet I can’t do anything about it.

Of course the other way it to set limits on CPU time, and as I said above DH tried that and it wasn’t really working out. There would be 5 or 6 posts here each week with users being told they were using too much CPU time. There’s a blog post about it IIRC, or maybe that was a newsletter. But because such a small percentage of people were going above the alloted time it was cheaper and better for DH to move those few to server with fewer people on them and remove the restrictions for the rest of us.

art.googlies.net - personal website

If its exeeding the limits, it should be blocked. No administrator should change the account settings just because he has the feeling that the code should be more elegant.[/quote]
So you’re inferring that the DreamHost employee in question actually disabled their account because he thought the code was ugly looking?
I hope I read that wrong…

Chips N Cheese - Custom PHP installs and the like!

Well, it’s the host’s business, if you can’t match your traffic to a suitable solution on your own.

Then you make the site efficient on your own, before it’s a problem. Or you move to a dedicated server BEFORE you outgrow a shared hosting account.

It’s really not as complicated as the people that get cut off like to pretend it is.

Read and understand this: MOST hosts would have just completely shut him down. ALL hosts will take action, however they see fit, when you start affecting the other users.

Again… it’s called shared hosting for a reason.

:stuck_out_tongue: Save up to $96 at Dreamhost with ALMOST97 promo code (I get $1).
Or save $97 with THEFULL97.

On a shared host, if the “kind of visitor” a site is getting includes a bot that is not respecting a robot.txt directive (like slowcrawl) and is hammering the server with rapid-fire, and/or repetitive, requests, it’ the “business” of everyone on that server! :wink:

A bot that misbehaves, or is using faulty logic, is efffectively a “low-grade” DOS attack, and should be curtailed.

Your arguments about “using your bandwidth” are largely irrelevant; it’s not the bandwidth, it the server load - things like concurrent requests, memory, CPU. When it comes to websites, there is “traffic” and then there is “traffic” - different types of traffic place different loads on the servers resources - rarely is the bandwidth usage a problem, or even a consideration, as long as the pipe is “big” enough (and DH’s is!).

If you really want to hear DH’s side, there is a pretty good response from DH in the comments responding to a similar blog post (check out the comments by “John” from DH). He actually explains the situation fairly well (not that many of the other commenters were sufficiently literate to actually read what he had to say or were bright enough to understand what he was saying). :wink:


And bringing a server to its knees is definitely not within limits of any package, thankfully.

There was a bot hammering a web site CGI script. Which one do you want to block, the site, the script, or the bot? Especially when the bot is known to misbehave and as for as you know there is nothing wrong with the script other than it should not be hammered by that bot. Some people might like having their site disabled in this case but I don’t know of any.

Inefficient code can cause problems too. And there is code out there that is known to be inefficient or not suitable for the type of environment DreamHost provides. Since DreamHost isn’t responsible for fixing the code its customers upload, its only recourse would be to affect site configuration or hosting service instead.

And besides blocking the bot in the .htaccess file can be undone by the customer. That way the customer can fix the site or wait for the hammering to stop and remove the block without having to wait for support to do it.

:cool: openvein.org -//- One-time [color=#6600CC]$50.00 discount[/color] on [color=#0000CC]DreamHost[/color] plans: Use ATROPOS7

Now that’s just ignorant! If you read the TOS, you see that one of the “package constraints” is the clear understanding that the primary function of the hosting service is to serve websites - “on-the-fly-resizing-of-large-gallery” is a classic example of someone using the webserver’s CPU and memory resources for something that is not related to the serving of pages, but is rather using it as an image processor.

That stuff should be done “off the server”. The server’s primary job should be to serve the pages, not process data. The fact that we can use the servers’ CPUs and resources for a “reasonable degree” of processing, and serve “dynamic” pages, does not mean we have no responsibility to appropriately pre-process the data that we ask the server to serve up so that it’s resources are directed at the actual serving of pages rather than the processing of data.

This is particularly important when using a shared server; if you are on a dedicated box, you could decide what the the servers resources should be used for and decide for yourself that you are willing to negatively impact the serving of pages in favor of dynamically resizing images (or any other CPU intensive processes). When sharing a server with others, their needs must also be accommodated, and you processing “crappy” code and/or using the server as your own personal computer for general processing purposes should be curtailed.

Part of understanding “package constraints” is understanding the whole concept of shared hosting, and understanding the entirety of the TOS, not just the “quotas”. You can be well within published “quotas” asnd still be in violation of the TOS by misusing the machine.

To suggest otherwise just indicates a lack of knowledge and understanding of how webservers work. :wink:


I’d pick the bot. What did I win?! :stuck_out_tongue:

I think a lot of people would think there are two correct answers, and which two they are, would depend on the one DH thought was the right choice. :wink:

:stuck_out_tongue: Save up to $96 at Dreamhost with ALMOST97 promo code (I get $1).
Or save $97 with THEFULL97.

Common sense, without the help of a TOS page, should tell you that you don’t get $500 worth of server for $9.95.

They have already stated that people are using all of their bandwidth & space. It can be done.

Want fewer accounts per server? Pay for it. You can even get your own server (not for $10) and be the only user on it, so you can hog up all of the resources without affecting anyone.

Why waste time learning how not to be a sloppy coder when you can just bring your own dedicated server to a crawl whenever you want? woo hoo!

You keep mentioning how you think things should be handled, but you keep failing to mention how much you’re willing to pay for it.

You know, you don’t have to be a computer genius to get how this works. You just have to understand the word “shared.”

If you need to defend crappy coding to make an argument, it’s a pretty good sign that you’re wrong.

Also, for every teary-eyed customer that gets booted for not getting that the server is shared, there are hundreds of happy customers that are glad to see him gone and their sites being back to normal.

:stuck_out_tongue: Save up to $96 at Dreamhost with ALMOST97 promo code (I get $1).
Or save $97 with THEFULL97.

Never a truer word was spoken (though, actually, I suspect there are may well be thousands of other customers glad to see that user gone :wink: ).