Site down for the third time in rtwo weeks


Can anyone comment is this normal? Been with dreamhost for two weeks and the site is down again for the third time.

id any of the staff members read this, then check my issue please: 2915313

So far no one has checked into it.


Unfortunately, this isn’t abnormal for a new account. All the new accounts get piled on the same machine, which really takes a beating as all the new users do wild and crazy things with their accounts.

For now, submit a ticket, as you have, and if you hear nothing back in over 24 hours, submit a new ticket and state that it’s a resubmission of 2915313.

In the future, it will calm down and things will seem more stable.



Are you saying it is ok for a new account to be down?

And yes it was back for 2hours and is back down again.

How do you guys keep your customers?


I submitted another ticket.

I have tried many hosts, but have never experienced a site being down 4 times in 2 weeks.

And if it is down how come it takes hours to come back up?


I don’t see anything in my post that implied that it’s ok for a site to be down. Typically, the site isn’t completely down. The server’s just so bogged down that it can’t respond to requests in a timely manner.

I don’t have customers; I don’t work here. And as a customer, when I have a problem, I submit a Support ticket.



my bad, thought you work at Dreamhost.

I did submit a ticket(tickets).

Even got a reply saying it is ok when it actually isnt.

It is 6 hours and counting. What kind of a web host has such problems that the site is down 6 hours a day!


What’s the URL for your domain?

And, as a test, log into your shell account and type ‘uptime’. This will tell you three things:

  1. If the server is actually up, which it probably is.
  2. How long it’s been up, but this doesn’t show you how long since Apache was restarted
  3. What kind of load the server is under. It should be less than 5. Mine’s hovering at about 5 right now, which is so-so, but the site’s responding reasonably well.



url is:

support is giving the worst answers ever.

The services your sites are on are up and running, looking further into
this I see that your process are being killed by the servers process
watching service for using up too much virtual memory…
I then looked at why your site was using up so much memory by looking at
your access logs and found a few IPs including Google and Yahoo that were
making excessive connections.
I have blocked these IPs, their connections made today in on the left


one of these were mine.

Of course they are making and I made connections to see over a 12 hour period have they fixed it.

And so far the site is still down. Soon 24 hours.

I understand problems occur, but if a web host(which claims to be good) cant get your site up and running for over a 15 hour period they should not be offering the service to anyone.


Heh, the site works just fine, though I’m not nearly as far away from California as you.

Yeah, that is bad news to find out some search engines and other visitors are causing your site to suffer from a denial of service. At least its not a hardware problem or server problem!

Well of course one thing to do in such a situation is block the web server from handling requests from these offenders until they stop hitting your site so often. I am thinking DreamHost added or modified .htaccess to with “Deny” directives.

So here is what you need to do. Keep an eye on your “error.log” file and when the “denied” messages slow down to a trickle, comment out the “Deny” directives in .htaccess. And if the site goes down again soon after, just uncomment them.

I see that you are running WordPress, do you have caching enabled?

:cool: -//-


site is back now, and so far no idea for how long. I am very skeptical.

I see no point blocking yahoo and google.

Just shows dreamhost cant handle it.

yep, running wordpress and caching is enabled.


Granted you shouldn’t have to block them, but the problem is that like children at times their search engine bots misbehave. They might only misbehave for a relative short period of time. You have a couple of options available to you:

  1. Block them until they stop visting your site so often (“ignore them until they shut up”)
  2. Add robots.txt rules to slow them down (“ask them to stop”)
  3. Throw money at the problem and move to dedicated hosting (“give in to every demand”)
  4. Find out why they want to visit so often in order to prevent it from happening

Exactly! Just like pie (server resources), there’s only so much to go around - and with shared hosting, you are sharing that pie with other people. DreamHost has told you that your site is trying to hog the pie. You can either stop your site from hogging the pie, or get a pie all to yourself by finding a dedicated hosting service.

Of course its easy to blame somebody else here, but you do have to realize that you are responsible for your web site, not DreamHost. If the problem is indeed cause by high traffic, then they can only suggest that you pay more for a virtual private server or find someone else to host your sites. You can allow DreamHost some time to make sure their hardware and the software they provided is not at fault, but if they are not in the wrong, it is ultimately up to you to fix your site or determine what your site needs in order to run smoothly.

Now, if you don’t want to have to block any visitors, you might want to start learning how much memory your site needs, because DreamHost has told you it is using too much. You might want to hire a PHP developer if you are not familiar with web technologies.

Another problem is that your robots.txt file is incorrectly formatted as rich text (RTF)

Another problem is that “” is redirected to “” but the links to all your images are to “”. This means the browser hits your web site twice for each image - (your access.log must be gigantic as a result)

You have 1,140 results in Google and they all seemed to have been cached in the past week.

:cool: -//-


hi Atropos7

the problem has not been caused by hit traffic.

If dreamhost cant handle less than 10,000 visitors a day on a shared hosting then they should not offer the service.

I fixed the robots.txt usd wrong editor at first.

what do you mean images are at… instead of

here all images have www

thanks for the help though, you have pointed to some problems which might be causing this.

I still believe and think that a shared hosting shold easily manage a site my size. 8traffic and bandwith wise)


I just checked, and the site’s up, though a little slow.

The banner image is missing the www, which will slow things down a tiny bit. It also has an extra slash - not a deal breaker, but something needs to be edited. I found only one other reference that was missing the www, but that was a link.

10k visits a day shouldn’t be too much to ask. That’s about 7 visitors per minute. But you shouldn’t be getting so hammered by search engines. I do expect that with some efficiency tweaks, you won’t be having any problems being hosted in a shared environment.