How can I find out what's causing memory limits to be reached?

Every month or so, I get an email from Dreamhost telling me that one or more of my users is hitting the memory limits, causing my sites to go down. (I am on a shared server)

I’m very much aware of how poorly my sites run… and how often I get 500 errors and just random downtimes and errors trying to update plugins and themes and Wordpress versions. But I’ve asked Dreamhost on a handful of occasions WHY I’m hitting my limits and I can never get a clear answer.

Since I don’t use FTP very often, there must be some process running that uses my FTP user. But I don’t know what it is. And if I could just see what processes are running – and a process that runs my memory limits immediately, that would be really helpful. But for some reason, Dreamhost can’t figure it out.

It’s not a small problem. I’ve hit my limit “12482” times. No idea what time frame that is. Per month? The number is typically that high, though.

Is there somewhere I can look? Is there a way I can word it to the helpdesk in a way that lets them know exactly what I’m looking for?

I have multiple pieces of software running. Wordpress (a few instances), Gallery2, Mediawiki, phpBB… If one of them is running wild, I just need to know which one it is so I can tweak it or turn it off.

Thanks for any guidance y’all can provide.

Do you have an exact error message or screenshot? We might be able to troubleshoot the issue, but we need more details to figure out what is happening. :slight_smile:

1 Like

Other than the White Screen of Death, and just random outage messages from Jetpack, I don’t receive any error messages. When I do use FTP, I have no problems. I only encounter the WSOD if I’m updating plugins/themes/WP. And Dreamhost hasn’t provided anything other than this form email that I get once a month:

Our monitoring systems show that one (or some) of your user accounts may be making your web hosting account operate inefficiently. We noticed you’ve frequently hit the memory limits of your shared hosting plan over the last couple weeks. When this happens, our system automatically stops web processes which could be negatively impacting your server’s performance. This means your visitors may see errors or be unable to access your website at all for brief periods of time.

_These are the FTP/shell users on your account, alongside the number of times they’ve hit their memory limits: _


1 Like

I just got the same email (with a different number times that the user has hit their memory limits but similarly “high” (what even is high, I have no clue?)) and I’ve gone through and done all the things the email advised that applied to my websites and I could figure out how to do. I feel like this does tend to happen when wordpress updates itself and then I go and update my themes and plugins.

I don’t get WSOD, but I do get the odd 500 error when updating plugins (particularly Jetpack, so for the most part I’ve just ditched Jetpack across the board and installed replacements for the stuff I might have used it for). I guess I’ll have to wait and see if trying the last thing on their list of suggestions that I figured out how to do, but hadn’t done before, works.

Ultimately, I don’t care. The visitor to my website that interests me is… me and I’ve not had issues accessing my own sites. What would be great is if I Dreamhost could tell me what the actual problem is, or accept a response of “I don’t care” and stop sending me the email about this once in a while! Maybe it is just is wordpress updating itself and having a funny turn while doing so! Is it just an upsell thing?

Yes, I have deactivated Jetpack from as many sites as I could. That really does take a toll on the memory and almost never updates without erroring and causing me to have to delete it and reupload it via ftp. Updating WP versions is a nightmare. It’s 50/50 on whether I make it through the whole update before it errors out… and then I have to manually update all of wordpress. What a pain.

But yeah, what is high?
What is causing it?
We need to know before we can fix it. I’m sure it’s partly to convince us to switch to VPS but sorry, that is too expensive.

You might try something like to see which plugins are taking up memory.

The shared hosting won’t have solid numbers on what “high” is; it is shared hosting. But if you are running into the limits of the shared hosting, know it isn’t an upsell tactic. Your site will likely not work on any shared hosting, it is just too big.

Fortunately, reducing plugins has a dramatic effect on most sites. I run several WordPress sites on DH shared hosting, and without more than a few plugins they run fine. :slight_smile:

@Maiki, thanks for the reply. Sorry it’s been a crazy month for me and I haven’t been able to try this yet. However, I’m seeing now the reviews on P3 are saying it’s no longer working.

As I’ve let all this stew, I have been thinking that most of the problems I’ve had with Wordpress did in fact start with the advent of Jetpack. As much as I loved the idea and enjoy much of the functionality that Jetpack provides, it just makes WP so sluggish and really hard to successfully run on a shared server. Seems weird that installing separate plugins per wanted-function would be an easier load, but… here we are.

You’ll find lots of conversations debating how it all pans out, but my experience is that Jetpack doesn’t slow down a site with code, it does it with external connections, which shared hosting isn’t optimized for.

Most of the sites I host use only a few plugins, so shared hosting isn’t bad for that. :slight_smile:

Well, I guess this is more common than I thought. Same issues with the 500 error and email from dreamhost about hitting memory limit.
What gets me is that I hardly have any traffic right now going to this site, and my site comes up as well optimize when using tools such as Yslow and/or google tools.

DH points to googlebots hitting my site a lot and to update by robots.txt file to limit these. But my question here is, don’t we want googlebots to crawl my website???

I just switch to DH so I have time for trial and I am not liking this. I am spending more time figuring thigns out like this than actually on the content…anyways, just frustrating…

Has anyone been faced with googlebots crawling your site so much that it exceeds memory limits???

@vero No, in all my emails from Dreamhost, I never saw that. But I can tell you that after the discussion that took place here, I disabled Jetpack and downloaded just the plugins I wanted… and things have been running pretty darn smooth ever since (on multiple sites!). So… despite what some may think/say, my anecdotal evidence shows that Jetpack is a big problem with Shared Hosting.

I don’t have Jetpack…

@vero I’m sorry, that answer is unacceptable. :smiley: Sorry…

Umm… so yes, you do want googlebots crawling your site. Unless they’re spoof googlebots. So… in that case maybe disallow all except the major search engines?

User-agent: *
Disallow: /
User-agent: Googlebot
Allow: /
User-agent: Slurp
Allow: /
User-Agent: msnbot


I have been searching for what to actually include in the robot.txt file to prevent bots from htting it but still allow the ones I should have such as googlebot, msnbot, etc… and YOU have been the ONLY that hve actually answered that question for me. THank you!!! I think this is the right way to go in my case and still have the ‘true’ bots crawl my site…

@vero Glad I could help in some way. Good luck!

Side note: I’ve been with Dreamhost for 12 years. They are a great company. Their shared server services continually degrade, however, as they attempt to push their more profitable products. I can’t blame 'em for that… but I also can’t say it’s not disappointing to have the same level of confidence in uptime I once had. All other aspects of their service is wonderful and, in my opinion, worth sticking around for.

that is a lot of years with one host company!! I have been actually thinking about switching to bluehost…and it seems they don’t have a limit on the shared services…still thinking about it as I don’t really want to go thru the hassle of switching over…but not sure…this limitation is ridiculous specially when I just joined them and have hardly no traffic whatsover…

Robots.txt is an ineffective method of controlling unwanted traffic.

On a shared server, the best way is to use your server access logs to identify the problem actors, then block them with one of the various methods via htaccess:

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.