Making changes to the memory_limit in my DH account


Hey guys I just want to make sure that I’m doing this right. If I’m making custom changes to my php account wide I first:

  1. Log in to Fillzilla
  2. open the .php folder (hidden but I got it)
  3. go into 5.3 (I’m going to be doing both 5.4 as well)
  4. On my desktop make a blank txt file file with notepad and name it:


Notice that there is no period, and also no file extension, is that right?

  1. Then in that blank text file create the entry:

memory_limit = 256M

  1. save the file and upload it to both the 5.3 and 5.4 dir via filezilla
  2. wait 30 or 40 min for the changes to take effect.

PS: is there a way to check the info.php file and see if I’ve done it right?

Thank you for all the help in advance!


If you mean a php info page as your info.php, yes, just search the page for ‘memory_limit’ :slight_smile:

If you’re on shared and you need 256M, you may have some more issues. Generally you shouldn’t us more than 128M on a shared server. If you’re on a VPS, that’s different.


Understood, I will try 128M and see if it works, the problem is that I’m trying to import the demo content for a WordPress theme and the import process keeps stalling out while the images and what not are being brought on to the site.

Is upping the memory limit a good approach to dealing with this, its always worked in the past for this (I used to adjust the memory limit on a site by site basis, that that process stopped working recently).

I’m just making sure that there are not alternate ways of doing this that might be better.

Thanks for the continued help!



Yes, upping the memory limit (temporarily) is a good approach.

Another is to import in chunks. I used to export per-category to do it, when I had to move larger sites.

Obviously the BEST is to FTP/sync the files and copy the DB over, of course!


Did it work?


When I put lines in phprc to solve this and similar issues I often put in these 3, and may vary the numbers on them:

memory_limit = 200M
post_max_size = 100M
upload_max_filesize = 100M

Continually raising it to a higher level would eventually cause you to max out RAM (assuming you were doing something that consumed that RAM), either due to the memory_limit or other values set, or for the entire ftp/sftp/ssh RAM allocation for the user you are running. If it does that by hitting the 200M in this example but you still have user RAM allocation left (and, for security, the exact limit of default RAM allocation is not revealed on shared servers), you would increase those values in the phprc file to use more of that allocated RAM.

At some point, though, the combined activity of all processing for any given moment may hit the RAM allocation limit for the user. That is what is called “hitting procwatch”, and in those cases you can request on an occasional basis for tech support to lift the procwatch limit for a period of time, so that you can run it.

If it then runs out of control using up so much ram that it affects other customers then they will put the limit back on, but it is worth asking them anyway. You cannot ask it routinely, but from time to time when it really warrants it, and is not going to require so much ram that, as mentioned, it degrades performance of others on the server.
For ref: