When I put lines in phprc to solve this and similar issues I often put in these 3, and may vary the numbers on them:
memory_limit = 200M
post_max_size = 100M
upload_max_filesize = 100M
Continually raising it to a higher level would eventually cause you to max out RAM (assuming you were doing something that consumed that RAM), either due to the memory_limit or other values set, or for the entire ftp/sftp/ssh RAM allocation for the user you are running. If it does that by hitting the 200M in this example but you still have user RAM allocation left (and, for security, the exact limit of default RAM allocation is not revealed on shared servers), you would increase those values in the phprc file to use more of that allocated RAM.
At some point, though, the combined activity of all processing for any given moment may hit the RAM allocation limit for the user. That is what is called “hitting procwatch”, and in those cases you can request on an occasional basis for tech support to lift the procwatch limit for a period of time, so that you can run it.
If it then runs out of control using up so much ram that it affects other customers then they will put the limit back on, but it is worth asking them anyway. You cannot ask it routinely, but from time to time when it really warrants it, and is not going to require so much ram that, as mentioned, it degrades performance of others on the server.