Overcomming the PHP memory limit

software development

#1

I have a script i made in PHP, which basically takes a large quantity of images, and adds them into a zip for user download (this would be run like weekly).

Of course being as there are 1500 images, and only 60 can be processed at a time without the php messege ‘tried to allocate…’ etc, i was thinking of doing it in batches, 60 at a time.

I thought after proccessing 60, saying ‘header(“location:script.php?offset=”.$num);’ would work, but it seems that doesn’t clear the memory cache…

ideas? suggestions? etc… or if someone wants the script for reference, feel free to ask

-if your not afraid… your not paying close enough attention -fmkaiba - fuhrer and flame alchemist


#2

Man, I’m sure glad that you are not doing that on the particular shared server that my sites are on! :wink: .

Why in the world would you use a shared webserver’s CPU and memory resources for that type of processing, when you can just as easily do that on a local machine and just upload the zip file (especially if done “once a week”)?

–rlparker


#3

It accualy takes alot less memory then what alot of people are using running things like IPB. It would be something hardly noticable, and thats exactly why i want to span it out, have it only do a little bit every once in a while. so no one gets hurt by the usage.

And besides, its cause the images are not on my local machine, and me downloading them all onto my local machine would use much more memory then running this simple script. lol

-if your not afraid… your not paying close enough attention -fmkaiba - fuhrer and flame alchemist


#4

incase anyone cares. i have solved the problem by having php print a meta refresh and carry a variable with it.

A mod or admin can close this topic if they wish

-if your not afraid… your not paying close enough attention -fmkaiba - fuhrer and flame alchemist