Iv been having a problem serving large files 100mb + to my sites users. I am using code similar to the examples of readfile_chunked in the comments on the php readfile function page (http://ca2.php.net/readfile).
I have set set_time_limit(0) and are performing flush() and ob_flush() after serving each chunk yet downloads are always failing around the 100mb mark,
As far as i can see downloads don’t seem to fail after an exact time period, as such i dont think it is the server timing out the process. I am more inclined to believe it is something to do with memory on my shared dream host server but apart from flushing the buffer i carnt really think what else to do.
Any ideas to a solution would be very much appreciated!!!