Php readfile chunked fail

software development

#1

Iv been having a problem serving large files 100mb + to my sites users. I am using code similar to the examples of readfile_chunked in the comments on the php readfile function page (http://ca2.php.net/readfile).

I have set set_time_limit(0) and are performing flush() and ob_flush() after serving each chunk yet downloads are always failing around the 100mb mark,

As far as i can see downloads don’t seem to fail after an exact time period, as such i dont think it is the server timing out the process. I am more inclined to believe it is something to do with memory on my shared dream host server but apart from flushing the buffer i carnt really think what else to do.

Any ideas to a solution would be very much appreciated!!!


#2

What does phpinfo say for:

upload_max = ???M
post_max = ???M

:slight_smile:

Check php.ini lines like above.
You know, this might help me solve my issue with exec not working… Thanks for reminding me.


#3

upload_max_filesize = 7M
post_max_size = 8M

also

memory_limit = 90M

any ideas?


#4

Yea,

Read this page: http://wiki.dreamhost.com/PHP.ini

It’s not too difficult if you follow the instructions carefully and don’t change anything in the original ini unless you know what it does.

Find and change:
upload_max = 100M
post_max = 100M

You have to change both of them in most cases. Also be aware that DH may have master values set that prevent larger upload sizes to prevent people from hogging an entire 100 megabit connection.
:slight_smile:

Also, set_time_limit(0) and performing flush() and ob_flush() after serving each chunk is unnecessary.


#5

You guys are making my head hurt a little here. Let me try to offer some insight.

Changing “upload_max” and “post_max” won’t have any impact on serving files to your users. They are specifically used for configuring the allowed size of files uploaded from users, through forms for example.

Modifying the amount of memory used in a shared environment won’t be permitted because any customer could hog resources from everyone else. Even if you could change this setting, downloading files is a very inexpensive operation on system resources.

Your reference to hogging a connection doesn’t apply here at all. The amount of memory operations are allowed to use and allowed file upload size has nothing to do with bandwidth. I don’t even think there is any sort of bandwidth limiting setting available to PHP. It’s an application server and doesn’t handle throttling of network traffic.

In order to change any of these things, you would also need to install your own custom version of PHP in your user space. It can be a little challenging, but there is an article about how to do that on the wiki. No offense intended, but I suggest that you consider reading up a little on PHP configuration before performing your own install.


#6

What’s the reason you’re using read for fileserving?

http://domain.com/files/abigfile.ext

Maximum Cash Discount on any plan with MAXCASH

How To Install PHP.INI / ionCube on DreamHost


#7

I suspect it’s so that people can’t see the direct file link and then distribute it to others.

Doesn’t readfile also allow for more sophisticated programmatic functionality like distributing your own copyrighted content to people who donate or pay for it?


#8

You can do a few handy things with it, but there are always other options. Readfile is serious business if you’re not careful (fopen, etc.) not to mention it can get heavy within a shared environment.

Probably pushing 100MB into 90MB of PHP.

Maximum Cash Discount on any plan with MAXCASH

How To Install PHP.INI / ionCube on DreamHost


#9

I’m planning on doing something like that eventually, so do you suggest some better way?

(Other than uploading to a file hosting service lol).


#10

I am indeed trying to offer copyrighted content without showing the user the file url itself (files in folder protected with a .htaccess file) hence the need to use php to offer the file.

I noted your comments siggma on changing the max values but i agree with pangea33 about this not affecting anything (i may be wrong).

The reason i am using a custom function readfile_chunked is specifically to avoid the problems of using too much memory etc, as this function serves the file in small “chunks” to the user

(here is a example of code Similar - the headers, to that which i am using)

readfile_chunked($filename, $download_rate) {

set_time_limit(0);
$chunk_size = 1024 * $download_rate;
$handle = fopen($filename, ‘rb’);
$buffer = ‘’;

while (!feof($handle)) {
$buffer = fread($handle, $chunk_size);
echo $buffer;
flush();
ob_flush();
sleep(1);
}
fclose($handle);
}

The files i am trying to serve range from a few 100kb to around 500MB. They always seem to fail after dl around the 120mb range, as seen before my php.ini settings are:

upload_max_filesize = 7M
post_max_size = 8M
memory_limit = 90M

I am still unsure as to why this is happening.
Any help welcome!


#11

Flushing order:

ob_flush(); flush(); But if your server uses buffering - and it will - flushing won’t work anyway :wink:

You’ll need to disable mod_gzip (ask Support).

Read all about it here

Maximum Cash Discount on any plan with MAXCASH

How To Install PHP.INI / ionCube on DreamHost


#12

Heya

I’m trying to achieve the exact same thing: let users download 150-700mb files, but keep the files unbrowsable. I’m using almost exactly the same PHP approach, and my downloads die after a similar amount of downloading. If you uncover any solution I would love to know! I am trying to solve it too so I’ll post here if I have any success.

Dylan


#13

My current guess is that perhaps Dreamhost automatically kills PHP processes that are taking a few minutes to process - which would apply to serving up a large download. Not sure how to confirm this though. Do the support staff read this forum??

D


#14

That’s an insightful comment, dylandylan, and it would seem extremely likely that you’ve nailed it. Dreamhost does indeed have some sort of monitoring technique that will kill long-running processes. I know that you can avoid this if running a “niced” process through the shell, but that obviously doesn’t apply in this case.

If this is correct, it would seem to me that the culprit is likely to be process time used as impacted by the download speed, rather than the actual file size. Would it be possible to attempt calling the page with wget from the shell where d/l speeds will be extremely high, just to see if the process completes under those circumstances?

If that turns out to be the problem the only solution I can think of is exceedingly annoying, which would be to use a utility to break the file into smaller chunks. For example a multi-part archive file set using something like rar.


#15

The solution is provided in the post immediately preceeding your own Dylan.

Logic is a far cry from insight :wink:

Maximum Cash Discount on any plan with MAXCASH

How To Install PHP.INI / ionCube on DreamHost


#16

Thanks, I should remember to use my eyes! Much appreciated, I’ll get on to that right away.


#17

I’ve now had mod_gzip disabled.

Unfortunately no difference, the downloads still terminate at about 115MB received.

saddiction have you had any success?

D


#18

one further piece of info - this method is working perfectly well for files under about 115mb. it’s only the larger files that seem to be hitting a limit.


#19

Thanks pangea33 - your suggestion was a great test. I used wget via SSH (my first time… not so bad) and downloaded two files that I serve via PHP, one ~350MB and one ~700MB. Both downloaded completely and without trouble, and both well under a minute - although I didn’t time them precisely.

As for a solution, I don’t think I can consider breaking up the files - I’m really concerned with keeping this as simple as possible for the user. My next step will probably be to trial the Dreamhost private server - they claim not to kill processes, and so long as my memory requirements are low it won’t break the bank.

I’ll report back here on my progress.

Thanks everyone for your help so far.


#20

How long does the script run? Can you provide a phpinfo() link?

Maximum Cash Discount on any plan with MAXCASH

How To Install PHP.INI / ionCube on DreamHost