Readfile() and php as cgi

software development

#1

hi!

i want my visitors to download large files from my site. instead of redirecting them to actual filename i want to hide the real path. that’s why i use readfile() function. however, each readfile() process spawns php.cgi which eats memory. in case of 100 simultaneous downloads the server just dies.

any ideas how to avoid that?

nick


#2

to be more specific here’s the script:

header(‘Content-type: video/x-msvideo’);
$file_size = filesize(‘files/full/’.$fn);
header(“Content-Length: “.$file_size);
header('Content-Disposition: attachment; filename=”’.$fn.’”’);
readfile(‘files/full/’.$fn);

this works, however 20-25 users kill the server completely. load average goes to 100 and server has to be rebooted.

i also tried used readfile_chunked - a custom function that doesnt reads parts of file using fread and then prints the part in cycle.

same effect.

is this a php config @ DH problem? i see lots of php.cgi spawned before system dies.


#3

From some searches I found this. Looked like the general consensus was that the snippet below was useful in fixing this…

http://us3.php.net/manual/en/function.readfile.php#54295

[quote]When using readfile() with very large files, it’s possible to run into problems due to the memory_limit setting; apparently readfile() pulls the whole file into memory at once.

One solution is to make sure memory_limit is larger than the largest file you’ll use with readfile(). A better solution is to write a chunking readfile. Here’s a simple one that doesn’t exactly conform to the API, but is close enough for most purposes:[/quote]

[code]<?php
function readfile_chunked($filename,$retbytes=true) {
$chunksize = 1*(1024*1024); // how many bytes per chunk
$buffer = ‘’;
$cnt =0;
// $handle = fopen($filename, ‘rb’);
$handle = fopen($filename, ‘rb’);
if ($handle === false) {
return false;
}
while (!feof($handle)) {
$buffer = fread($handle, $chunksize);
echo $buffer;
ob_flush();
flush();
if ($retbytes) {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status) {
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;

}
?>
[/code]PHP 4 Bugs containing 'readfile’
PHP 5 Bugs containing ‘readfile’

http://benconley.net
http://teamshocker.com


#4

thanks, it’s exactly the readfile_chunked i was using :slight_smile: flushing doesn’t help - same effect.

one other thing i was thinking about is that large files being downloaded by download managers like flashget. they issue HTTP 206 requests for partial data and might flood the server in case simple readfile is used. so i added a routine which is able to handle that as well.

same effect :frowning: i guess the problem is php ran as cgi binary, since it’s not a common configuration.


#5

ok, it was all about download managers and http headers.

here’s the solution - use this class

http://www.phpclasses.org/browse/file/9051.html


#6

Thanks a lot for the followup. I was off on some tangent theorizing that you were exceeding the maxclients, because the download was tying them up with keepalives.

http://benconley.net
http://teamshocker.com


#7

one another problem is that DH for some reason starts killing process if there are more than some number of them.

since each http download spawns separate php.cgi (instead of using apache memory space) they are getting killed.

any ideas how to fix that??

the processes dont eat any memory/cpu

nick