Connection closed when downloading large files via PHP

software development

#1

Hey,

I am having issues trying to download large files via php in my shared hosting. Using either my custom script or this:

http://phpsnips.com/579/PHP-download-script-for-Large-File-Downloads#.Ux0U_fldXn4

… I am getting random disconnects at around 140mb:

[code]C:>wget “http://www.xxxxx.com/files/some-large-file.mp3
–21:33:29-- http://www.xxxxx.com/files/some-large-file.mp3 => `some-large-file.mp3.4’
Resolving www.xxxxx.com… 69.00.00.00
Connecting to www.xxxxx.com|69.00.00.00|:80… connected

HTTP request sent, awaiting response… 200 OK
Length: 184,120,532 (176M) [audio/mpeg]

75% [==========================> ] 138,996,649 314.95K/s ETA 02:50

21:42:15 (258.62 KB/s) - Connection closed at byte 138996649. Retrying.

–21:42:15-- http://www.xxxxx.com/files/some-large-file.mp3
(try: 2) => `some-large-file.mp3.4’
Connecting to www.xxxxx.com|69.00.00.00|:80… connected

HTTP request sent, awaiting response… 200 OK
Length: unspecified [text/html]

11% [===> ] 15,800,525 294.39K/s ETA 06:37[/code]

WGET will just keep trying as it will continue getting disconnected at around the same point in the download.

I cannot use mod_xsendfile as I am in shared hosting, and this site is not for profit, so upgrading to a VPS would be prohibitively expensive.

Does anyone know of a solution to this, or at least a reason why the connections are being closed?

Cheers :slight_smile:
[hr]
After attempting to download the exact same file on a much faster internet connection, I did not experience the same issues and the file downloaded successfully:

D:\>wget "http://www.xxxxx.com/files/some-large-file.mp3"
--2014-03-09 22:43:30--  http://www.xxxxx.com/files/some-large-file.mp3
Resolving www.xxxxx.com... 69.00.00.00
Connecting to www.xxxxx.com|69.00.00.00|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 184120532 (176M) [audio/mpeg]
Saving to: `some-large-file.mp3.4'

100%[========================================================================================>] 184,120,532 1.03M/s   in 2m 58s

2014-03-09 22:46:29 (1010 KB/s) - `some-large-file.mp3.4' saved [184120532/184120532]

That suggests to me that there must be a script timeout happening that I cannot control by setting set_time_limit(xxxxx)…

Does that seem accurate? And if so, is there any work around for this in shared hosting (I already know the answer to that, but it’s worth a try)?