Wget upgrade?


So I wanted to mirror wikipedia on my site. Step one in this process is wget[ting] a 2.8 GB xml.bz2 file onto the web server. The problem is, because the file is so enormous, wget 1.9.1 (the version on my server) recognizes the file size as being negative (the signed int overflows). I basically wanted to know if you guys could go ahead and upgrade to wget 1.10, which apparently fixes this problem, or let me in on how to do it myself.

Cheers, Danny

http://www.mail-archive.com/wget@sunsite.dk/msg08260.html for reference


We’re just customers here. You can always suggest it:

In the mean time, see if ‘curl’ will do it for you.



Use curl
From debian ISO faq:

Why is my downloaded DVD image smaller than 1 GB when it should be larger than 4 GB?

Most likely, the tool you use for downloading the image does not have large file support, i.e. it has problems downloading files larger than 4 GBytes. The usual symptom for this problem is that when you download the file, the file size reported by your tool (and the amount of data that it downloads) is too small by exactly 4 GB. For example, if the DVD image is 4.4 GB large, your tool will report a size of 0.4 GB.

Some versions of wget also suffer from this problem - either upgrade to a version of wget which does not have this restriction or use the curl command line download tool: “curl -C - URL”

use VICM3 for a full 97USD discount on your first year!


Just to extend the possibilities here further: You CAN in fact, or at least more than likely, install your own version of wget.
Sometime back I wrote a VIM Installation guide that details basically the same steps you’d need to go through in order to install wget.

Maybe I’ll get around to writing a wiki article on that later on…

Chips N Cheese - Custom PHP installs and the like!