Description of problem: wget uses an unsigned integer for file size reports, which allows an overflow if users retrieve a very large file (e.g. a FC5 DVD iso) Version-Release number of selected component (if applicable): 1.10.2 How reproducible: 100% Steps to Reproduce: 1. wget a large file Actual results: tremor:~$ wget ftp://ftp.cse.buffalo.edu/pub/fedora/linux/core/5/i386/iso/FC-5-i 386-DVD.iso --12:37:34-- ftp://ftp.cse.buffalo.edu/pub/fedora/linux/core/5/i386/iso/FC-5-i3 86-DVD.iso => `FC-5-i386-DVD.iso' Resolving ftp.cse.buffalo.edu... 128.205.32.51 Connecting to ftp.cse.buffalo.edu|128.205.32.51|:21... connected. Logging in as anonymous ... Logged in! ==> SYST ... done. ==> PWD ... done. ==> TYPE I ... done. ==> CWD /pub/fedora/linux/core/5/i386/iso ... done. ==> PASV ... done. ==> RETR FC-5-i386-DVD.iso ... done. Length: -1,041,297,408 (unauthoritative) [ <=> ] 3,253,669,888 1.05M/s 13:18:29 (1.26 MB/s) - `FC-5-i386-DVD.iso' saved [3253669888] Note particularly the Length: field Expected results: A length field indicating the actual length Additional info:
Can you reproduce this with a different server ? wget-1.10 was rewritten to support files > 2GB and downloading large files works for me here. But many webservers out there can't handle files that large and report wrong sizes to the client (wget). Maybe you should try to download from an ftp-server, those tend to behave somewhat better wrt. large files.
Closing due to lack of response from reporter. This certainly works for me on other servers. I suspect it is a server issue.