Description of problem:
wget uses an unsigned integer for file size reports, which allows an overflow if
users retrieve a very large file (e.g. a FC5 DVD iso)
Version-Release number of selected component (if applicable):
Steps to Reproduce:
1. wget a large file
tremor:~$ wget ftp://ftp.cse.buffalo.edu/pub/fedora/linux/core/5/i386/iso/FC-5-i
Resolving ftp.cse.buffalo.edu... 184.108.40.206
Connecting to ftp.cse.buffalo.edu|220.127.116.11|:21... connected.
Logging in as anonymous ... Logged in!
==> SYST ... done. ==> PWD ... done.
==> TYPE I ... done. ==> CWD /pub/fedora/linux/core/5/i386/iso ... done.
==> PASV ... done. ==> RETR FC-5-i386-DVD.iso ... done.
Length: -1,041,297,408 (unauthoritative)
[ <=> ] 3,253,669,888 1.05M/s
13:18:29 (1.26 MB/s) - `FC-5-i386-DVD.iso' saved 
Note particularly the Length: field
A length field indicating the actual length
Can you reproduce this with a different server ? wget-1.10 was rewritten to
support files > 2GB and downloading large files works for me here. But many
webservers out there can't handle files that large and report wrong sizes to the
client (wget). Maybe you should try to download from an ftp-server, those tend
to behave somewhat better wrt. large files.
Closing due to lack of response from reporter.
This certainly works for me on other servers. I suspect it is a