From Bugzilla Helper: User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.7a) Gecko/20040209 Description of problem: wget doesn't open files in a way which allows file sizes of > 2GB. I'll attach a patch. Version-Release number of selected component (if applicable): wget-1.9.1-3 How reproducible: Always Steps to Reproduce: 1.add file of >2GB to http server 2.try to download it 3. Actual Results: writing fails Expected Results: writing succeeds Additional info:
Created attachment 97571 [details] Patch to enable large file support Only lightly tested but should work.
This has been discussed on the wget-devel mailinglist and the solution is not that simple. Just have a look at this code-snippet from wget: /* Sends the SIZE command to the server, and returns the value in 'size'. * If an error occurs, size is set to zero. */ uerr_t ftp_size(struct rbuf *rbuf, const char *file, long int *size) The whole code needs to be checked for those occurences of long int. Have a look at http://www.mail-archive.com/wget%40sunsite.dk/msg05239.html for a dicussion about large files in wget.
Consider adding a variant of the following patch: http://software.lpetrov.net/wget-LFS/wget-LFS-20040630.patch I don't know what this strange number_to_string_64 function is supposed to do, should be correct to replace it with a call to snprintf.
I think this function is used to make it work on Windows, should be save both ways (to keep or replace it)
*** This bug has been marked as a duplicate of 123524 ***
Changed to 'CLOSED' state since 'RESOLVED' has been deprecated.