Bug 123524 - (wget-LFS) wget > 2gb files
wget > 2gb files
Status: CLOSED ERRATA
Product: Fedora
Classification: Fedora
Component: wget (Show other bugs)
1
i386 Linux
medium Severity medium
: ---
: ---
Assigned To: Karsten Hopp
:
: 115348 124652 133193 133646 (view as bug list)
Depends On:
Blocks: FC2Target FC3Target FC4Target
  Show dependency treegraph
 
Reported: 2004-05-18 20:38 EDT by redhat admin
Modified: 2007-11-30 17:10 EST (History)
9 users (show)

See Also:
Fixed In Version:
Doc Type: Bug Fix
Doc Text:
Story Points: ---
Clone Of:
Environment:
Last Closed: 2004-11-04 15:36:03 EST
Type: ---
Regression: ---
Mount Type: ---
Documentation: ---
CRM:
Verified Versions:
Category: ---
oVirt Team: ---
RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: ---


Attachments (Terms of Use)

  None (edit)
Description redhat admin 2004-05-18 20:38:16 EDT
Description of problem:

wget does not seem able to download >2gb files with an ftp:// url

Version-Release number of selected component (if applicable):

1.9.1-5

How reproducible:

Always

Steps to Reproduce:
1. try and download a file > 2gb eg FC2-i386-DVD.iso from a mirror site.
2.
3.
  
Actual results:

wget terminates with a file of 2147483647 bytes
Expected results:

Actual file size should be ~ 4.3gb

Additional info:
Comment 1 Péter, Soós 2004-05-21 03:56:24 EDT
I experienced same results.
Comment 2 Karsten Hopp 2004-05-28 05:31:57 EDT
The wget maintainer is aware of this and already works on a new 
version with large file support. I'll update the package as soon the 
new version is available. 
Comment 3 Karsten Hopp 2004-05-28 05:44:02 EDT
*** Bug 124652 has been marked as a duplicate of this bug. ***
Comment 4 Leonid Petrov 2004-06-07 09:49:40 EDT
I could not wait for offical wget update, so I've hacked the code.
You can find both the patch and tarball with Large File Support 
enabled in http://software.lpetrov.net/wget-LFS-20040605
Comment 5 Robert Scheck 2004-06-08 02:28:33 EDT
Leonid, your tarball with Large File Support works for me at Red Hat 
Linux 9, Fedora Core 1/2, Red Hat Enterprise Linux 3 (all x86). But 
there are still some problems using wget for such large files over a 
http/ftp proxy...the request dies with "502 Bad Response", but the 
proxy server itself has 100% large file support...

To have the LFS support working, I had to append "--enable-LFS" at 
the building time.

Karsten, maybe you could test that tarball (or the patch), too, 
please? If it works in your eyes, we would be happy to get a wget
with working LFS support sooner.

FYI: The patches applied by Red Hat to wget-1.9.1-5 don't work with
Leonid's tarball and Leonid's patch doesn't apply to the Red Hat 
patched wget-1.9.1-5...
Comment 6 Leonid Petrov 2004-06-30 23:14:59 EDT
Robert,

  I am exploiting wget literally incessantly, and since my last post I
found 
several instances when wget with large file support did not work
correctly, 
one of them (incorrect work when connection was lost after downloading 
2Gb+ portion of a big file and wget restarted) is critical. I've posted 
the fresh patch at http://software.lpetrov.net/wget-LFS/

>   But there are still some problems using wget for such large files
over a 
> http/ftp proxy...the request dies with "502 Bad Response", but the 
> proxy server itself has 100% large file support...

  If you can provide log files, URLs and names of proxy-servers I will
check. 
I did not touch logic for handling proxies yet.

  Wget with LFS was able to download 4.3 Gb file from http server
Cherokee 
( http://freshmeat.net/projects/cherokee/ ) the only http server which 
I know with LFS, but I was unable to test restarting, since that http
server
itself does not support yet restarting.

> To have the LFS support working, I had to append "--enable-LFS" at 
> the building time

  Strange. I made --enable-LFS default and kept --disable-LFS option just
for the case if at some bizarre system wget with large file support may
fail compilation. It is still unresolved problem how to recognize during 
configuration whether the system is capable to support large files and 
64-bits arithmetic correctly. I should confess that an elegant
solution which 
is suitable for all cases is beyond my competence. I also did not
include any
tricks which would allow a user to learn whether his version of wget is
LFS capable. I left it to wget maintainers.
Comment 7 Robert Scheck 2004-08-23 19:44:02 EDT
Fedora Core 2 and 3 urgently (!) need LFS support in wget, because 
there are DVD iso images provided and lots of users have problems with 
downloading them...very annoying.

I still provide the users asking me directly your patched wget, Leonid,
but that can't be the permanent official solution for this problem!!
Comment 8 Karsten Hopp 2004-08-27 19:10:59 EDT
I've released a test update for Fedora Core 2 with LFS support a 
moment ago. A similar package will be available for Fedora Core 3test 
soon. 
 
Comment 9 Rob van Nieuwkerk 2004-08-30 02:11:32 EDT
I tried the FC2 test update wget-1.9.1-7.i386.rpm.
It seems nothing has improved: file transfers are still limited to
2 GB.

I test with this URL:

ftp://alviss.et.tudelft.nl/pub/fedora/core/2/i386/iso/FC2-i386-DVD.
iso

The server handles >2GB files OK.  For example ncftp does not have a
problem transferring >2GB from this server.
Comment 10 Robert Scheck 2004-08-31 03:42:46 EDT
Yes Rob, seems so, that something is broken at your patch, Karsten:

--- snipp "./configure $" ---
checking for gcc option to accept ANSI C... none needed
./configure: line 2684: xyes: command not found
checking how to run the C preprocessor... gcc -E
--- snapp "./configure $" ---

--- snipp "configure" ---
fi;  test
x"${ENABLE_LFS}" = xyes &&
cat >>confdefs.h <<\_ACEOF
#define ENABLE_LFS 1
--- snapp "configure" ---

Another thing that makes me thinking, that *something is broken*:

--- snipp ---
# rpmbuild --rebuild wget-1.9.1-9.src.rpm >>/tmp/wget.log 2>&1
# 
# grep LARGEFILE_SOURCE /tmp/wget.log
# 
# grep FILE_OFFSET_BITS /tmp/wget.log
#
--- snapp ---

Could you please review and fix that, that LFS is working this time 
fine? ;-) Maybe, if I've got time this morning, I'll try to have a 
look to it and send a patch.
Comment 11 Karsten Hopp 2004-08-31 05:18:57 EDT
Broken patch, sorry.  The 'test' command moved to a different line 
and broke parsing of --enable-LFS. 
 
Leonid, you might want to check your patch, too. It seems to have the 
same problem: 
+AC_ARG_ENABLE(LFS, [  --disable-LFS           disable Large File 
Support], 
+ENABLE_LFS=$enableval, ENABLE_LFS=yes) test 
                                        ^^^^ 
+x"${ENABLE_LFS}" = xyes && AC_DEFINE([ENABLE_LFS], 1, 
+   [Define if you want the Large File Support compiled in.]) 
 
New packages will arrive at your preferred Fedorce mirror after the 
next sync, FC2 packages are available at 
http://people.redhat.com/karsten , too 
Comment 12 Rob van Nieuwkerk 2004-08-31 05:50:28 EDT
It still does not work correctly.
I tried http://people.redhat.com/karsten/wget-1.9.1-10.i386.rpm
I did not wait for the download to finish (takes very long),
only 30s or so.
But I assumed from the wget output that nothing had improved:
 - "Length: 75,673,600 (unauthoritative)" --> should be 4GB
 - the estimated download time: 2.5 minutes --> should be much longer
Comment 13 Robert Scheck 2004-08-31 06:09:08 EDT
*arghl* well, I thought I would have some time to track the complete 
issue, because that 2-line-configure-fix doesn't do it really.

Maybe that we work together: CFLAGS isn't set correct with the LFS 
parameters, when CFLAGs already is set (which rpm does) - I think.
Comment 14 Robert Scheck 2004-08-31 06:30:52 EDT
Okay, I think, I tracked it out, but Karsten, please really test it 
also:

--- snipp ---
--- wget-1.9.1/configure.in       2004-08-31 12:24:30.000000000 +0200
+++ wget-1.9.1/configure.in.rsc   2004-08-31 12:25:46.000000000 +0200
@@ -120,7 +120,7 @@
 dnl   if compiler is gcc, then use -O2 and some warning flags
 dnl   else use os-specific flags or -O
 dnl
-if test -n "$auto_cflags"; then
+if test -n $auto_cflags; then
   if test -n "$GCC"; then
     CFLAGS="$CFLAGS -O2 -Wall -Wno-implicit"
     if test x"$ENABLE_LFS" = xyes; then
--- snapp ---

then check the rebuilding using:

--- snipp ---
# rpmbuild --rebuild wget-1.9.1-9.src.rpm >>/tmp/wget_rsc.log 2>&1
#
# grep LARGEFILE_SOURCE /tmp/wget_rsc.log | wc -l
42
#
# grep FILE_OFFSET_BITS /tmp/wget_rsc.log | wc -l
42
#
--- snapp ---

wget MUST have the "-D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE" at 
compiling time visible, otherwise the wget has no LFS support.

Hehe and last but not least, let it re-test by Rob (I currently 
don't have access to such big files and supported servers).
Comment 15 Robert Scheck 2004-08-31 16:48:43 EDT
wget-1.9.1-12 (from your people directory) seems to be broken, 
because the following behavior is just totally wrong:

--- snipp ---
# wget http://download.fedora.redhat.com/pub/fedora/linux/core/updates/2/SRPMS/krb5-1.3.4-6.src.rpm
--22:39:18--  http://download.fedora.redhat.com/pub/fedora/linux/core/updates/2/SRPMS/krb5-1.3.4-6.src.rpm
           => `krb5-1.3.4-6.src.rpm'
Resolving download.fedora.redhat.com... 66.187.224.20
Connecting to download.fedora.redhat.com[66.187.224.20]:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 6,417,461 [application/x-rpm]

    [ <=>                                                                                                ] 2,449         --.--K/s

22:39:18 (23.36 MB/s) - `krb5-1.3.4-6.src.rpm' saved [2,449/2,449])

#
--- snipp ---

If I use wget-1.9.1-9 for example, the download is correct (seems so)
and has also the correct size. Btw, you can continue (-c) this 
corrupted (aborted?) download but it's simply strange...I'm able to 
reproduce that behaviour. Personal fallback to -5 until the problem 
is triggered out.
Comment 16 Ulrich Drepper 2004-09-01 02:48:21 EDT
*** Bug 115348 has been marked as a duplicate of this bug. ***
Comment 17 Leonid Petrov 2004-09-08 23:51:05 EDT
Karsten,

  I looked at wget-1.9.1-12 . Unfortunately, it cannot work correctly,
since
the preprocessor variable LFS was not set up. It was my fault: when I
tested
I've updated configure, but forgot to update configure.in :-(

  To fix it, please replace teh line 140 in configure.in 

   CFLAGS="$CFLAGS -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE"

with 

   CFLAGS="$CFLAGS -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE -DLFS"

  Alternativiely, the updated patch and full tar-ball are 

at http://software.lpetrov.net/wget-LFS/

Leonid
08-SEP-2004 23:49:24
Comment 18 Karsten Hopp 2004-09-09 04:29:08 EDT
Leonid, 
 
Which webserver do you use for your testing ? I just found out that  
our latest Apache doesn't support LFS even on 64bit machines due to 
int32 variables. I'm still looking for one that works to do some 
tests. 
 
     Karsten 
Comment 19 Rob van Nieuwkerk 2004-09-09 07:07:24 EDT
vsftpd is 64bit OK out of the box, so that can be used for testing
too.  If you don't want to set it up yourself you can use for example
ftp://alviss.et.tudelft.nl/pub/fedora/core/2/i386/iso/FC2-i386-DVD.
iso to test with.

Comment 20 Robert Scheck 2004-09-09 07:27:35 EDT
Leonid,

if I change it in such way, as you recomment it in comment #17, the
rebuilding dies (I used Karsten's patch and modified it):

--- snipp ---
gcc -O2 -g -march=i386 -mcpu=i686 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE -DLFS -o wget cmpt.o connect.o convert.o cookies.o ftp.o ftp-basic.o ftp-ls.o ftp-opie.o hash.o headers.o host.o html-parse.o html-url.o http.o init.o log.o main.o gen-md5.o netrc.o progress.o rbuf.o recur.o res.o retr.o safe-ctype.o snprintf.o gen_sslfunc.o url.o utils.o version.o  -L/usr/kerberos/lib -lgssapi_krb5 -lkrb5 -lcom_err -lk5crypto -lresolv -lz -lssl -lcrypto -ldl
ftp.o(.text+0xdfb): In function `getftp':
/usr/src/redhat/BUILD/wget-1.9.1/src/ftp.c:908: undefined reference to `legible_off_t'
ftp.o(.text+0xe86):/usr/src/redhat/BUILD/wget-1.9.1/src/ftp.c:910: undefined reference to `legible_off_t'
ftp.o(.text+0xec9):/usr/src/redhat/BUILD/wget-1.9.1/src/ftp.c:900: undefined reference to `legible_off_t'
ftp.o(.text+0xf48):/usr/src/redhat/BUILD/wget-1.9.1/src/ftp.c:902: undefined reference to `legible_off_t'
ftp.o(.text+0x155e):/usr/src/redhat/BUILD/wget-1.9.1/src/ftp.c:669: undefined reference to `legible_off_t'
ftp.o(.text+0x295c):/usr/src/redhat/BUILD/wget-1.9.1/src/ftp.c:1202: more undefined references to `legible_off_t' follow
--- snapp ---

Any idea?
Comment 21 Karsten Hopp 2004-09-09 07:54:29 EDT
Rob, 
ftp worked for me all the time, but http seems to be broken 
Comment 22 Karsten Hopp 2004-09-09 07:57:51 EDT
Robert, 
 move the line 
#endif /* not USE_SIGNAL_TIMEOUT */    (last line in the file) 
 in src/utils.c up above the #ifdef LFS   (~line 2110) 
Comment 23 Rob van Nieuwkerk 2004-09-09 08:02:39 EDT
Karsten,

Does ftp work for you with a published rpm ??
I tried all the new versions you announced (I think) and
NONE of them worked with ftp (as I reported).  Maybe you're
talking about your working version for which you didn't build
public packages yet ?
Comment 24 Karsten Hopp 2004-09-09 08:16:44 EDT
I've uploaded new rpms (-14) to my people.redhat.com page. 
wget ftp://neuss/pub/BIG 
... 
Anmelden als anonymous ... Angemeldet! 
==> SYST ... fertig.    ==> PWD ... fertig. 
==> TYPE I ... fertig.  ==> CWD /pub ... fertig. 
==> PASV ... fertig.    ==> RETR BIG ... fertig. 
Länge: 2,621,440,000 (unmaßgeblich) 
 
100%[==================================>] 2,621,440,000   11.09M/s    
ETA 00:00 
 
16:16:48 (10.92 MB/s) - `BIG' saved [2,621,440,000] 
 
[tmp] >ls -l BIG 
-rw-rw-r--  1 karsten karsten 2621440000  9. Sep 16:16 BIG 
Comment 25 Leonid Petrov 2004-09-09 09:38:31 EDT
Karsten,

  As I wrote earlier (comment #6), apache 2.0.x Web server does not
provide 
LFS. I checked that updated wget still correctly downloads from apache
files 
shorted than 2Gb, and checked that updated wget correctly downloaded 
4.3 Gb files from the only Web server which I know with LFS, 
http://www.alobbs.com/cherokee/ , if it is not interrupted. Unfortunately,
cherokee does not support yet restarting, so I could not test logic for 
restarting. Without ability to restart after loss of connection
downloading
large files is not practical.

  Bottom line: wget still _does_not_provide_ LFS for http. It would be
nice
to mark it with bold capital letters in documentation.

Leonid
09-SEP-2004 09:30:33
Comment 26 Leonid Petrov 2004-09-09 09:53:13 EDT
Robert,

  I suppose you have checked that you do not have stale object files in
your directory tree, right?

  I've just downloaded 
http://software.lpetrov.net/wget-LFS/wget-LFS-20040908.tar.bz2 

ran 

autoconf
configure
make

and it compiled flawlessly. I would suggest you to do the same. If it will
work on your machine, then something wrong with patches and/or
packaging, right?
legible_off_t are defined in utils.h and utils.c . Without preprocessor 
variable LFS legible_off_t is a synonym of legible.

Leonid
09-SEP-2004 09:47:32
Comment 27 Robert Scheck 2004-09-09 10:12:54 EDT
Leonid, I never have stale files, because I always rebuild it as rpm, 
which does the cleanup automatically. My problem seems to be solved 
by comment #22, which is also in the -14 rpm done.

Karsten, the -14 rpm also has the same problem already described in 
comment #15:

--16:10:31--  http://download.fedora.redhat.com/pub/fedora/linux/core/development/SRPMS/Canna-3.7p3-4.src.rpm
           => `Canna-3.7p3-4.src.rpm'
Auflösen des Hostnamen »download.fedora.redhat.com«.... 209.132.176.220, 209.132.176.221, 66.187.224.20, ...
Verbindungsaufbau zu download.fedora.redhat.com[209.132.176.220]:80... verbunden.
HTTP Anforderung gesendet, warte auf Antwort... 200 OK
Länge: B,446,744,069,420,987,314 [application/x-rpm]

    [ <=>                                                                                                ] 2,450         --.--K/s

16:10:32 (23.37 MB/s) - `Canna-3.7p3-4.src.rpm' saved [2,450/2,450])

It seems so, that the problem is for me especially reproducable at 
download.fedora.redhat.com - because it seems to be slow (and 
overloaded) and so on...

Seems to be a problem brought up by the LFS patch...
Comment 28 Karsten Hopp 2004-09-09 16:24:45 EDT
Leonid: The bug from comment 20 was in my LFS patch. I couldn't use 
yours because that one seems to for a wget-cvs version, where we have 
plain wget-1.9.1 + some patches from CVS. I had to redo the complete 
patch using your patch as a guide and messed up at least one part 
(comment 22). There is another bug in my patch which results in what 
Robers describes in his last comment.  
I've tried a stable version of cherokee to test LFS, but the stable 
branch doesn't support large files yet. I then tried to compile one 
of those daily snapshots, but got one where several files were 
missing and some obvious syntax errors in the C code so I didn't try 
any further. 
 
Robert: -14 ist just a snapshot of the current state of the package 
which I put together for you as this seems to enable at least LFS ftp 
downloads. This one won't get released as it is, there's still some 
debugging to be done for the http downloads. 
 
 
Comment 29 Leonid Petrov 2004-09-09 23:35:53 EDT
Karsten,

  I've updated wget-LFS once again. Line 

> Länge: B,446,744,069,420,987,314 [application/x-rpm]

(comment #27) suggests that strtoll did not work correctly. Some
Unixes, like HP-UX do not have it at all. So I've replaced it with
home-made wget_strtoull and tested under it HP-UX, Sun and, of course,
Linux on i686. It works.

  You can find both the patch and full tar-ball at 
http://software.lpetrov.net/wget-LFS

  If you are interesting in playing with Cherokee I've put the version
which I have on http://lpetrov.net/misc/cherokee-0.4.17b18.tar.bz2

Leonid
09-SEP-2004 23:34:08
Comment 30 Karsten Hopp 2004-09-14 09:55:21 EDT
I'll upload wget-1.9.1-15 later to my people page which has Leonids 
strtol fix. I've tested large and small file downloads via ftp and 
http. Everything except restarts of large file downloads from apache 
seem to work now. 
Comment 31 Rob van Nieuwkerk 2004-09-14 15:49:43 EDT
Hi,

I tried wget-1.9.1-15.  I was able to succesfully transfer a
4.3 GB file via ftp.  Some details are still wrong though:

* wget tells "Length: 75,673,600 (unauthoritative)"
  This is wrong.  Length should be 4370640896. The transfer
  itself goes OK.

* --continue (restart) only works correctly if you transferred
  more than this 75673600 bytes initially it seems.

* the "to go" value is incorrect when the restart is on a partial
  file smaller than 75673600 bytes.  When the partial file is bigger
  it all seems OK.

So it seems there are some 32-bit issues left ...
I test with this file (ftp server is OK):
ftp://alviss.et.tudelft.nl/pub/fedora/
                            core/2/i386/iso/FC2-i386-DVD.iso
I haven't tested whith http (don't know any > 2GB capable publically 
accessible http server)

    Rob van Nieuwkerk
Comment 32 Karsten Hopp 2004-09-14 15:58:39 EDT
looks like I need to test some downloads from an 32bit ftp-server, 
too. Downloads from an x86_64 server worked ok. 
Comment 33 Rob van Nieuwkerk 2004-09-14 16:15:31 EDT
Don't know if its related to the ftp server being 32-bit.
alviss.et.tudelft.nl is a indeed 32-bit machine running vsftpd
(maintained by myself).  But I doubt that the problems
are related to the *server*: both "curl" and "ncftpget"
transfer > 2GB files fine from this server and the info
showed during the transfers looks fine.  But with wget
I see wrong things (but with your latest version the transfer
itself is OK as I mentioned).

    greetings,
    Rob van Nieuwkerk
Comment 34 Leonid Petrov 2004-09-15 20:51:39 EDT
Karsten,
 
  I've downloaded wget-1.9.1-15.src.rpm and installed. Rob was right when
he was complaining: large file support in -15 **was not compiled in**
at all! 

  I found three errors in wget-1.9.1-15.src.rpm 

Error #1:
     
   Line 140 of configure.in is wrong. 

You can fix it by altering line 28 of wget-1.9.1-LFS.patch

(wrong)   +   CFLAGS="$CFLAGS -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE"
(correct) +   CFLAGS="$CFLAGS -D_FILE_OFFSET_BITS=64
-D_LARGEFILE_SOURCE -DLFS"

  I already mentioned it earlier (Comment #17).

Error #2)

  A typo in line 1001 of src/ftp-basic.c

You can fix it by altering line 745 of wget-1.9.1-LFS.patch

(wrong)   +  *size = wget_strtoll (respline + 4, NULL, 10);
(correct) +  *size = wget_strtoull (respline + 4, NULL, 10);

Error #3:

#endif /* not USE_SIGNAL_TIMEOUT */

  should be moved to 2106, just after "#endif /* not WINDOWS */" and 
before "#ifdef LFS"

After fixing these bugs and recompiling you will see that 
wget
ftp://alviss.et.tudelft.nl/pub/fedora/core/2/i386/iso/FC2-i386-DVD.iso

will print at the very beginning 

Length: 4,370,640,896 (unauthoritative)

instead of 

Length: 75,673,600 (unauthoritative)


Hope, this help
Leonid
15-SEP-2004 20:52:50
Comment 35 Karsten Hopp 2004-09-16 04:27:30 EDT
$#@!  I had 1. and 3. fixed in my tree before we moved our complete 
internal CVS, I even found 3. by myself. Sorry. 
  Karsten (puzzled why the downloads worked here) 
Comment 36 Robert Scheck 2004-09-21 13:31:54 EDT
Huh, didn't I recognize some things?!

1. I thought "Länge: B,446,744,069,420,987,314 [application/x-rpm]" is
   fixed already since comment #29.
2. I thought my download problem of small files from comment #15 and 
   #27 was fixed with -15 from comment #30.

What's up? I currently get these problems with Fedora Core 
Development's wget-1.9.1-16.
Comment 37 Leonid Petrov 2004-09-21 23:05:00 EDT
Karsten,

  Robert is right, http retrieval in wget-16 is still broken. In fact,
http implementation in wget-1.9.1 which you used as a basis and
wget-1.9.1-cvs which I used as a basis diverted so far that an
additional patch is needed. 

Please find it in http://software.lpetrov.net/wget-LFS/wget-1.9.1-16.patch

Leonid
21-SEP-2004 23:06:11
Comment 38 Karsten Hopp 2004-09-28 10:29:14 EDT
*** Bug 133646 has been marked as a duplicate of this bug. ***
Comment 39 Karsten Hopp 2004-09-28 10:36:02 EDT
*** Bug 133193 has been marked as a duplicate of this bug. ***
Comment 40 Ralf Ertzinger 2004-09-29 11:11:01 EDT
May I suggest that a wget reliably working for transfers < 2GB be
pushed to rawhide?

While I understand that transfers > 2GB are nice to have, I'd
currently settle for a wget that works for smaller files.
Comment 41 Leonid Petrov 2004-09-29 14:13:12 EDT
Ralf, 
  it should work after applying patch
http://software.lpetrov.net/wget-LFS/wget-1.9.1-16.patch
Can you try?
Comment 42 Karsten Hopp 2004-10-09 07:58:23 EDT
No new comments/bugreports for a while. Does this mean that 
wget-1.9.1-17 works as expected ? 
Comment 43 Ralf Ertzinger 2004-10-09 08:00:19 EDT
I did not try files > 2GB, but files smaller than that seem to work
just fine again.
Comment 44 Rob van Nieuwkerk 2004-10-09 08:14:27 EDT
You did not announce any wget-1.9.1-17, so that probably explains
why nobody reacted .. (at least I did not know it existed).
I will try it.
Comment 45 Rob van Nieuwkerk 2004-10-09 13:53:08 EDT
I don't have access to a FC2 machine at the moment.
But I rebuilt wget-1.9.1-17 on a Red Hat 9 machine from the SRPM.
It all seems to work fine now.  I tested:

- ftp > 4GB file
- ftp with partial > 4GB file (truncated couple of MBs before end)

File contents were OK and the on-screen info was also correct.
I assume it will be OK with FC2 too.
Comment 46 Robert Scheck 2004-10-09 20:22:50 EDT
Sorry Karsten, wget-1.9.1-17 was pushed up with too much other updates,
so I didn't recognize it...downloading of small http/ftp files seems
to work now, again :)
Comment 47 Karsten Hopp 2004-10-20 06:12:13 EDT
Success reports for large files are welcome, too ;-)  
I'd like to release a FC2 update before FC3 comes out for those who'd 
like to download the DVD iso. I wonder if it's better to release a 
'it might work' wget version or to stay with 'it will break' from FC2 
Comment 48 Rob van Nieuwkerk 2004-10-21 18:58:53 EDT
FYI:
I found an *http* server doing >2GB:

http://ftp.heanet.ie/pub/fedora/linux/
                    core/2/i386/iso/FC2-i386-DVD.iso

I will do some more wget/FC2 testing with it later.
Comment 49 Rob van Nieuwkerk 2004-10-27 18:03:24 EDT
I've tested the just released wget-1.9.1-16.fc2.i386.rpm from
updates/testing/2/i386/ on a 32-bit machine running FC2.
I tested all combinations of:

  - ftp access
  - http access
  - partial --continue < 2 GB
  - partial --continue > 2 GB
  - partial --continue > 4 GB

In all situations the on-screen info was correct and the
tranfered data was also correct (checked with md5sum).

So I think this wget version is OK.

Comment 50 Karsten Hopp 2004-10-28 08:24:36 EDT
That's great news, thanks a lot !
Comment 51 Rob van Nieuwkerk 2004-10-30 20:54:53 EDT
I read in the FC3 release-notes
(http://testing.fedora.redhat.com/tree/i386/os/RELEASE-NOTES-en)
this:

"If you intend to download the Fedora Core 3 DVD ISO image, keep
in mind that not all file downloading tools can accommodate files
larger than 2GB in size. For example, wget will exit with a File
size limit exceeded error."

Maybe it's good to point out that this problem is fixed in
the (upcoming) FC2 wget update !
So maybe change "wget" to "wget before version 1.9.1-16.fc2" ..
Comment 52 Karsten Hopp 2004-11-04 15:36:03 EST
The update has been pushed to our mirrors. 
Thanks to everyone who helped, either with patches or with testing 
and giving feedback ! 

Note You need to log in before you can comment on or make changes to this bug.