Red Hat Bugzilla – Bug 79736
wget fails when checking for bad file names
Last modified: 2007-04-18 12:49:04 EDT
From Bugzilla Helper:
User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.0.1) Gecko/20020823
Description of problem:
version wget-1.8.2-4.73 fails the check for bad filenames
It worked with the old version of wget, the
one that didn't check for bad filenames.
Version-Release number of selected component (if applicable):
Steps to Reproduce:
1.wget -m --accept tar "ftp://ftpeur.nai.com/pub/antivirus/datfiles/4.x/"
Actual Results: ...
Segmentation fault (core dumped)
Expected Results: The .tar file in the directory should have
I looks like that the patch wget-1.8.2-filename.patch
should be change in line 49 from:
+ f = orig;
+ f = start;
If the first loop in ftp_retrieve_glob removes the first
element of the list, orig is no longer valid.
I can confirm this problem (i discovered the same thing using the -r switch).
Program received signal SIGSEGV, Segmentation fault.
0x0804ff00 in ftp_retrieve_glob ()
Problem non-existant in 1.8.1
I can't reproduce this. I've installed a fresh 7.3 and upgraded wget to
the errata version.
wget -m --passive-ftp --accept tar
finishes without a problem (I had to use passive-ftp due to firewall restrictions,
but that shouldn't matter here)
I think you might have been lucky that the
"orig" pointer points to something useful, when
you try the example.