Bug 39470 - zcat crashes on files which, uncompressed, are >2GB
zcat crashes on files which, uncompressed, are >2GB
Status: CLOSED RAWHIDE
Product: Red Hat Linux
Classification: Retired
Component: gzip (Show other bugs)
7.1
i386 Linux
medium Severity low
: ---
: ---
Assigned To: Trond Eivind Glomsrxd
David Lawrence
http://www.gzip.org/
:
Depends On:
Blocks:
  Show dependency treegraph
 
Reported: 2001-05-07 17:07 EDT by Jonathan Epstein
Modified: 2007-04-18 12:33 EDT (History)
0 users

See Also:
Fixed In Version:
Doc Type: Bug Fix
Doc Text:
Story Points: ---
Clone Of:
Environment:
Last Closed: 2001-05-08 11:29:29 EDT
Type: ---
Regression: ---
Mount Type: ---
Documentation: ---
CRM:
Verified Versions:
Category: ---
oVirt Team: ---
RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: ---


Attachments (Terms of Use)

  None (edit)
Description Jonathan Epstein 2001-05-07 17:07:45 EDT
From Bugzilla Helper:
User-Agent: Mozilla/4.7 [en] (WinNT; I)

Description of problem:
Redhat 7.1 supports filesizes >2GB.  However, zcat (& uncompress) crashes in these cases.
E.g., I downloaded the file
  ftp://ncbi.nlm.nih.gov/blast/db/nt.Z
which is slightly less than 1GB.  I then typed:
  zcat nt.Z >nt
and got the error:

Filesize limit exceeded (core dumped)


However, the following test of the new filesystem works:
dd if=/dev/zero of=./a_testfile bs=1M seek=2100 count=1
Resulting in:
sh-2.04$ ls -l a_testfile 
-rw-r--r-- 1 epstein users 2203058176 May 7 09:38 a_testfile

Strangely enough, the following works as well: 
gzip -d nt.Z >nt 
sh-2.04$ ls -l nt*
-rw-r--r-- 1 epstein users 3293252235 May 3 22:46 nt

This is pretty strange, since gzip and zcat are the same program, and have the same checksum.  Note that the "gzip -d" solution is 
an effective workaround until the zcat problem is fixed.

How reproducible:
Always

Steps to Reproduce:
1. lynx -dump   ftp://ncbi.nlm.nih.gov/blast/db/nt.Z >nt.Z
2. (go home for the night while waiting for the long download to complete)
3.  zcat nt.Z >nt


	

Actual Results:  "Filesize limit exceeded (core dumped)"
and a 2GB (incomplete) file

Expected Results:  No error message and a resulting file in the 3GB range.

Additional info:

Note that (sorry) the sample file (nt.Z) changes size over time, and eventually will probably exceed 2GB, making this example less 
meaningful.  It would be nice to provide a more reproducible example, but that would require the integration of the gzip patches, 
discussed at:
  http://www.gzip.org/#faq10

It's not clear why the patches aren't integrated into Redhat 7.1, or whether the patches would actually fix the problem which I 
reported above.
Comment 1 Trond Eivind Glomsrxd 2001-05-08 10:56:33 EDT
It's not reproducible with zcat either, on your testcase:

[root@halden teg]# ls -l foobar 
-rw-r--r--    1 root     root     2203058176 mai  8 10:51 foobar
[root@halden teg]# gzip -9 foobar 
[root@halden teg]# ls -l
totalt 2092
-rw-r--r--    1 root     root      2138038 mai  8 10:51 foobar.gz
[root@halden teg]# zcat foobar.gz > foobar    
[root@halden teg]# ls -l
totalt 2155624
-rw-r--r--    1 root     root     2203058176 mai  8 10:57 foobar
-rw-r--r--    1 root     root      2138038 mai  8 10:51 foobar.gz
[root@halden teg]#

I'm downloading your file now, to see if I can reproduce it with that.
Comment 2 Trond Eivind Glomsrxd 2001-05-08 11:01:49 EDT
Hmm...  the reason could be the .Z suffix - compress doesn't support files this
large, at least.
Comment 3 Jonathan Epstein 2001-05-08 11:29:26 EDT
Sorry ... use anonymous FTP (or perhaps netscape) rather than lynx to fetch the nt.Z file ... I got an error and a zero-length file when using 
lynx, probably because I don't remember exactly how to dump such a file.
Comment 4 Trond Eivind Glomsrxd 2001-05-08 14:59:12 EDT
I couldn't reproduce your problem with the file specified (it works fine here,
on an ext2 filesystem), but I made a new version of compress able to handle
files > 2 GB which should show up in Rawhide someday and, for a limited time, is
available from http://people.redhat.com/teg/ - ncompress-4.2.4-23.

Note You need to log in before you can comment on or make changes to this bug.