Bug 39470 - zcat crashes on files which, uncompressed, are >2GB
Summary: zcat crashes on files which, uncompressed, are >2GB
Alias: None
Product: Red Hat Linux
Classification: Retired
Component: gzip
Version: 7.1
Hardware: i386
OS: Linux
Target Milestone: ---
Assignee: Trond Eivind Glomsrxd
QA Contact: David Lawrence
URL: http://www.gzip.org/
Depends On:
TreeView+ depends on / blocked
Reported: 2001-05-07 21:07 UTC by Jonathan Epstein
Modified: 2007-04-18 16:33 UTC (History)
0 users

Clone Of:
Last Closed: 2001-05-08 15:29:29 UTC

Attachments (Terms of Use)

Description Jonathan Epstein 2001-05-07 21:07:45 UTC
From Bugzilla Helper:
User-Agent: Mozilla/4.7 [en] (WinNT; I)

Description of problem:
Redhat 7.1 supports filesizes >2GB.  However, zcat (& uncompress) crashes in these cases.
E.g., I downloaded the file
which is slightly less than 1GB.  I then typed:
  zcat nt.Z >nt
and got the error:

Filesize limit exceeded (core dumped)

However, the following test of the new filesystem works:
dd if=/dev/zero of=./a_testfile bs=1M seek=2100 count=1
Resulting in:
sh-2.04$ ls -l a_testfile 
-rw-r--r-- 1 epstein users 2203058176 May 7 09:38 a_testfile

Strangely enough, the following works as well: 
gzip -d nt.Z >nt 
sh-2.04$ ls -l nt*
-rw-r--r-- 1 epstein users 3293252235 May 3 22:46 nt

This is pretty strange, since gzip and zcat are the same program, and have the same checksum.  Note that the "gzip -d" solution is 
an effective workaround until the zcat problem is fixed.

How reproducible:

Steps to Reproduce:
1. lynx -dump   ftp://ncbi.nlm.nih.gov/blast/db/nt.Z >nt.Z
2. (go home for the night while waiting for the long download to complete)
3.  zcat nt.Z >nt


Actual Results:  "Filesize limit exceeded (core dumped)"
and a 2GB (incomplete) file

Expected Results:  No error message and a resulting file in the 3GB range.

Additional info:

Note that (sorry) the sample file (nt.Z) changes size over time, and eventually will probably exceed 2GB, making this example less 
meaningful.  It would be nice to provide a more reproducible example, but that would require the integration of the gzip patches, 
discussed at:

It's not clear why the patches aren't integrated into Redhat 7.1, or whether the patches would actually fix the problem which I 
reported above.

Comment 1 Trond Eivind Glomsrxd 2001-05-08 14:56:33 UTC
It's not reproducible with zcat either, on your testcase:

[root@halden teg]# ls -l foobar 
-rw-r--r--    1 root     root     2203058176 mai  8 10:51 foobar
[root@halden teg]# gzip -9 foobar 
[root@halden teg]# ls -l
totalt 2092
-rw-r--r--    1 root     root      2138038 mai  8 10:51 foobar.gz
[root@halden teg]# zcat foobar.gz > foobar    
[root@halden teg]# ls -l
totalt 2155624
-rw-r--r--    1 root     root     2203058176 mai  8 10:57 foobar
-rw-r--r--    1 root     root      2138038 mai  8 10:51 foobar.gz
[root@halden teg]#

I'm downloading your file now, to see if I can reproduce it with that.

Comment 2 Trond Eivind Glomsrxd 2001-05-08 15:01:49 UTC
Hmm...  the reason could be the .Z suffix - compress doesn't support files this
large, at least.

Comment 3 Jonathan Epstein 2001-05-08 15:29:26 UTC
Sorry ... use anonymous FTP (or perhaps netscape) rather than lynx to fetch the nt.Z file ... I got an error and a zero-length file when using 
lynx, probably because I don't remember exactly how to dump such a file.

Comment 4 Trond Eivind Glomsrxd 2001-05-08 18:59:12 UTC
I couldn't reproduce your problem with the file specified (it works fine here,
on an ext2 filesystem), but I made a new version of compress able to handle
files > 2 GB which should show up in Rawhide someday and, for a limited time, is
available from http://people.redhat.com/teg/ - ncompress-4.2.4-23.

Note You need to log in before you can comment on or make changes to this bug.