Red Hat Bugzilla – Bug 126775
[PATCH] compress does not work if the file size is greater than 2GB
Last modified: 2007-11-30 17:07:02 EST
From Issue Tracker (41696):
When the file they are compressing is greater than 2GB, they get a
This problem has been seen at customer site and duplicated on our lab
on two systems.
Also see https://bugzilla.redhat.com/bugzilla/show_bug.cgi?id=66311
possible same symptom/problem
Customer reported this against AS2.1, but RHEL3 still seems to have
the problem. We'll need the RHEL3 package moved into AS2.1 once the
fix has been applied; I'll open a separate bugzilla for this.
Created attachment 101437 [details]
A patch which seems to fix the problem
There was one file-size related variable ("checkpoint") left which was declared
as "long"; changing that to "long long" appears to fix the segfault. I can't
really follow the algorithm, but apparently it got confused once "checkpoint"
became negative, and it tried to write past the end of an array.
Note that to reproduce the problem it seems you need a file with actual 2GB of
data or more; I failed to cause a segfault with a test file which was only
holes below 2G and a bit of data above that.
This test in progress, packages immediately after success:
sudo dd if=/dev/sda1 bs=1M | compress -c | uncompress -c > /dev/null
ncompress-4.2.4-37 in AS2.1-errata-candidate
ncompress-4.2.4-38 in 3.0-U3-HEAD
Apologies for the delay.