unzip fails on files larger than ~4GB. Is it possible to add large file support to it?
I think the old ZIP file spec it implements maxes out at 4GB. The way zip/unzip passes return + error codes (as negative numbers), we limit it to a few KB under 4GB. Getting it to do near 4GB in the first place (instead of 2GB) took a lot of coersion. What we ultimately need is a Free implementation of the Zip64 specification. For instance, the Linux port of 7zip, called p7zip: http://sourceforge.net/projects/p7zip Unfortunately, the p7zip code is not 64-bit or endian clean right now, so it's not an option (i.e. it will only work correctly on i386). There's an implementation of the zip64 codec in the Heirloom Toolchest's cpio program: http://sourceforge.net/projects/heirloom ... But it requires reading entire files into RAM prior to compressing (not good). According to the Info-Zip web page, version 3.0 (which might support large files) is under development: http://www.info-zip.org/Zip.html Of course, the timeframe is 2004, perhaps "early summer". Probably best to use gzip with tar files for now. Many modern archiver applications running under other operating systems can extract them.