From Bugzilla Helper: User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8) Gecko/20051111 Firefox/1.5 Description of problem: Backing up files larger than 2 Gbytes in size with cpio results in the error: "Value too large for defined data type" and the files are skipped. I thought it was a problem with CPIO, and downloaded and re-compiled many versions, all with the same result. It turns out that cpio will archive files larger than 2 Gbytes if they are an ext3 filesystem created with -T largefile. But on filesystems created with the default ext3 fs creation options, it fails. This is an issue because we have a filesystem on web content which is mainly small files, yet there are one or two large files that need to be backed up. Our current cpio-based backup scripts simply skip the files. Please note that other utilities, inclusing zcat and dd work on large files which cpio fails to copy. For example, I can do zcat largefile.cpio | cpio -itv with no problem. Version-Release number of selected component (if applicable): cpio-2.5-3.2.legacy.i386.rpm How reproducible: Always Steps to Reproduce: 1. Create a file 3 GB in size (largefile.tar) 2. ls *.tar | cpio -ov -O./backup.cpio 3. You get the error. 4. Copy file to a machine with ext3 -T largefile used to create the filesystems 5. ls *.tar | cpio -ov -O./backup.cpio 6. It works! Actual Results: Value too large for defined data type Expected Results: File should be backed up. Additional info:
This sounds like a non-legacy specific issue. This is an upstream cpio issue, and should be filed upstream with them.
Does this still happen in Fedora Core Devel (or even FC4 for that matter)?
The latest version I tried to was 2.6-2, which I think came from the pre-release FC4. I downloaded and compiled cpio-2.6-2.src.rpm on the target system with no luck.
Try cpio-2.6-9.FC4 or cpio-2.6-10(devel), file limit size has been enlarged to 4GB there. Because of cpio header it is not possible to get over 4GB limit.It's up to upstream to change it.