From Bugzilla Helper: User-Agent: Mozilla/5.0 (X11; U; Linux 2.4.2-2 i686; en-US; 0.7) Gecko/20010316 Description of problem: I suspect this issue really only arises now because it is possible to create a large file(> 2147483647 bytes). The dump program will not allow a file greater than this to be created - eg "dump -0f /mnt/mnt1/home.dump /home". Where the resulting file will be larger than the value above, dump complains and exits after dumping that amount. It is possible to use dump -0f - /home > /mnt/mnt1/home.dump but this is hard to restore. With this second option, trying to restore with "restore -xf /mnt/mnt1/home.dump" will cause the an error similar to "File to Large". It appears this error is coming from libc. The command "restore -xf - < /mnt/mnt1/home.dump" appears to work (I'm not 100% certain here) but has an error after restoring files. I'm not sure if it restores all permissions. Having looked at the dump distribution, there appears to be an option to configure. Using "./configure --enable-largefile" and compiling, the resulting binaries don't have the problem I mention above. However I haven't fully looked into this to check that using this option is ok. The documention does say that it causes dump to use a 64 bit interface to glibc, and that a minimum version of glibc is needed. RedHat 7.1 seems to fit this. How reproducible: Always Steps to Reproduce: 1.See description 2. 3. Additional info: I've set the severity to High because of the possible risk of someone thinking they've done a full dump of a file system only to find that they haven't.
Make sure you get dump-0.4b22 when you --enable-largefile it, earlier versions had problems with LFS. Stelian.
I have updated the dump package and enabled the --enable-largefile option.