Bug 39303 - dump to file doesn't cope with large filesystems
Summary: dump to file doesn't cope with large filesystems
Keywords:
Status: CLOSED RAWHIDE
Alias: None
Product: Red Hat Linux
Classification: Retired
Component: dump
Version: 7.1
Hardware: i386
OS: Linux
medium
high
Target Milestone: ---
Assignee: Mike A. Harris
QA Contact: David Lawrence
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2001-05-07 06:11 UTC by Need Real Name
Modified: 2007-04-18 16:33 UTC (History)
0 users

Fixed In Version:
Clone Of:
Environment:
Last Closed: 2001-05-22 19:37:16 UTC
Embargoed:


Attachments (Terms of Use)

Description Need Real Name 2001-05-07 06:11:35 UTC
From Bugzilla Helper:
User-Agent: Mozilla/5.0 (X11; U; Linux 2.4.2-2 i686; en-US; 0.7) Gecko/20010316

Description of problem:
I suspect this issue really only arises now because it is possible to
create a large file(> 2147483647 bytes).  The dump program will not allow a
file greater than this to be created - eg "dump -0f /mnt/mnt1/home.dump
/home".  Where the resulting file will be larger than the value above, dump
complains and exits after dumping that amount.  It is possible to use dump
-0f - /home > /mnt/mnt1/home.dump but this is hard to restore.
With this second option, trying to restore with "restore -xf
/mnt/mnt1/home.dump" will cause the an error similar to "File to Large".
It appears this error is coming from libc.  The command "restore -xf - <
/mnt/mnt1/home.dump" appears to work (I'm not 100% certain here) but has an
error after restoring files.  I'm not sure if it restores all permissions.

Having looked at the dump distribution, there appears to be an option to
configure.  Using "./configure  --enable-largefile" and compiling, the
resulting binaries don't have the problem I mention above.  However I
haven't fully looked into this to check that using this option is ok.  The
documention does say that it causes dump to use a 64 bit interface to
glibc, and that a minimum version of glibc is needed.  RedHat 7.1 seems to
fit this.

How reproducible:
Always

Steps to Reproduce:
1.See description
2.
3.
	

Additional info:

I've set the severity to High because of the possible risk of someone
thinking they've done a full dump of a file system only to find that they
haven't.

Comment 1 Stelian Pop 2001-05-22 19:37:11 UTC
Make sure you get dump-0.4b22 when you --enable-largefile it, earlier versions
had problems with LFS.

Stelian.

Comment 2 Mike A. Harris 2001-06-14 14:32:30 UTC
I have updated the dump package and enabled the --enable-largefile
option.


Note You need to log in before you can comment on or make changes to this bug.