Red Hat Bugzilla – Bug 66333
can't handle large files
Last modified: 2007-04-18 12:43:02 EDT
From Bugzilla Helper:
User-Agent: Mozilla/4.78 [en] (WinNT; U)
Description of problem:
I created a 16 GB database file. The lsof command doesn't seem to like it:
$ /usr/sbin/lsof data.dat
lsof: status error on data.dat: Value too large for defined data type
lsof 4.51 (latest revision at ftp://vic.cc.purdue.edu/pub/tools/unix/lsof)
usage: [-?abhlnNoOPRstUvV] [-c c] [+|-d s] [+D D] [+|-f]
[-F [f]] [-g [s]] [-i [i]] [+|-L [l]] [+|-M] [-o [o]] [-p s]
[+|-r [t]] [-S [t]] [-T [t]] [-u s] [+|-w] [--] [names]
Use the ``-h'' option to get more help information.
A Deja search determined that this was likely due to the command not supporting large files.
Version-Release number of selected component (if applicable): 4.51-2
Steps to Reproduce:
1. Create a large file (attached lseek.c creates data.dat, a sparse 4 GB file)
2. /usr/sbin/lsof data.dat
Actual Results: Command displayed "Value too large for defined data type" warning.
Expected Results: Unless the file is in use, the command should have no output.
I found the following in the lsof FAQ list: "Large file support is defined dialect by dialect in the lsof source files and Configure script." Version 4.51, included with RH
7.3, does not define large file support for the linux dialect. More recent versions, including the current 4.63, do.
Created attachment 60144 [details]
create a 4 GB file
You can also use dd to create a file that demonstrates this problem:
$ dd if=/dev/null of=foo.dat bs=1 count=1 seek=4000000000
0+0 records in
0+0 records out
$ /usr/sbin/lsof foo.dat
lsof: status error on foo.dat: Value too large for defined data type
Should be fixed in lsof-4.63-1 in rawhide.