From Bugzilla Helper: User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.2.1) Gecko/20030225 Description of problem: When reading or writing files larger than 2 GB to a RH 9 NFS server from a RH 9 workstation we eventually get a "File size limit exceeded" error. This does not occur with all large file access but once it does happen it happens every time large file access is attemped. Once it occurs it happens with all programs, both system command such as cat and dd as well as our own in house code. The one thing that consistently seem to start the error happening is to do a full back up of a one of the large (100s of GB) of workstations disks that get backed up to a tape system on the server. (I am thinking some 32 bit counter that may be wrapping around at some point???) Version-Release number of selected component (if applicable): kernel-2.4.20-20 How reproducible: Sometimes Steps to Reproduce: 1. create a large file on server via NFS 2. read large (hundreds of GB from the client to the server) 3. attempt to read the large file that was created (i.e. with cat) Actual Results: File size limit exceeded Expected Results: The file should have read OK Additional info:
Thanks for the bug report. However, Red Hat no longer maintains this version of the product. Please upgrade to the latest version and open a new bug if the problem persists. The Fedora Legacy project (http://fedoralegacy.org/) maintains some older releases, and if you believe this bug is interesting to them, please report the problem in the bug tracker at: http://bugzilla.fedora.us/