Bug 1071337 - wrong size reported by du and ls
Summary: wrong size reported by du and ls
Keywords:
Status: CLOSED WORKSFORME
Alias: None
Product: Red Hat Gluster Storage
Classification: Red Hat Storage
Component: distribute
Version: rhgs-3.0
Hardware: x86_64
OS: Linux
unspecified
high
Target Milestone: ---
: ---
Assignee: Nithya Balachandran
QA Contact: storage-qa-internal@redhat.com
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2014-02-28 14:23 UTC by Bruno Cornec
Modified: 2016-09-01 13:45 UTC (History)
8 users (show)

Fixed In Version:
Doc Type: Bug Fix
Doc Text:
Clone Of:
Environment:
Last Closed: 2016-09-01 13:45:28 UTC
Embargoed:


Attachments (Terms of Use)

Description Bruno Cornec 2014-02-28 14:23:50 UTC
Description of problem:

Using du and ls to get sizing information on a glusterfs mounted file system gives wrong results

Version-Release number of selected component (if applicable):
GlusterFS 3.4 
glusterfs-libs-3.4.0.59rhs-1.el6rhs.x86_64
glusterfs-server-3.4.0.59rhs-1.el6rhs.x86_64
glusterfs-fuse-3.4.0.59rhs-1.el6rhs.x86_64
glusterfs-3.4.0.59rhs-1.el6rhs.x86_64

RHEL 6.4
Kernel 2.6.32-358.32.3.el6.x86_64

How reproducible:
each time

Steps to Reproduce:

For the ls case:
[root@hp-worker-1 Input]# ll -h
total 185G
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00000
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:44 part-m-00001
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00002
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:44 part-m-00003
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00004
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:44 part-m-00005
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00006
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:44 part-m-00007
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00008
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:44 part-m-00009
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:44 part-m-00010
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00011
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:44 part-m-00012
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:44 part-m-00013
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00014
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:44 part-m-00015
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00016
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:44 part-m-00017
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:44 part-m-00018
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:44 part-m-00019
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:44 part-m-00020
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00021
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00022
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00023
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00024
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00025
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00026
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00027
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00028
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00029
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00030
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00031
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00032
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00033
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00034
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00035
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00036
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00037
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00038
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00039
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00040
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00041
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00042
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00043
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00044
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00045
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00046
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00047
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00048
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00049
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00050
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00051
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00052
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00053
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00054
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00055
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00056
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00057
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00058
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00059
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00060
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00061
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00062
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00063
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00064
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00065
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00066
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:45 part-m-00067
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00068
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00069
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00070
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:47 part-m-00071
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00072
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00073
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:47 part-m-00074
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00075
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:47 part-m-00076
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:47 part-m-00077
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:47 part-m-00078
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:47 part-m-00079
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:47 part-m-00080
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:47 part-m-00081
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:47 part-m-00082
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00083
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00084
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00085
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00086
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:47 part-m-00087
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:47 part-m-00088
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00089
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:47 part-m-00090
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00091
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00092
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00093
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:46 part-m-00094
-rw-r--r-- 1 yarn yarn 994M Feb 27 15:47 part-m-00095
[root@hp-worker-1 Input]# ll -h | wc -l
97

Actual results:

we get at the top of ll -h a value of 185G

Expected results:

we should get the real value 97 G as shown by the number of 1GB files in that glusterfs mounted directory

Similarly for du we get:

[root@hp-worker-1 Input]# du -sh part-m-00095
2.0G        part-m-00095

[root@hp-worker-1 Input]# stat part-m-00095
  File: `part-m-00095'
  Size: 1041666600      Blocks: 4036952    IO Block: 131072 regular file
Device: 1bh/27d Inode: 13806205625778023143  Links: 1
Access: (0644/-rw-r--r--)  Uid: (  506/    yarn)   Gid: (  506/    yarn)
Access: 2014-02-27 15:46:43.729596763 +0100
Modify: 2014-02-27 15:47:00.041168894 +0100
Change: 2014-02-27 15:47:18.250807562 +0100

So du gives a value of 2 GB whereas it should answer 1 GB.


Using strace, we found that these commands use different system calls that may explain the differences obtained:

execve("/bin/ls", ["ls", "-lh", "part-m-00095"], [/* 26 vars */]) = 0
[...]
lstat("part-m-00095", {st_mode=S_IFREG|0644, st_size=1041666600, ...}) = 0
[...]


execve("/usr/bin/stat", ["stat", "part-m-00095"], [/* 26 vars */]) = 0
[...]
lstat("part-m-00095", {st_mode=S_IFREG|0644, st_size=1041666600, ...}) = 0
fstat(1, {st_mode=S_IFREG|0644, st_size=3653, ...}) = 0
mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x7f6ed5b59000

both seem to use stat based syscalls whereas:

execve("/usr/bin/du", ["du", "-sh", "part-m-00095"], [/* 26 vars */]) = 0
brk(0)                                  = 0x254c000
mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x7f82dcf96000
access("/etc/ld.so.preload", R_OK)      = -1 ENOENT (No such file or directory)
open("/etc/ld.so.cache", O_RDONLY)      = 3
fstat(3, {st_mode=S_IFREG|0644, st_size=44925, ...}) = 0
mmap(NULL, 44925, PROT_READ, MAP_PRIVATE, 3, 0) = 0x7f82dcf8b000
close(3)                                = 0
open("/lib64/libc.so.6", O_RDONLY)      = 3
read(3, "\177ELF\2\1\1\3\0\0\0\0\0\0\0\0\3\0>\0\1\0\0\0\360\355A\2345\0\0\0"..., 832) = 832
fstat(3, {st_mode=S_IFREG|0755, st_size=1922152, ...}) = 0
mmap(0x359c400000, 3745960, PROT_READ|PROT_EXEC, MAP_PRIVATE|MAP_DENYWRITE, 3, 0) = 0x359c400000
mprotect(0x359c58a000, 2093056, PROT_NONE) = 0
mmap(0x359c789000, 20480, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_DENYWRITE, 3, 0x189000) = 0x359c789000
mmap(0x359c78e000, 18600, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_ANONYMOUS, -1, 0) = 0x359c78e000
close(3)                                = 0
mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x7f82dcf8a000
mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x7f82dcf89000
mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x7f82dcf88000
arch_prctl(ARCH_SET_FS, 0x7f82dcf89700) = 0
mprotect(0x359c789000, 16384, PROT_READ) = 0
mprotect(0x359be1f000, 4096, PROT_READ) = 0
munmap(0x7f82dcf8b000, 44925)           = 0
brk(0)                                  = 0x254c000
brk(0x256d000)                          = 0x256d000
open("/usr/lib/locale/locale-archive", O_RDONLY) = 3
fstat(3, {st_mode=S_IFREG|0644, st_size=99158576, ...}) = 0
mmap(NULL, 99158576, PROT_READ, MAP_PRIVATE, 3, 0) = 0x7f82d70f7000
close(3)                                = 0
newfstatat(AT_FDCWD, "part-m-00095", {st_mode=S_IFREG|0644, st_size=1041666600, ...}, AT_SYMLINK_NOFOLLOW) = 0
fstat(1, {st_mode=S_IFREG|0644, st_size=2005, ...}) = 0
mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x7f82dcf95000
write(1, "2.0G\tpart-m-00095\n", 182.0G	part-m-00095
)    = 18
close(1)                                = 0
munmap(0x7f82dcf95000, 4096)            = 0
close(2)                                = 0
exit_group(0)

seems to use newfstatat which may not report similar info. 

This is in a context of Hadoop with GlusterFS, using the plugin. HOwever, through that interface we get correct values:

[root@hp-worker-1 Input]# su -l yarn -c "hadoop fs -dus /tmp/HiBench/Terasort/Input/part-m-00095"
dus: DEPRECATED: Please use 'du -s' instead.
14/02/27 17:37:37 INFO glusterfs.GlusterVolume: Initializing gluster volume..
14/02/27 17:37:37 INFO glusterfs.GlusterFileSystem: Configuring GlusterFS
14/02/27 17:37:37 INFO glusterfs.GlusterFileSystem: Initializing GlusterFS,  CRC disabled.
14/02/27 17:37:37 INFO glusterfs.GlusterFileSystem: GIT INFO={git.commit.id.abbrev=7b04317, git.commit.user.email=jayunit100, git.commit.message.full=Merge pull request #80 from jayunit100/2.1.6_release_fix_sudoers

include the sudoers file in the srpm, git.commit.id=7b04317ff5c13af8de192626fb40c4a0a5c37000, git.commit.message.short=Merge pull request #80 from jayunit100/2.1.6_release_fix_sudoers, git.commit.user.name=jay vyas, git.build.user.name=Unknown, git.commit.id.describe=2.1.6, git.build.user.email=Unknown, git.branch=7b04317ff5c13af8de192626fb40c4a0a5c37000, git.commit.time=07.02.2014 @ 12:06:31 EST, git.build.time=10.02.2014 @ 13:31:20 EST}
14/02/27 17:37:37 INFO glusterfs.GlusterFileSystem: GIT_TAG=2.1.6
14/02/27 17:37:37 INFO glusterfs.GlusterFileSystem: Configuring GlusterFS
14/02/27 17:37:37 INFO glusterfs.GlusterVolume: Initializing gluster volume..
14/02/27 17:37:37 INFO glusterfs.GlusterVolume: Root of Gluster file system is /mnt/hpbigdata
14/02/27 17:37:37 INFO glusterfs.GlusterVolume: mapreduce/superuser daemon : root
14/02/27 17:37:37 INFO glusterfs.GlusterVolume: Working directory is : glusterfs:/user/yarn
14/02/27 17:37:37 INFO glusterfs.GlusterVolume: Write buffer size : 131072
1041666600  /tmp/HiBench/Terasort/Input/part-m-00095

Comment 2 Gaurav Kumar Garg 2015-12-30 08:41:43 UTC
Can someone from AFR or DHT team look into it, since its doesn't belong to a-team

Comment 3 Nithya Balachandran 2016-09-01 13:45:28 UTC
I am unable to reproduce this on glusterfs-server-3.7.9-12.el7rhgs.x86_64. 

I tried with smaller files (1MB each) and ll -h reported the correct size.

Volume Name: vol2
Type: Distribute
Volume ID: 0ed28329-f4c7-46ff-9c9c-78f4608ccf13
Status: Started
Number of Bricks: 6
Transport-type: tcp
Bricks:
Brick1: 192.168.122.17:/bricks/brick2/a-1
Brick2: 192.168.122.17:/bricks/brick2/a-2
Brick3: 192.168.122.17:/bricks/brick2/a-3
Brick4: 192.168.122.17:/bricks/brick2/a-4
Brick5: 192.168.122.17:/bricks/brick2/a-5
Brick6: 192.168.122.17:/bricks/brick2/a-6
Options Reconfigured:
performance.readdir-ahead: on


192.168.122.17:vol2 on /mnt/client1 type fuse.glusterfs (rw,relatime,user_id=0,group_id=0,default_permissions,allow_other,max_read=131072)



[root@localhost testdir]#for i in {1..100}; do dd if=/dev/urandom of=file-$i bs=1K count=1024; done


[root@localhost testdir]# ll -h
total 100M
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-1
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-10
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-100
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-11
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-12
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-13
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-14
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-15
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-16
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-17
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-18
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-19
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-2
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-20
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-21
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-22
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-23
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-24
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-25
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-26
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-27
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-28
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-29
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-3
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-30
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-31
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-32
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-33
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-34
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-35
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-36
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-37
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-38
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-39
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-4
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-40
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-41
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-42
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-43
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-44
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-45
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-46
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-47
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-48
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-49
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-5
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-50
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-51
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-52
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-53
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-54
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-55
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-56
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-57
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-58
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-59
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-6
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-60
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-61
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-62
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-63
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-64
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-65
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-66
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-67
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-68
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-69
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-7
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-70
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-71
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-72
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-73
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-74
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-75
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-76
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-77
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-78
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-79
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-8
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-80
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-81
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-82
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-83
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-84
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-85
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-86
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-87
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-88
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-89
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-9
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-90
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-91
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-92
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-93
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-94
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-95
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-96
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-97
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-98
-rw-r--r--. 1 root root 1.0M Sep  1 19:07 file-99



I am therefore Closing this as WorksForMe. Please reopen if seen again.


Note You need to log in before you can comment on or make changes to this bug.