Bug 1376562
| Summary: | last created dmstats file map isn't displayed | ||
|---|---|---|---|
| Product: | Red Hat Enterprise Linux 7 | Reporter: | Corey Marthaler <cmarthal> |
| Component: | lvm2 | Assignee: | Bryn M. Reeves <bmr> |
| lvm2 sub component: | dmsetup | QA Contact: | cluster-qe <cluster-qe> |
| Status: | CLOSED ERRATA | Docs Contact: | |
| Severity: | medium | ||
| Priority: | unspecified | CC: | agk, heinzm, jbrassow, lmiksik, msnitzer, mthacker, prajnoha, prockai, zkabelac |
| Version: | 7.3 | ||
| Target Milestone: | rc | ||
| Target Release: | --- | ||
| Hardware: | x86_64 | ||
| OS: | Linux | ||
| Whiteboard: | |||
| Fixed In Version: | lvm2-2.02.165-3.el7 | Doc Type: | If docs needed, set a value |
| Doc Text: | Story Points: | --- | |
| Clone Of: | Environment: | ||
| Last Closed: | 2016-11-04 04:18:57 UTC | Type: | Bug |
| Regression: | --- | Mount Type: | --- |
| Documentation: | --- | CRM: | |
| Verified Versions: | Category: | --- | |
| oVirt Team: | --- | RHEL 7.3 requirements from Atomic Host: | |
| Cloudforms Team: | --- | Target Upstream Version: | |
| Embargoed: | |||
|
Description
Corey Marthaler
2016-09-15 17:43:46 UTC
Same thing on ext4 and xfs (comment #0 was xfs, this is ext4). [root@host-117 ~]# ls -lrt /mnt/lvol0/ total 1973 drwx------. 2 root root 12288 Sep 15 12:45 lost+found -rw-rw-rw-. 1 root root 255720 Sep 15 12:45 xjjcmoeftdcathggnism -rw-rw-rw-. 1 root root 178652 Sep 15 12:45 vntwhckfhujdlh -rw-rw-rw-. 1 root root 37185 Sep 15 12:45 oxh -rw-rw-rw-. 1 root root 359128 Sep 15 12:45 jyounkwfobrpvwktguljapbouqhaasi -rw-rw-rw-. 1 root root 46176 Sep 15 12:45 hywd -rw-rw-rw-. 1 root root 300144 Sep 15 12:45 huqhghhdvahqxsbqwfaglaqeatj -rw-rw-rw-. 1 root root 59637 Sep 15 12:45 gtvyy -rw-rw-rw-. 1 root root 132108 Sep 15 12:45 eyhphlsfseyh -rw-rw-rw-. 1 root root 156651 Sep 15 12:45 endtuoiwtqsrpf -rw-rw-rw-. 1 root root 466376 Sep 15 12:45 aamjuraatffafxcjulduflntpdhcbgvdgloedqeyxlygl [root@host-117 ~]# dmstats create --filemap /mnt/lvol0/xjjcmoeftdcathggnism /mnt/lvol0/xjjcmoeftdcathggnism: Created new group with 1 region(s) as group ID 0. [root@host-117 ~]# dmstats create --filemap /mnt/lvol0/vntwhckfhujdlh /mnt/lvol0/vntwhckfhujdlh: Created new group with 1 region(s) as group ID 1. [root@host-117 ~]# dmstats create --filemap /mnt/lvol0/oxh /mnt/lvol0/oxh: Created new group with 1 region(s) as group ID 2. [root@host-117 ~]# dmstats list --group Name GrpID RgID ObjType RgStart RgSize #Areas ArSize ProgID xjjcmoeftdcathggnism 0 0 group 8.63m 250.00k 1 250.00k dmstats vntwhckfhujdlh 1 1 group 9.36m 175.00k 1 175.00k dmstats [root@host-117 ~]# dmstats report --group Name GrpID RgID ObjType ArID ArStart ArSize RMrg/s WMrg/s R/s W/s RSz/s WSz/s AvgRqSz QSize Util% AWait RdAWait WrAWait xjjcmoeftdcathggnism 0 0 group 0 8.63m 250.00k 0.00 0.00 0.00 0.00 0 0 0 0.00 0.00 0.00 0.00 0.00 vntwhckfhujdlh 1 1 group 0 9.36m 175.00k 0.00 0.00 0.00 0.00 0 0 0 0.00 0.00 0.00 0.00 0.00 I've not managed to reproduce this yet; I'll test again today with the RHEL7 builds. If you have the test setup to hand still it'd be useful to see the group tags stored with the regions: # dmsetup message $DM_DEVICE 0 "@stats_list" I can reproduce this now, using exactly the file names and sizes from comment #0: # ls -l /var/tmp/test total 2404 -rw-r--r--. 1 root root 55900 Sep 16 10:44 dweyv -rw-r--r--. 1 root root 74664 Sep 16 10:44 fjgsbcm -rw-r--r--. 1 root root 394230 Sep 16 10:44 fsxmsjjuuybvvuvffucprynnaujtadghxgv -rw-r--r--. 1 root root 11236 Sep 16 10:44 j -rw-r--r--. 1 root root 92962 Sep 16 10:44 joxcobxp -rw-r--r--. 1 root root 319823 Sep 16 10:44 klnsuhitivfmpvcxyjosnxfcsqp -rw-r--r--. 1 root root 392400 Sep 16 10:44 masrralqmguoowffwuscbvnhutfiehptj -rw-r--r--. 1 root root 448580 Sep 16 10:44 nnkgyhnkfnlfrhhbucmybomlhokgkotaeghenu -rw-r--r--. 1 root root 167200 Sep 16 10:44 nxaajqdsvqbqnn -rw-r--r--. 1 root root 484071 Sep 16 10:44 oqcgmbeopavniredqkiswcvrptskyaypqdvdhauw 1006 dmstats create --filemap fsxmsjjuuybvvuvffucprynnaujtadghxgv 1007 dmstats create --filemap joxcobxp 1008 dmstats create --filemap nnkgyhnkfnlfrhhbucmybomlhokgkotaeghenu 1009 dmstats list --group Name GrpID RgID ObjType RgStart RgSize #Areas ArSize ProgID fsxmsjjuuybvvuvffucprynnaujtadghxgv 0 0 group 11.16g 388.00k 1 388.00k dmstats joxcobxp 1 1 group 10.53g 92.00k 1 92.00k dmstats Somewhat bizarrely, it does not reproduce with my standard set of test image files: # ls /var/lib/libvirt/images/ rhel5.10-1.qcow2 rhel5.10.qcow2 rhel7.0-1.qcow2 rhel7.0-2.qcow2 rhel7.0.qcow2 vm.img # dmstats list --group vg_hex/lv_images Name GrpID RgID ObjType RgStart RgSize #Areas ArSize ProgID rhel5.10-1.qcow2 0 0-120 group 80.00k 328.24m 1 328.24m dmstats rhel5.10.qcow2 121 121-128 group 22.02m 1.13m 1 1.13m dmstats rhel7.0-1.qcow2 129 129-139 group 8.90g 1.69m 1 1.69m dmstats rhel7.0-2.qcow2 140 140 group 11.45g 448.00k 1 448.00k dmstats rhel7.0.qcow2 141 141-1605 group 16.08m 4.35g 1 4.35g dmstats vm.img 1606 1606-1607 group 6.55g 1.00g 1 1.00g dmstats I think it's a bug affecting only groups where the first group member (leader) is also the last region present on a device: # dmsetup message vg_hex-lv_root 0 "@stats_list" 0: 23396720+776 776 dmstats DMS_GROUP=fsxmsjjuuybvvuvffucprynnaujtadghxgv:0#- 1: 22092536+184 184 dmstats DMS_GROUP=joxcobxp:1#- 2: 23631960+880 880 dmstats DMS_GROUP=nnkgyhnkfnlfrhhbucmybomlhokgkotaeghenu:2#- Region 2 is the only member of group 2. If I add a 2nd region (even without adding it to group_id 2's DMS_GROUP tag), the missing group appears: # dmstats list --group vg_hex/lv_root Name GrpID RgID ObjType RgStart RgSize #Areas ArSize ProgID fsxmsjjuuybvvuvffucprynnaujtadghxgv 0 0 group 11.16g 388.00k 1 388.00k dmstats joxcobxp 1 1 group 10.53g 92.00k 1 92.00k dmstats nnkgyhnkfnlfrhhbucmybomlhokgkotaeghenu 2 2 group 11.27g 440.00k 1 440.00k dmstats Seems to be a bug in the end conditions for the group walk iterator. Definitely a bug in the group walk - the groups are correctly recognised and populated, and the walk is correctly set up - it just doesn't visit the final group in the case that that group has one member, and that member is the final region on the device: Read alias 'fsxmsjjuuybvvuvffucprynnaujtadghxgv' from aux_data Found group_id 0: alias="fsxmsjjuuybvvuvffucprynnaujtadghxgv" Read alias 'joxcobxp' from aux_data Found group_id 1: alias="joxcobxp" Read alias 'nnkgyhnkfnlfrhhbucmybomlhokgkotaeghenu' from aux_data Found group_id 2: alias="nnkgyhnkfnlfrhhbucmybomlhokgkotaeghenu" dm_stats_walk_init: initialised flags to f000000000000 starting stats walk with AREA REGION GROUP SKIP Name GrpID RgID ObjType RgStart RgSize #Areas ArID ArStart ArSize ProgID fsxmsjjuuybvvuvffucprynnaujtadghxgv 0 0 area 184.00k 388.00k 1 0 184.00k 388.00k dmstats joxcobxp 1 1 area 584.00k 92.00k 1 0 584.00k 92.00k dmstats nnkgyhnkfnlfrhhbucmybomlhokgkotaeghenu 2 2 area 1.34m 440.00k 1 0 1.34m 440.00k dmstats fsxmsjjuuybvvuvffucprynnaujtadghxgv 0 0 group 184.00k 388.00k 1 0 184.00k 388.00k dmstats joxcobxp 1 1 group 584.00k 92.00k 1 0 584.00k 92.00k dmstats The test for the end of group table in _stats_walk_end() is wrong:
1526 static int _stats_walk_end(const struct dm_stats *dms, uint64_t *flags,
1527 uint64_t *cur_r, uint64_t *cur_a, uint64_t *cur_g)
1528 {
[...]
1544
1545 if (*flags & DM_STATS_WALK_GROUP) {
1546 if (*cur_g < dms->max_region)
^
1547 goto out;
1548 *flags &= ~DM_STATS_WALK_GROUP;
1549 }
1550 out:
1551 return !(*flags & ~DM_STATS_WALK_SKIP_SINGLE_AREA);
1552 }
The comparison with dms->max_region should be '<='
Making this change fixes the --groups output:
[ original ]
# dmstats list --group
Name GrpID RgID ObjType RgStart RgSize #Areas ArSize ProgID
fsxmsjjuuybvvuvffucprynnaujtadghxgv 0 0 group 11.16g 388.00k 1 388.00k dmstats
joxcobxp 1 1 group 10.53g 92.00k 1 92.00k dmstats
[ patched ]
# dmstats list --group
Name GrpID RgID ObjType RgStart RgSize #Areas ArSize ProgID
fsxmsjjuuybvvuvffucprynnaujtadghxgv 0 0 group 11.16g 388.00k 1 388.00k dmstats
joxcobxp 1 1 group 10.53g 92.00k 1 92.00k dmstats
nnkgyhnkfnlfrhhbucmybomlhokgkotaeghenu 2 2 group 11.27g 440.00k 1 440.00k dmstats
Fixed upstream:
commit c26cd4853668f1da94eb98e18d7a52c93fd21197
Author: Bryn M. Reeves <bmr>
Date: Fri Sep 16 13:08:30 2016 +0100
libdm: fix end-of-groups test in _stats_walk_end()
(In reply to Bryn M. Reeves from comment #6) > [ original ] > # dmstats list --group > Name GrpID RgID ObjType RgStart RgSize > #Areas ArSize ProgID > fsxmsjjuuybvvuvffucprynnaujtadghxgv 0 0 group 11.16g 388.00k > 1 388.00k dmstats > joxcobxp 1 1 group 10.53g 92.00k > 1 92.00k dmstats > > [ patched ] > # dmstats list --group > Name GrpID RgID ObjType RgStart RgSize > #Areas ArSize ProgID > fsxmsjjuuybvvuvffucprynnaujtadghxgv 0 0 group 11.16g 388.00k > 1 388.00k dmstats > joxcobxp 1 1 group 10.53g 92.00k > 1 92.00k dmstats > nnkgyhnkfnlfrhhbucmybomlhokgkotaeghenu 2 2 group 11.27g 440.00k > 1 440.00k dmstats (In reply to Bryn M. Reeves from comment #7) > Fixed upstream: > > commit c26cd4853668f1da94eb98e18d7a52c93fd21197 > Author: Bryn M. Reeves <bmr> > Date: Fri Sep 16 13:08:30 2016 +0100 > > libdm: fix end-of-groups test in _stats_walk_end() The patch is trivial, I'm adding blocker flag to get this into next 7.3 lvm2 build (which we'll do anyway). Fix verified in the latest rpms. lvm2-2.02.165-3.el7 BUILT: Wed Sep 21 08:26:18 CDT 2016 lvm2-libs-2.02.165-3.el7 BUILT: Wed Sep 21 08:26:18 CDT 2016 lvm2-cluster-2.02.165-3.el7 BUILT: Wed Sep 21 08:26:18 CDT 2016 device-mapper-1.02.134-3.el7 BUILT: Wed Sep 21 08:26:18 CDT 2016 device-mapper-libs-1.02.134-3.el7 BUILT: Wed Sep 21 08:26:18 CDT 2016 device-mapper-event-1.02.134-3.el7 BUILT: Wed Sep 21 08:26:18 CDT 2016 device-mapper-event-libs-1.02.134-3.el7 BUILT: Wed Sep 21 08:26:18 CDT 2016 device-mapper-persistent-data-0.6.3-1.el7 BUILT: Fri Jul 22 05:29:13 CDT 2016 -rw-rw-rw-. 1 root root 14641 Sep 23 11:48 y -rw-rw-rw-. 1 root root 293510 Sep 23 11:48 bfoagyxadmcimesocrpadmodnmvv -rw-rw-rw-. 1 root root 224614 Sep 23 11:48 jkjnbdfcggyqnipjdhnh -rw-rw-rw-. 1 root root 645814 Sep 23 11:48 veblwvgquvqnoxuvvhovfjxpuhtvtitroxdltmfoivewtcurl -rw-rw-rw-. 1 root root 214170 Sep 23 11:48 yayctulqxvguecgu -rw-rw-rw-. 1 root root 603600 Sep 23 11:48 xnfnqqbotttupjjhjscvxlmdtgcrtnnrbuirljjhfdfwoo [root@host-116 checkit]# dmstats create --filemap y y: Created new group with 1 region(s) as group ID 0. [root@host-116 checkit]# dmstats create --filemap bfoagyxadmcimesocrpadmodnmvv bfoagyxadmcimesocrpadmodnmvv: Created new group with 1 region(s) as group ID 1. [root@host-116 checkit]# dmstats create --filemap jkjnbdfcggyqnipjdhnh jkjnbdfcggyqnipjdhnh: Created new group with 1 region(s) as group ID 2. [root@host-116 checkit]# dmstats report --group Name GrpID RgID ObjType ArID ArStart ArSize RMrg/s WMrg/s R/s W/s RSz/s WSz/s AvgRqSz QSize Util% AWait RdAWait WrAWait y 0 0 group 0 24.51m 16.00k 0.00 0.00 0.00 0.00 0 0 0 0.00 0.00 0.00 0.00 0.00 bfoagyxadmcimesocrpadmodnmvv 1 1 group 0 24.53m 288.00k 0.00 0.00 0.00 0.00 0 0 0 0.00 0.00 0.00 0.00 0.00 jkjnbdfcggyqnipjdhnh 2 2 group 0 24.81m 220.00k 0.00 0.00 0.00 0.00 0 0 0 0.00 0.00 0.00 0.00 0.00 [root@host-116 checkit]# dmstats list --group Name GrpID RgID ObjType RgStart RgSize #Areas ArSize ProgID y 0 0 group 24.51m 16.00k 1 16.00k dmstats bfoagyxadmcimesocrpadmodnmvv 1 1 group 24.53m 288.00k 1 288.00k dmstats jkjnbdfcggyqnipjdhnh 2 2 group 24.81m 220.00k 1 220.00k dmstats [root@host-116 checkit]# dmstats create --filemap veblwvgquvqnoxuvvhovfjxpuhtvtitroxdltmfoivewtcurl veblwvgquvqnoxuvvhovfjxpuhtvtitroxdltmfoivewtcurl: Created new group with 1 region(s) as group ID 3. [root@host-116 checkit]# dmstats create --filemap yayctulqxvguecgu yayctulqxvguecgu: Created new group with 1 region(s) as group ID 4. [root@host-116 checkit]# dmstats create --filemap xnfnqqbotttupjjhjscvxlmdtgcrtnnrbuirljjhfdfwoo xnfnqqbotttupjjhjscvxlmdtgcrtnnrbuirljjhfdfwoo: Created new group with 1 region(s) as group ID 5. [root@host-116 checkit]# dmstats report --group Name GrpID RgID ObjType ArID ArStart ArSize RMrg/s WMrg/s R/s W/s RSz/s WSz/s AvgRqSz QSize Util% AWait RdAWait WrAWait y 0 0 group 0 24.51m 16.00k 0.00 0.00 0.00 0.00 0 0 0 0.00 0.00 0.00 0.00 0.00 bfoagyxadmcimesocrpadmodnmvv 1 1 group 0 24.53m 288.00k 0.00 0.00 0.00 0.00 0 0 0 0.00 0.00 0.00 0.00 0.00 jkjnbdfcggyqnipjdhnh 2 2 group 0 24.81m 220.00k 0.00 0.00 0.00 0.00 0 0 0 0.00 0.00 0.00 0.00 0.00 veblwvgquvqnoxuvvhovfjxpuhtvtitroxdltmfoivewtcurl 3 3 group 0 25.02m 632.00k 0.00 0.00 0.00 0.00 0 0 0 0.00 0.00 0.00 0.00 0.00 yayctulqxvguecgu 4 4 group 0 25.64m 212.00k 0.00 0.00 0.00 0.00 0 0 0 0.00 0.00 0.00 0.00 0.00 xnfnqqbotttupjjhjscvxlmdtgcrtnnrbuirljjhfdfwoo 5 5 group 0 25.85m 592.00k 0.00 0.00 0.00 0.00 0 0 0 0.00 0.00 0.00 0.00 0.00 [root@host-116 checkit]# dmstats list --group Name GrpID RgID ObjType RgStart RgSize #Areas ArSize ProgID y 0 0 group 24.51m 16.00k 1 16.00k dmstats bfoagyxadmcimesocrpadmodnmvv 1 1 group 24.53m 288.00k 1 288.00k dmstats jkjnbdfcggyqnipjdhnh 2 2 group 24.81m 220.00k 1 220.00k dmstats veblwvgquvqnoxuvvhovfjxpuhtvtitroxdltmfoivewtcurl 3 3 group 25.02m 632.00k 1 632.00k dmstats yayctulqxvguecgu 4 4 group 25.64m 212.00k 1 212.00k dmstats xnfnqqbotttupjjhjscvxlmdtgcrtnnrbuirljjhfdfwoo 5 5 group 25.85m 592.00k 1 592.00k dmstats Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory, and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://rhn.redhat.com/errata/RHBA-2016-1445.html |