RHEL Engineering is moving the tracking of its product development work on RHEL 6 through RHEL 9 to Red Hat Jira (issues.redhat.com). If you're a Red Hat customer, please continue to file support cases via the Red Hat customer portal. If you're not, please head to the "RHEL project" in Red Hat Jira and file new tickets here. Individual Bugzilla bugs in the statuses "NEW", "ASSIGNED", and "POST" are being migrated throughout September 2023. Bugs of Red Hat partners with an assigned Engineering Partner Manager (EPM) are migrated in late September as per pre-agreed dates. Bugs against components "kernel", "kernel-rt", and "kpatch" are only migrated if still in "NEW" or "ASSIGNED". If you cannot log in to RH Jira, please consult article #7032570. That failing, please send an e-mail to the RH Jira admins at rh-issues@redhat.com to troubleshoot your issue as a user management inquiry. The email creates a ServiceNow ticket with Red Hat. Individual Bugzilla bugs that are migrated will be moved to status "CLOSED", resolution "MIGRATED", and set with "MigratedToJIRA" in "Keywords". The link to the successor Jira issue will be found under "Links", have a little "two-footprint" icon next to it, and direct you to the "RHEL project" in Red Hat Jira (issue links are of type "https://issues.redhat.com/browse/RHEL-XXXX", where "X" is a digit). This same link will be available in a blue banner at the top of the page informing you that that bug has been migrated.
Bug 1747585 - vgextend segfaults if attempting to extend using device that was previous removed and volumes were deleted
Summary: vgextend segfaults if attempting to extend using device that was previous rem...
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Enterprise Linux 8
Classification: Red Hat
Component: lvm2
Version: 8.1
Hardware: x86_64
OS: Linux
medium
medium
Target Milestone: rc
: 8.0
Assignee: David Teigland
QA Contact: cluster-qe@redhat.com
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2019-08-30 22:04 UTC by Corey Marthaler
Modified: 2021-09-07 11:53 UTC (History)
9 users (show)

Fixed In Version: lvm2-2.03.07-1.el8
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2020-04-28 16:58:57 UTC
Type: Bug
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Red Hat Issue Tracker RHELPLAN-30672 0 None None None 2021-09-07 11:51:50 UTC
Red Hat Product Errata RHEA-2020:1881 0 None None None 2020-04-28 16:59:11 UTC

Description Corey Marthaler 2019-08-30 22:04:12 UTC
Description of problem:
I was playing around with the regression test case for bug 1434054 that basically removes a primary raid device, vgreduce --removemissing, and then before the next command is run, adds the primary device back while also removing the secondary device. So you can end up where there are no known devices in a mounted raid for a moment (like in bug 1583805). Seeing the "outdated PV" warnings when running vgck, I started adding "vgck --updatemetadata" in certain spots like mentioned in bug 1741276 and eventually ran into this segfault. I'll attempt to provide a simpler reproducer.



============================================================
Iteration 1 of 1 started at Fri Aug 30 16:41:14 CDT 2019
============================================================
SCENARIO (raid1) - [remove_another_image_during_meta_update]
Create a raid, hide a PV, then vgreduce, and reinstate missing PV right afterwards
hayes-01: lvcreate  --nosync --type raid1 -m 1 -n removemissing -L 300M raid_sanity
  WARNING: New raid1 won't be synchronised. Don't read what you didn't write!
WARNING: ext2 signature detected on /dev/raid_sanity/removemissing at offset 1080. Wipe it? [y/n]: [n]
  Aborted wiping of ext2.
  1 existing signature left on the device.
Placing an ext filesystem on linear volume
mke2fs 1.44.6 (5-Mar-2019)
Writing files to /mnt/removemissing
Checking files on /mnt/removemissing

Adding /dev/sde1 to be excluded from lvm.conf filter
  WARNING: Couldn't find device with uuid fCfYPa-IoS8-y090-2vUf-kHvB-TSws-JzsU2e.
  WARNING: VG raid_sanity is missing PV fCfYPa-IoS8-y090-2vUf-kHvB-TSws-JzsU2e.
  WARNING: Couldn't find all devices for LV raid_sanity/removemissing_rimage_0 while checking used and assumed devices.
  WARNING: Couldn't find all devices for LV raid_sanity/removemissing_rmeta_0 while checking used and assumed devices.
  removemissing            raid_sanity Rwi-aor-p- 300.00m                                    100.00           removemissing_rimage_0(0),removemissing_rimage_1(0)
   [removemissing_rimage_0] raid_sanity iwi-aor-p- 300.00m                                                     [unknown](1)                                       
   [removemissing_rimage_1] raid_sanity iwi-aor--- 300.00m                                                     /dev/sdf1(1)                                       
   [removemissing_rmeta_0]  raid_sanity ewi-aor-p-   4.00m                                                     [unknown](0)                                       
   [removemissing_rmeta_1]  raid_sanity ewi-aor---   4.00m                                                     /dev/sdf1(0)                                       

Verifying raid has missing device with vgck
  WARNING: Couldn't find device with uuid fCfYPa-IoS8-y090-2vUf-kHvB-TSws-JzsU2e.
  WARNING: VG raid_sanity is missing PV fCfYPa-IoS8-y090-2vUf-kHvB-TSws-JzsU2e.
  WARNING: Couldn't find all devices for LV raid_sanity/removemissing_rimage_0 while checking used and assumed devices.
  WARNING: Couldn't find all devices for LV raid_sanity/removemissing_rmeta_0 while checking used and assumed devices.
  The volume group is missing 1 physical volumes.
Run vgreduce --removemissing to remove "missing" PV and then immediately bring that PV back
vgreduce --force --removemissing raid_sanity
  WARNING: Couldn't find device with uuid fCfYPa-IoS8-y090-2vUf-kHvB-TSws-JzsU2e.
  WARNING: VG raid_sanity is missing PV fCfYPa-IoS8-y090-2vUf-kHvB-TSws-JzsU2e.
  WARNING: Couldn't find all devices for LV raid_sanity/removemissing_rimage_0 while checking used and assumed devices.
  WARNING: Couldn't find all devices for LV raid_sanity/removemissing_rmeta_0 while checking used and assumed devices.
  WARNING: Couldn't find device with uuid fCfYPa-IoS8-y090-2vUf-kHvB-TSws-JzsU2e.
  WARNING: Couldn't find device with uuid fCfYPa-IoS8-y090-2vUf-kHvB-TSws-JzsU2e.
  removemissing            raid_sanity Rwi-aor-r- 300.00m                                    100.00           removemissing_rimage_0(0),removemissing_rimage_1(0)
   [removemissing_rimage_0] raid_sanity vwi-aor-r- 300.00m                                                                                                        
   [removemissing_rimage_1] raid_sanity iwi-aor--- 300.00m                                                     /dev/sdf1(1)                                       
   [removemissing_rmeta_0]  raid_sanity ewi-aor-r-   4.00m                                                                                                        
   [removemissing_rmeta_1]  raid_sanity ewi-aor---   4.00m                                                     /dev/sdf1(0)                                       

Removing /dev/sde1 from excluded lvm.conf filter
And now adding /dev/sdf1 to excluded lvm.conf filter

vgck --updatemetadata raid_sanity
  WARNING: ignoring metadata seqno 45 on /dev/sde1 for seqno 47 on /dev/sdb1 for VG raid_sanity.
  WARNING: Inconsistent metadata found for VG raid_sanity
  WARNING: Couldn't find device with uuid oagj0B-ExBB-kWeP-bMhN-qPPG-YQm2-Kybfjx.
  WARNING: outdated PV /dev/sde1 seqno 45 has been removed in current VG raid_sanity seqno 47.
  WARNING: VG raid_sanity is missing PV oagj0B-ExBB-kWeP-bMhN-qPPG-YQm2-Kybfjx.
  WARNING: Couldn't find all devices for LV raid_sanity/removemissing_rimage_1 while checking used and assumed devices.
  WARNING: Couldn't find all devices for LV raid_sanity/removemissing_rmeta_1 while checking used and assumed devices.
  WARNING: Couldn't find device with uuid oagj0B-ExBB-kWeP-bMhN-qPPG-YQm2-Kybfjx.
  WARNING: wiping mda on outdated PV /dev/sde1
  WARNING: wiping header on outdated PV /dev/sde1
  WARNING: Couldn't find device with uuid oagj0B-ExBB-kWeP-bMhN-qPPG-YQm2-Kybfjx.
Check that vgs/pvs doesn't segfault here (Bug 1434054)
  WARNING: Couldn't find device with uuid oagj0B-ExBB-kWeP-bMhN-qPPG-YQm2-Kybfjx.
  WARNING: VG raid_sanity is missing PV oagj0B-ExBB-kWeP-bMhN-qPPG-YQm2-Kybfjx.
  WARNING: Couldn't find all devices for LV raid_sanity/removemissing_rimage_1 while checking used and assumed devices.
  WARNING: Couldn't find all devices for LV raid_sanity/removemissing_rmeta_1 while checking used and assumed devices.
  WARNING: Couldn't find device with uuid oagj0B-ExBB-kWeP-bMhN-qPPG-YQm2-Kybfjx.
  WARNING: VG raid_sanity is missing PV oagj0B-ExBB-kWeP-bMhN-qPPG-YQm2-Kybfjx.
  WARNING: Couldn't find all devices for LV raid_sanity/removemissing_rimage_1 while checking used and assumed devices.
  WARNING: Couldn't find all devices for LV raid_sanity/removemissing_rmeta_1 while checking used and assumed devices.
Checking files on /mnt/removemissing

  WARNING: Couldn't find device with uuid oagj0B-ExBB-kWeP-bMhN-qPPG-YQm2-Kybfjx.
  WARNING: VG raid_sanity is missing PV oagj0B-ExBB-kWeP-bMhN-qPPG-YQm2-Kybfjx.
  WARNING: Couldn't find all devices for LV raid_sanity/removemissing_rimage_1 while checking used and assumed devices.
  WARNING: Couldn't find all devices for LV raid_sanity/removemissing_rmeta_1 while checking used and assumed devices.
  removemissing            raid_sanity Rwi-aor-p- 300.00m                                    100.00           removemissing_rimage_0(0),removemissing_rimage_1(0)
   [removemissing_rimage_0] raid_sanity vwi-aor-r- 300.00m                                                                                                        
   [removemissing_rimage_1] raid_sanity iwi-aor-p- 300.00m                                                     [unknown](1)                                       
   [removemissing_rmeta_0]  raid_sanity ewi-aor-r-   4.00m                                                                                                        
   [removemissing_rmeta_1]  raid_sanity ewi-aor-p-   4.00m                                                     [unknown](0)                                       



[root@hayes-01 ~]# vgextend raid_sanity /dev/sdk1
  Volume group "raid_sanity" successfully extended
[root@hayes-01 ~]# umount /mnt/*


[root@hayes-01 ~]# lvremove -f raid_sanity
  WARNING: Couldn't find device with uuid oagj0B-ExBB-kWeP-bMhN-qPPG-YQm2-Kybfjx.
  WARNING: VG raid_sanity is missing PV oagj0B-ExBB-kWeP-bMhN-qPPG-YQm2-Kybfjx.
  WARNING: Couldn't find all devices for LV raid_sanity/removemissing_rimage_1 while checking used and assumed devices.
  WARNING: Couldn't find all devices for LV raid_sanity/removemissing_rmeta_1 while checking used and assumed devices.
  WARNING: Couldn't find device with uuid oagj0B-ExBB-kWeP-bMhN-qPPG-YQm2-Kybfjx.
  WARNING: Couldn't find device with uuid oagj0B-ExBB-kWeP-bMhN-qPPG-YQm2-Kybfjx.
  Logical volume "removemissing" successfully removed
[root@hayes-01 ~]# vgextend raid_sanity /dev/sdk1
  WARNING: Couldn't find device with uuid oagj0B-ExBB-kWeP-bMhN-qPPG-YQm2-Kybfjx.
  WARNING: VG raid_sanity is missing PV oagj0B-ExBB-kWeP-bMhN-qPPG-YQm2-Kybfjx.
  Physical volume '/dev/sdk1' is already in volume group 'raid_sanity'
  Unable to add physical volume '/dev/sdk1' to volume group 'raid_sanity'
  /dev/sdk1: physical volume not initialized.
[root@hayes-01 ~]# vgck --updatemetadata raid_sanity
  WARNING: Couldn't find device with uuid oagj0B-ExBB-kWeP-bMhN-qPPG-YQm2-Kybfjx.
  WARNING: VG raid_sanity is missing PV oagj0B-ExBB-kWeP-bMhN-qPPG-YQm2-Kybfjx.
  WARNING: Couldn't find device with uuid oagj0B-ExBB-kWeP-bMhN-qPPG-YQm2-Kybfjx.
  WARNING: Couldn't find device with uuid oagj0B-ExBB-kWeP-bMhN-qPPG-YQm2-Kybfjx.
[root@hayes-01 ~]# vgs
  WARNING: Couldn't find device with uuid oagj0B-ExBB-kWeP-bMhN-qPPG-YQm2-Kybfjx.
  WARNING: VG raid_sanity is missing PV oagj0B-ExBB-kWeP-bMhN-qPPG-YQm2-Kybfjx.
  VG          #PV #LV #SN Attr   VSize   VFree  
  raid_sanity   9   0   0 wz-pn- <16.37t <16.37t
[root@hayes-01 ~]# pvscan
  WARNING: Couldn't find device with uuid oagj0B-ExBB-kWeP-bMhN-qPPG-YQm2-Kybfjx.
  WARNING: VG raid_sanity is missing PV oagj0B-ExBB-kWeP-bMhN-qPPG-YQm2-Kybfjx.
  PV [unknown]   VG raid_sanity     lvm2 [<1.82 TiB / <1.82 TiB free]
  PV /dev/sdc1   VG raid_sanity     lvm2 [<1.82 TiB / <1.82 TiB free]
  PV /dev/sdb1   VG raid_sanity     lvm2 [<1.82 TiB / <1.82 TiB free]
  PV /dev/sdg1   VG raid_sanity     lvm2 [<1.82 TiB / <1.82 TiB free]
  PV /dev/sdi1   VG raid_sanity     lvm2 [<1.82 TiB / <1.82 TiB free]
  PV /dev/sdh1   VG raid_sanity     lvm2 [<1.82 TiB / <1.82 TiB free]
  PV /dev/sdd1   VG raid_sanity     lvm2 [<1.82 TiB / <1.82 TiB free]
  PV /dev/sdj1   VG raid_sanity     lvm2 [<1.82 TiB / <1.82 TiB free]
  PV /dev/sdk1   VG raid_sanity     lvm2 [<1.82 TiB / <1.82 TiB free]
  PV /dev/sde1                      lvm2 [<1.82 TiB]
  Total: 10 [<18.19 TiB] / in use: 9 [<16.37 TiB] / in no VG: 1 [<1.82 TiB]
[root@hayes-01 ~]# lvs
  WARNING: Couldn't find device with uuid oagj0B-ExBB-kWeP-bMhN-qPPG-YQm2-Kybfjx.
  WARNING: VG raid_sanity is missing PV oagj0B-ExBB-kWeP-bMhN-qPPG-YQm2-Kybfjx.
[root@hayes-01 ~]# vgextend raid_sanity /dev/sde1
  WARNING: Couldn't find device with uuid oagj0B-ExBB-kWeP-bMhN-qPPG-YQm2-Kybfjx.
  WARNING: VG raid_sanity is missing PV oagj0B-ExBB-kWeP-bMhN-qPPG-YQm2-Kybfjx.
  WARNING: Couldn't find device with uuid oagj0B-ExBB-kWeP-bMhN-qPPG-YQm2-Kybfjx.
  WARNING: VG raid_sanity is missing PV oagj0B-ExBB-kWeP-bMhN-qPPG-YQm2-Kybfjx.
  WARNING: Couldn't find device with uuid oagj0B-ExBB-kWeP-bMhN-qPPG-YQm2-Kybfjx.
Segmentation fault (core dumped)



[792752.304057] vgextend[23039]: segfault at 34 ip 0000558bb6b2ab29 sp 00007fffb8ad3800 error 4 in lvm[558bb6a8f000+20f000]
[792752.316226] Code: a4 0e 00 50 52 ba 5e 00 00 00 eb 99 0f 1f 44 00 00 f3 0f 1e fa 41 56 49 89 f6 41 55 49 89 d5 41 54 55 53 48 89 fb 48 83 ec 10 <44> 8b 67 34 64 48 8b 04 25 28 00 00 00 48 89 44 24 08 31 c0 48 83
Aug 30 17:02:18 hayes-01 kernel: vgextend[23039]: segfault at 34 ip 0000558bb6b2ab29 sp 00007fffb8ad3800 error 4 in lvm[558bb6a8f000+20f000]
Aug 30 17:02:18 hayes-01 kernel: Code: a4 0e 00 50 52 ba 5e 00 00 00 eb 99 0f 1f 44 00 00 f3 0f 1e fa 41 56 49 89 f6 41 55 49 89 d5 41 54 55 53 48 89 fb 48 83 ec 10 <44> 8b 67 34 64 48 8b 04 25 28 00 00 00 48 89 44 24 08 31 c0 48 83
Aug 30 17:02:18 hayes-01 systemd[1]: Started Process Core Dump (PID 23045/UID 0).
Aug 30 17:02:18 hayes-01 systemd-coredump[23046]: Process 23039 (vgextend) of user 0 dumped core.#012#012Stack trace of thread 23039:#012#0  0x0000558bb6b2ab29 dev_get_direct_block_sizes (lvm)#012#1  0x0000558bb6b6b6f2 vg_extend_each_pv (lvm)#012#2  0x0000558bb6b0963d _vgextend_single (lvm)#012#3  0x0000558bb6afddea _process_vgnameid_list (lvm)#012#4  0x0000558bb6b09bb5 vgextend (lvm)#012#5  0x0000558bb6ae5435 lvm_run_command (lvm)#012#6  0x0000558bb6ae6763 lvm2_main (lvm)#012#7  0x00007f3f5901c873 __libc_start_main (libc.so.6)#012#8  0x0000558bb6ac2c9e _start (lvm)


Version-Release number of selected component (if applicable):
kernel-4.18.0-134.el8    BUILT: Thu Aug 15 13:08:19 CDT 2019
lvm2-2.03.05-4.el8    BUILT: Sun Aug 18 11:44:11 CDT 2019
lvm2-libs-2.03.05-4.el8    BUILT: Sun Aug 18 11:44:11 CDT 2019
lvm2-dbusd-2.03.05-4.el8    BUILT: Sun Aug 18 11:46:32 CDT 2019
lvm2-lockd-2.03.05-4.el8    BUILT: Sun Aug 18 11:44:11 CDT 2019
boom-boot-1.0-0.2.20190610git246b116.el8    BUILT: Mon Jun 10 08:22:40 CDT 2019
cmirror-2.03.05-4.el8    BUILT: Sun Aug 18 11:44:11 CDT 2019
device-mapper-1.02.163-4.el8    BUILT: Sun Aug 18 11:44:11 CDT 2019
device-mapper-libs-1.02.163-4.el8    BUILT: Sun Aug 18 11:44:11 CDT 2019
device-mapper-event-1.02.163-4.el8    BUILT: Sun Aug 18 11:44:11 CDT 2019
device-mapper-event-libs-1.02.163-4.el8    BUILT: Sun Aug 18 11:44:11 CDT 2019
device-mapper-persistent-data-0.8.5-2.el8    BUILT: Wed Jun  5 10:28:04 CDT 2019


How reproducible:
Always (when in this state)

Comment 1 David Teigland 2019-09-03 15:10:52 UTC
It seems pretty likely that it's failing at the spot fixed here, pushed to master:
https://sourceware.org/git/?p=lvm2.git;a=commitdiff;h=98d420200e16b450b6b7e33b83bdf36a59196d6d

Comment 4 Corey Marthaler 2020-01-06 23:00:01 UTC
Fix verified in the latest rpms.

# BROKEN VERSION:
kernel-4.18.0-167.el8    BUILT: Sat Dec 14 19:43:52 CST 2019
lvm2-2.03.05-4.el8    BUILT: Sun Aug 18 11:44:11 CDT 2019
lvm2-libs-2.03.05-4.el8    BUILT: Sun Aug 18 11:44:11 CDT 2019
device-mapper-1.02.163-4.el8    BUILT: Sun Aug 18 11:44:11 CDT 2019
device-mapper-libs-1.02.163-4.el8    BUILT: Sun Aug 18 11:44:11 CDT 2019
device-mapper-event-1.02.163-4.el8    BUILT: Sun Aug 18 11:44:11 CDT 2019
device-mapper-event-libs-1.02.163-4.el8    BUILT: Sun Aug 18 11:44:11 CDT 2019

[root@hayes-02 ~]# lvs -a -o +devices
  WARNING: Couldn't find device with uuid UBl1Gx-FtWs-tLHz-pTh3-fKIu-kGtw-GN8adg.
  WARNING: VG raid_sanity is missing PV UBl1Gx-FtWs-tLHz-pTh3-fKIu-kGtw-GN8adg.
  WARNING: Couldn't find all devices for LV raid_sanity/removemissing_rimage_1 while checking used and assumed devices.
  WARNING: Couldn't find all devices for LV raid_sanity/removemissing_rmeta_1 while checking used and assumed devices.
  LV                       VG          Attr       LSize   Pool Origin Data%  Meta%  Move Log Cpy%Sync Convert Devices                                            
  removemissing            raid_sanity Rwi-aor-p- 300.00m                                    100.00           removemissing_rimage_0(0),removemissing_rimage_1(0)
  [removemissing_rimage_0] raid_sanity vwi-aor-r- 300.00m                                                                                                        
  [removemissing_rimage_1] raid_sanity iwi-aor-p- 300.00m                                                     [unknown](1)                                       
  [removemissing_rmeta_0]  raid_sanity ewi-aor-r-   4.00m                                                                                                        
  [removemissing_rmeta_1]  raid_sanity ewi-aor-p-   4.00m                                                     [unknown](0)                                       
[root@hayes-02 ~]# pvscan
  WARNING: Couldn't find device with uuid UBl1Gx-FtWs-tLHz-pTh3-fKIu-kGtw-GN8adg.
  WARNING: VG raid_sanity is missing PV UBl1Gx-FtWs-tLHz-pTh3-fKIu-kGtw-GN8adg.
  WARNING: Couldn't find all devices for LV raid_sanity/removemissing_rimage_1 while checking used and assumed devices.
  WARNING: Couldn't find all devices for LV raid_sanity/removemissing_rmeta_1 while checking used and assumed devices.
  PV [unknown]   VG raid_sanity     lvm2 [446.62 GiB / 446.32 GiB free]
  PV /dev/sdn1   VG raid_sanity     lvm2 [446.62 GiB / 446.62 GiB free]
  PV /dev/sdp1   VG raid_sanity     lvm2 [446.62 GiB / 446.62 GiB free]
  PV /dev/sdl1   VG raid_sanity     lvm2 [446.62 GiB / 446.62 GiB free]
  PV /dev/sdi1   VG raid_sanity     lvm2 [<1.82 TiB / <1.82 TiB free]
  PV /dev/sdh1   VG raid_sanity     lvm2 [<1.82 TiB / <1.82 TiB free]
  PV /dev/sde1   VG raid_sanity     lvm2 [<1.82 TiB / <1.82 TiB free]
  PV /dev/sdd1   VG raid_sanity     lvm2 [<1.82 TiB / <1.82 TiB free]
  PV /dev/sdg1   VG raid_sanity     lvm2 [<1.82 TiB / <1.82 TiB free]
  PV /dev/sdo1                      lvm2 [446.62 GiB]
  Total: 10 [11.27 TiB] / in use: 9 [<10.84 TiB] / in no VG: 1 [446.62 GiB]
[root@hayes-02 ~]# vgextend raid_sanity /dev/sdk1
  WARNING: Couldn't find device with uuid UBl1Gx-FtWs-tLHz-pTh3-fKIu-kGtw-GN8adg.
  WARNING: VG raid_sanity is missing PV UBl1Gx-FtWs-tLHz-pTh3-fKIu-kGtw-GN8adg.
  WARNING: Couldn't find all devices for LV raid_sanity/removemissing_rimage_1 while checking used and assumed devices.
  WARNING: Couldn't find all devices for LV raid_sanity/removemissing_rmeta_1 while checking used and assumed devices.
  Physical volume "/dev/sdk1" successfully created.
  WARNING: Couldn't find device with uuid UBl1Gx-FtWs-tLHz-pTh3-fKIu-kGtw-GN8adg.
  WARNING: VG raid_sanity is missing PV UBl1Gx-FtWs-tLHz-pTh3-fKIu-kGtw-GN8adg.
  WARNING: Couldn't find all devices for LV raid_sanity/removemissing_rimage_1 while checking used and assumed devices.
  WARNING: Couldn't find all devices for LV raid_sanity/removemissing_rmeta_1 while checking used and assumed devices.
  WARNING: Couldn't find device with uuid UBl1Gx-FtWs-tLHz-pTh3-fKIu-kGtw-GN8adg.
Segmentation fault (core dumped)




# FIXED VERSION:
kernel-4.18.0-167.el8    BUILT: Sat Dec 14 19:43:52 CST 2019
lvm2-2.03.07-1.el8    BUILT: Mon Dec  2 00:09:32 CST 2019
lvm2-libs-2.03.07-1.el8    BUILT: Mon Dec  2 00:09:32 CST 2019
device-mapper-1.02.167-1.el8    BUILT: Mon Dec  2 00:09:32 CST 2019
device-mapper-libs-1.02.167-1.el8    BUILT: Mon Dec  2 00:09:32 CST 2019
device-mapper-event-1.02.167-1.el8    BUILT: Mon Dec  2 00:09:32 CST 2019
device-mapper-event-libs-1.02.167-1.el8    BUILT: Mon Dec  2 00:09:32 CST 2019

[root@hayes-02 ~]# lvs -a -o +devices
  WARNING: Couldn't find device with uuid IIjf77-AWgE-X6GJ-JCrt-yF9W-hcjP-mx8vQg.
  WARNING: VG raid_sanity is missing PV IIjf77-AWgE-X6GJ-JCrt-yF9W-hcjP-mx8vQg (last written to [unknown]).
  WARNING: Couldn't find all devices for LV raid_sanity/removemissing_rimage_1 while checking used and assumed devices.
  WARNING: Couldn't find all devices for LV raid_sanity/removemissing_rmeta_1 while checking used and assumed devices.
  LV                       VG          Attr       LSize   Pool Origin Data%  Meta%  Move Log Cpy%Sync Convert Devices                                            
  removemissing            raid_sanity Rwi-aor-p- 300.00m                                    100.00           removemissing_rimage_0(0),removemissing_rimage_1(0)
  [removemissing_rimage_0] raid_sanity vwi-aor-r- 300.00m                                                                                                        
  [removemissing_rimage_1] raid_sanity iwi-aor-p- 300.00m                                                     [unknown](1)                                       
  [removemissing_rmeta_0]  raid_sanity ewi-aor-r-   4.00m                                                                                                        
  [removemissing_rmeta_1]  raid_sanity ewi-aor-p-   4.00m                                                     [unknown](0)                                       
[root@hayes-02 ~]# pvscan
  WARNING: Couldn't find device with uuid IIjf77-AWgE-X6GJ-JCrt-yF9W-hcjP-mx8vQg.
  WARNING: VG raid_sanity is missing PV IIjf77-AWgE-X6GJ-JCrt-yF9W-hcjP-mx8vQg (last written to [unknown]).
  WARNING: Couldn't find all devices for LV raid_sanity/removemissing_rimage_1 while checking used and assumed devices.
  WARNING: Couldn't find all devices for LV raid_sanity/removemissing_rmeta_1 while checking used and assumed devices.
  PV [unknown]   VG raid_sanity     lvm2 [<1.82 TiB / <1.82 TiB free]
  PV /dev/sdm1   VG raid_sanity     lvm2 [446.62 GiB / 446.62 GiB free]
  PV /dev/sdo1   VG raid_sanity     lvm2 [446.62 GiB / 446.62 GiB free]
  PV /dev/sdc1   VG raid_sanity     lvm2 [<1.82 TiB / <1.82 TiB free]
  PV /dev/sdn1   VG raid_sanity     lvm2 [446.62 GiB / 446.62 GiB free]
  PV /dev/sde1   VG raid_sanity     lvm2 [<1.82 TiB / <1.82 TiB free]
  PV /dev/sdp1   VG raid_sanity     lvm2 [446.62 GiB / 446.62 GiB free]
  PV /dev/sdi1   VG raid_sanity     lvm2 [<1.82 TiB / <1.82 TiB free]
  PV /dev/sdl1   VG raid_sanity     lvm2 [446.62 GiB / 446.62 GiB free]
  PV /dev/sdb1                      lvm2 [<1.82 TiB]
  Total: 10 [11.27 TiB] / in use: 9 [<9.46 TiB] / in no VG: 1 [<1.82 TiB]
[root@hayes-02 ~]# vgextend raid_sanity /dev/sdk1
  WARNING: Couldn't find device with uuid IIjf77-AWgE-X6GJ-JCrt-yF9W-hcjP-mx8vQg.
  WARNING: VG raid_sanity is missing PV IIjf77-AWgE-X6GJ-JCrt-yF9W-hcjP-mx8vQg (last written to [unknown]).
  WARNING: Couldn't find all devices for LV raid_sanity/removemissing_rimage_1 while checking used and assumed devices.
  WARNING: Couldn't find all devices for LV raid_sanity/removemissing_rmeta_1 while checking used and assumed devices.
  Physical volume "/dev/sdk1" successfully created.
  WARNING: Couldn't find device with uuid IIjf77-AWgE-X6GJ-JCrt-yF9W-hcjP-mx8vQg.
  WARNING: VG raid_sanity is missing PV IIjf77-AWgE-X6GJ-JCrt-yF9W-hcjP-mx8vQg (last written to [unknown]).
  WARNING: Couldn't find all devices for LV raid_sanity/removemissing_rimage_1 while checking used and assumed devices.
  WARNING: Couldn't find all devices for LV raid_sanity/removemissing_rmeta_1 while checking used and assumed devices.
  WARNING: Couldn't find device with uuid IIjf77-AWgE-X6GJ-JCrt-yF9W-hcjP-mx8vQg.
  WARNING: Couldn't find device with uuid IIjf77-AWgE-X6GJ-JCrt-yF9W-hcjP-mx8vQg.
  Volume group "raid_sanity" successfully extended

Comment 6 errata-xmlrpc 2020-04-28 16:58:57 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHEA-2020:1881


Note You need to log in before you can comment on or make changes to this bug.