Note: This bug is displayed in read-only format because
the product is no longer active in Red Hat Bugzilla.
RHEL Engineering is moving the tracking of its product development work on RHEL 6 through RHEL 9 to Red Hat Jira (issues.redhat.com). If you're a Red Hat customer, please continue to file support cases via the Red Hat customer portal. If you're not, please head to the "RHEL project" in Red Hat Jira and file new tickets here. Individual Bugzilla bugs in the statuses "NEW", "ASSIGNED", and "POST" are being migrated throughout September 2023. Bugs of Red Hat partners with an assigned Engineering Partner Manager (EPM) are migrated in late September as per pre-agreed dates. Bugs against components "kernel", "kernel-rt", and "kpatch" are only migrated if still in "NEW" or "ASSIGNED". If you cannot log in to RH Jira, please consult article #7032570. That failing, please send an e-mail to the RH Jira admins at rh-issues@redhat.com to troubleshoot your issue as a user management inquiry. The email creates a ServiceNow ticket with Red Hat. Individual Bugzilla bugs that are migrated will be moved to status "CLOSED", resolution "MIGRATED", and set with "MigratedToJIRA" in "Keywords". The link to the successor Jira issue will be found under "Links", have a little "two-footprint" icon next to it, and direct you to the "RHEL project" in Red Hat Jira (issue links are of type "https://issues.redhat.com/browse/RHEL-XXXX", where "X" is a digit). This same link will be available in a blue banner at the top of the page informing you that that bug has been migrated.
Description of problem: [RHEL9][RHEL-system-roles] test tests_lvm_pool_members_scsi_generated.yml failed Version-Release number of selected component (if applicable): rhel-system-roles-1.20.0-1.el9.noarch How reproducible: 100% Steps to Reproduce: 1. # cat inventory localhost ansible_connection=local # ansible-playbook -i inventory tests/tests_lvm_pool_members_scsi_generated.yml 2. 3. Actual results: Expected results: Additional info: Here is the disk status before/after the tests: Before the tests: [root@storageqe-62 rhel-system-roles.storage]# lsblk NAME MAJ:MIN RM SIZE RO TYPE MOUNTPOINTS sda 8:0 0 279.4G 0 disk sdb 8:16 0 279.4G 0 disk ├─sdb1 8:17 0 600M 0 part /boot/efi ├─sdb2 8:18 0 1G 0 part /boot └─sdb3 8:19 0 277.8G 0 part ├─rhel_storageqe--62-root 253:0 0 70G 0 lvm / ├─rhel_storageqe--62-swap 253:1 0 7.8G 0 lvm [SWAP] └─rhel_storageqe--62-home 253:2 0 200G 0 lvm /home sdc 8:32 0 186.3G 0 disk sdd 8:48 0 111.8G 0 disk sde 8:64 0 111.8G 0 disk sdf 8:80 0 931.5G 0 disk sdg 8:96 0 931.5G 0 disk sdh 8:112 0 931.5G 0 disk sdi 8:128 0 279.4G 0 disk sdj 8:144 0 279.4G 0 disk sdk 8:160 0 931.5G 0 disk sdl 8:176 0 931.5G 0 disk nvme0n1 259:0 0 894.3G 0 disk After the tests: [root@storageqe-62 rhel-system-roles.storage]# cat after.log NAME MAJ:MIN RM SIZE RO TYPE MOUNTPOINTS sda 8:0 0 279.4G 0 disk └─sda1 8:1 0 279.4G 0 part └─luks-5241e03a-f185-42ac-ad09-49f6f633ee57 253:12 0 279.4G 0 crypt sdb 8:16 0 279.4G 0 disk ├─sdb1 8:17 0 600M 0 part /boot/efi ├─sdb2 8:18 0 1G 0 part /boot └─sdb3 8:19 0 277.8G 0 part ├─rhel_storageqe--62-root 253:0 0 70G 0 lvm / ├─rhel_storageqe--62-swap 253:1 0 7.8G 0 lvm [SWAP] └─rhel_storageqe--62-home 253:2 0 200G 0 lvm /home sdc 8:32 0 186.3G 0 disk └─sdc1 8:33 0 186.3G 0 part └─luks-1723165e-19a2-4779-9481-8b2abcb2b133 253:11 0 186.3G 0 crypt sdd 8:48 0 111.8G 0 disk └─sdd1 8:49 0 111.8G 0 part └─luks-da53b35e-286c-4fbe-8c99-73ddbe5a4a4e 253:10 0 111.8G 0 crypt sde 8:64 0 111.8G 0 disk └─sde1 8:65 0 111.8G 0 part └─luks-29b56314-4c35-4409-9ebe-4401d6ad78bc 253:9 0 111.8G 0 crypt sdf 8:80 0 931.5G 0 disk └─sdf1 8:81 0 931.5G 0 part └─luks-baa394a4-a970-48b0-b2f6-2da30dde0e8f 253:8 0 931.5G 0 crypt sdg 8:96 0 931.5G 0 disk └─sdg1 8:97 0 931.5G 0 part └─luks-152d7223-1f39-4848-9f24-bf3c87165fe3 253:7 0 931.5G 0 crypt sdh 8:112 0 931.5G 0 disk └─sdh1 8:113 0 931.5G 0 part └─luks-c13371e1-0227-420c-955a-1a35b201cadf 253:6 0 931.5G 0 crypt sdi 8:128 0 279.4G 0 disk └─sdi1 8:129 0 279.4G 0 part └─luks-bf2507aa-82a3-4015-ad14-13da5bbd5ad6 253:5 0 279.4G 0 crypt sdj 8:144 0 279.4G 0 disk └─sdj1 8:145 0 279.4G 0 part └─luks-d7d5f582-33be-42ee-98f1-7d5b1f708709 253:4 0 279.4G 0 crypt sdk 8:160 0 931.5G 0 disk └─sdk1 8:161 0 931.5G 0 part └─luks-7ea48c0d-90c5-4d7a-aa55-6b7e7b2ca6fd 253:3 0 931.5G 0 crypt sdl 8:176 0 931.5G 0 disk nvme0n1 259:0 0 894.3G 0 disk part of the failed log: TASK [rhel-system-roles.storage : Mask the systemd cryptsetup services] ******** TASK [rhel-system-roles.storage : manage the pools and volumes to match the specified state] *** fatal: [localhost]: FAILED! => {"actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "msg": "Failed to commit changes to disk: Process reported exit code 5: Device not found for /dev/mapper/luks-sdk1.\n Device not found for /dev/mapper/luks-sde1.\n Device not found for /dev/mapper/luks-sdc1.\n Device not found for /dev/mapper/luks-sdf1.\n Device not found for /dev/mapper/luks-sdj1.\n Device not found for /dev/mapper/luks-sdi1.\n Device not found for /dev/mapper/luks-sdh1.\n Device not found for /dev/mapper/luks-sda1.\n Device not found for /dev/mapper/luks-sdd1.\n Device not found for /dev/mapper/luks-sdg1.\n Devices have inconsistent logical block sizes (512 and 4096).\n", "packages": ["xfsprogs", "dosfstools", "lvm2", "cryptsetup"], "pools": [], "volumes": []} TASK [rhel-system-roles.storage : failed message] ****************************** fatal: [localhost]: FAILED! => {"changed": false, "msg": {"actions": [], "changed": false, "crypts": [], "failed": true, "invocation": {"module_args": {"disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": {"disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": []}, "pools": [{"disks": ["sda", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi", "sdj", "sdk"], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": []}], "safe_mode": false, "use_partitions": true, "volume_defaults": {"cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null}, "volumes": []}}, "leaves": [], "mounts": [], "msg": "Failed to commit changes to disk: Process reported exit code 5: Device not found for /dev/mapper/luks-sdk1.\n Device not found for /dev/mapper/luks-sde1.\n Device not found for /dev/mapper/luks-sdc1.\n Device not found for /dev/mapper/luks-sdf1.\n Device not found for /dev/mapper/luks-sdj1.\n Device not found for /dev/mapper/luks-sdi1.\n Device not found for /dev/mapper/luks-sdh1.\n Device not found for /dev/mapper/luks-sda1.\n Device not found for /dev/mapper/luks-sdd1.\n Device not found for /dev/mapper/luks-sdg1.\n Devices have inconsistent logical block sizes (512 and 4096).\n", "packages": ["xfsprogs", "dosfstools", "lvm2", "cryptsetup"], "pools": [], "volumes": []}} TASK [rhel-system-roles.storage : Unmask the systemd cryptsetup services] ****** PLAY RECAP ********************************************************************* localhost : ok=340 changed=9 unreachable=0 failed=1 skipped=261 rescued=1 ignored=0