+++ This bug was initially created as a clone of Bug #2213691 +++ This bug was initially created as a copy of Bug #2213673 I am copying this bug because: Description of problem: storage roles regression found the error, please have a look Version-Release number of selected component (if applicable): ansible-core-2.15.0-1.el8.x86_64 rhel-system-roles-1.22.0-0.9.el8.noarch RHEL-8.9.0-20230603.20 BaseOS x86_64 How reproducible: Steps to Reproduce: 1. ansible-playbook -vv -i host tests_misc.yml 2. ansible-playbook -vv -i host tests_misc_scsi_generated.yml 3. Actual results: Expected results: Additional info: TASK [rhel-system-roles.storage : Make sure required packages are installed] *** task path: /usr/share/ansible/roles/rhel-system-roles.storage/tasks/main-blivet.yml:39 skipping: [localhost] => {"changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False"} TASK [rhel-system-roles.storage : Get service facts] *************************** task path: /usr/share/ansible/roles/rhel-system-roles.storage/tasks/main-blivet.yml:46 skipping: [localhost] => {"changed": false, "false_condition": "storage_skip_checks is not defined or not \"service_facts\" in storage_skip_checks", "skip_reason": "Conditional result was False"} TASK [rhel-system-roles.storage : Set storage_cryptsetup_services] ************* task path: /usr/share/ansible/roles/rhel-system-roles.storage/tasks/main-blivet.yml:53 ok: [localhost] => {"ansible_facts": {"storage_cryptsetup_services": []}, "changed": false} TASK [rhel-system-roles.storage : Mask the systemd cryptsetup services] ******** task path: /usr/share/ansible/roles/rhel-system-roles.storage/tasks/main-blivet.yml:67 skipping: [localhost] => {"changed": false, "skipped_reason": "No items in the list"} TASK [rhel-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /usr/share/ansible/roles/rhel-system-roles.storage/tasks/main-blivet.yml:73 fatal: [localhost]: FAILED! => {"actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "msg": "cannot remove existing formatting (lvmpv) and/or devices on disk 'sdb' (pool 'foo') in safe mode", "packages": [], "pools": [], "volumes": []} TASK [rhel-system-roles.storage : Failed message] ****************************** task path: /usr/share/ansible/roles/rhel-system-roles.storage/tasks/main-blivet.yml:95 fatal: [localhost]: FAILED! => {"changed": false, "msg": {"actions": [], "changed": false, "crypts": [], "failed": true, "invocation": {"module_args": {"disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": {"disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": []}, "pools": [{"disks": ["sdb"], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [{"cache_devices": [], "cache_mode": null, "cache_size": null, "cached": null, "compression": null, "deduplication": null, "encryption": null, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "-Fb 512", "fs_label": "", "fs_type": "ext4", "mount_group": null, "mount_mode": null, "mount_options": null, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_disks": [], "raid_level": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": null, "vdo_pool_size": null}]}], "safe_mode": true, "use_partitions": null, "volume_defaults": {"cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null}, "volumes": []}}, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting (lvmpv) and/or devices on disk 'sdb' (pool 'foo') in safe mode", "packages": [], "pools": [], "volumes": []}} TASK [rhel-system-roles.storage : Unmask the systemd cryptsetup services] ****** task path: /usr/share/ansible/roles/rhel-system-roles.storage/tasks/main-blivet.yml:99 skipping: [localhost] => {"changed": false, "skipped_reason": "No items in the list"} TASK [Check that we failed in the role] **************************************** task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/verify-role-failed.yml:20 ok: [localhost] => { "changed": false, "msg": "All assertions passed" } TASK [Verify the blivet output and error message are correct] ****************** task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/verify-role-failed.yml:25 fatal: [localhost]: FAILED! => { "assertion": "blivet_output.msg is search(__storage_failed_regex)", "changed": false, "evaluated_to": false, "msg": "Unexpected behavior when creating ext4 filesystem with invalid parameter" } PLAY RECAP ********************************************************************* localhost : ok=178 changed=4 unreachable=0 failed=1 skipped=150 rescued=2 ignored=0 STDERR: RETURN:2 https://beaker.engineering.redhat.com/recipes/14051127#task161440046 RHEL9 failed job https://beaker.engineering.redhat.com/recipes/14051634#task161443711
Hi, I can not open the brew task link[1]. [1]https://download.devel.redhat.com/brewroot/work/tasks/2305/53682305/
[root@storageqe-104 tests]# cat host localhost ansible_connection=local [root@storageqe-104 tests]# ansible-playbook -vv -i host tests_disk_errors.yml TASK [Store global variable value copy] ********************************************************************************************************** task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/verify-role-failed.yml:4 fatal: [localhost]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: 'storage_safe_mode' is undefined\n\nThe error appears to be in '/usr/share/ansible/roles/rhel-system-roles.storage/tests/verify-role-failed.yml': line 4, column 7, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n block:\n - name: Store global variable value copy\n ^ here\n"} TASK [Check that we failed in the role] ********************************************************************************************************** task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/verify-role-failed.yml:25 ok: [localhost] => { "changed": false, "msg": "All assertions passed" } TASK [Verify the blivet output and error message are correct] ************************************************************************************ task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/verify-role-failed.yml:30 fatal: [localhost]: FAILED! => { "assertion": "blivet_output.failed", "changed": false, "evaluated_to": false, "msg": "Unexpected behavior w/ multiple disk volumes using the same name" } PLAY RECAP *************************************************************************************************************************************** localhost : ok=30 changed=0 unreachable=0 failed=1 skipped=15 rescued=2 ignored=0 [root@storageqe-104 tests]# [ERROR][23:11:02]ansible-playbook -vv -i host tests_disk_errors.yml [ERROR][23:11:27]ansible-playbook -vv -i host tests_disk_errors_scsi_generated.yml [ERROR][23:21:15]ansible-playbook -vv -i host tests_luks.yml [ERROR][23:21:32]ansible-playbook -vv -i host tests_luks_pool.yml [ERROR][23:21:54]ansible-playbook -vv -i host tests_luks_pool_scsi_generated.yml [ERROR][23:22:07]ansible-playbook -vv -i host tests_luks_scsi_generated.yml [ERROR][23:22:44]ansible-playbook -vv -i host tests_lvm_auto_size_cap.yml [ERROR][23:23:03]ansible-playbook -vv -i host tests_lvm_auto_size_cap_scsi_generated.yml [ERROR][23:23:31]ansible-playbook -vv -i host tests_lvm_errors.yml [ERROR][23:23:56]ansible-playbook -vv -i host tests_lvm_errors_scsi_generated.yml [ERROR][00:22:20]ansible-playbook -vv -i host tests_resize.yml [ERROR][00:27:29]ansible-playbook -vv -i host tests_resize_scsi_generated.yml [ERROR][00:36:29]ansible-playbook -vv -i host tests_volume_relabel.yml almost the failed case prompt same error which is the var 'storage_safe_mode' is undefined.
Hi, Update the ansible inventory host var. [root@storageqe-104 tests]# cat host localhost ansible_connection=local storage_safe_mode=true [root@storageqe-104 tests]# [PASS][02:15:09]ansible-playbook -vv -i host tests_disk_errors.yml [PASS][02:16:37]ansible-playbook -vv -i host tests_disk_errors_scsi_generated.yml [PASS][02:22:36]ansible-playbook -vv -i host tests_luks_pool.yml [PASS][02:25:27]ansible-playbook -vv -i host tests_luks_pool_scsi_generated.yml [PASS][02:30:25]ansible-playbook -vv -i host tests_lvm_auto_size_cap.yml [PASS][02:31:44]ansible-playbook -vv -i host tests_lvm_auto_size_cap_scsi_generated.yml [PASS][02:32:51]ansible-playbook -vv -i host tests_lvm_errors.yml [PASS][02:34:04]ansible-playbook -vv -i host tests_lvm_errors_scsi_generated.yml [PASS][02:44:42]ansible-playbook -vv -i host tests_volume_relabel.yml [ERROR][02:19:34]ansible-playbook -vv -i host tests_luks.yml [ERROR][02:28:46]ansible-playbook -vv -i host tests_luks_scsi_generated.yml [ERROR][02:39:14]ansible-playbook -vv -i host tests_resize.yml [ERROR][02:43:41]ansible-playbook -vv -i host tests_resize_scsi_generated.yml ansible-playbook -vv -i host tests_luks.yml TASK [Verify the current mount state by device] ************************************************************************************************** task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/test-verify-volume-mount.yml:46 fatal: [localhost]: FAILED! => { "assertion": "storage_test_mount_device_matches | length == storage_test_mount_expected_match_count | int", "changed": false, "evaluated_to": false, "msg": "Found unexpected mount state for volume 'foo' device" } PLAY RECAP *************************************************************************************************************************************** localhost : ok=90 changed=9 unreachable=0 failed=2 skipped=29 rescued=1 ignored=0 [root@storageqe-104 tests]# cat host localhost ansible_connection=local storage_safe_mode=true [root@storageqe-104 tests]# ansible-playbook -vv -i host tests_luks_scsi_generated.yml TASK [Set some facts] **************************************************************************************************************************** task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/test-verify-volume-mount.yml:16 ok: [localhost] => {"ansible_facts": {"storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [{"block_available": 116330557, "block_size": 4096, "block_total": 117155654, "block_used": 825097, "device": "/dev/sdb", "fstype": "xfs", "inode_available": 234425725, "inode_total": 234425728, "inode_used": 3, "mount": "/opt/test1", "options": "rw,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 476489961472, "size_total": 479869558784, "uuid": "21001c92-d6c7-488c-b8b9-61d24c949336"}], "storage_test_swap_expected_matches": "0"}, "changed": false} TASK [Get information about the mountpoint directory] ******************************************************************************************** task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/test-verify-volume-mount.yml:33 skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [Verify the current mount state by device] ************************************************************************************************** task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/test-verify-volume-mount.yml:46 fatal: [localhost]: FAILED! => { "assertion": "storage_test_mount_device_matches | length == storage_test_mount_expected_match_count | int", "changed": false, "evaluated_to": false, "msg": "Found unexpected mount state for volume 'foo' device" } PLAY RECAP *************************************************************************************************************************************** localhost : ok=88 changed=5 unreachable=0 failed=2 skipped=33 rescued=1 ignored=0 [root@storageqe-104 tests]# ansible-playbook -vv -i host tests_resize.yml skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [Verify the current mount state by device] ************************************************************************************************** task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/test-verify-volume-mount.yml:46 fatal: [localhost]: FAILED! => { "assertion": "storage_test_mount_device_matches | length == storage_test_mount_expected_match_count | int", "changed": false, "evaluated_to": false, "msg": "Found unexpected mount state for volume 'test1' device" } PLAY RECAP *************************************************************************************************************************************** localhost : ok=95 changed=6 unreachable=0 failed=1 skipped=53 rescued=0 ignored=0 ansible-playbook -vv -i host tests_resize_scsi_generated.yml TASK [rhel-system-roles.storage : Retrieve facts for the /etc/crypttab file] ********************************************************************* task path: /usr/share/ansible/roles/rhel-system-roles.storage/tasks/main-blivet.yml:182 ok: [localhost] => {"changed": false, "stat": {"atime": 1688627729.8890707, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "2ea4af70536d96e7f43b4d9b67fb4e20daa3297d", "ctime": 1688627728.1600769, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 203234253, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1688627728.1600769, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 106, "uid": 0, "version": "3943556224", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}} TASK [rhel-system-roles.storage : Manage /etc/crypttab to account for changes we just made] ****************************************************** task path: /usr/share/ansible/roles/rhel-system-roles.storage/tasks/main-blivet.yml:187 TASK [rhel-system-roles.storage : Update facts] ************************************************************************************************** task path: /usr/share/ansible/roles/rhel-system-roles.storage/tasks/main-blivet.yml:209 ok: [localhost] META: role_complete for localhost TASK [Unreachable task] ************************************************************************************************************************** task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/verify-role-failed.yml:20 fatal: [localhost]: FAILED! => {"changed": false, "msg": "UNREACH"} TASK [Check that we failed in the role] ********************************************************************************************************** task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/verify-role-failed.yml:25 fatal: [localhost]: FAILED! => { "assertion": "ansible_failed_result.msg != 'UNREACH'", "changed": false, "evaluated_to": false, "msg": "Role has not failed when it should have" } PLAY RECAP *************************************************************************************************************************************** localhost : ok=1535 changed=67 unreachable=0 failed=4 skipped=1526 rescued=4 ignored=0 [root@storageqe-104 tests]#
Hi, still have two case failed. ansible-playbook -vv -i host tests_resize.yml TASK [Unreachable task] ******************************************************** task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/verify-role-failed.yml:24 fatal: [localhost]: FAILED! => {"changed": false, "msg": "UNREACH"} TASK [Check that we failed in the role] **************************************** task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/verify-role-failed.yml:29 fatal: [localhost]: FAILED! => { "assertion": "ansible_failed_result.msg != 'UNREACH'", "changed": false, "evaluated_to": false, "msg": "Role has not failed when it should have" } PLAY RECAP ********************************************************************* localhost : ok=1493 changed=27 unreachable=0 failed=4 skipped=1566 rescued=4 ignored=0 STDERR:[DEPRECATION WARNING]: Ansible will require Python 3.8 or newer on the controller starting with Ansible 2.12. Current version: 3.6.8 (default, Jun 26 2023, 17:47:38) [GCC 8.5.0 20210514 (Red Hat 8.5.0-20)]. This feature will be removed from ansible-core in version 2.12. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. /usr/local/lib/python3.6/site-packages/ansible/parsing/vault/__init__.py:44: CryptographyDeprecationWarning: Python 3.6 is no longer supported by the Python core team. Therefore, support for it is deprecated in cryptography. The next release of cryptography will remove support for Python 3.6. from cryptography.exceptions import InvalidSignature [WARNING]: An error occurred while calling ansible.utils.display.initialize_locale (unsupported locale setting). This may result in incorrectly calculated text widths that can cause Display to print incorrect line lengths RETURN:2 ansible-playbook -vv -i host tests_resize_scsi_generated.yml TASK [Check that we failed in the role] **************************************** task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/verify-role-failed.yml:29 fatal: [localhost]: FAILED! => { "assertion": "ansible_failed_result.msg != 'UNREACH'", "changed": false, "evaluated_to": false, "msg": "Role has not failed when it should have" } PLAY RECAP ********************************************************************* localhost : ok=1495 changed=27 unreachable=0 failed=4 skipped=1566 rescued=4 ignored=0 STDERR:[DEPRECATION WARNING]: Ansible will require Python 3.8 or newer on the controller starting with Ansible 2.12. Current version: 3.6.8 (default, Jun 26 2023, 17:47:38) [GCC 8.5.0 20210514 (Red Hat 8.5.0-20)]. This feature will be removed from ansible-core in version 2.12. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. /usr/local/lib/python3.6/site-packages/ansible/parsing/vault/__init__.py:44: CryptographyDeprecationWarning: Python 3.6 is no longer supported by the Python core team. Therefore, support for it is deprecated in cryptography. The next release of cryptography will remove support for Python 3.6. from cryptography.exceptions import InvalidSignature [WARNING]: An error occurred while calling ansible.utils.display.initialize_locale (unsupported locale setting). This may result in incorrectly calculated text widths that can cause Display to print incorrect line lengths RETURN:2
rhel-system-roles-1.22.0-0.13.10.el8.noarch test pass with the package