RHEL Engineering is moving the tracking of its product development work on RHEL 6 through RHEL 9 to Red Hat Jira (issues.redhat.com). If you're a Red Hat customer, please continue to file support cases via the Red Hat customer portal. If you're not, please head to the "RHEL project" in Red Hat Jira and file new tickets here. Individual Bugzilla bugs in the statuses "NEW", "ASSIGNED", and "POST" are being migrated throughout September 2023. Bugs of Red Hat partners with an assigned Engineering Partner Manager (EPM) are migrated in late September as per pre-agreed dates. Bugs against components "kernel", "kernel-rt", and "kpatch" are only migrated if still in "NEW" or "ASSIGNED". If you cannot log in to RH Jira, please consult article #7032570. That failing, please send an e-mail to the RH Jira admins at rh-issues@redhat.com to troubleshoot your issue as a user management inquiry. The email creates a ServiceNow ticket with Red Hat. Individual Bugzilla bugs that are migrated will be moved to status "CLOSED", resolution "MIGRATED", and set with "MigratedToJIRA" in "Keywords". The link to the successor Jira issue will be found under "Links", have a little "two-footprint" icon next to it, and direct you to the "RHEL project" in Red Hat Jira (issue links are of type "https://issues.redhat.com/browse/RHEL-XXXX", where "X" is a digit). This same link will be available in a blue banner at the top of the page informing you that that bug has been migrated.
Bug 2218899 - [RHEL8] Unexpected behavior when creating ext4 filesystem with invalid parameter
Summary: [RHEL8] Unexpected behavior when creating ext4 filesystem with invalid parameter
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Enterprise Linux 8
Classification: Red Hat
Component: rhel-system-roles
Version: 8.9
Hardware: Unspecified
OS: Unspecified
unspecified
unspecified
Target Milestone: rc
: 8.9
Assignee: Rich Megginson
QA Contact: guazhang@redhat.com
URL:
Whiteboard: role:storage
Depends On: 2213691
Blocks:
TreeView+ depends on / blocked
 
Reported: 2023-06-30 13:43 UTC by Rich Megginson
Modified: 2023-11-14 16:42 UTC (History)
5 users (show)

Fixed In Version: rhel-system-roles-1.22.0-0.14.el8
Doc Type: No Doc Update
Doc Text:
Clone Of: 2213691
Environment:
Last Closed: 2023-11-14 15:31:21 UTC
Type: Bug
Target Upstream Version:
Embargoed:
pm-rhel: mirror+


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Github linux-system-roles storage pull 367 0 None Merged fix: Test issue when creating fs /w invalid param 2023-06-30 13:43:13 UTC
Red Hat Issue Tracker RHELPLAN-161346 0 None None None 2023-06-30 13:44:08 UTC
Red Hat Product Errata RHEA-2023:6946 0 None None None 2023-11-14 15:31:37 UTC

Description Rich Megginson 2023-06-30 13:43:13 UTC
+++ This bug was initially created as a clone of Bug #2213691 +++

This bug was initially created as a copy of Bug #2213673

I am copying this bug because: 



Description of problem:
storage roles regression found the error, please have a look 

Version-Release number of selected component (if applicable):
ansible-core-2.15.0-1.el8.x86_64
rhel-system-roles-1.22.0-0.9.el8.noarch 
RHEL-8.9.0-20230603.20 BaseOS x86_64

How reproducible:


Steps to Reproduce:
1. ansible-playbook -vv -i host tests_misc.yml
2. ansible-playbook -vv -i host tests_misc_scsi_generated.yml
3.

Actual results:


Expected results:


Additional info:


TASK [rhel-system-roles.storage : Make sure required packages are installed] ***
task path: /usr/share/ansible/roles/rhel-system-roles.storage/tasks/main-blivet.yml:39
skipping: [localhost] => {"changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False"}

TASK [rhel-system-roles.storage : Get service facts] ***************************
task path: /usr/share/ansible/roles/rhel-system-roles.storage/tasks/main-blivet.yml:46
skipping: [localhost] => {"changed": false, "false_condition": "storage_skip_checks is not defined or not \"service_facts\" in storage_skip_checks", "skip_reason": "Conditional result was False"}

TASK [rhel-system-roles.storage : Set storage_cryptsetup_services] *************
task path: /usr/share/ansible/roles/rhel-system-roles.storage/tasks/main-blivet.yml:53
ok: [localhost] => {"ansible_facts": {"storage_cryptsetup_services": []}, "changed": false}

TASK [rhel-system-roles.storage : Mask the systemd cryptsetup services] ********
task path: /usr/share/ansible/roles/rhel-system-roles.storage/tasks/main-blivet.yml:67
skipping: [localhost] => {"changed": false, "skipped_reason": "No items in the list"}

TASK [rhel-system-roles.storage : Manage the pools and volumes to match the specified state] ***
task path: /usr/share/ansible/roles/rhel-system-roles.storage/tasks/main-blivet.yml:73
fatal: [localhost]: FAILED! => {"actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "msg": "cannot remove existing formatting (lvmpv) and/or devices on disk 'sdb' (pool 'foo') in safe mode", "packages": [], "pools": [], "volumes": []}

TASK [rhel-system-roles.storage : Failed message] ******************************
task path: /usr/share/ansible/roles/rhel-system-roles.storage/tasks/main-blivet.yml:95
fatal: [localhost]: FAILED! => {"changed": false, "msg": {"actions": [], "changed": false, "crypts": [], "failed": true, "invocation": {"module_args": {"disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": {"disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": []}, "pools": [{"disks": ["sdb"], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [{"cache_devices": [], "cache_mode": null, "cache_size": null, "cached": null, "compression": null, "deduplication": null, "encryption": null, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "-Fb 512", "fs_label": "", "fs_type": "ext4", "mount_group": null, "mount_mode": null, "mount_options": null, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_disks": [], "raid_level": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": null, "vdo_pool_size": null}]}], "safe_mode": true, "use_partitions": null, "volume_defaults": {"cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null}, "volumes": []}}, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting (lvmpv) and/or devices on disk 'sdb' (pool 'foo') in safe mode", "packages": [], "pools": [], "volumes": []}}

TASK [rhel-system-roles.storage : Unmask the systemd cryptsetup services] ******
task path: /usr/share/ansible/roles/rhel-system-roles.storage/tasks/main-blivet.yml:99
skipping: [localhost] => {"changed": false, "skipped_reason": "No items in the list"}

TASK [Check that we failed in the role] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/verify-role-failed.yml:20
ok: [localhost] => {
    "changed": false,
    "msg": "All assertions passed"
}

TASK [Verify the blivet output and error message are correct] ******************
task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/verify-role-failed.yml:25
fatal: [localhost]: FAILED! => {
    "assertion": "blivet_output.msg is search(__storage_failed_regex)",
    "changed": false,
    "evaluated_to": false,
    "msg": "Unexpected behavior when creating ext4 filesystem with invalid parameter"
}

PLAY RECAP *********************************************************************
localhost                  : ok=178  changed=4    unreachable=0    failed=1    skipped=150  rescued=2    ignored=0   
STDERR:
RETURN:2

https://beaker.engineering.redhat.com/recipes/14051127#task161440046



RHEL9 failed job
https://beaker.engineering.redhat.com/recipes/14051634#task161443711

Comment 2 guazhang@redhat.com 2023-07-05 07:07:36 UTC
Hi,

I can not open the brew task link[1].

[1]https://download.devel.redhat.com/brewroot/work/tasks/2305/53682305/

Comment 6 guazhang@redhat.com 2023-07-06 05:23:38 UTC
[root@storageqe-104 tests]# cat host 
localhost  ansible_connection=local
[root@storageqe-104 tests]# ansible-playbook -vv -i host tests_disk_errors.yml

TASK [Store global variable value copy] **********************************************************************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/verify-role-failed.yml:4
fatal: [localhost]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: 'storage_safe_mode' is undefined\n\nThe error appears to be in '/usr/share/ansible/roles/rhel-system-roles.storage/tests/verify-role-failed.yml': line 4, column 7, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n  block:\n    - name: Store global variable value copy\n      ^ here\n"}

TASK [Check that we failed in the role] **********************************************************************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/verify-role-failed.yml:25
ok: [localhost] => {
    "changed": false,
    "msg": "All assertions passed"
}

TASK [Verify the blivet output and error message are correct] ************************************************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/verify-role-failed.yml:30
fatal: [localhost]: FAILED! => {
    "assertion": "blivet_output.failed",
    "changed": false,
    "evaluated_to": false,
    "msg": "Unexpected behavior w/ multiple disk volumes using the same name"
}

PLAY RECAP ***************************************************************************************************************************************
localhost                  : ok=30   changed=0    unreachable=0    failed=1    skipped=15   rescued=2    ignored=0   

[root@storageqe-104 tests]# 


[ERROR][23:11:02]ansible-playbook -vv -i host tests_disk_errors.yml
[ERROR][23:11:27]ansible-playbook -vv -i host tests_disk_errors_scsi_generated.yml
[ERROR][23:21:15]ansible-playbook -vv -i host tests_luks.yml
[ERROR][23:21:32]ansible-playbook -vv -i host tests_luks_pool.yml
[ERROR][23:21:54]ansible-playbook -vv -i host tests_luks_pool_scsi_generated.yml
[ERROR][23:22:07]ansible-playbook -vv -i host tests_luks_scsi_generated.yml
[ERROR][23:22:44]ansible-playbook -vv -i host tests_lvm_auto_size_cap.yml
[ERROR][23:23:03]ansible-playbook -vv -i host tests_lvm_auto_size_cap_scsi_generated.yml
[ERROR][23:23:31]ansible-playbook -vv -i host tests_lvm_errors.yml
[ERROR][23:23:56]ansible-playbook -vv -i host tests_lvm_errors_scsi_generated.yml
[ERROR][00:22:20]ansible-playbook -vv -i host tests_resize.yml
[ERROR][00:27:29]ansible-playbook -vv -i host tests_resize_scsi_generated.yml
[ERROR][00:36:29]ansible-playbook -vv -i host tests_volume_relabel.yml


almost the failed case prompt same error which is the var 'storage_safe_mode' is undefined.

Comment 7 guazhang@redhat.com 2023-07-06 07:37:43 UTC
Hi,

Update the ansible inventory host var.

[root@storageqe-104 tests]# cat host 
localhost  ansible_connection=local storage_safe_mode=true
[root@storageqe-104 tests]# 

[PASS][02:15:09]ansible-playbook -vv -i host tests_disk_errors.yml
[PASS][02:16:37]ansible-playbook -vv -i host tests_disk_errors_scsi_generated.yml
[PASS][02:22:36]ansible-playbook -vv -i host tests_luks_pool.yml
[PASS][02:25:27]ansible-playbook -vv -i host tests_luks_pool_scsi_generated.yml
[PASS][02:30:25]ansible-playbook -vv -i host tests_lvm_auto_size_cap.yml
[PASS][02:31:44]ansible-playbook -vv -i host tests_lvm_auto_size_cap_scsi_generated.yml
[PASS][02:32:51]ansible-playbook -vv -i host tests_lvm_errors.yml
[PASS][02:34:04]ansible-playbook -vv -i host tests_lvm_errors_scsi_generated.yml
[PASS][02:44:42]ansible-playbook -vv -i host tests_volume_relabel.yml
[ERROR][02:19:34]ansible-playbook -vv -i host tests_luks.yml
[ERROR][02:28:46]ansible-playbook -vv -i host tests_luks_scsi_generated.yml
[ERROR][02:39:14]ansible-playbook -vv -i host tests_resize.yml
[ERROR][02:43:41]ansible-playbook -vv -i host tests_resize_scsi_generated.yml


ansible-playbook -vv -i host tests_luks.yml


TASK [Verify the current mount state by device] **************************************************************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/test-verify-volume-mount.yml:46
fatal: [localhost]: FAILED! => {
    "assertion": "storage_test_mount_device_matches | length == storage_test_mount_expected_match_count | int",
    "changed": false,
    "evaluated_to": false,
    "msg": "Found unexpected mount state for volume 'foo' device"
}

PLAY RECAP ***************************************************************************************************************************************
localhost                  : ok=90   changed=9    unreachable=0    failed=2    skipped=29   rescued=1    ignored=0   

[root@storageqe-104 tests]# cat host 
localhost  ansible_connection=local storage_safe_mode=true
[root@storageqe-104 tests]# 




ansible-playbook -vv -i host tests_luks_scsi_generated.yml

TASK [Set some facts] ****************************************************************************************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/test-verify-volume-mount.yml:16
ok: [localhost] => {"ansible_facts": {"storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [{"block_available": 116330557, "block_size": 4096, "block_total": 117155654, "block_used": 825097, "device": "/dev/sdb", "fstype": "xfs", "inode_available": 234425725, "inode_total": 234425728, "inode_used": 3, "mount": "/opt/test1", "options": "rw,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 476489961472, "size_total": 479869558784, "uuid": "21001c92-d6c7-488c-b8b9-61d24c949336"}], "storage_test_swap_expected_matches": "0"}, "changed": false}

TASK [Get information about the mountpoint directory] ********************************************************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/test-verify-volume-mount.yml:33
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}

TASK [Verify the current mount state by device] **************************************************************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/test-verify-volume-mount.yml:46
fatal: [localhost]: FAILED! => {
    "assertion": "storage_test_mount_device_matches | length == storage_test_mount_expected_match_count | int",
    "changed": false,
    "evaluated_to": false,
    "msg": "Found unexpected mount state for volume 'foo' device"
}

PLAY RECAP ***************************************************************************************************************************************
localhost                  : ok=88   changed=5    unreachable=0    failed=2    skipped=33   rescued=1    ignored=0   

[root@storageqe-104 tests]# 


 ansible-playbook -vv -i host tests_resize.yml

skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}

TASK [Verify the current mount state by device] **************************************************************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/test-verify-volume-mount.yml:46
fatal: [localhost]: FAILED! => {
    "assertion": "storage_test_mount_device_matches | length == storage_test_mount_expected_match_count | int",
    "changed": false,
    "evaluated_to": false,
    "msg": "Found unexpected mount state for volume 'test1' device"
}

PLAY RECAP ***************************************************************************************************************************************
localhost                  : ok=95   changed=6    unreachable=0    failed=1    skipped=53   rescued=0    ignored=0   



ansible-playbook -vv -i host tests_resize_scsi_generated.yml

TASK [rhel-system-roles.storage : Retrieve facts for the /etc/crypttab file] *********************************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.storage/tasks/main-blivet.yml:182
ok: [localhost] => {"changed": false, "stat": {"atime": 1688627729.8890707, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "2ea4af70536d96e7f43b4d9b67fb4e20daa3297d", "ctime": 1688627728.1600769, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 203234253, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1688627728.1600769, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 106, "uid": 0, "version": "3943556224", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}}

TASK [rhel-system-roles.storage : Manage /etc/crypttab to account for changes we just made] ******************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.storage/tasks/main-blivet.yml:187

TASK [rhel-system-roles.storage : Update facts] **************************************************************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.storage/tasks/main-blivet.yml:209
ok: [localhost]
META: role_complete for localhost

TASK [Unreachable task] **************************************************************************************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/verify-role-failed.yml:20
fatal: [localhost]: FAILED! => {"changed": false, "msg": "UNREACH"}

TASK [Check that we failed in the role] **********************************************************************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/verify-role-failed.yml:25
fatal: [localhost]: FAILED! => {
    "assertion": "ansible_failed_result.msg != 'UNREACH'",
    "changed": false,
    "evaluated_to": false,
    "msg": "Role has not failed when it should have"
}

PLAY RECAP ***************************************************************************************************************************************
localhost                  : ok=1535 changed=67   unreachable=0    failed=4    skipped=1526 rescued=4    ignored=0   

[root@storageqe-104 tests]#

Comment 10 guazhang@redhat.com 2023-07-07 00:55:27 UTC
Hi, 

still have two case failed.


ansible-playbook -vv -i host tests_resize.yml 

TASK [Unreachable task] ********************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/verify-role-failed.yml:24
fatal: [localhost]: FAILED! => {"changed": false, "msg": "UNREACH"}

TASK [Check that we failed in the role] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/verify-role-failed.yml:29
fatal: [localhost]: FAILED! => {
    "assertion": "ansible_failed_result.msg != 'UNREACH'",
    "changed": false,
    "evaluated_to": false,
    "msg": "Role has not failed when it should have"
}

PLAY RECAP *********************************************************************
localhost                  : ok=1493 changed=27   unreachable=0    failed=4    skipped=1566 rescued=4    ignored=0
STDERR:[DEPRECATION WARNING]: Ansible will require Python 3.8 or newer on the
controller starting with Ansible 2.12. Current version: 3.6.8 (default, Jun 26
2023, 17:47:38) [GCC 8.5.0 20210514 (Red Hat 8.5.0-20)]. This feature will be
removed from ansible-core in version 2.12. Deprecation warnings can be disabled
 by setting deprecation_warnings=False in ansible.cfg.
/usr/local/lib/python3.6/site-packages/ansible/parsing/vault/__init__.py:44: CryptographyDeprecationWarning: Python 3.6 is no longer supported by the Python core team. Therefore, support for it is deprecated in cryptography. The next release of cryptography will remove support for Python 3.6.
  from cryptography.exceptions import InvalidSignature
[WARNING]: An error occurred while calling
ansible.utils.display.initialize_locale (unsupported locale setting). This may
result in incorrectly calculated text widths that can cause Display to print
incorrect line lengths
RETURN:2



ansible-playbook -vv -i host tests_resize_scsi_generated.yml

TASK [Check that we failed in the role] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/verify-role-failed.yml:29
fatal: [localhost]: FAILED! => {
    "assertion": "ansible_failed_result.msg != 'UNREACH'",
    "changed": false,
    "evaluated_to": false,
    "msg": "Role has not failed when it should have"
}

PLAY RECAP *********************************************************************
localhost                  : ok=1495 changed=27   unreachable=0    failed=4    skipped=1566 rescued=4    ignored=0
STDERR:[DEPRECATION WARNING]: Ansible will require Python 3.8 or newer on the
controller starting with Ansible 2.12. Current version: 3.6.8 (default, Jun 26
2023, 17:47:38) [GCC 8.5.0 20210514 (Red Hat 8.5.0-20)]. This feature will be
removed from ansible-core in version 2.12. Deprecation warnings can be disabled
 by setting deprecation_warnings=False in ansible.cfg.
/usr/local/lib/python3.6/site-packages/ansible/parsing/vault/__init__.py:44: CryptographyDeprecationWarning: Python 3.6 is no longer supported by the Python core team. Therefore, support for it is deprecated in cryptography. The next release of cryptography will remove support for Python 3.6.
  from cryptography.exceptions import InvalidSignature
[WARNING]: An error occurred while calling
ansible.utils.display.initialize_locale (unsupported locale setting). This may
result in incorrectly calculated text widths that can cause Display to print
incorrect line lengths
RETURN:2

Comment 12 guazhang@redhat.com 2023-07-08 06:01:14 UTC
rhel-system-roles-1.22.0-0.13.10.el8.noarch  test pass with the package

Comment 16 errata-xmlrpc 2023-11-14 15:31:21 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (rhel-system-roles bug fix and enhancement update), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHEA-2023:6946


Note You need to log in before you can comment on or make changes to this bug.