Bug 2177934 - [RHEL9] AttributeError: 'DiskLabel' object has no attribute 'cipher' [NEEDINFO]
Summary: [RHEL9] AttributeError: 'DiskLabel' object has no attribute 'cipher'
Keywords:
Status: CLOSED WORKSFORME
Alias: None
Product: Red Hat Enterprise Linux 9
Classification: Red Hat
Component: rhel-system-roles
Version: 9.2
Hardware: Unspecified
OS: Unspecified
low
low
Target Milestone: rc
: ---
Assignee: Rich Megginson
QA Contact: CS System Management SST QE
URL:
Whiteboard: role:storage
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2023-03-14 00:55 UTC by guazhang@redhat.com
Modified: 2023-07-19 18:39 UTC (History)
4 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2023-07-19 18:39:16 UTC
Type: Bug
Target Upstream Version:
Embargoed:
rmeggins: needinfo? (kzak)


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Red Hat Issue Tracker RHELPLAN-151660 0 None None None 2023-03-14 00:55:58 UTC

Description guazhang@redhat.com 2023-03-14 00:55:18 UTC
Description of problem:
rhel-system-role regression testing found the error, looks it same with RHEL8 bug https://bugzilla.redhat.com/show_bug.cgi?id=2059426.

Version-Release number of selected component (if applicable):
ansible-core-2.14.2-4.el9.x86_64 
rhel-system-roles-1.21.0-2.el9.noarch
util-linux-2.37.4-10.el9.x86_64


How reproducible:


Steps to Reproduce:
1. ansible-playbook -vv -i host tests_luks_pool_nvme_generated.yml
2.
3.

Actual results:
regression failed 

Expected results:
pass

Additional info:

TASK [rhel-system-roles.storage : Manage the pools and volumes to match the specified state] ***
task path: /usr/share/ansible/roles/rhel-system-roles.storage/tasks/main-blivet.yml:73
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: AttributeError: 'DiskLabel' object has no attribute 'cipher'
fatal: [localhost]: FAILED! => {"changed": false, "module_stderr": "Traceback (most recent call last):\n  File \"/root/.ansible/tmp/ansible-tmp-1678724124.463746-227413-163344926299701/AnsiballZ_blivet.py\", line 107, in <module>\n    _ansiballz_main()\n  File \"/root/.ansible/tmp/ansible-tmp-1678724124.463746-227413-163344926299701/AnsiballZ_blivet.py\", line 99, in _ansiballz_main\n    invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\n  File \"/root/.ansible/tmp/ansible-tmp-1678724124.463746-227413-163344926299701/AnsiballZ_blivet.py\", line 47, in invoke_module\n    runpy.run_module(mod_name='ansible.modules.blivet', init_globals=dict(_module_fqn='ansible.modules.blivet', _modlib_path=modlib_path),\n  File \"/usr/lib64/python3.9/runpy.py\", line 225, in run_module\n    return _run_module_code(code, init_globals, run_name, mod_spec)\n  File \"/usr/lib64/python3.9/runpy.py\", line 97, in _run_module_code\n    _run_code(code, mod_globals, init_globals,\n  File \"/usr/lib64/python3.9/runpy.py\", line 87, in _run_code\n    exec(code, run_globals)\n  File \"/tmp/ansible_blivet_payload_fc5hu4cb/ansible_blivet_payload.zip/ansible/modules/blivet.py\", line 1930, in <module>\n  File \"/tmp/ansible_blivet_payload_fc5hu4cb/ansible_blivet_payload.zip/ansible/modules/blivet.py\", line 1926, in main\n  File \"/tmp/ansible_blivet_payload_fc5hu4cb/ansible_blivet_payload.zip/ansible/modules/blivet.py\", line 1878, in run_module\n  File \"/tmp/ansible_blivet_payload_fc5hu4cb/ansible_blivet_payload.zip/ansible/modules/blivet.py\", line 1529, in manage_pool\n  File \"/tmp/ansible_blivet_payload_fc5hu4cb/ansible_blivet_payload.zip/ansible/modules/blivet.py\", line 1257, in manage\n  File \"/tmp/ansible_blivet_payload_fc5hu4cb/ansible_blivet_payload.zip/ansible/modules/blivet.py\", line 1181, in _apply_defaults\n  File \"/tmp/ansible_blivet_payload_fc5hu4cb/ansible_blivet_payload.zip/ansible/modules/blivet.py\", line 1155, in _update_from_device\nAttributeError: 'DiskLabel' object has no attribute 'cipher'\n", "module_stdout": "", "msg": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1}

https://beaker.engineering.redhat.com/recipes/13534049#task157397050,task157397052,task157397054,task157397055
http://lab-04.rhts.eng.pek2.redhat.com/beaker/logs/tasks/157397+/157397052/taskout.log

Comment 1 Rich Megginson 2023-03-14 14:09:29 UTC
@vtrefny did something change in rhel9?

Comment 2 Vojtech Trefny 2023-03-22 10:47:03 UTC
(In reply to Rich Megginson from comment #1)
> @vtrefny did something change in rhel9?

Not in Blivet. I thought all the atari related fixes for libblkid are present in RHEL 9, but we still see random atari PT on LUKS:

 'DEVNAME': '/dev/nvme2n1',
 'DEVPATH': '/devices/pci0000:ae/0000:ae:01.0/0000:b0:00.0/nvme/nvme2/nvme2n1',
 'DEVTYPE': 'disk',
 'DISKSEQ': '1',
 'DM_MULTIPATH_DEVICE_PATH': '0',
 'ID_FS_TYPE': 'crypto_LUKS',
 'ID_FS_USAGE': 'crypto',
 'ID_FS_UUID': '73c1535a-31a9-4ed2-a8f1-60f0895466bd',
 'ID_FS_UUID_ENC': '73c1535a-31a9-4ed2-a8f1-60f0895466bd',
 'ID_FS_VERSION': '2',
 'ID_MODEL': 'Samsung SSD 983 DCT 960GB',
 'ID_PART_TABLE_TYPE': 'atari',

@kzak this fix[1] should be available in 9.2, right (util-linux 2.37.4)? Are there some additional fixes needed in RHEL 9 or is this a new issue?

[1] https://github.com/util-linux/util-linux/commit/282ceadc3a72fc07dd0388b8880fd751490bb87f

Comment 4 Karel Zak 2023-04-14 08:25:43 UTC
(In reply to Vojtech Trefny from comment #2)
> @kzak this fix[1] should be available in 9.2, right (util-linux
> 2.37.4)? Are there some additional fixes needed in RHEL 9 or is this a new
> issue?

The fix is in RHEL-9, there is no difference between upstream and rhel-9 atari prober code.

It would be nice to have the image (or any other simple way) to reproduce the issue locally.

Comment 5 Rich Megginson 2023-04-14 17:03:34 UTC
(In reply to Karel Zak from comment #4)
> (In reply to Vojtech Trefny from comment #2)
> > @kzak this fix[1] should be available in 9.2, right (util-linux
> > 2.37.4)? Are there some additional fixes needed in RHEL 9 or is this a new
> > issue?
> 
> The fix is in RHEL-9, there is no difference between upstream and rhel-9
> atari prober code.
> 
> It would be nice to have the image (or any other simple way) to reproduce
> the issue locally.

@guazhang - can you help Karel provision a machine which demonstrates this problem?  We don't see this issue in our usual baseos ci, or using qemu/kvm as in https://source.redhat.com/communities/communities_of_practice/infrastructure/rhel-ecosystem/rhelsystemroles/rhel_system_roles_wiki/rhel_system_roles_onboarding_and_development_process#running-ci-tests-locally


Note You need to log in before you can comment on or make changes to this bug.