RHEL Engineering is moving the tracking of its product development work on RHEL 6 through RHEL 9 to Red Hat Jira (issues.redhat.com). If you're a Red Hat customer, please continue to file support cases via the Red Hat customer portal. If you're not, please head to the "RHEL project" in Red Hat Jira and file new tickets here. Individual Bugzilla bugs in the statuses "NEW", "ASSIGNED", and "POST" are being migrated throughout September 2023. Bugs of Red Hat partners with an assigned Engineering Partner Manager (EPM) are migrated in late September as per pre-agreed dates. Bugs against components "kernel", "kernel-rt", and "kpatch" are only migrated if still in "NEW" or "ASSIGNED". If you cannot log in to RH Jira, please consult article #7032570. That failing, please send an e-mail to the RH Jira admins at rh-issues@redhat.com to troubleshoot your issue as a user management inquiry. The email creates a ServiceNow ticket with Red Hat. Individual Bugzilla bugs that are migrated will be moved to status "CLOSED", resolution "MIGRATED", and set with "MigratedToJIRA" in "Keywords". The link to the successor Jira issue will be found under "Links", have a little "two-footprint" icon next to it, and direct you to the "RHEL project" in Red Hat Jira (issue links are of type "https://issues.redhat.com/browse/RHEL-XXXX", where "X" is a digit). This same link will be available in a blue banner at the top of the page informing you that that bug has been migrated.
Bug 1879785 - [storage] main-blivet.yml:90 test failed when persist stratis in system
Summary: [storage] main-blivet.yml:90 test failed when persist stratis in system
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Enterprise Linux 8
Classification: Red Hat
Component: rhel-system-roles
Version: 8.3
Hardware: Unspecified
OS: Unspecified
unspecified
unspecified
Target Milestone: rc
: 8.5
Assignee: David Lehman
QA Contact: guazhang@redhat.com
URL:
Whiteboard: role:storage
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2020-09-17 01:54 UTC by guazhang@redhat.com
Modified: 2021-11-09 20:27 UTC (History)
6 users (show)

Fixed In Version: rhel-system-roles-1.7.1-1.el8
Doc Type: No Doc Update
Doc Text:
Clone Of:
Environment:
Last Closed: 2021-11-09 17:44:38 UTC
Type: Bug
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Red Hat Product Errata RHBA-2021:4159 0 None None None 2021-11-09 17:44:53 UTC

Description guazhang@redhat.com 2020-09-17 01:54:28 UTC
Description of problem:
blivet can not get right device and occurred error

Version-Release number of selected component (if applicable):


How reproducible:


Steps to Reproduce:
1. stratis pool create pool1 /dev/sde
2. run system role testing
3.

Actual results:


Expected results:


Additional info:
lsblk 
sde                                                                       8:64   0 223.6G  0 disk    
└─stratis-1-private-d6c840aeccb0452e8d8dc3c1242a500a-physical-originsub 253:4    0 223.6G  0 stratis 
  ├─stratis-1-private-d6c840aeccb0452e8d8dc3c1242a500a-flex-thinmeta    253:5    0   224M  0 stratis 
  │ └─stratis-1-private-d6c840aeccb0452e8d8dc3c1242a500a-thinpool-pool  253:8    0 223.3G  0 stratis 
  ├─stratis-1-private-d6c840aeccb0452e8d8dc3c1242a500a-flex-thindata    253:6    0 223.3G  0 stratis 
  │ └─stratis-1-private-d6c840aeccb0452e8d8dc3c1242a500a-thinpool-pool  253:8    0 223.3G  0 stratis 
  └─stratis-1-private-d6c840aeccb0452e8d8dc3c1242a500a-flex-mdv         253:7    0    16M  0 stratis 


task path: /usr/share/ansible/roles/rhel-system-roles.storage/tasks/main-blivet.yml:90
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: blivet.errors.DeviceTreeError: failed to add slave stratis-1-private-d6c840aeccb0452e8d8dc3c1242a500a-physical-originsub of device stratis-1-private-d6c840aeccb0452e8d8dc3c1242a500a-flex-thinmeta
fatal: [localhost]: FAILED! => {"changed": false, "module_stderr": "Traceback (most recent call last):\n  File \"/root/.ansible/tmp/ansible-tmp-1600307019.0284355-10997-59984381510545/AnsiballZ_blivet.py\", line 102, in <module>\n    _ansiballz_main()\n  File \"/root/.ansible/tmp/ansible-tmp-1600307019.0284355-10997-59984381510545/AnsiballZ_blivet.py\", line 94, in _ansiballz_main\n    invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\n  File \"/root/.ansible/tmp/ansible-tmp-1600307019.0284355-10997-59984381510545/AnsiballZ_blivet.py\", line 40, in invoke_module\n    runpy.run_module(mod_name='ansible.modules.blivet', init_globals=None, run_name='__main__', alter_sys=True)\n  File \"/usr/lib64/python3.6/runpy.py\", line 205, in run_module\n    return _run_module_code(code, init_globals, run_name, mod_spec)\n  File \"/usr/lib64/python3.6/runpy.py\", line 96, in _run_module_code\n    mod_name, mod_spec, pkg_name, script_name)\n  File \"/usr/lib64/python3.6/runpy.py\", line 85, in _run_code\n    exec(code, run_globals)\n  File \"/tmp/ansible_blivet_payload_il96ws01/ansible_blivet_payload.zip/ansible/modules/blivet.py\", line 1252, in <module>\n  File \"/tmp/ansible_blivet_payload_il96ws01/ansible_blivet_payload.zip/ansible/modules/blivet.py\", line 1249, in main\n  File \"/tmp/ansible_blivet_payload_il96ws01/ansible_blivet_payload.zip/ansible/modules/blivet.py\", line 1172, in run_module\n  File \"/usr/lib/python3.6/site-packages/blivet/threads.py\", line 53, in run_with_lock\n    return m(*args, **kwargs)\n  File \"/usr/lib/python3.6/site-packages/blivet/blivet.py\", line 141, in reset\n    self.devicetree.populate(cleanup_only=cleanup_only)\n  File \"/usr/lib/python3.6/site-packages/blivet/threads.py\", line 53, in run_with_lock\n    return m(*args, **kwargs)\n  File \"/usr/lib/python3.6/site-packages/blivet/populator/populator.py\", line 414, in populate\n    self._populate()\n  File \"/usr/lib/python3.6/site-packages/blivet/threads.py\", line 53, in run_with_lock\n    return m(*args, **kwargs)\n  File \"/usr/lib/python3.6/site-packages/blivet/populator/populator.py\", line 459, in _populate\n    self.handle_device(dev)\n  File \"/usr/lib/python3.6/site-packages/blivet/threads.py\", line 53, in run_with_lock\n    return m(*args, **kwargs)\n  File \"/usr/lib/python3.6/site-packages/blivet/populator/populator.py\", line 265, in handle_device\n    device = helper_class(self, info).run()\n  File \"/usr/lib/python3.6/site-packages/blivet/populator/helpers/dm.py\", line 50, in run\n    slave_devices = self._devicetree._add_slave_devices(self.data)\n  File \"/usr/lib/python3.6/site-packages/blivet/threads.py\", line 53, in run_with_lock\n    return m(*args, **kwargs)\n  File \"/usr/lib/python3.6/site-packages/blivet/populator/populator.py\", line 137, in _add_slave_devices\n    raise DeviceTreeError(msg)\nblivet.errors.DeviceTreeError: failed to add slave stratis-1-private-d6c840aeccb0452e8d8dc3c1242a500a-physical-originsub of device stratis-1-private-d6c840aeccb0452e8d8dc3c1242a500a-flex-thinmeta\n", "module_stdout": "", "msg": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1}

Comment 3 guazhang@redhat.com 2021-06-01 23:50:46 UTC
Hi

the bug has set ITM to 13 but miss DTM, so please update the DTM first,

Comment 8 guazhang@redhat.com 2021-07-26 01:48:56 UTC
Hi,

Any updates here? today is last day for ITM 21 so I change to ITM 23.

Comment 10 David Lehman 2021-08-05 13:19:50 UTC
This should have been resolved by the version of python-blivet in the 8.5 composes. Please verify.

Comment 12 guazhang@redhat.com 2021-08-06 11:28:56 UTC
Hi,


I tested the package and found some errors, please help to check, but don't reproduce the bug.

https://bugzilla.redhat.com/show_bug.cgi?id=1990749
https://bugzilla.redhat.com/show_bug.cgi?id=1990757
https://bugzilla.redhat.com/show_bug.cgi?id=1990776
https://bugzilla.redhat.com/show_bug.cgi?id=1990793

Comment 16 guazhang@redhat.com 2021-08-16 09:39:24 UTC
Hi,

test pass with the rhel-system-roles-1.8.0-0.1.el8.noarch.rpm even if stratis pool persist in system 

# lsblk
NAME                                                                    MAJ:MIN RM   SIZE RO TYPE    MOUNTPOINT
sda                                                                       8:0    0 223.6G  0 disk    
└─stratis-1-private-a4a240b6c579437b98df7a79e5c4d589-physical-originsub 253:3    0 223.6G  0 stratis 
  ├─stratis-1-private-a4a240b6c579437b98df7a79e5c4d589-flex-thinmeta    253:4    0   224M  0 stratis 
  │ └─stratis-1-private-a4a240b6c579437b98df7a79e5c4d589-thinpool-pool  253:7    0 223.3G  0 stratis 
  ├─stratis-1-private-a4a240b6c579437b98df7a79e5c4d589-flex-thindata    253:5    0 223.3G  0 stratis 
  │ └─stratis-1-private-a4a240b6c579437b98df7a79e5c4d589-thinpool-pool  253:7    0 223.3G  0 stratis 
  └─stratis-1-private-a4a240b6c579437b98df7a79e5c4d589-flex-mdv         253:6    0    16M  0 stratis 
sdb                                                                       8:16   0 223.6G  0 disk    
sdc                                                                       8:32   0 223.6G  0 disk    
sdd                                                                       8:48   0 223.6G  0 disk    
sde                                                                       8:64   0 223.6G  0 disk    
sdf                                                                       8:80   0 223.6G  0 disk    
sdg                                                                       8:96   0 223.6G  0 disk    
sdh                                                                       8:112  0 223.6G  0 disk

Comment 18 guazhang@redhat.com 2021-08-19 14:25:56 UTC
Hi,

the fixed package rhel-system-roles-1.7.1-1.el8 has fixed the issue.

Comment 21 guazhang@redhat.com 2021-08-20 03:05:17 UTC
move to verified since #18

Comment 23 errata-xmlrpc 2021-11-09 17:44:38 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (rhel-system-roles bug fix and enhancement update), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2021:4159


Note You need to log in before you can comment on or make changes to this bug.