Description of problem: Due to the restructuring of the image creating process (TCIB) revealed a missing dependency in os-brick. In some cases (FC brick connector, iSCSI connector with multipath for online volume-extend) os-brick uses the '/lib/udev/scsi_id' provided by systemd-udev, but the package is not available in those containers. (thanks Alan Bishop for the analysis). Those cases are not working until the dependency is added. The dependency should show up in all containers using os-brick (some of the cinder- and glance- containers for example). The dependency seems to be available in nova_compute containers, probably due to other dependencies. Version-Release number of selected component (if applicable): python3-os-brick-2.10.7-2.20210528134947.el8ost.2.noarch Testing container: rhosp16-openstack-cinder-volume 16.2_20210811.1 How reproducible: Always in the cases explained above. Steps to Reproduce: It was reproduced while testing glance over cinder over FC, but it may break in any of the cases described above: # glance image-create --name cirros --disk-format qcow2 --container-format bare --file cirros-0.4.0-x86_64-disk.img Error on glance api log: 2021-08-23 17:26:00.666 15 ERROR glance.api.v2.image_data [req-960fade6-6038-4f10-94b8-10fe5764a030 0cdf71e61fba485a94eae6d483c7c6e1 be8df2871618429e88014ca2f9de4a06 - default default] Failed to upload image data due to internal error: o slo_concurrency.processutils.ProcessExecutionError: [Errno 2] No such file or directory: '/lib/udev/scsi_id' 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi [req-960fade6-6038-4f10-94b8-10fe5764a030 0cdf71e61fba485a94eae6d483c7c6e1 be8df2871618429e88014ca2f9de4a06 - default default] Caught error: [Errno 2] No such file or directory: '/lib/u dev/scsi_id' Command: /lib/udev/scsi_id --page 0x83 --whitelisted /dev/disk/by-path/pci-0000:5e:00.1-fc-0x20010002ac021f6b-lun-0 Exit code: - Stdout: None Stderr: None: oslo_concurrency.processutils.ProcessExecutionError: [Errno 2] No such file or directory: '/lib/udev/scsi_id' Command: /lib/udev/scsi_id --page 0x83 --whitelisted /dev/disk/by-path/pci-0000:5e:00.1-fc-0x20010002ac021f6b-lun-0 Exit code: - Stdout: None Stderr: None 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi Traceback (most recent call last): 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/os_brick/privileged/rootwrap.py", line 169, in execute 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi return execute_root(*cmd, **kwargs) 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/oslo_privsep/priv_context.py", line 245, in _wrap 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi return self.channel.remote_call(name, args, kwargs) 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/oslo_privsep/daemon.py", line 224, in remote_call 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi raise exc_type(*result[2]) 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi FileNotFoundError: [Errno 2] No such file or directory: '/lib/udev/scsi_id' 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi During handling of the above exception, another exception occurred: 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi Traceback (most recent call last): 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/glance/common/wsgi.py", line 1476, in __call__ 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi request, **action_args) 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/glance/common/wsgi.py", line 1519, in dispatch 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi return method(*args, **kwargs) 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/glance/common/utils.py", line 419, in wrapped 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi return func(self, req, *args, **kwargs) 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/glance/api/v2/image_data.py", line 299, in upload 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi self._restore(image_repo, image) 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 220, in __exit__ 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi self.force_reraise() 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 196, in force_reraise 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi six.reraise(self.type_, self.value, self.tb) 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/six.py", line 693, in reraise 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi raise value 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/glance/api/v2/image_data.py", line 164, in upload 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi image.set_data(data, size, backend=backend) 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/glance/domain/proxy.py", line 208, in set_data 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi self.base.set_data(data, size, backend=backend, set_active=set_active) 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/glance/notifier.py", line 499, in set_data 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi _send_notification(notify_error, 'image.upload', msg) 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 220, in __exit__ 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi self.force_reraise() 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 196, in force_reraise 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi six.reraise(self.type_, self.value, self.tb) 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/six.py", line 693, in reraise 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi raise value 021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/glance/notifier.py", line 446, in set_data 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi set_active=set_active) 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/glance/api/policy.py", line 208, in set_data 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi return self.image.set_data(*args, **kwargs) 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/glance/quota/__init__.py", line 319, in set_data 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi set_active=set_active) 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/glance/location.py", line 559, in set_data 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi self._upload_to_store(data, verifier, backend, size) 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/glance/location.py", line 472, in _upload_to_store 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi verifier=verifier) 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/glance_store/multi_backend.py", line 399, in add_with_multihash 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi image_id, data, size, hashing_algo, store, context, verifier) 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/glance_store/multi_backend.py", line 481, in store_add_to_backend_with_multihash 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi image_id, data, size, hashing_algo, context=context, verifier=verifier) 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/glance_store/driver.py", line 279, in add_adapter 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi metadata_dict) = store_add_fun(*args, **kwargs) 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/glance_store/capabilities.py", line 176, in op_checker 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi return store_op_fun(store, *args, **kwargs) 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/glance_store/_drivers/cinder.py", line 803, in add 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi with self._open_cinder_volume(client, volume, 'wb') as f: 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib64/python3.6/contextlib.py", line 81, in __enter__ 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi return next(self.gen) 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/glance_store/_drivers/cinder.py", line 602, in _open_cinder_volume 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi device = conn.connect_volume(connection_info['data']) 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/os_brick/utils.py", line 150, in trace_logging_wrapper 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi result = f(*args, **kwargs) 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/oslo_concurrency/lockutils.py", line 328, in inner 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi return f(*args, **kwargs) 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/os_brick/initiator/connectors/fibre_channel.py", line 251, in connect_volume 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi device_wwn = self._linuxscsi.get_scsi_wwn(self.host_device) 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/os_brick/initiator/linuxscsi.py", line 201, in get_scsi_wwn 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi root_helper=self._root_helper) 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/os_brick/executor.py", line 52, in _execute 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi result = self.__execute(*args, **kwargs) 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi File "/usr/lib/python3.6/site-packages/os_brick/privileged/rootwrap.py", line 187, in execute 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi cmd=sanitized_cmd, description=six.text_type(e)) 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi oslo_concurrency.processutils.ProcessExecutionError: [Errno 2] No such file or directory: '/lib/udev/scsi_id' 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi Command: /lib/udev/scsi_id --page 0x83 --whitelisted /dev/disk/by-path/pci-0000:5e:00.1-fc-0x20010002ac021f6b-lun-0 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi Exit code: - 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi Stdout: None 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi Stderr: None 2021-08-23 17:26:00.684 15 ERROR glance.common.wsgi 2021-08-23 17:26:00.728 15 INFO eventlet.wsgi.server [req-960fade6-6038-4f10-94b8-10fe5764a030 0cdf71e61fba485a94eae6d483c7c6e1 be8df2871618429e88014ca2f9de4a06 - default default] 172.17.1.66 - - [23/Aug/2021 17:26:00] "PUT /v2/images/6347ebef-a6e2-416d-b6f4-132eabe12a6a/file HTTP/1.1" 500 430 7.007719 2021-08-23 17:26:02.637 26 DEBUG eventlet.wsgi.server [-] (26) accepted ('172.17.1.66', 55940) server /usr/lib/python3.6/site-packages/eventlet/wsgi.py:985 2021-08-23 17:26:02.642 26 INFO eventlet.wsgi.server [-] 172.17.1.66 - - [23/Aug/2021 17:26:02] "GET /healthcheck HTTP/1.0" 200 137 0.003559
Verified on: python3-os-brick-2.10.7-2.20210528134947.el8ost.4.noarch Deployed same environment, Glance/Cinder/3parFC with multipath enabled. Here we see the backend is up (overcloud) [stack@seal41 ~]$ cinder service-list +------------------+-------------------------+------+---------+-------+----------------------------+-----------------+ | Binary | Host | Zone | Status | State | Updated_at | Disabled Reason | +------------------+-------------------------+------+---------+-------+----------------------------+-----------------+ | cinder-backup | controller-0 | nova | enabled | up | 2021-09-05T10:19:00.000000 | - | | cinder-scheduler | controller-0 | nova | enabled | up | 2021-09-05T10:18:59.000000 | - | | cinder-volume | controller-0@3parfc | nova | enabled | up | 2021-09-05T10:18:59.000000 | - | | cinder-volume | hostgroup@tripleo_iscsi | nova | enabled | down | 2021-09-05T10:17:12.000000 | - | +------------------+-------------------------+------+---------+-------+----------------------------+-----------------+ Lets create a test volume, to confirm 3par is working. (overcloud) [stack@seal41 ~]$ cinder create 1 --name Test +--------------------------------+--------------------------------------+ | Property | Value | +--------------------------------+--------------------------------------+ | attachments | [] | | availability_zone | nova | | bootable | false | | consistencygroup_id | None | | created_at | 2021-09-05T10:19:37.000000 | | description | None | | encrypted | False | | id | 37c5f6ef-d819-4be5-891a-0c6c4fd39e45 | | metadata | {} | | migration_status | None | | multiattach | False | | name | Test | | os-vol-host-attr:host | controller-0@3parfc#SSD_r5 | | os-vol-mig-status-attr:migstat | None | | os-vol-mig-status-attr:name_id | None | | os-vol-tenant-attr:tenant_id | 9b8f8ddd1e764f1cb1741aba102eff17 | | replication_status | None | | size | 1 | | snapshot_id | None | | source_volid | None | | status | creating | | updated_at | 2021-09-05T10:19:37.000000 | | user_id | a65d13a8bb72475ab6fcbc29f95f99cc | | volume_type | tripleo | +--------------------------------+--------------------------------------+ Now lets upload an image to Glance: I don't have the upload command as I used an ansible script to also do other things. But below we can see the end result an image was uploaded: (overcloud) [stack@seal41 ~]$ glance image-list +--------------------------------------+--------+ | ID | Name | +--------------------------------------+--------+ | b7b21945-ce68-4294-b38f-395099f29b7b | cirros | +--------------------------------------+--------+ (overcloud) [stack@seal41 ~]$ glance image-show b7b21945-ce68-4294-b38f-395099f29b7b +----------------------------------+----------------------------------------------------------------------------------+ | Property | Value | +----------------------------------+----------------------------------------------------------------------------------+ | checksum | 443b7623e27ecf03dc9e01ee93f67afe | | container_format | bare | | created_at | 2021-09-05T10:21:29Z | | direct_url | cinder://b89a5617-794b-4c9f-b739-a61d35390493 | | disk_format | qcow2 | | id | b7b21945-ce68-4294-b38f-395099f29b7b | | min_disk | 0 | | min_ram | 0 | | name | cirros | | os_hash_algo | sha512 | | os_hash_value | 6513f21e44aa3da349f248188a44bc304a3653a04122d8fb4535423c8e1d14cd6a153f735bb0982e | | | 2161b5b5186106570c17a9e58b64dd39390617cd5a350f78 | | os_hidden | False | | owner | 9b8f8ddd1e764f1cb1741aba102eff17 | | owner_specified.openstack.md5 | | | owner_specified.openstack.object | images/cirros | | owner_specified.openstack.sha256 | | | protected | False | | size | 12716032 | | status | active | | stores | default_backend | | tags | [] | | updated_at | 2021-09-05T10:21:42Z | | virtual_size | Not available | | visibility | public | +----------------------------------+----------------------------------------------------------------------------------+ Yay we're good, we can see the uploaded image is available, it's stored on Cinder. We also managed to consume/boot an instance off of this image: (overcloud) [stack@seal41 ~]$ nova show inst1 +--------------------------------------+----------------------------------------------------------------------------------+ | Property | Value | +--------------------------------------+----------------------------------------------------------------------------------+ | OS-DCF:diskConfig | MANUAL | | OS-EXT-AZ:availability_zone | nova | | OS-EXT-SRV-ATTR:host | compute-1.localdomain | | OS-EXT-SRV-ATTR:hostname | inst1 | | OS-EXT-SRV-ATTR:hypervisor_hostname | compute-1.localdomain | | OS-EXT-SRV-ATTR:instance_name | instance-00000001 | | OS-EXT-SRV-ATTR:kernel_id | | | OS-EXT-SRV-ATTR:launch_index | 0 | | OS-EXT-SRV-ATTR:ramdisk_id | | | OS-EXT-SRV-ATTR:reservation_id | r-la0th188 | | OS-EXT-SRV-ATTR:root_device_name | /dev/vda | | OS-EXT-SRV-ATTR:user_data | - | | OS-EXT-STS:power_state | 1 | | OS-EXT-STS:task_state | - | | OS-EXT-STS:vm_state | active | | OS-SRV-USG:launched_at | 2021-09-05T10:22:00.000000 | | OS-SRV-USG:terminated_at | - | | accessIPv4 | | | accessIPv6 | | | config_drive | | | created | 2021-09-05T10:21:46Z | | description | inst1 | | flavor:disk | 1 | | flavor:ephemeral | 0 | | flavor:extra_specs | {} | | flavor:original_name | tiny | | flavor:ram | 512 | | flavor:swap | 0 | | flavor:vcpus | 1 | | hostId | a699cb3fa7c20426d8f7e4120548f646c4289a9dd8ded8ae60bbbcda | | host_status | UP | | id | 00ad2a3d-6fac-4f61-b456-8369d901c350 | | image | cirros (b7b21945-ce68-4294-b38f-395099f29b7b) | | internal network | 192.168.0.20, 10.35.21.24 | | key_name | key1 | | locked | False | | locked_reason | - | | metadata | {} | | name | inst1 | | os-extended-volumes:volumes_attached | [{"id": "fcc9188c-8ae5-4b9f-b91a-8991b9561e1d", "delete_on_termination": false}] | | progress | 0 | | security_groups | inst1-sg | | server_groups | [] | | status | ACTIVE | | tags | [] | | tenant_id | 9b8f8ddd1e764f1cb1741aba102eff17 | | trusted_image_certificates | - | | updated | 2021-09-05T10:22:01Z | | user_id | a65d13a8bb72475ab6fcbc29f95f99cc | +--------------------------------------+----------------------------------------------------------------------------------+ Good to verify!
Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory (Red Hat OpenStack Platform (RHOSP) 16.2 enhancement advisory), and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://access.redhat.com/errata/RHEA-2021:3483