Bug 1885487

Summary: Unable to retype a volume in-use after extening volume while in-use
Product: Red Hat OpenStack Reporter: bkopilov <bkopilov>
Component: openstack-novaAssignee: OSP DFG:Compute <osp-dfg-compute>
Status: CLOSED EOL QA Contact: OSP DFG:Compute <osp-dfg-compute>
Severity: high Docs Contact:
Priority: high    
Version: 16.1 (Train)CC: alifshit, dasmith, eglynn, jhakimra, kchamart, lyarwood, sbauza, sgordon, vromanso
Target Milestone: ---Keywords: Triaged
Target Release: ---   
Hardware: x86_64   
OS: Linux   
Whiteboard:
Fixed In Version: Doc Type: If docs needed, set a value
Doc Text:
Story Points: ---
Clone Of: Environment:
Last Closed: 2025-01-15 22:56:09 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:
Attachments:
Description Flags
compute-1-log directory none

Description bkopilov 2020-10-06 06:21:57 UTC
# Attached volume to an instance (in-use)

(overcloud) [stack@undercloud-0 ~]$ cinder show 163a91f1-23b9-41f2-b134-fa0f6855b9e9
+--------------------------------+------------------------------------------+
| Property                       | Value                                    |
+--------------------------------+------------------------------------------+
| attached_servers               | ['3b01b6db-4027-4089-9294-e3c4667ee5c0'] |
| attachment_ids                 | ['b30dc003-5f05-4552-ae80-378c21f65173'] |
| availability_zone              | nova                                     |
| bootable                       | false                                    |
| consistencygroup_id            | None                                     |
| created_at                     | 2020-10-04T13:28:47.000000               |
| description                    | None                                     |
| encrypted                      | False                                    |
| id                             | 163a91f1-23b9-41f2-b134-fa0f6855b9e9     |
| metadata                       |                                          |
| migration_status               | success                                  |
| multiattach                    | False                                    |
| name                           | tripleo_ceph                             |
| os-vol-host-attr:host          | hostgroup@tripleo_ceph#tripleo_ceph      |
| os-vol-mig-status-attr:migstat | success                                  |
| os-vol-mig-status-attr:name_id | e89b0c5d-6ea2-436f-b7d7-096f9cff0167     |
| os-vol-tenant-attr:tenant_id   | d213f5e7c68e4d609bb5212c8c40963b         |
| replication_status             | None                                     |
| size                           | 2                                        |
| snapshot_id                    | None                                     |
| source_volid                   | None                                     |
| status                         | in-use                                   |
| updated_at                     | 2020-10-06T06:11:45.000000               |
| user_id                        | 708dc8c91fbd467196b518f9535ad130         |
| volume_type                    | tripleo_ceph                             |
+--------------------------------+------------------------------------------+


# Extend the volume size +1 while attached   ->
cinder --os-volume-api-version 3.59 extend 163a91f1-23b9-41f2-b134-fa0f6855b9e9 3

# Try to retype a volume afer size increased  --- > Failed (from ceph to netapp backend)

(overcloud) [stack@undercloud-0 ~]$ cinder show 163a91f1-23b9-41f2-b134-fa0f6855b9e9
+--------------------------------+------------------------------------------+
| Property                       | Value                                    |
+--------------------------------+------------------------------------------+
| attached_servers               | ['3b01b6db-4027-4089-9294-e3c4667ee5c0'] |
| attachment_ids                 | ['b30dc003-5f05-4552-ae80-378c21f65173'] |
| availability_zone              | nova                                     |
| bootable                       | false                                    |
| consistencygroup_id            | None                                     |
| created_at                     | 2020-10-04T13:28:47.000000               |
| description                    | None                                     |
| encrypted                      | False                                    |
| id                             | 163a91f1-23b9-41f2-b134-fa0f6855b9e9     |
| metadata                       |                                          |
| migration_status               | error                                    |
| multiattach                    | False                                    |
| name                           | tripleo_ceph                             |
| os-vol-host-attr:host          | hostgroup@tripleo_ceph#tripleo_ceph      |
| os-vol-mig-status-attr:migstat | error                                    |
| os-vol-mig-status-attr:name_id | e89b0c5d-6ea2-436f-b7d7-096f9cff0167     |
| os-vol-tenant-attr:tenant_id   | d213f5e7c68e4d609bb5212c8c40963b         |
| replication_status             | None                                     |
| size                           | 3                                        |
| snapshot_id                    | None                                     |
| source_volid                   | None                                     |
| status                         | in-use                                   |
| updated_at                     | 2020-10-06T06:15:47.000000               |
| user_id                        | 708dc8c91fbd467196b518f9535ad130         |
| volume_type                    | tripleo_ceph                             |
+--------------------------------+------------------------------------------+




/home/heat-admin
==== compute-1 computes ====
/var/log/containers/nova/nova-compute.log:87:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server [req-60bce194-f0d9-4a3f-857b-d8875982fb96 b7fe7757bbfb4ae1b971d378904e63c2 a1bda7588731428a903f082eedcf6fb4 - default default] Exception during message handling: nova.exception.VolumeNotFound: Volume e89b0c5d-6ea2-436f-b7d7-096f9cff0167 could not be found.
/var/log/containers/nova/nova-compute.log:88:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
/var/log/containers/nova/nova-compute.log:89:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
/var/log/containers/nova/nova-compute.log:90:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
/var/log/containers/nova/nova-compute.log:91:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/dispatcher.py", line 274, in dispatch
/var/log/containers/nova/nova-compute.log:92:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
/var/log/containers/nova/nova-compute.log:93:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/dispatcher.py", line 194, in _do_dispatch
/var/log/containers/nova/nova-compute.log:94:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
/var/log/containers/nova/nova-compute.log:95:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/nova/exception_wrapper.py", line 79, in wrapped
/var/log/containers/nova/nova-compute.log:96:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     function_name, call_dict, binary, tb)
/var/log/containers/nova/nova-compute.log:97:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 220, in __exit__
/var/log/containers/nova/nova-compute.log:98:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     self.force_reraise()
/var/log/containers/nova/nova-compute.log:99:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
/var/log/containers/nova/nova-compute.log:100:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     six.reraise(self.type_, self.value, self.tb)
/var/log/containers/nova/nova-compute.log:101:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/six.py", line 693, in reraise
/var/log/containers/nova/nova-compute.log:102:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     raise value
/var/log/containers/nova/nova-compute.log:103:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/nova/exception_wrapper.py", line 69, in wrapped
/var/log/containers/nova/nova-compute.log:104:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     return f(self, context, *args, **kw)
/var/log/containers/nova/nova-compute.log:105:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/nova/compute/manager.py", line 9327, in external_instance_event
/var/log/containers/nova/nova-compute.log:106:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     self.extend_volume(context, instance, event.tag)
/var/log/containers/nova/nova-compute.log:107:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/nova/compute/utils.py", line 1372, in decorated_function
/var/log/containers/nova/nova-compute.log:108:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
/var/log/containers/nova/nova-compute.log:109:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/nova/compute/manager.py", line 219, in decorated_function
/var/log/containers/nova/nova-compute.log:110:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     kwargs['instance'], e, sys.exc_info())
/var/log/containers/nova/nova-compute.log:111:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 220, in __exit__
/var/log/containers/nova/nova-compute.log:112:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     self.force_reraise()
/var/log/containers/nova/nova-compute.log:113:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
/var/log/containers/nova/nova-compute.log:114:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     six.reraise(self.type_, self.value, self.tb)
/var/log/containers/nova/nova-compute.log:115:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/six.py", line 693, in reraise
/var/log/containers/nova/nova-compute.log:116:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     raise value
/var/log/containers/nova/nova-compute.log:117:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/nova/compute/manager.py", line 207, in decorated_function
/var/log/containers/nova/nova-compute.log:118:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
/var/log/containers/nova/nova-compute.log:119:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/nova/compute/manager.py", line 9185, in extend_volume
/var/log/containers/nova/nova-compute.log:120:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     bdm.volume_size * units.Gi)
/var/log/containers/nova/nova-compute.log:121:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/driver.py", line 2198, in extend_volume
/var/log/containers/nova/nova-compute.log:122:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     raise exception.VolumeNotFound(volume_id=volume_id)
/var/log/containers/nova/nova-compute.log:123:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server nova.exception.VolumeNotFound: Volume e89b0c5d-6ea2-436f-b7d7-096f9cff0167 could not be found.
/var/log/containers/nova/nova-compute.log:124:2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server 
/var/log/containers/nova/nova-compute.log:239:2020-10-06 06:15:46.419 7 ERROR nova.virt.libvirt.driver [req-4e47de47-0d33-4e1f-94cd-6e39fc684b87 b7fe7757bbfb4ae1b971d378904e63c2 a1bda7588731428a903f082eedcf6fb4 - default default] Failure rebasing volume /dev/dm-0 on vdb.: libvirt.libvirtError: invalid argument: disk vdb does not have an active block job
/var/log/containers/nova/nova-compute.log:240:2020-10-06 06:15:46.419 7 ERROR nova.virt.libvirt.driver Traceback (most recent call last):
/var/log/containers/nova/nova-compute.log:241:2020-10-06 06:15:46.419 7 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/driver.py", line 1966, in _swap_volume
/var/log/containers/nova/nova-compute.log:242:2020-10-06 06:15:46.419 7 ERROR nova.virt.libvirt.driver     dev.abort_job(pivot=True)
/var/log/containers/nova/nova-compute.log:243:2020-10-06 06:15:46.419 7 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/guest.py", line 766, in abort_job
/var/log/containers/nova/nova-compute.log:244:2020-10-06 06:15:46.419 7 ERROR nova.virt.libvirt.driver     self._guest._domain.blockJobAbort(self._disk, flags=flags)
/var/log/containers/nova/nova-compute.log:245:2020-10-06 06:15:46.419 7 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.6/site-packages/eventlet/tpool.py", line 190, in doit
/var/log/containers/nova/nova-compute.log:246:2020-10-06 06:15:46.419 7 ERROR nova.virt.libvirt.driver     result = proxy_call(self._autowrap, f, *args, **kwargs)
/var/log/containers/nova/nova-compute.log:247:2020-10-06 06:15:46.419 7 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.6/site-packages/eventlet/tpool.py", line 148, in proxy_call
/var/log/containers/nova/nova-compute.log:248:2020-10-06 06:15:46.419 7 ERROR nova.virt.libvirt.driver     rv = execute(f, *args, **kwargs)
/var/log/containers/nova/nova-compute.log:249:2020-10-06 06:15:46.419 7 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.6/site-packages/eventlet/tpool.py", line 129, in execute
/var/log/containers/nova/nova-compute.log:250:2020-10-06 06:15:46.419 7 ERROR nova.virt.libvirt.driver     six.reraise(c, e, tb)
/var/log/containers/nova/nova-compute.log:251:2020-10-06 06:15:46.419 7 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.6/site-packages/six.py", line 693, in reraise
/var/log/containers/nova/nova-compute.log:252:2020-10-06 06:15:46.419 7 ERROR nova.virt.libvirt.driver     raise value
/var/log/containers/nova/nova-compute.log:253:2020-10-06 06:15:46.419 7 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.6/site-packages/eventlet/tpool.py", line 83, in tworker
/var/log/containers/nova/nova-compute.log:254:2020-10-06 06:15:46.419 7 ERROR nova.virt.libvirt.driver     rv = meth(*args, **kwargs)
/var/log/containers/nova/nova-compute.log:255:2020-10-06 06:15:46.419 7 ERROR nova.virt.libvirt.driver   File "/usr/lib64/python3.6/site-packages/libvirt.py", line 888, in blockJobAbort
/var/log/containers/nova/nova-compute.log:256:2020-10-06 06:15:46.419 7 ERROR nova.virt.libvirt.driver     if ret == -1: raise libvirtError ('virDomainBlockJobAbort() failed', dom=self)
/var/log/containers/nova/nova-compute.log:257:2020-10-06 06:15:46.419 7 ERROR nova.virt.libvirt.driver libvirt.libvirtError: invalid argument: disk vdb does not have an active block job
/var/log/containers/nova/nova-compute.log:258:2020-10-06 06:15:46.419 7 ERROR nova.virt.libvirt.driver 
/var/log/containers/nova/nova-compute.log:291:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [req-4e47de47-0d33-4e1f-94cd-6e39fc684b87 b7fe7757bbfb4ae1b971d378904e63c2 a1bda7588731428a903f082eedcf6fb4 - default default] [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0] Failed to swap volume 163a91f1-23b9-41f2-b134-fa0f6855b9e9 for 6ac700af-c79f-4f20-91a3-4c89efc48354: nova.exception.VolumeRebaseFailed: Volume rebase failed: invalid argument: disk vdb does not have an active block job
/var/log/containers/nova/nova-compute.log:292:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0] Traceback (most recent call last):
/var/log/containers/nova/nova-compute.log:293:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0]   File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/driver.py", line 1966, in _swap_volume
/var/log/containers/nova/nova-compute.log:294:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0]     dev.abort_job(pivot=True)
/var/log/containers/nova/nova-compute.log:295:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0]   File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/guest.py", line 766, in abort_job
/var/log/containers/nova/nova-compute.log:296:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0]     self._guest._domain.blockJobAbort(self._disk, flags=flags)
/var/log/containers/nova/nova-compute.log:297:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0]   File "/usr/lib/python3.6/site-packages/eventlet/tpool.py", line 190, in doit
/var/log/containers/nova/nova-compute.log:298:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0]     result = proxy_call(self._autowrap, f, *args, **kwargs)
/var/log/containers/nova/nova-compute.log:299:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0]   File "/usr/lib/python3.6/site-packages/eventlet/tpool.py", line 148, in proxy_call
/var/log/containers/nova/nova-compute.log:300:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0]     rv = execute(f, *args, **kwargs)
/var/log/containers/nova/nova-compute.log:301:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0]   File "/usr/lib/python3.6/site-packages/eventlet/tpool.py", line 129, in execute
/var/log/containers/nova/nova-compute.log:302:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0]     six.reraise(c, e, tb)
/var/log/containers/nova/nova-compute.log:303:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0]   File "/usr/lib/python3.6/site-packages/six.py", line 693, in reraise
/var/log/containers/nova/nova-compute.log:304:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0]     raise value
/var/log/containers/nova/nova-compute.log:305:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0]   File "/usr/lib/python3.6/site-packages/eventlet/tpool.py", line 83, in tworker
/var/log/containers/nova/nova-compute.log:306:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0]     rv = meth(*args, **kwargs)
/var/log/containers/nova/nova-compute.log:307:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0]   File "/usr/lib64/python3.6/site-packages/libvirt.py", line 888, in blockJobAbort
/var/log/containers/nova/nova-compute.log:308:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0]     if ret == -1: raise libvirtError ('virDomainBlockJobAbort() failed', dom=self)
/var/log/containers/nova/nova-compute.log:309:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0] libvirt.libvirtError: invalid argument: disk vdb does not have an active block job
/var/log/containers/nova/nova-compute.log:310:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0] 
/var/log/containers/nova/nova-compute.log:311:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0] During handling of the above exception, another exception occurred:
/var/log/containers/nova/nova-compute.log:312:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0] 
/var/log/containers/nova/nova-compute.log:313:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0] Traceback (most recent call last):
/var/log/containers/nova/nova-compute.log:314:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0]   File "/usr/lib/python3.6/site-packages/nova/compute/manager.py", line 6346, in _swap_volume
/var/log/containers/nova/nova-compute.log:315:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0]     mountpoint, resize_to)
/var/log/containers/nova/nova-compute.log:316:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0]   File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/driver.py", line 2038, in swap_volume
/var/log/containers/nova/nova-compute.log:317:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0]     self._disconnect_volume(context, new_connection_info, instance)
/var/log/containers/nova/nova-compute.log:318:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0]   File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 220, in __exit__
/var/log/containers/nova/nova-compute.log:319:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0]     self.force_reraise()
/var/log/containers/nova/nova-compute.log:320:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0]   File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
/var/log/containers/nova/nova-compute.log:321:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0]     six.reraise(self.type_, self.value, self.tb)
/var/log/containers/nova/nova-compute.log:322:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0]   File "/usr/lib/python3.6/site-packages/six.py", line 693, in reraise
/var/log/containers/nova/nova-compute.log:323:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0]     raise value
/var/log/containers/nova/nova-compute.log:324:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0]   File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/driver.py", line 2035, in swap_volume
/var/log/containers/nova/nova-compute.log:325:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0]     resize_to, hw_firmware_type)
/var/log/containers/nova/nova-compute.log:326:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0]   File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/driver.py", line 1976, in _swap_volume
/var/log/containers/nova/nova-compute.log:327:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0]     raise exception.VolumeRebaseFailed(reason=six.text_type(exc))
/var/log/containers/nova/nova-compute.log:328:2020-10-06 06:15:46.758 7 ERROR nova.compute.manager [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0] nova.exception.VolumeRebaseFailed: Volume rebase failed: invalid argument: disk vdb does not have an active block job

Comment 1 bkopilov 2020-10-06 06:48:02 UTC
Created attachment 1719286 [details]
compute-1-log directory

Comment 2 Lee Yarwood 2020-10-09 09:01:40 UTC
The issue here is with the initial attempt to extend the rbd volume under req-60bce194-f0d9-4a3f-857b-d8875982fb96:

  87 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server [req-60bce194-f0d9-4a3f-857b-d8875982fb96 b7fe7757bbfb4ae1b971d378904e63c2 a1bda7588731428a903f082eedcf6fb4 - default default] Exception during message handling: nova.exception.VolumeNotFound: Volume e89b0c5d-6ea2-436f-b7d7-096f9cff0167 could not be found.
  88 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
  89 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
  90 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
  91 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/dispatcher.py", line 274, in dispatch
  92 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
  93 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/dispatcher.py", line 194, in _do_dispatch
  94 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
  95 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/nova/exception_wrapper.py", line 79, in wrapped
  96 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     function_name, call_dict, binary, tb)
  97 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 220, in __exit__
  98 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     self.force_reraise()
  99 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
 100 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     six.reraise(self.type_, self.value, self.tb)
 101 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/six.py", line 693, in reraise
 102 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     raise value       
 103 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/nova/exception_wrapper.py", line 69, in wrapped
 104 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     return f(self, context, *args, **kw)
 105 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/nova/compute/manager.py", line 9327, in external_instance_event
 106 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     self.extend_volume(context, instance, event.tag)
 107 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/nova/compute/utils.py", line 1372, in decorated_function
 108 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
 109 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/nova/compute/manager.py", line 219, in decorated_function
 110 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     kwargs['instance'], e, sys.exc_info())
 111 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 220, in __exit__
 112 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     self.force_reraise()
 113 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
 114 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     six.reraise(self.type_, self.value, self.tb)
 115 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/six.py", line 693, in reraise
 116 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     raise value       
 117 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/nova/compute/manager.py", line 207, in decorated_function
 118 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
 119 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/nova/compute/manager.py", line 9185, in extend_volume
 120 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     bdm.volume_size * units.Gi)
 121 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/driver.py", line 2198, in extend_volume
 122 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     raise exception.VolumeNotFound(volume_id=volume_id)
 123 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server nova.exception.VolumeNotFound: Volume e89b0c5d-6ea2-436f-b7d7-096f9cff0167 could not be found.
 124 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server  

We use the following code to lookup the device in this case as we don't have a host block device to reference when calling libvirt with rbd volumes:

https://github.com/openstack/nova/blob/4cf72ea6bfc58d33da894f248184c08c36055884/nova/virt/libvirt/driver.py#L2091-L2106

Had you already retyped this volume prior to this attempt?

Looking further down I see the following connection_info listed that has a different serial than the volume_id:

 175 2020-10-06 06:15:42.306 7 DEBUG nova.compute.manager [req-4e47de47-0d33-4e1f-94cd-6e39fc684b87 b7fe7757bbfb4ae1b971d378904e63c2 a1bda7588731428a903f082eedcf6fb4 - default default] [instance: 3b01b6db-4027-4089-9294-e3c4667ee5c0] swap     _volume: Calling driver volume swap with connection infos: new: {[..]} , old: {'driver_volume_type': 'rbd', [..], 'volume_id': 'e89b0c5d-6ea2-436f-b7d7-096f9cff0167', 'serial': '163a91f1-23b9-41f2-b134-fa0f6855b9e9'} _swap_volume /usr/lib/python3.6/site-packages/nova/compute/manager.py:6343

^ That would cause the lookup during the extend to fail here and likely also caused the retype to then fail.

Setting a needinfo to confirm and then I'll reword the subject of the bug.

Comment 5 bkopilov 2020-10-09 10:48:22 UTC
(In reply to Lee Yarwood from comment #2)
> The issue here is with the initial attempt to extend the rbd volume under
> req-60bce194-f0d9-4a3f-857b-d8875982fb96:
> 
>   87 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server
> [req-60bce194-f0d9-4a3f-857b-d8875982fb96 b7fe7757bbfb4ae1b971d378904e63c2
> a1bda7588731428a903f082eedcf6fb4 - default default] Exception during message
> handling: nova.exception.VolumeNotFound: Volume
> e89b0c5d-6ea2-436f-b7d7-096f9cff0167 could not be found.
>   88 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server Traceback
> (most recent call last):
>   89 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File
> "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/server.py", line 165,
> in _process_incoming
>   90 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     res =
> self.dispatcher.dispatch(message)
>   91 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File
> "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/dispatcher.py", line
> 274, in dispatch
>   92 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     return
> self._do_dispatch(endpoint, method, ctxt, args)
>   93 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File
> "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/dispatcher.py", line
> 194, in _do_dispatch
>   94 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     result =
> func(ctxt, **new_args)
>   95 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File
> "/usr/lib/python3.6/site-packages/nova/exception_wrapper.py", line 79, in
> wrapped
>   96 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server    
> function_name, call_dict, binary, tb)
>   97 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File
> "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 220, in
> __exit__
>   98 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server    
> self.force_reraise()
>   99 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File
> "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 196, in
> force_reraise
>  100 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server    
> six.reraise(self.type_, self.value, self.tb)
>  101 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File
> "/usr/lib/python3.6/site-packages/six.py", line 693, in reraise
>  102 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     raise
> value       
>  103 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File
> "/usr/lib/python3.6/site-packages/nova/exception_wrapper.py", line 69, in
> wrapped
>  104 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     return
> f(self, context, *args, **kw)
>  105 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File
> "/usr/lib/python3.6/site-packages/nova/compute/manager.py", line 9327, in
> external_instance_event
>  106 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server    
> self.extend_volume(context, instance, event.tag)
>  107 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File
> "/usr/lib/python3.6/site-packages/nova/compute/utils.py", line 1372, in
> decorated_function
>  108 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     return
> function(self, context, *args, **kwargs)
>  109 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File
> "/usr/lib/python3.6/site-packages/nova/compute/manager.py", line 219, in
> decorated_function
>  110 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server    
> kwargs['instance'], e, sys.exc_info())
>  111 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File
> "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 220, in
> __exit__
>  112 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server    
> self.force_reraise()
>  113 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File
> "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 196, in
> force_reraise
>  114 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server    
> six.reraise(self.type_, self.value, self.tb)
>  115 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File
> "/usr/lib/python3.6/site-packages/six.py", line 693, in reraise
>  116 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     raise
> value       
>  117 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File
> "/usr/lib/python3.6/site-packages/nova/compute/manager.py", line 207, in
> decorated_function
>  118 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     return
> function(self, context, *args, **kwargs)
>  119 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File
> "/usr/lib/python3.6/site-packages/nova/compute/manager.py", line 9185, in
> extend_volume
>  120 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server    
> bdm.volume_size * units.Gi)
>  121 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server   File
> "/usr/lib/python3.6/site-packages/nova/virt/libvirt/driver.py", line 2198,
> in extend_volume
>  122 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server     raise
> exception.VolumeNotFound(volume_id=volume_id)
>  123 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server
> nova.exception.VolumeNotFound: Volume e89b0c5d-6ea2-436f-b7d7-096f9cff0167
> could not be found.
>  124 2020-10-06 06:14:33.801 7 ERROR oslo_messaging.rpc.server  
> 
> We use the following code to lookup the device in this case as we don't have
> a host block device to reference when calling libvirt with rbd volumes:
> 
> https://github.com/openstack/nova/blob/
> 4cf72ea6bfc58d33da894f248184c08c36055884/nova/virt/libvirt/driver.py#L2091-
> L2106
> 
> Had you already retyped this volume prior to this attempt?
> 
> Looking further down I see the following connection_info listed that has a
> different serial than the volume_id:
> 
>  175 2020-10-06 06:15:42.306 7 DEBUG nova.compute.manager
> [req-4e47de47-0d33-4e1f-94cd-6e39fc684b87 b7fe7757bbfb4ae1b971d378904e63c2
> a1bda7588731428a903f082eedcf6fb4 - default default] [instance:
> 3b01b6db-4027-4089-9294-e3c4667ee5c0] swap     _volume: Calling driver
> volume swap with connection infos: new: {[..]} , old: {'driver_volume_type':
> 'rbd', [..], 'volume_id': 'e89b0c5d-6ea2-436f-b7d7-096f9cff0167', 'serial':
> '163a91f1-23b9-41f2-b134-fa0f6855b9e9'} _swap_volume
> /usr/lib/python3.6/site-packages/nova/compute/manager.py:6343
> 
> ^ That would cause the lookup during the extend to fail here and likely also
> caused the retype to then fail.
> 
> Setting a needinfo to confirm and then I'll reword the subject of the bug.


Hi , 
Yes , i have retyped the volume before exteding ,
nova volume-attach
cinder retype to netapp 
cinder extend 
cinder retype back to ceph --> Fails (back to the

Comment 7 Artom Lifshitz 2025-01-15 22:56:09 UTC
This bug has received no update in the last almost 5 years; it's most likely safe to close. If this problem is reported by anyone else, we can re-investigate with the newest code and a new bug report.