Bug 2009278 - When adding a qcow2 image to a cinder backed glance_store, the incremental volume extend is broken
Summary: When adding a qcow2 image to a cinder backed glance_store, the incremental vo...
Keywords:
Status: CLOSED DUPLICATE of bug 2004316
Alias: None
Product: Red Hat OpenStack
Classification: Red Hat
Component: python-glance-store
Version: 16.1 (Train)
Hardware: Unspecified
OS: Unspecified
high
high
Target Milestone: z2
: 16.2 (Train on RHEL 8.4)
Assignee: Rajat Dhasmana
QA Contact: Tzach Shefi
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2021-09-30 09:37 UTC by Giulio Fidente
Modified: 2022-01-12 16:29 UTC (History)
4 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2022-01-12 16:29:12 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Red Hat Issue Tracker OSP-10047 0 None None None 2021-11-15 12:42:17 UTC

Description Giulio Fidente 2021-09-30 09:37:42 UTC
This bug was initially created as a copy of Bug #1838653

[1] Glance API calls cinder to create a volume, volume is created but since image_size is 0 (presumably), glance_store seems to extend it starting at 2Gb. In this case, the qcow2 has a virtual_size of 8Gb.

[2] cinder fails to resizebecause that would shrink the image.

This is not reproducible with raw images, only qcow2.

Version-Release number of selected component (if applicable):
containers_ga-cinder-api                  16.0-90
containers_ga-cinder-volume               16.0-90
containers_ga-glance-api                  16.0-91
containers_ga-cinder-scheduler            16.0-92

How reproducible:
All the time

Steps to Reproduce:
1. Configure glance_store with cinder
2. upload qcow2 image

Actual results:
Doesn't work because we're trying to shrink the volume

Expected results:
Should work

Additional info:

[1]
~~~
2020-05-18 13:45:08.589 34 DEBUG eventlet.wsgi.server [-] (34) accepted ('192.168.66.224', 52042) server /usr/lib/python3.6/site-packages/eventlet/wsgi.py:985
2020-05-18 13:45:08.591 34 DEBUG glance.api.middleware.version_negotiation [-] Determining version of request: PUT /v2/images/674edd04-4602-4ee2-b9e8-afaf691cba16/file Accept: */* process_request /usr/lib/python3.6/site-packages/glance/api/middleware/version_negotiation.py:45
2020-05-18 13:45:08.592 34 DEBUG glance.api.middleware.version_negotiation [-] Using url versioning process_request /usr/lib/python3.6/site-packages/glance/api/middleware/version_negotiation.py:57
2020-05-18 13:45:08.592 34 DEBUG glance.api.middleware.version_negotiation [-] Matched version: v2 process_request /usr/lib/python3.6/site-packages/glance/api/middleware/version_negotiation.py:69
2020-05-18 13:45:08.593 34 DEBUG glance.api.middleware.version_negotiation [-] new path /v2/images/674edd04-4602-4ee2-b9e8-afaf691cba16/file process_request /usr/lib/python3.6/site-packages/glance/api/middleware/version_negotiation.py:70
2020-05-18 13:45:08.685 34 DEBUG glance_store._drivers.cinder [req-e7298df7-ba7e-41ca-9e00-30c74ca489b3 2119753b12fb46ef96dfe741de358ebf 32acd4356aee41cf93f7038e588f714f - default default] Cinderclient connection created for user glance using URL: http://192.168.66.207:5000/v3. get_cinderclient /usr/lib/python3.6/site-packages/glance_store/_drivers/cinder.py:375
2020-05-18 13:45:08.686 34 DEBUG glance_store._drivers.cinder [req-e7298df7-ba7e-41ca-9e00-30c74ca489b3 2119753b12fb46ef96dfe741de358ebf 32acd4356aee41cf93f7038e588f714f - default default] Creating a new volume: image_size=0 size_gb=1 type=None add /usr/lib/python3.6/site-packages/glance_store/_drivers/cinder.py:695
2020-05-18 13:45:08.686 34 INFO glance_store._drivers.cinder [req-e7298df7-ba7e-41ca-9e00-30c74ca489b3 2119753b12fb46ef96dfe741de358ebf 32acd4356aee41cf93f7038e588f714f - default default] Since image size is zero, we will be doing resize-before-write for each GB which will be considerably slower than normal.
2020-05-18 13:45:10.332 25 DEBUG eventlet.wsgi.server [-] (25) accepted ('192.168.66.148', 34534) server /usr/lib/python3.6/site-packages/eventlet/wsgi.py:985
2020-05-18 13:45:10.336 29 DEBUG eventlet.wsgi.server [-] (29) accepted ('192.168.66.224', 52126) server /usr/lib/python3.6/site-packages/eventlet/wsgi.py:985
2020-05-18 13:45:10.337 34 DEBUG eventlet.wsgi.server [-] (34) accepted ('192.168.66.201', 60758) server /usr/lib/python3.6/site-packages/eventlet/wsgi.py:985
2020-05-18 13:45:10.338 25 INFO eventlet.wsgi.server [-] 192.168.66.148 - - [18/May/2020 13:45:10] "GET /healthcheck HTTP/1.0" 200 137 0.004603
2020-05-18 13:45:10.342 29 INFO eventlet.wsgi.server [-] 192.168.66.224 - - [18/May/2020 13:45:10] "GET /healthcheck HTTP/1.0" 200 137 0.004269
2020-05-18 13:45:10.344 34 INFO eventlet.wsgi.server [-] 192.168.66.201 - - [18/May/2020 13:45:10] "GET /healthcheck HTTP/1.0" 200 137 0.004926
2020-05-18 13:45:12.252 34 DEBUG os_brick.utils [req-e7298df7-ba7e-41ca-9e00-30c74ca489b3 2119753b12fb46ef96dfe741de358ebf 32acd4356aee41cf93f7038e588f714f - default default] ==> get_connector_properties: call "{'root_helper': 'sudo glance-rootwrap /etc/glance/rootwrap.conf', 'my_ip': 'txslst02nce-controller-0', 'multipath': False, 'enforce_multipath': False, 'host': None, 'execute': None}" trace_logging_wrapper /usr/lib/python3.6/site-packages/os_brick/utils.py:146
2020-05-18 13:45:12.255 29557 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:372
2020-05-18 13:45:12.270 29557 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.015s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:409
2020-05-18 13:45:12.271 29557 DEBUG oslo.privsep.daemon [-] privsep: reply[139807552648912]: (4, ('InitiatorName=iqn.1994-05.com.redhat:913cc589542f\n', '')) _call_back /usr/lib/python3.6/site-packages/oslo_privsep/daemon.py:475
2020-05-18 13:45:12.272 34 DEBUG os_brick.initiator.linuxfc [req-e7298df7-ba7e-41ca-9e00-30c74ca489b3 2119753b12fb46ef96dfe741de358ebf 32acd4356aee41cf93f7038e588f714f - default default] No Fibre Channel support detected on system. get_fc_hbas /usr/lib/python3.6/site-packages/os_brick/initiator/linuxfc.py:134
2020-05-18 13:45:12.272 34 DEBUG os_brick.initiator.linuxfc [req-e7298df7-ba7e-41ca-9e00-30c74ca489b3 2119753b12fb46ef96dfe741de358ebf 32acd4356aee41cf93f7038e588f714f - default default] No Fibre Channel support detected on system. get_fc_hbas /usr/lib/python3.6/site-packages/os_brick/initiator/linuxfc.py:134
2020-05-18 13:45:12.275 29557 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /sys/class/dmi/id/product_uuid execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:372
2020-05-18 13:45:12.286 29557 DEBUG oslo_concurrency.processutils [-] CMD "cat /sys/class/dmi/id/product_uuid" returned: 0 in 0.011s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:409
2020-05-18 13:45:12.286 29557 DEBUG oslo.privsep.daemon [-] privsep: reply[139807552648912]: (4, ('2f772d41-0a46-4678-b173-76ddab9fe358\n', '')) _call_back /usr/lib/python3.6/site-packages/oslo_privsep/daemon.py:475
2020-05-18 13:45:12.288 34 DEBUG os_brick.utils [req-e7298df7-ba7e-41ca-9e00-30c74ca489b3 2119753b12fb46ef96dfe741de358ebf 32acd4356aee41cf93f7038e588f714f - default default] <== get_connector_properties: return (35ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': 'txslst02nce-controller-0', 'host': 'txslst02nce-controller-0', 'multipath': False, 'initiator': 'iqn.1994-05.com.redhat:913cc589542f', 'do_local_attach': False, 'system uuid': '2f772d41-0a46-4678-b173-76ddab9fe358'} trace_logging_wrapper /usr/lib/python3.6/site-packages/os_brick/utils.py:170
2020-05-18 13:45:12.338 34 DEBUG eventlet.wsgi.server [-] (34) accepted ('192.168.66.148', 34640) server /usr/lib/python3.6/site-packages/eventlet/wsgi.py:985
2020-05-18 13:45:12.344 34 INFO eventlet.wsgi.server [-] 192.168.66.148 - - [18/May/2020 13:45:12] "GET /healthcheck HTTP/1.0" 200 137 0.004465
2020-05-18 13:45:12.344 31 DEBUG eventlet.wsgi.server [-] (31) accepted ('192.168.66.224', 52218) server /usr/lib/python3.6/site-packages/eventlet/wsgi.py:985
2020-05-18 13:45:12.345 26 DEBUG eventlet.wsgi.server [-] (26) accepted ('192.168.66.201', 60844) server /usr/lib/python3.6/site-packages/eventlet/wsgi.py:985
2020-05-18 13:45:12.350 31 INFO eventlet.wsgi.server [-] 192.168.66.224 - - [18/May/2020 13:45:12] "GET /healthcheck HTTP/1.0" 200 137 0.004284
2020-05-18 13:45:12.351 26 INFO eventlet.wsgi.server [-] 192.168.66.201 - - [18/May/2020 13:45:12] "GET /healthcheck HTTP/1.0" 200 137 0.004348
2020-05-18 13:45:12.742 34 DEBUG os_brick.initiator.connector [req-e7298df7-ba7e-41ca-9e00-30c74ca489b3 2119753b12fb46ef96dfe741de358ebf 32acd4356aee41cf93f7038e588f714f - default default] Factory for nfs on None factory /usr/lib/python3.6/site-packages/os_brick/initiator/connector.py:279
2020-05-18 13:45:12.744 34 DEBUG os_brick.initiator.connectors.remotefs [req-e7298df7-ba7e-41ca-9e00-30c74ca489b3 2119753b12fb46ef96dfe741de358ebf 32acd4356aee41cf93f7038e588f714f - default default] ==> connect_volume: call "{'self': <os_brick.initiator.connectors.remotefs.RemoteFsConnector object at 0x7f277b5f2a90>, 'connection_properties': {'export': '192.168.76.99:/stack2_nfs_2', 'name': 'volume-9e718bf8-8fc4-42e6-be6e-7507425937d2', 'options': None, 'format': 'raw', 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False}}" trace_logging_wrapper /usr/lib/python3.6/site-packages/os_brick/utils.py:146
2020-05-18 13:45:12.744 34 DEBUG oslo_concurrency.processutils [req-e7298df7-ba7e-41ca-9e00-30c74ca489b3 2119753b12fb46ef96dfe741de358ebf 32acd4356aee41cf93f7038e588f714f - default default] Running cmd (subprocess): mount execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:372
2020-05-18 13:45:12.764 34 DEBUG oslo_concurrency.processutils [req-e7298df7-ba7e-41ca-9e00-30c74ca489b3 2119753b12fb46ef96dfe741de358ebf 32acd4356aee41cf93f7038e588f714f - default default] CMD "mount" returned: 0 in 0.020s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:409
2020-05-18 13:45:12.766 34 DEBUG os_brick.remotefs.remotefs [req-e7298df7-ba7e-41ca-9e00-30c74ca489b3 2119753b12fb46ef96dfe741de358ebf 32acd4356aee41cf93f7038e588f714f - default default] Already mounted: /var/lib/cinder/mnt/04e1bdb760091e51531be717730c8ab6 mount /usr/lib/python3.6/site-packages/os_brick/remotefs/remotefs.py:100
2020-05-18 13:45:12.767 34 DEBUG os_brick.initiator.connectors.remotefs [req-e7298df7-ba7e-41ca-9e00-30c74ca489b3 2119753b12fb46ef96dfe741de358ebf 32acd4356aee41cf93f7038e588f714f - default default] <== connect_volume: return (22ms) {'path': '/var/lib/cinder/mnt/04e1bdb760091e51531be717730c8ab6/volume-9e718bf8-8fc4-42e6-be6e-7507425937d2'} trace_logging_wrapper /usr/lib/python3.6/site-packages/os_brick/utils.py:170
2020-05-18 13:45:13.366 34 DEBUG oslo_concurrency.processutils [req-e7298df7-ba7e-41ca-9e00-30c74ca489b3 2119753b12fb46ef96dfe741de358ebf 32acd4356aee41cf93f7038e588f714f - default default] Running cmd (subprocess): sudo glance-rootwrap /etc/glance/rootwrap.conf chown 42415 /var/lib/cinder/mnt/04e1bdb760091e51531be717730c8ab6/volume-9e718bf8-8fc4-42e6-be6e-7507425937d2 execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:372
2020-05-18 13:45:13.764 34 DEBUG oslo_concurrency.processutils [req-e7298df7-ba7e-41ca-9e00-30c74ca489b3 2119753b12fb46ef96dfe741de358ebf 32acd4356aee41cf93f7038e588f714f - default default] CMD "sudo glance-rootwrap /etc/glance/rootwrap.conf chown 42415 /var/lib/cinder/mnt/04e1bdb760091e51531be717730c8ab6/volume-9e718bf8-8fc4-42e6-be6e-7507425937d2" returned: 0 in 0.398s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:409
[...]
2020-05-18 13:45:30.763 34 DEBUG oslo_concurrency.processutils [req-e7298df7-ba7e-41ca-9e00-30c74ca489b3 2119753b12fb46ef96dfe741de358ebf 32acd4356aee41cf93f7038e588f714f - default default] Running cmd (subprocess): sudo glance-rootwrap /etc/glance/rootwrap.conf chown 65534 /var/lib/cinder/mnt/04e1bdb760091e51531be717730c8ab6/volume-9e718bf8-8fc4-42e6-be6e-7507425937d2 execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:372
2020-05-18 13:45:31.212 34 DEBUG oslo_concurrency.processutils [req-e7298df7-ba7e-41ca-9e00-30c74ca489b3 2119753b12fb46ef96dfe741de358ebf 32acd4356aee41cf93f7038e588f714f - default default] CMD "sudo glance-rootwrap /etc/glance/rootwrap.conf chown 65534 /var/lib/cinder/mnt/04e1bdb760091e51531be717730c8ab6/volume-9e718bf8-8fc4-42e6-be6e-7507425937d2" returned: 0 in 0.449s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:409
2020-05-18 13:45:31.366 34 DEBUG os_brick.initiator.connectors.remotefs [req-e7298df7-ba7e-41ca-9e00-30c74ca489b3 2119753b12fb46ef96dfe741de358ebf 32acd4356aee41cf93f7038e588f714f - default default] ==> disconnect_volume: call "{'self': <os_brick.initiator.connectors.remotefs.RemoteFsConnector object at 0x7f277b5f2a90>, 'connection_properties': {'export': '192.168.76.99:/stack2_nfs_2', 'name': 'volume-9e718bf8-8fc4-42e6-be6e-7507425937d2', 'options': None, 'format': 'raw', 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False}, 'device_info': {'path': '/var/lib/cinder/mnt/04e1bdb760091e51531be717730c8ab6/volume-9e718bf8-8fc4-42e6-be6e-7507425937d2'}, 'force': False, 'ignore_errors': False}" trace_logging_wrapper /usr/lib/python3.6/site-packages/os_brick/utils.py:146
2020-05-18 13:45:31.366 34 DEBUG os_brick.initiator.connectors.remotefs [req-e7298df7-ba7e-41ca-9e00-30c74ca489b3 2119753b12fb46ef96dfe741de358ebf 32acd4356aee41cf93f7038e588f714f - default default] <== disconnect_volume: return (0ms) None trace_logging_wrapper /usr/lib/python3.6/site-packages/os_brick/utils.py:170
2020-05-18 13:45:31.873 34 DEBUG glance_store._drivers.cinder [req-e7298df7-ba7e-41ca-9e00-30c74ca489b3 2119753b12fb46ef96dfe741de358ebf 32acd4356aee41cf93f7038e588f714f - default default] Extending volume 9e718bf8-8fc4-42e6-be6e-7507425937d2 to 2 GB. add /usr/lib/python3.6/site-packages/glance_store/_drivers/cinder.py:733
2020-05-18 13:45:32.416 24 DEBUG eventlet.wsgi.server [-] (24) accepted ('192.168.66.148', 35566) server /usr/lib/python3.6/site-packages/eventlet/wsgi.py:985
2020-05-18 13:45:32.417 28 DEBUG eventlet.wsgi.server [-] (28) accepted ('192.168.66.224', 53116) server /usr/lib/python3.6/site-packages/eventlet/wsgi.py:985
2020-05-18 13:45:32.421 26 DEBUG eventlet.wsgi.server [-] (26) accepted ('192.168.66.201', 33612) server /usr/lib/python3.6/site-packages/eventlet/wsgi.py:985
2020-05-18 13:45:32.422 24 INFO eventlet.wsgi.server [-] 192.168.66.148 - - [18/May/2020 13:45:32] "GET /healthcheck HTTP/1.0" 200 137 0.004758
2020-05-18 13:45:32.423 28 INFO eventlet.wsgi.server [-] 192.168.66.224 - - [18/May/2020 13:45:32] "GET /healthcheck HTTP/1.0" 200 137 0.004617
2020-05-18 13:45:32.427 26 INFO eventlet.wsgi.server [-] 192.168.66.201 - - [18/May/2020 13:45:32] "GET /healthcheck HTTP/1.0" 200 137 0.004280
2020-05-18 13:45:32.947 34 ERROR glance_store._drivers.cinder [req-e7298df7-ba7e-41ca-9e00-30c74ca489b3 2119753b12fb46ef96dfe741de358ebf 32acd4356aee41cf93f7038e588f714f - default default] The status of volume 9e718bf8-8fc4-42e6-be6e-7507425937d2 is unexpected: status = error_extending, expected = available.
2020-05-18 13:45:32.947 34 ERROR glance_store._drivers.cinder [req-e7298df7-ba7e-41ca-9e00-30c74ca489b3 2119753b12fb46ef96dfe741de358ebf 32acd4356aee41cf93f7038e588f714f - default default] Failed to write to volume 9e718bf8-8fc4-42e6-be6e-7507425937d2.: glance_store.exceptions.StorageFull: There is not enough disk space on the image storage media.
2020-05-18 13:45:33.103 34 ERROR glance.api.v2.image_data [req-e7298df7-ba7e-41ca-9e00-30c74ca489b3 2119753b12fb46ef96dfe741de358ebf 32acd4356aee41cf93f7038e588f714f - default default] Failed to upload image data due to HTTP error: webob.exc.HTTPRequestEntityTooLarge: Image storage media is full: There is not enough disk space on the image storage media.
2020-05-18 13:45:34.026 34 INFO eventlet.wsgi.server [req-e7298df7-ba7e-41ca-9e00-30c74ca489b3 2119753b12fb46ef96dfe741de358ebf 32acd4356aee41cf93f7038e588f714f - default default] 192.168.66.224 - - [18/May/2020 13:45:34] "PUT /v2/images/674edd04-4602-4ee2-b9e8-afaf691cba16/file HTTP/1.1" 413 444 25.434945
~~~

[2]
~~~
2020-05-18 13:45:32.196 81 INFO cinder.volume.drivers.netapp.dataontap.nfs_base [req-9f08f128-80db-4dcf-a5f2-e034244c154a fe01d20dfa1d49cd9334f378806db021 73dbb175f8314acaa69a33236ba68e0f - default default] Extending volume volume-9e718bf8-8fc4-42e6-be6e-7507425937d2.
2020-05-18 13:45:32.197 81 DEBUG cinder.volume.drivers.netapp.dataontap.nfs_base [req-9f08f128-80db-4dcf-a5f2-e034244c154a fe01d20dfa1d49cd9334f378806db021 73dbb175f8314acaa69a33236ba68e0f - default default] Checking file for resize _resize_image_file /usr/lib/python3.6/site-packages/cinder/volume/drivers/netapp/dataontap/nfs_base.py:649
2020-05-18 13:45:32.198 81 DEBUG oslo_concurrency.processutils [req-9f08f128-80db-4dcf-a5f2-e034244c154a fe01d20dfa1d49cd9334f378806db021 73dbb175f8314acaa69a33236ba68e0f - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C qemu-img info /var/lib/cinder/mnt/04e1bdb760091e51531be717730c8ab6/volume-9e718bf8-8fc4-42e6-be6e-7507425937d2 execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:372
2020-05-18 13:45:32.353 81 DEBUG oslo_concurrency.processutils [req-9f08f128-80db-4dcf-a5f2-e034244c154a fe01d20dfa1d49cd9334f378806db021 73dbb175f8314acaa69a33236ba68e0f - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C qemu-img info /var/lib/cinder/mnt/04e1bdb760091e51531be717730c8ab6/volume-9e718bf8-8fc4-42e6-be6e-7507425937d2" returned: 0 in 0.155s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:409
2020-05-18 13:45:32.356 81 INFO cinder.volume.drivers.netapp.dataontap.nfs_base [req-9f08f128-80db-4dcf-a5f2-e034244c154a fe01d20dfa1d49cd9334f378806db021 73dbb175f8314acaa69a33236ba68e0f - default default] Resizing file to 2G
2020-05-18 13:45:32.357 81 DEBUG oslo_concurrency.processutils [req-9f08f128-80db-4dcf-a5f2-e034244c154a fe01d20dfa1d49cd9334f378806db021 73dbb175f8314acaa69a33236ba68e0f - default default] Running cmd (subprocess): qemu-img resize /var/lib/cinder/mnt/04e1bdb760091e51531be717730c8ab6/volume-9e718bf8-8fc4-42e6-be6e-7507425937d2 2G execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:372
2020-05-18 13:45:32.399 81 DEBUG oslo_concurrency.processutils [req-9f08f128-80db-4dcf-a5f2-e034244c154a fe01d20dfa1d49cd9334f378806db021 73dbb175f8314acaa69a33236ba68e0f - default default] CMD "qemu-img resize /var/lib/cinder/mnt/04e1bdb760091e51531be717730c8ab6/volume-9e718bf8-8fc4-42e6-be6e-7507425937d2 2G" returned: 1 in 0.042s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:409
2020-05-18 13:45:32.401 81 DEBUG oslo_concurrency.processutils [req-9f08f128-80db-4dcf-a5f2-e034244c154a fe01d20dfa1d49cd9334f378806db021 73dbb175f8314acaa69a33236ba68e0f - default default] 'qemu-img resize /var/lib/cinder/mnt/04e1bdb760091e51531be717730c8ab6/volume-9e718bf8-8fc4-42e6-be6e-7507425937d2 2G' failed. Not Retrying. execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:457
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager [req-9f08f128-80db-4dcf-a5f2-e034244c154a fe01d20dfa1d49cd9334f378806db021 73dbb175f8314acaa69a33236ba68e0f - default default] Extend volume failed.: cinder.exception.VolumeBackendAPIException: Bad or unexpected response from the storage volume backend API: Failed to extend volume volume-9e718bf8-8fc4-42e6-be6e-7507425937d2, Error msg: Unexpected error while running command.
Command: qemu-img resize /var/lib/cinder/mnt/04e1bdb760091e51531be717730c8ab6/volume-9e718bf8-8fc4-42e6-be6e-7507425937d2 2G
Exit code: 1
Stdout: ''
Stderr: "qemu-img: warning: Shrinking an image will delete all data beyond the shrunken image's end. Before performing such an operation, make sure there is no important data there.\nqemu-img: Use the --shrink option to perform a shrink operation.\n".
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager Traceback (most recent call last):
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager   File "/usr/lib/python3.6/site-packages/cinder/volume/drivers/netapp/dataontap/nfs_base.py", line 801, in extend_volume
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager     self._resize_image_file(path, new_size)
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager   File "/usr/lib/python3.6/site-packages/cinder/utils.py", line 727, in trace_method_logging_wrapper
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager     return f(*args, **kwargs)
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager   File "/usr/lib/python3.6/site-packages/cinder/volume/drivers/netapp/dataontap/nfs_base.py", line 655, in _resize_image_file
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager     run_as_root=self._execute_as_root)
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager   File "/usr/lib/python3.6/site-packages/cinder/image/image_utils.py", line 334, in resize_image
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager     utils.execute(*cmd, run_as_root=run_as_root)
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager   File "/usr/lib/python3.6/site-packages/cinder/utils.py", line 126, in execute
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager     return processutils.execute(*cmd, **kwargs)
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager   File "/usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py", line 424, in execute
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager     cmd=sanitized_cmd)
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command.
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager Command: qemu-img resize /var/lib/cinder/mnt/04e1bdb760091e51531be717730c8ab6/volume-9e718bf8-8fc4-42e6-be6e-7507425937d2 2G
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager Exit code: 1
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager Stdout: ''
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager Stderr: "qemu-img: warning: Shrinking an image will delete all data beyond the shrunken image's end. Before performing such an operation, make sure there is no important data there.\nqemu-img: Use the --shrink option to perform a shrink operation.\n"
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager During handling of the above exception, another exception occurred:
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager Traceback (most recent call last):
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager   File "/usr/lib/python3.6/site-packages/cinder/volume/manager.py", line 2733, in extend_volume
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager     self.driver.extend_volume(volume, new_size)
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager   File "/usr/lib/python3.6/site-packages/cinder/utils.py", line 727, in trace_method_logging_wrapper
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager     return f(*args, **kwargs)
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager   File "/usr/lib/python3.6/site-packages/cinder/volume/drivers/netapp/dataontap/nfs_base.py", line 807, in extend_volume
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager     raise exception.VolumeBackendAPIException(data=exception_msg)
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager cinder.exception.VolumeBackendAPIException: Bad or unexpected response from the storage volume backend API: Failed to extend volume volume-9e718bf8-8fc4-42e6-be6e-7507425937d2, Error msg: Unexpected error while running command.
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager Command: qemu-img resize /var/lib/cinder/mnt/04e1bdb760091e51531be717730c8ab6/volume-9e718bf8-8fc4-42e6-be6e-7507425937d2 2G
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager Exit code: 1
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager Stdout: ''
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager Stderr: "qemu-img: warning: Shrinking an image will delete all data beyond the shrunken image's end. Before performing such an operation, make sure there is no important data there.\nqemu-img: Use the --shrink option to perform a shrink operation.\n".
2020-05-18 13:45:32.404 81 ERROR cinder.volume.manager
~~~

Comment 5 Cyril Roelandt 2022-01-12 16:29:12 UTC

*** This bug has been marked as a duplicate of bug 2004316 ***


Note You need to log in before you can comment on or make changes to this bug.