Bug 1360424 - Cinder backup raises DeviceUnavailable for NFS volumes
Summary: Cinder backup raises DeviceUnavailable for NFS volumes
Keywords:
Status: CLOSED WORKSFORME
Alias: None
Product: Red Hat OpenStack
Classification: Red Hat
Component: openstack-cinder
Version: 7.0 (Kilo)
Hardware: x86_64
OS: Linux
high
high
Target Milestone: async
: 7.0 (Kilo)
Assignee: Gorka Eguileor
QA Contact: nlevinki
URL:
Whiteboard:
Depends On: 1349005 1371695 1371696
Blocks:
TreeView+ depends on / blocked
 
Reported: 2016-07-26 17:12 UTC by Gorka Eguileor
Modified: 2019-11-14 08:48 UTC (History)
6 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of: 1349005
Environment:
Last Closed: 2017-05-17 02:43:17 UTC
Target Upstream Version:
Embargoed:
tshefi: automate_bug-


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Launchpad 1603537 0 None None None 2016-07-26 17:12:35 UTC

Internal Links: 1371911

Description Gorka Eguileor 2016-07-26 17:12:36 UTC
+++ This bug was initially created as a clone of Bug #1349005 +++

Description of problem:

Backups to swift fail raising DeviceUnavailable for NFS volumes.

==> /var/log/cinder/backup.log <==
2016-07-19 13:19:52.530 4037 DEBUG oslo_concurrency.processutils [req-dda7d934-4344-476a-a07c-05fd2462938d 95520872a30443688f73eb48af516de3 a3a014b0248a4c6e86207427674269db - - -] CMD "dd if=/var/lib/cinder/mnt/b069a4f25c0d852979dcb962468c1829/volume-e5e8dff8-ab7c-4ef5-83ac-6dd8376d472b of=/dev/null count=1" returned: 1 in 0.724s execute /usr/lib/python2.7/site-packages/oslo_concurrency/processutils.py:254
2016-07-19 13:19:52.533 4037 DEBUG oslo_concurrency.processutils [req-dda7d934-4344-476a-a07c-05fd2462938d 95520872a30443688f73eb48af516de3 a3a014b0248a4c6e86207427674269db - - -] u'dd if=/var/lib/cinder/mnt/b069a4f25c0d852979dcb962468c1829/volume-e5e8dff8-ab7c-4ef5-83ac-6dd8376d472b of=/dev/null count=1' failed. Not Retrying. execute /usr/lib/python2.7/site-packages/oslo_concurrency/processutils.py:291
2016-07-19 13:19:52.534 4037 ERROR cinder.brick.initiator.connector [req-dda7d934-4344-476a-a07c-05fd2462938d 95520872a30443688f73eb48af516de3 a3a014b0248a4c6e86207427674269db - - -] Failed to access the device on the path /var/lib/cinder/mnt/b069a4f25c0d852979dcb962468c1829/volume-e5e8dff8-ab7c-4ef5-83ac-6dd8376d472b: dd: failed to open /var/lib/cinder/mnt/b069a4f25c0d852979dcb962468c1829/volume-e5e8dff8-ab7c-4ef5-83ac-6dd8376d472b: Permission denied None.
2016-07-19 13:19:52.707 4037 ERROR oslo_messaging.rpc.dispatcher [req-dda7d934-4344-476a-a07c-05fd2462938d 95520872a30443688f73eb48af516de3 a3a014b0248a4c6e86207427674269db - - -] Exception during message handling: The device in the path /var/lib/cinder/mnt/b069a4f25c0d852979dcb962468c1829/volume-e5e8dff8-ab7c-4ef5-83ac-6dd8376d472b is unavailable: Unable to access the backend storage via the path /var/lib/cinder/mnt/b069a4f25c0d852979dcb962468c1829/volume-e5e8dff8-ab7c-4ef5-83ac-6dd8376d472b.
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher Traceback (most recent call last):
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher   File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 142, in _dispatch_and_reply
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher     executor_callback))
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher   File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 186, in _dispatch
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher     executor_callback)
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher   File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 130, in _do_dispatch
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher     result = func(ctxt, **new_args)
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher   File "/usr/lib/python2.7/site-packages/cinder/backup/manager.py", line 301, in create_backup
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher     'fail_reason': six.text_type(err)})
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher   File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 85, in __exit__
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher     six.reraise(self.type_, self.value, self.tb)
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher   File "/usr/lib/python2.7/site-packages/cinder/backup/manager.py", line 294, in create_backup
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher     backup_service)
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher   File "/usr/lib/python2.7/site-packages/cinder/volume/driver.py", line 743, in backup_volume
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher     attach_info, volume = self._attach_volume(context, volume, properties)
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher   File "/usr/lib/python2.7/site-packages/cinder/volume/driver.py", line 700, in _attach_volume
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher     return (self._connect_device(conn), volume)
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher   File "/usr/lib/python2.7/site-packages/cinder/volume/driver.py", line 724, in _connect_device
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher     {'path': host_device}))
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher DeviceUnavailable: The device in the path /var/lib/cinder/mnt/b069a4f25c0d852979dcb962468c1829/volume-e5e8dff8-ab7c-4ef5-83ac-6dd8376d472b is unavailable: Unable to access the backend storage via the path /var/lib/cinder/mnt/b069a4f25c0d852979dcb962468c1829/volume-e5e8dff8-ab7c-4ef5-83ac-6dd8376d472b.
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher

Comment 1 Gorka Eguileor 2016-07-26 17:15:44 UTC
Problem is caused by an inconsistent behavior of nas_secure_file_operations working on auto mode.

A workaround for this is setting disabling it in the specific driver configuration:

[mydriversection]
nas_secure_file_operations=false

Comment 2 Paul Grist 2017-05-17 02:43:17 UTC
Closing this clone of the BZ and the case for this was also closed.  There are additional BZ tagged with cinder_nas_secure to work the remaining issues.

Comment 3 Tzach Shefi 2019-04-29 14:44:32 UTC
Nothing to test/automate per close loop as closed works for me.


Note You need to log in before you can comment on or make changes to this bug.