Bug 1360424

Summary: Cinder backup raises DeviceUnavailable for NFS volumes
Product: Red Hat OpenStack Reporter: Gorka Eguileor <geguileo>
Component: openstack-cinderAssignee: Gorka Eguileor <geguileo>
Status: CLOSED WORKSFORME QA Contact: nlevinki <nlevinki>
Severity: high Docs Contact:
Priority: high    
Version: 7.0 (Kilo)CC: abishop, cschwede, eharney, geguileo, pgrist, tshefi
Target Milestone: asyncKeywords: Triaged, ZStream
Target Release: 7.0 (Kilo)Flags: tshefi: automate_bug-
Hardware: x86_64   
OS: Linux   
Whiteboard:
Fixed In Version: Doc Type: If docs needed, set a value
Doc Text:
Story Points: ---
Clone Of: 1349005 Environment:
Last Closed: 2017-05-17 02:43:17 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:
Bug Depends On: 1349005, 1371695, 1371696    
Bug Blocks:    

Description Gorka Eguileor 2016-07-26 17:12:36 UTC
+++ This bug was initially created as a clone of Bug #1349005 +++

Description of problem:

Backups to swift fail raising DeviceUnavailable for NFS volumes.

==> /var/log/cinder/backup.log <==
2016-07-19 13:19:52.530 4037 DEBUG oslo_concurrency.processutils [req-dda7d934-4344-476a-a07c-05fd2462938d 95520872a30443688f73eb48af516de3 a3a014b0248a4c6e86207427674269db - - -] CMD "dd if=/var/lib/cinder/mnt/b069a4f25c0d852979dcb962468c1829/volume-e5e8dff8-ab7c-4ef5-83ac-6dd8376d472b of=/dev/null count=1" returned: 1 in 0.724s execute /usr/lib/python2.7/site-packages/oslo_concurrency/processutils.py:254
2016-07-19 13:19:52.533 4037 DEBUG oslo_concurrency.processutils [req-dda7d934-4344-476a-a07c-05fd2462938d 95520872a30443688f73eb48af516de3 a3a014b0248a4c6e86207427674269db - - -] u'dd if=/var/lib/cinder/mnt/b069a4f25c0d852979dcb962468c1829/volume-e5e8dff8-ab7c-4ef5-83ac-6dd8376d472b of=/dev/null count=1' failed. Not Retrying. execute /usr/lib/python2.7/site-packages/oslo_concurrency/processutils.py:291
2016-07-19 13:19:52.534 4037 ERROR cinder.brick.initiator.connector [req-dda7d934-4344-476a-a07c-05fd2462938d 95520872a30443688f73eb48af516de3 a3a014b0248a4c6e86207427674269db - - -] Failed to access the device on the path /var/lib/cinder/mnt/b069a4f25c0d852979dcb962468c1829/volume-e5e8dff8-ab7c-4ef5-83ac-6dd8376d472b: dd: failed to open /var/lib/cinder/mnt/b069a4f25c0d852979dcb962468c1829/volume-e5e8dff8-ab7c-4ef5-83ac-6dd8376d472b: Permission denied None.
2016-07-19 13:19:52.707 4037 ERROR oslo_messaging.rpc.dispatcher [req-dda7d934-4344-476a-a07c-05fd2462938d 95520872a30443688f73eb48af516de3 a3a014b0248a4c6e86207427674269db - - -] Exception during message handling: The device in the path /var/lib/cinder/mnt/b069a4f25c0d852979dcb962468c1829/volume-e5e8dff8-ab7c-4ef5-83ac-6dd8376d472b is unavailable: Unable to access the backend storage via the path /var/lib/cinder/mnt/b069a4f25c0d852979dcb962468c1829/volume-e5e8dff8-ab7c-4ef5-83ac-6dd8376d472b.
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher Traceback (most recent call last):
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher   File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 142, in _dispatch_and_reply
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher     executor_callback))
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher   File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 186, in _dispatch
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher     executor_callback)
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher   File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 130, in _do_dispatch
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher     result = func(ctxt, **new_args)
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher   File "/usr/lib/python2.7/site-packages/cinder/backup/manager.py", line 301, in create_backup
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher     'fail_reason': six.text_type(err)})
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher   File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 85, in __exit__
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher     six.reraise(self.type_, self.value, self.tb)
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher   File "/usr/lib/python2.7/site-packages/cinder/backup/manager.py", line 294, in create_backup
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher     backup_service)
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher   File "/usr/lib/python2.7/site-packages/cinder/volume/driver.py", line 743, in backup_volume
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher     attach_info, volume = self._attach_volume(context, volume, properties)
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher   File "/usr/lib/python2.7/site-packages/cinder/volume/driver.py", line 700, in _attach_volume
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher     return (self._connect_device(conn), volume)
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher   File "/usr/lib/python2.7/site-packages/cinder/volume/driver.py", line 724, in _connect_device
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher     {'path': host_device}))
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher DeviceUnavailable: The device in the path /var/lib/cinder/mnt/b069a4f25c0d852979dcb962468c1829/volume-e5e8dff8-ab7c-4ef5-83ac-6dd8376d472b is unavailable: Unable to access the backend storage via the path /var/lib/cinder/mnt/b069a4f25c0d852979dcb962468c1829/volume-e5e8dff8-ab7c-4ef5-83ac-6dd8376d472b.
2016-07-19 13:19:52.707 4037 TRACE oslo_messaging.rpc.dispatcher

Comment 1 Gorka Eguileor 2016-07-26 17:15:44 UTC
Problem is caused by an inconsistent behavior of nas_secure_file_operations working on auto mode.

A workaround for this is setting disabling it in the specific driver configuration:

[mydriversection]
nas_secure_file_operations=false

Comment 2 Paul Grist 2017-05-17 02:43:17 UTC
Closing this clone of the BZ and the case for this was also closed.  There are additional BZ tagged with cinder_nas_secure to work the remaining issues.

Comment 3 Tzach Shefi 2019-04-29 14:44:32 UTC
Nothing to test/automate per close loop as closed works for me.