Bug 2081319 - Share stuck in "error_deleting" status after running "test_access_rules_metadata" tests
Summary: Share stuck in "error_deleting" status after running "test_access_rules_metad...
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat OpenStack
Classification: Red Hat
Component: openstack-manila
Version: 17.0 (Wallaby)
Hardware: Unspecified
OS: Unspecified
medium
medium
Target Milestone: ---
: ---
Assignee: Goutham Pacha Ravi
QA Contact: vhariria
Erin Peterson
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2022-05-03 12:08 UTC by lkuchlan
Modified: 2022-09-21 12:21 UTC (History)
3 users (show)

Fixed In Version: openstack-manila-12.1.3-0.20220517120843.3d844d6.el9ost
Doc Type: No Doc Update
Doc Text:
Clone Of:
Environment:
Last Closed: 2022-09-21 12:20:53 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Launchpad 1971530 0 None None None 2022-05-04 06:20:53 UTC
OpenStack gerrit 841481 0 None MERGED [Native CephFS] Don't fail to deny missing rules 2022-05-24 14:27:59 UTC
Red Hat Issue Tracker OSP-14994 0 None None None 2022-05-03 12:08:51 UTC
Red Hat Product Errata RHEA-2022:6543 0 None None None 2022-09-21 12:21:05 UTC

Description lkuchlan 2022-05-03 12:08:18 UTC
Description of problem:
Test "test_set_get_delete_access_metadata" is failed with timeout error 
because the waiter method is waiting for the share resource to be deleted, 
however, the share stuck in "error_deleting" status.

I guess what's happening is that the access rules of the share instance are blocked 
by locked_access_rules_operation [1] function.
And that's causing the problem because we use one single share for all test cases. 
So there may be a situation where the test tries to create an access rule to the share,
while the access rules of the share instance are blocked.
So this leads that the access rule is in "error" status, and after cleanup stage, 
the share is stuck in "error_deleting" state.  


[1] https://github.com/openstack/manila/blob/master/manila/share/access.py#L28

Version-Release number of selected component (if applicable):
openstack-manila-12.1.2-0.20220404161416.f5b2cdc.el9ost.noarch

How reproducible:
100%


Steps to Reproduce:
1. stestr run manila_tempest_tests.tests.api.test_access_rules_metadata.AccessRulesMetadataTest

Actual results:
Share stuck in "error deleting" status

Expected results:
Share should be deleted successfully

Additional info:

From manila-share.log
======================
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server [req-b2d6f069-58ab-4950-ba97-0655c5e19825 7db6eb0754d543fd96372845c5d4758b 5e8574ca66f44c16b521dac0185f1091 - - -] Exception during message handling: manila.exception.ShareBackendException: json_command failed - prefix=fs subvolume authorize, argdict={'vol_name': 'cephfs', 'sub_name': '859aa3a1-e5af-4eb4-8440-29e263689b55', 'auth_id': 'Joe', 'tenant_id': '5e8574ca66f44c16b521dac0185f1091', 'access_level': 'rw', 'format': 'json'} - exception message: [errno -1] auth ID: Joe is already in use.
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/manila/share/drivers/cephfs/driver.py", line 193, in rados_command
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server     raise rados.Error(outs, ret)
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server rados.Error: [errno -1] auth ID: Joe is already in use
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server 
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred:
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server 
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/manila/share/manager.py", line 219, in wrapped
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server     return f(self, *args, **kwargs)
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/manila/utils.py", line 578, in wrapper
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server     return func(self, *args, **kwargs)
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/manila/share/manager.py", line 3920, in update_access
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server     self.update_access_for_instances(context, [share_instance_id],
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/manila/share/manager.py", line 3934, in update_access_for_instances
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server     self.access_helper.update_access_rules(
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/manila/share/access.py", line 301, in update_access_rules
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server     self._update_access_rules(context, share_instance_id,
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/manila/share/access.py", line 338, in _update_access_rules
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server     driver_rule_updates = self._update_rules_through_share_driver(
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/manila/share/access.py", line 403, in _update_rules_through_share_driver
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server     driver_rule_updates = self.driver.update_access(
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/manila/share/drivers/cephfs/driver.py", line 511, in update_access
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server     return self.protocol_helper.update_access(
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/manila/share/drivers/cephfs/driver.py", line 896, in update_access
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server     access_key = self._allow_access(context, share, rule)
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/manila/share/drivers/cephfs/driver.py", line 822, in _allow_access
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server     auth_result = rados_command(
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/manila/share/drivers/cephfs/driver.py", line 205, in rados_command
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server     raise exception.ShareBackendException(msg)
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server manila.exception.ShareBackendException: json_command failed - prefix=fs subvolume authorize, argdict={'vol_name': 'cephfs', 'sub_name': '859aa3a1-e5af-4eb4-8440-29e263689b55', 'auth_id': 'Joe', 'tenant_id': '5e8574ca66f44c16b521dac0185f1091', 'access_level': 'rw', 'format': 'json'} - exception message: [errno -1] auth ID: Joe is already in use.
2022-05-02 11:42:40.973 9 ERROR oslo_messaging.rpc.server

Comment 8 errata-xmlrpc 2022-09-21 12:20:53 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (Release of components for Red Hat OpenStack Platform 17.0 (Wallaby)), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHEA-2022:6543


Note You need to log in before you can comment on or make changes to this bug.