+++ This bug was initially created as a clone of Bug #1787461 +++ Description of problem: Tempest cleanup utility does not always delete and destroy used OpenStack resources. **Reproduce Steps: $ tempest cleanup --init-saved-state $ execute selected manila upstream api tests $ tempest cleanup **User impact: Tests may fail due to insufficient resources. **Failure example: setUpClass (manila_tempest_tests.tests.api.admin.test_user_messages.UserMessageTest) ------------------------------------------------------------------------------------ Captured traceback: ~~~~~~~~~~~~~~~~~~~ b'Traceback (most recent call last):' b' File "/usr/lib/python3.6/site-packages/tempest/test.py", line 173, in setUpClass' b' six.reraise(etype, value, trace)' b' File "/usr/lib/python3.6/site-packages/six.py", line 675, in reraise' b' raise value' b' File "/usr/lib/python3.6/site-packages/tempest/test.py", line 158, in setUpClass' b' cls.setup_credentials()' b' File "/usr/lib/python3.6/site-packages/tempest/test.py", line 377, in setup_credentials' b' credential_type=credentials_type)' b' File "/usr/lib/python3.6/site-packages/tempest/test.py", line 682, in get_client_manager' b' creds = getattr(cred_provider, credentials_method)()' b' File "/usr/lib/python3.6/site-packages/tempest/lib/common/dynamic_creds.py", line 356, in get_admin_creds' b" return self.get_credentials('admin')" b' File "/usr/lib/python3.6/site-packages/tempest/lib/common/dynamic_creds.py", line 345, in get_credentials' b' credentials.tenant_id)' b' File "/usr/lib/python3.6/site-packages/tempest/lib/common/dynamic_creds.py", line 262, in _create_network_resources' b' router = self._create_router(router_name, tenant_id)' b' File "/usr/lib/python3.6/site-packages/tempest/lib/common/dynamic_creds.py", line 323, in _create_router' b' resp_body = self.routers_admin_client.create_router(**kwargs)' b' File "/usr/lib/python3.6/site-packages/tempest/lib/services/network/routers_client.py", line 27, in create_router' b' return self.create_resource(uri, post_body)' b' File "/usr/lib/python3.6/site-packages/tempest/lib/services/network/base.py", line 61, in create_resource' b' resp, body = self.post(req_uri, req_post_data)' b' File "/usr/lib/python3.6/site-packages/tempest/lib/common/rest_client.py", line 283, in post' b" return self.request('POST', url, extra_headers, headers, body, chunked)" b' File "/usr/lib/python3.6/site-packages/tempest/lib/common/rest_client.py", line 679, in request' b' self._error_checker(resp, resp_body)' b' File "/usr/lib/python3.6/site-packages/tempest/lib/common/rest_client.py", line 800, in _error_checker' b' raise exceptions.Conflict(resp_body, resp=resp)' b'tempest.lib.exceptions.Conflict: Conflict with state of target resource' b"Details: {'type': 'IpAddressGenerationFailure', 'message': 'No more IP addresses available on network 166b710f-cb4a-4eaa-af74-269964078dbe.', 'detail': ''}" b'' **Screen captures: (overcloud) [stack@undercloud-0 testing_dir]$ ll saved_state.json -rw-rw-r--. 1 stack stack 2675 May 23 17:56 saved_state.json (overcloud) [stack@undercloud-0 testing_dir]$ openstack network list +--------------------------------------+------------------------------------------+--------------------------------------+ | ID | Name | Subnets | +--------------------------------------+------------------------------------------+--------------------------------------+ | 03e522a1-c3bd-421f-84ce-30bbf164dbd9 | tempest-share-service-1723054483-network | a0b7670c-b605-4504-9b4a-7e7f1af2032e | | 0b53019a-0bcd-4351-8df0-6eb0fed3793e | tempest-share-service-1820279606-network | c63bf07c-8a27-4516-aa07-da24377912cd | | 0fab7ada-920f-4847-87bf-40fd7b2d7882 | tempest-share-service-1322603070-network | a2529d4f-c6f1-46af-b73c-39d9d6dfd971 | | 110d202f-e49a-4a81-87a8-92e88698c75e | tempest-share-service-1051501383-network | ebcf2d0f-30c4-4a84-8463-dace40089e20 | | 166b710f-cb4a-4eaa-af74-269964078dbe | public | d4e8879e-a11e-4a42-83b3-1fdccad633ae | | 249d773b-f555-492d-a88c-8b59a5db0ac5 | tempest-share-service-1237827927-network | d40f3765-5fdd-4313-b991-0a9fbfdba00b | ... (overcloud) [stack@undercloud-0 testing_dir]$ tempest cleanup Begin cleanup Process 0 projects (overcloud) [stack@undercloud-0 testing_dir]$ openstack network list +--------------------------------------+------------------------------------------+--------------------------------------+ | ID | Name | Subnets | +--------------------------------------+------------------------------------------+--------------------------------------+ | 03e522a1-c3bd-421f-84ce-30bbf164dbd9 | tempest-share-service-1723054483-network | a0b7670c-b605-4504-9b4a-7e7f1af2032e | | 0b53019a-0bcd-4351-8df0-6eb0fed3793e | tempest-share-service-1820279606-network | c63bf07c-8a27-4516-aa07-da24377912cd | | 0fab7ada-920f-4847-87bf-40fd7b2d7882 | tempest-share-service-1322603070-network | a2529d4f-c6f1-46af-b73c-39d9d6dfd971 | | 110d202f-e49a-4a81-87a8-92e88698c75e | tempest-share-service-1051501383-network | ebcf2d0f-30c4-4a84-8463-dace40089e20 | | 166b710f-cb4a-4eaa-af74-269964078dbe | public | d4e8879e-a11e-4a42-83b3-1fdccad633ae | | 249d773b-f555-492d-a88c-8b59a5db0ac5 | tempest-share-service-1237827927-network | d40f3765-5fdd-4313-b991-0a9fbfdba00b | ... Version-Release number of selected component (if applicable): The current version for rhos-15: openstack-tempest-21.0.0-0.20191209200452.702b21c.el8ost.noarch.rpm Additional info: If a resource is associated with a project which has been already removed and the resource is from project services category, it won't be cleaned. However, some service resources can be deleted all at once without iteration over the projects. More info in the attached Launchpad bug
Created attachment 1662400 [details] verification output The 'Fixed in version' contains the fix for the issue, verified on a 13 env using 2020-02-06.2 puddle. Verification output attached.
Created attachment 1662401 [details] verification output The 'Fixed in version' contains the fix for the issue, verified on a 13 env using 2020-02-06.2 puddle. Verification output attached.
Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory, and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://access.redhat.com/errata/RHBA-2020:0769