Bug 1477160 - Tempest does not properly clean up resources after a failed run
Summary: Tempest does not properly clean up resources after a failed run
Keywords:
Status: CLOSED WORKSFORME
Alias: None
Product: Red Hat OpenStack
Classification: Red Hat
Component: openstack-tempest
Version: unspecified
Hardware: Unspecified
OS: Unspecified
unspecified
unspecified
Target Milestone: ---
: ---
Assignee: Chandan Kumar
QA Contact: Martin Kopec
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2017-08-01 11:32 UTC by Ganesh Kadam
Modified: 2020-12-14 09:17 UTC (History)
10 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2017-09-12 11:01:36 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)
tempest-ports o/p (15.51 KB, text/plain)
2017-08-01 11:32 UTC, Ganesh Kadam
no flags Details
tempest-networks o/p (3.67 KB, text/plain)
2017-08-01 11:33 UTC, Ganesh Kadam
no flags Details
tempest-routers o/p (2.59 KB, text/plain)
2017-08-01 11:33 UTC, Ganesh Kadam
no flags Details
saved_state o/p (2.09 KB, text/plain)
2017-08-01 11:37 UTC, Ganesh Kadam
no flags Details
tempest.log for the reproduced steps (135.62 KB, text/plain)
2017-08-10 04:18 UTC, Ganesh Kadam
no flags Details

Description Ganesh Kadam 2017-08-01 11:32:36 UTC
Created attachment 1307532 [details]
tempest-ports o/p

Description of problem:

Tempest cleanup --init-saved-state does not create a JSON that contains all the resources necessary, and running tempest cleanup subsequently does not remove disk/network/etc resources. The only resources cleaned up are users, groups, and flavors.

Version-Release number of selected component (if applicable):
RHOSP 10 

How reproducible/Steps to Reproduce:

Run tempest --init-saved-state, then run any set of tempest tests that inlude a failure, then run tempest cleanup with the given saved state. Tempest then is unable to clean up resources that it created outside of users/projects


Actual results:

Tempest should cleanup the resources

Expected results:

Tempest does not cleanup all the resources after a failed test run

Comment 1 Ganesh Kadam 2017-08-01 11:33:10 UTC
Created attachment 1307533 [details]
tempest-networks o/p

Comment 2 Ganesh Kadam 2017-08-01 11:33:36 UTC
Created attachment 1307534 [details]
tempest-routers o/p

Comment 3 Ganesh Kadam 2017-08-01 11:37:58 UTC
Created attachment 1307536 [details]
saved_state o/p

Comment 4 Chandan Kumar 2017-08-01 11:56:11 UTC
Hello Ganesh,

(In reply to Ganesh Kadam from comment #0)
> Created attachment 1307532 [details]
> tempest-ports o/p
> 

Below is my inline comments:

> Description of problem:
> 
> Tempest cleanup --init-saved-state does not create a JSON that contains all
> the resources necessary, and running tempest cleanup subsequently does not
> remove disk/network/etc resources. The only resources cleaned up are users,
> groups, and flavors.
> 

[1.] Do we have trackback for the same?

As per the gived doc: https://access.redhat.com/documentation/en-us/red_hat_openstack_platform/10/html-single/openstack_integration_test_suite_guide/#Cleaning_Tempest_Resources

On Running tempest cleanup --init-saved-state command, It will saves the state of OpenStack deployment by saving uuid of all the resources and tenant in a saved_state.json file in the dir where command is invoked.

Once again you run 'tempest cleanup' command only it will delete the tempest resources created after that.

Please check the above doc before running it.


Thanks,

Chandan Kumar

Comment 6 Ganesh Kadam 2017-08-10 04:18:53 UTC
Created attachment 1311520 [details]
tempest.log for the reproduced steps

Comment 8 Chandan Kumar 2017-08-29 09:36:09 UTC
Hello Chris,

You can run $tempest cleanup --delete-tempest-conf-objects command to delete all left overs.

Please check this doc: https://docs.openstack.org/tempest/latest/cleanup.html
before running.

Thanks,

Chandan Kumar

Comment 9 Uemit Seren 2018-03-26 09:36:01 UTC
This does not work. We have the same issue. 

Running tempest cleanup --delete-tempest-conf-objects will output: 
Begin cleanup
Process 0 tenants

although there lots of tempest tenants and resources still existing:
openstack project list:

+----------------------------------+------------------------------------------------------------------+
| ID                               | Name                                                             |
+----------------------------------+------------------------------------------------------------------+
| 0031e25949fe42d5a5a07cda181a635f | df72e1d5305e4ed69f027b1773db5abe-2389bb85-269b-4ea8-b2cc-983914c |
| 0199bb6e42134586906b6fbda1243667 | df72e1d5305e4ed69f027b1773db5abe-ac9c0f33-5bf9-4b5b-99e3-addb223 |
| 01f18a832dc04089ac6689efbb9aa7f9 | alt_demo                                                         |
| 03077dff0f1d471abe534f24d629b761 | df72e1d5305e4ed69f027b1773db5abe-72833425-ff2e-4838-8ad6-062f16f |
| 086edfd16321477ba7fe0725b1b40785 | df72e1d5305e4ed69f027b1773db5abe-4cd8d718-86bd-42c0-8207-ed8439a |
| 0a3c755c8a514dc98e5377c83db90f83 | df72e1d5305e4ed69f027b1773db5abe-e42e69f6-13f1-40eb-9849-52f532f |
| 0a5152338b0c4077a3573c97b0cd4b84 | df72e1d5305e4ed69f027b1773db5abe-8e457d17-0b04-4454-87ef-adad794 |
| 0b89ec131d6345339cb01fec4d3e4bb7 | df72e1d5305e4ed69f027b1773db5abe-b9e51833-07ea-425e-ab1a-8ed152e |
| 0f88d4e5c1554826b44579c595a980b7 | df72e1d5305e4ed69f027b1773db5abe-e9e2d318-434a-4b09-93ca-376cf40 |
| 18e5f748889f4196b03130c44f2671d9 | df72e1d5305e4ed69f027b1773db5abe-3f686c81-b0f1-46ce-85ab-d5a6e56 |
| 1c4d0245e8d4446eab430abad00054f3 | df72e1d5305e4ed69f027b1773db5abe-6ac8f39f-afba-4eeb-a440-c2ea92f |
| 1e2e57613ace46b9bed9934cbcaaae47 | df72e1d5305e4ed69f027b1773db5abe-d2d26c51-75c4-401d-b5b3-a57f7d8 |

Comment 10 Uemit Seren 2018-03-26 10:07:38 UTC
After additional investigation it seems that our tempest version (OSP 11, Pike) is still using the v2 keystone endpoint and thus does not get all tenants/projects. 

This seems to be fixed in upstream tempest: https://github.com/openstack/tempest/commit/05fe4bcb35c4c7b3933b016a710fb3a0627e9b43


Note You need to log in before you can comment on or make changes to this bug.