Bug 1477160 - Tempest does not properly clean up resources after a failed run
Tempest does not properly clean up resources after a failed run
Status: CLOSED WORKSFORME
Product: Red Hat OpenStack
Classification: Red Hat
Component: openstack-tempest (Show other bugs)
unspecified
Unspecified Unspecified
unspecified Severity unspecified
: ---
: ---
Assigned To: Chandan Kumar
Martin Kopec
:
Depends On:
Blocks:
  Show dependency treegraph
 
Reported: 2017-08-01 07:32 EDT by Ganesh Kadam
Modified: 2017-09-12 07:01 EDT (History)
8 users (show)

See Also:
Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Story Points: ---
Clone Of:
Environment:
Last Closed: 2017-09-12 07:01:36 EDT
Type: Bug
Regression: ---
Mount Type: ---
Documentation: ---
CRM:
Verified Versions:
Category: ---
oVirt Team: ---
RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: ---


Attachments (Terms of Use)
tempest-ports o/p (15.51 KB, text/plain)
2017-08-01 07:32 EDT, Ganesh Kadam
no flags Details
tempest-networks o/p (3.67 KB, text/plain)
2017-08-01 07:33 EDT, Ganesh Kadam
no flags Details
tempest-routers o/p (2.59 KB, text/plain)
2017-08-01 07:33 EDT, Ganesh Kadam
no flags Details
saved_state o/p (2.09 KB, text/plain)
2017-08-01 07:37 EDT, Ganesh Kadam
no flags Details
tempest.log for the reproduced steps (135.62 KB, text/plain)
2017-08-10 00:18 EDT, Ganesh Kadam
no flags Details

  None (edit)
Description Ganesh Kadam 2017-08-01 07:32:36 EDT
Created attachment 1307532 [details]
tempest-ports o/p

Description of problem:

Tempest cleanup --init-saved-state does not create a JSON that contains all the resources necessary, and running tempest cleanup subsequently does not remove disk/network/etc resources. The only resources cleaned up are users, groups, and flavors.

Version-Release number of selected component (if applicable):
RHOSP 10 

How reproducible/Steps to Reproduce:

Run tempest --init-saved-state, then run any set of tempest tests that inlude a failure, then run tempest cleanup with the given saved state. Tempest then is unable to clean up resources that it created outside of users/projects


Actual results:

Tempest should cleanup the resources

Expected results:

Tempest does not cleanup all the resources after a failed test run
Comment 1 Ganesh Kadam 2017-08-01 07:33 EDT
Created attachment 1307533 [details]
tempest-networks o/p
Comment 2 Ganesh Kadam 2017-08-01 07:33 EDT
Created attachment 1307534 [details]
tempest-routers o/p
Comment 3 Ganesh Kadam 2017-08-01 07:37 EDT
Created attachment 1307536 [details]
saved_state o/p
Comment 4 Chandan Kumar 2017-08-01 07:56:11 EDT
Hello Ganesh,

(In reply to Ganesh Kadam from comment #0)
> Created attachment 1307532 [details]
> tempest-ports o/p
> 

Below is my inline comments:

> Description of problem:
> 
> Tempest cleanup --init-saved-state does not create a JSON that contains all
> the resources necessary, and running tempest cleanup subsequently does not
> remove disk/network/etc resources. The only resources cleaned up are users,
> groups, and flavors.
> 

[1.] Do we have trackback for the same?

As per the gived doc: https://access.redhat.com/documentation/en-us/red_hat_openstack_platform/10/html-single/openstack_integration_test_suite_guide/#Cleaning_Tempest_Resources

On Running tempest cleanup --init-saved-state command, It will saves the state of OpenStack deployment by saving uuid of all the resources and tenant in a saved_state.json file in the dir where command is invoked.

Once again you run 'tempest cleanup' command only it will delete the tempest resources created after that.

Please check the above doc before running it.


Thanks,

Chandan Kumar
Comment 6 Ganesh Kadam 2017-08-10 00:18 EDT
Created attachment 1311520 [details]
tempest.log for the reproduced steps
Comment 8 Chandan Kumar 2017-08-29 05:36:09 EDT
Hello Chris,

You can run $tempest cleanup --delete-tempest-conf-objects command to delete all left overs.

Please check this doc: https://docs.openstack.org/tempest/latest/cleanup.html
before running.

Thanks,

Chandan Kumar

Note You need to log in before you can comment on or make changes to this bug.