Note: This bug is displayed in read-only format because the product is no longer active in Red Hat Bugzilla.

Bug 1722027

Summary: Tempest test cases and scenarios failed with multiple Availability Zones
Product: Red Hat OpenStack Reporter: Martin Kopec <mkopec>
Component: openstack-tempestAssignee: Martin Kopec <mkopec>
Status: CLOSED ERRATA QA Contact: Lukas Piwowarski <lpiwowar>
Severity: medium Docs Contact:
Priority: medium    
Version: 14.0 (Rocky)CC: apevec, chkumar, knoha, lhh, mkopec, slinaber, udesale
Target Milestone: z4Keywords: Triaged, ZStream
Target Release: 14.0 (Rocky)   
Hardware: x86_64   
OS: Linux   
Whiteboard:
Fixed In Version: openstack-tempest-19.0.0-5.el7ost Doc Type: If docs needed, set a value
Doc Text:
Story Points: ---
Clone Of: 1693488
: 1722034 (view as bug list) Environment:
Last Closed: 2019-11-06 16:53:25 UTC Type: ---
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:
Bug Depends On: 1693488    
Bug Blocks: 1722034    
Attachments:
Description Flags
bug_1722027.log none

Description Martin Kopec 2019-06-19 11:59:40 UTC
+++ This bug was initially created as a clone of Bug #1693488 +++

Description of problem:
Tempest test cases and scenarios failed with multiple Availability Zones
For example, cinder has an availability zone, which is called az1 and nova doesn't have an availability zone that it means None.
Following test cases failed.
~~~
{0} tempest.api.compute.admin.test_live_migration.LiveAutoBlockMigrationV225Test.test_volume_backed_live_migration ... FAILED
{0} tempest.api.compute.admin.test_live_migration.LiveMigrationRemoteConsolesV26Test.test_volume_backed_live_migration ... FAILED
{0} tempest.api.compute.admin.test_live_migration.LiveMigrationTest.test_volume_backed_live_migration ... FAILED
{0} setUpClass (tempest.api.compute.servers.test_create_server.ServersTestBootFromVolume) ... FAILED
{0} tempest.api.compute.servers.test_device_tagging.DeviceTaggingTest.test_device_tagging ... FAILED
{0} tempest.api.compute.servers.test_device_tagging.DeviceTaggingTestV2_42.test_device_tagging ... FAILED
{0} tempest.api.compute.servers.test_server_actions.ServerActionsTestJSON.test_resize_volume_backed_server_confirm ... FAILED
{0} tempest.scenario.test_shelve_instance.TestShelveInstance.test_shelve_volume_backed_instance ... FAILED
{0} tempest.scenario.test_volume_boot_pattern.TestVolumeBootPattern.test_create_ebs_image_and_check_boot ... FAILED
{0} tempest.scenario.test_volume_boot_pattern.TestVolumeBootPattern.test_volume_boot_pattern ... FAILED
~~~

To avoid this situation, we should introduce an option to assign the availability zone for nova instance and cinder volume.

This topic
Version-Release number of selected component (if applicable):
RHOSP13.

How reproducible:
Always.

Steps to Reproduce:
1. Deploy overcloud
2. Set availability zone in cinder. Nova should be the default.
3. Run the listed tests in tempest.

Actual results:
All test cases failed.

Expected results:
Those test cases should be passed.

Additional info:
This issue is being discussed in upstream launchpad, https://bugs.launchpad.net/tempest/+bug/1647999
However, the discussed gerrit focuses on scenario cases only.

Comment 2 Lukas Piwowarski 2019-10-09 15:38:14 UTC
Created attachment 1623845 [details]
bug_1722027.log

Comment 4 Lukas Piwowarski 2019-10-09 15:43:46 UTC
Tested with:
  - package with the bug (openstack-tempest-19.0.0-4.el7ost)
  - package with the fix (openstack-tempest-19.0.0-5.el7ost)

When tested with the package with the fix the bug was not reproduced as shown in the attached log file (bug_1722027.log).

(tempest.api.compute.servers.test_server_actions.ServerActionsTestJSON.test_resize_volume_backed_server_confirm test fail is non-related error to this bug)

Comment 6 errata-xmlrpc 2019-11-06 16:53:25 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2019:3747