Bug 1466448 - Tempest failures with cinder
Tempest failures with cinder
Status: CLOSED ERRATA
Product: Red Hat OpenStack
Classification: Red Hat
Component: openstack-cinder (Show other bugs)
12.0 (Pike)
Unspecified Unspecified
high Severity high
: Upstream M3
: 12.0 (Pike)
Assigned To: Eric Harney
Avi Avraham
: Automation, Triaged
Depends On:
Blocks:
  Show dependency treegraph
 
Reported: 2017-06-29 11:57 EDT by Jon Schlueter
Modified: 2018-02-05 14:10 EST (History)
5 users (show)

See Also:
Fixed In Version: openstack-cinder-11.0.0-0.20170821192443.120fdb0.el7ost
Doc Type: If docs needed, set a value
Doc Text:
Story Points: ---
Clone Of:
Environment:
Last Closed: 2017-12-13 16:35:30 EST
Type: Bug
Regression: ---
Mount Type: ---
Documentation: ---
CRM:
Verified Versions:
Category: ---
oVirt Team: ---
RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: ---


Attachments (Terms of Use)
Cinder Nova conf and logs plus tempest output (206.08 KB, application/x-gzip)
2017-07-16 08:58 EDT, Tzach Shefi
no flags Details


External Trackers
Tracker ID Priority Status Summary Last Updated
Launchpad 1701547 None None None 2017-06-30 09:02 EDT
OpenStack gerrit 479295 None None None 2017-06-30 09:03 EDT
OpenStack gerrit 485256 None None None 2017-07-19 14:54 EDT

  None (edit)
Description Jon Schlueter 2017-06-29 11:57:48 EDT
Description of problem:

Standard deployment is having tempest tests around cinder

Version-Release number of selected component (if applicable):

OSP 12 puddle 2017-06-28.2 (RDO import from 2017-06-25)

How reproducible:

always

Steps to Reproduce:
1. packstack or ospd install with cinder LVM
2. run tempest tests
3.

Actual results:

tempest.api.volume.test_volumes_get.VolumesGetTest.test_volume_create_get_update_delete_as_clone[id-3f591b4a-7dc6-444c-bd51-77469506b3a1]
tempest.api.volume.test_volumes_clone.VolumesCloneTest.test_create_from_bootable_volume[id-cbbcd7c6-5a6c-481a-97ac-ca55ab715d16,image]
tempest.api.volume.test_volumes_clone.VolumesCloneTest.test_create_from_volume[id-9adae371-a257-43a5-9555-dc7c88e66e0e]
tempest.api.volume.test_volumes_backup.VolumesBackupsTest.test_backup_create_attached_volume[compute,id-07af8f6d-80af-44c9-a5dc-c8427b1b62e6]



Expected results:

These tests cases pass

Additional info:
Comment 1 Jon Schlueter 2017-06-29 11:58:18 EDT
<eharney> jschlueter: LVM commands are failing with "Command does not accept option: --ignoreactivationskip"  -- maybe an updated lvm package changed the params... looking

<eharney> jschlueter: most likely this was introduced by this Cinder change that enabled thin LVM by default: https://review.openstack.org/#/c/474811/
<eharney> jschlueter: i'll have to dig around to figure out what to do about it, probably a bug in cinder thin lvm support
Comment 3 Eric Harney 2017-06-30 09:02:34 EDT
Setting lvm_type='default' in the backend's section in cinder.conf should workaround these issues for now.
Comment 4 Eric Harney 2017-07-06 18:32:24 EDT
I believe this is fixed by:
   https://review.openstack.org/#/c/479295/

The other patches were not correct and aren't needed.
Comment 7 Tzach Shefi 2017-07-16 08:58 EDT
Created attachment 1299339 [details]
Cinder Nova conf and logs plus tempest output

Installed with Cinder LVM backend, enabled Cinder backup service. 
Ran tempest dunno what I did wrong or missed, filtered out Cinder test
Result:  
Ran 92 tests in 329.496s
FAILED (id=0, failures=7, skips=3)

Out of the 7 failed tests, 4 are the ones mentioned on this bug. 

Attaching cinder/nova conf and logs as well as tempest.conf and 0/failing.
Comment 9 Tzach Shefi 2017-07-16 09:14:29 EDT
Per first failing test:

tempest.api.volume.test_volumes_get.VolumesGetTest.test_volume_create_get_update_delete_as_clone[id-3f591b4a-7dc6-444c-bd51-77469506b3a1]


Same issue as Jon reported:

rootwrap /etc/cinder/rootwrap.conf lvchange -a y --yes -K -k n cinder-volumes/volume-90482ae1-3045-4727-9820-d81d3ec4cd74' failed. Not Retrying. execute /usr/lib/python2.7/site-packages/oslo_concurrency/processutils.py:433
2017-07-16 12:18:24.606 92861 ERROR cinder.brick.local_dev.lvm [req-e92badbd-5c14-4f0c-9c99-bceefbac568c 45a82da10ccc4b779d3cac00d336b134 c3ceac09af574099b78f80fca6857d6a - default default] Error activating LV: ProcessExecutionError: Unexpected error while running command.
Command: sudo cinder-rootwrap /etc/cinder/rootwrap.conf lvchange -a y --yes -K -k n cinder-volumes/volume-90482ae1-3045-4727-9820-d81d3ec4cd74
Exit code: 3
Stdout: u''
Stderr: u'File descriptor 12 (/dev/urandom) leaked on lvchange invocation. Parent PID 272032: /usr/bin/python2\n  Command does not accept option: --ignoreactivationskip.\n'


2017-07-16 12:18:24.606 92861 ERROR cinder.brick.local_dev.lvm Traceback (most recent call last):
2017-07-16 12:18:24.606 92861 ERROR cinder.brick.local_dev.lvm   File "/usr/lib/python2.7/site-packages/cinder/brick/local_dev/lvm.py", line 690, in activate_lv
2017-07-16 12:18:24.606 92861 ERROR cinder.brick.local_dev.lvm     run_as_root=True)
2017-07-16 12:18:24.606 92861 ERROR cinder.brick.local_dev.lvm   File "/usr/lib/python2.7/site-packages/os_brick/executor.py", line 49, in _execute
2017-07-16 12:18:24.606 92861 ERROR cinder.brick.local_dev.lvm     result = self.__execute(*args, **kwargs)
2017-07-16 12:18:24.606 92861 ERROR cinder.brick.local_dev.lvm   File "/usr/lib/python2.7/site-packages/cinder/utils.py", line 123, in execute
2017-07-16 12:18:24.606 92861 ERROR cinder.brick.local_dev.lvm     return processutils.execute(*cmd, **kwargs)
2017-07-16 12:18:24.606 92861 ERROR cinder.brick.local_dev.lvm   File "/usr/lib/python2.7/site-packages/oslo_concurrency/processutils.py", line 400, in execute
2017-07-16 12:18:24.606 92861 ERROR cinder.brick.local_dev.lvm     cmd=sanitized_cmd)
2017-07-16 12:18:24.606 92861 ERROR cinder.brick.local_dev.lvm ProcessExecutionError: Unexpected error while running command.
2017-07-16 12:18:24.606 92861 ERROR cinder.brick.local_dev.lvm Command: sudo cinder-rootwrap /etc/cinder/rootwrap.conf lvchange -a y --yes -K -k n cinder-volumes/volume-90482ae1-3045-4727-9820-d81d3ec4cd74
2017-07-16 12:18:24.606 92861 ERROR cinder.brick.local_dev.lvm Exit code: 3
2017-07-16 12:18:24.606 92861 ERROR cinder.brick.local_dev.lvm Stdout: u''
2017-07-16 12:18:24.606 92861 ERROR cinder.brick.local_dev.lvm Stderr: u'File descriptor 12 (/dev/urandom) leaked on lvchange invocation. Parent PID 272032: /usr/bin/python2\n  Command does not accept option: --ignoreactivationskip.\n'
Comment 12 Tzach Shefi 2017-10-02 11:10:39 EDT
Verified on:
openstack-cinder-11.0.1-0.20170921120341.ca8a2b3.el7ost.noarch

On a Cinder LVM deployment.
Tempest output: 

test: tempest.api.volume.test_volumes_get.VolumesGetTest.test_volume_create_get_update_delete_as_clone[id-3f591b4a-7dc6-444c-bd51-77469506b3a1]
successful: tempest.api.volume.test_volumes_get.VolumesGetTest.test_volume_create_get_update_delete_as_clone[id-3f591b4a-7dc6-444c-bd51-77469506b3a1] [ multipart

test: tempest.api.volume.test_volumes_clone.VolumesCloneTest.test_create_from_bootable_volume[id-cbbcd7c6-5a6c-481a-97ac-ca55ab715d16,image]
successful: tempest.api.volume.test_volumes_clone.VolumesCloneTest.test_create_from_bootable_volume[id-cbbcd7c6-5a6c-481a-97ac-ca55ab715d16,image] [ multipart

test: tempest.api.volume.test_volumes_clone.VolumesCloneTest.test_create_from_volume[id-9adae371-a257-43a5-9555-dc7c88e66e0e]
successful: tempest.api.volume.test_volumes_clone.VolumesCloneTest.test_create_from_volume[id-9adae371-a257-43a5-9555-dc7c88e66e0e] [ multipart


Verifying despite below failing backup test as it's an open other bz
https://bugzilla.redhat.com/show_bug.cgi?id=1484467


test: tempest.api.volume.test_volumes_backup.VolumesBackupsTest.test_backup_create_attached_volume[compute,id-07af8f6d-80af-44c9-a5dc-c8427b1b62e6]
failure: tempest.api.volume.test_volumes_backup.VolumesBackupsTest.test_backup_create_attached_volume[compute,id-07af8f6d-80af-44c9-a5dc-c8427b1b62e6] [ multipart
  File "/usr/lib/python2.7/site-packages/tempest/api/volume/test_volumes_backup.py", line 121, in test_backup_create_attached_volume
Comment 15 errata-xmlrpc 2017-12-13 16:35:30 EST
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHEA-2017:3462

Note You need to log in before you can comment on or make changes to this bug.