Bug 1438137 - heat stack snapshot failes when cinder volume is IN_USE state
Summary: heat stack snapshot failes when cinder volume is IN_USE state
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat OpenStack
Classification: Red Hat
Component: openstack-heat
Version: 10.0 (Newton)
Hardware: x86_64
OS: Linux
medium
medium
Target Milestone: z4
: 10.0 (Newton)
Assignee: Thomas Hervé
QA Contact: Amit Ugol
URL:
Whiteboard:
Depends On:
Blocks: 1461810
TreeView+ depends on / blocked
 
Reported: 2017-04-01 06:16 UTC by VIKRANT
Modified: 2017-09-06 17:13 UTC (History)
7 users (show)

Fixed In Version: openstack-heat-7.0.3-3.el7ost
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
: 1461810 (view as bug list)
Environment:
Last Closed: 2017-09-06 17:13:53 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Launchpad 1687006 0 None None None 2017-04-28 12:02:03 UTC
OpenStack gerrit 465592 0 None None None 2017-05-18 14:21:37 UTC
OpenStack gerrit 465593 0 None None None 2017-05-18 14:21:59 UTC
Red Hat Product Errata RHBA-2017:2655 0 normal SHIPPED_LIVE openstack-heat bug fix advisory 2017-09-06 20:56:09 UTC

Description VIKRANT 2017-04-01 06:16:52 UTC
Description of problem:

I tried to create heat stack snapshot for the stack in which instance is
booted using cinder volume. It got failed because of cinder volume
IN_USE state. I don't see any option from heat to force the snapshot
creation. It looks like a bug.
~~~
# heat stack-snapshot f03159ff-c301-4e38-adb0-df69b9cd0fd8 -n
teststack1-snap1
WARNING (shell) "heat stack-snapshot" is deprecated, please use
"openstack stack snapshot create" instead
{
  "status": "IN_PROGRESS",
  "name": "teststack1-snap1",
  "data": null,
  "creation_time": "2017-03-16T12:51:12Z",
  "status_reason": null,
  "id": "aa0d8b13-ce1f-4b35-a62a-08db30a79f25"
}


# heat snapshot-list f03159ff-c301-4e38-adb0-df69b9cd0fd8
WARNING (shell) "heat snapshot-list" is deprecated, please use
"openstack stack snapshot list" instead
+--------------------------------------+------------------+--------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+----------------------+
| id                                   | name             | status |
status_reason                                                                                                                                                                                 
| creation_time        |
+--------------------------------------+------------------+--------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+----------------------+
| aa0d8b13-ce1f-4b35-a62a-08db30a79f25 | teststack1-snap1 | FAILED |
Resource SNAPSHOT failed: BadRequest: resources.volume: Invalid volume:
Backing up an in-use volume must use the force flag. (HTTP 400)
(Request-ID: req-1a2ae978-c656-4880-8ae6-d3c38afb2821) |
2017-03-16T12:51:12Z |
+--------------------------------------+------------------+--------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+----------------------+
~~~


Version-Release number of selected component (if applicable):
RHEL OSP 10

How reproducible:
Everytime

Steps to Reproduce:
1. Spawn an instance using stack which is boot from cinder volume
2. Tried to take the snapshot of the heat stack. It's getting failed because volume is IN_USE. 
3. We can use the cinder force option to take the snapshot of IN_USE volume but this option is not available with heat snapshot. 

Actual results:
Not able to take the snapshot of heat stack. 

Expected results:
We should be able to take the snapshot of heat stack. 

Additional info:

Comment 4 errata-xmlrpc 2017-09-06 17:13:53 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2017:2655


Note You need to log in before you can comment on or make changes to this bug.