Bug 1420438

Summary: [RFE] Migration between Cinder AZs
Product: Red Hat OpenStack Reporter: Mikel Olasagasti <molasaga>
Component: openstack-cinderAssignee: Gorka Eguileor <geguileo>
Status: CLOSED ERRATA QA Contact: Avi Avraham <aavraham>
Severity: high Docs Contact: James Smith <jamsmith>
Priority: medium    
Version: 13.0 (Queens)CC: ealcaniz, Egarciad, eharney, geguileo, jamsmith, mariel, molasaga, scohen, srevivo, tshefi
Target Milestone: Upstream M1Keywords: FutureFeature, Reopened, Triaged
Target Release: 14.0 (Rocky)   
Hardware: All   
OS: Linux   
Whiteboard:
Fixed In Version: openstack-cinder-12.0.1-0.20180418194613.c476898.el7ost.noarch Doc Type: If docs needed, set a value
Doc Text:
Story Points: ---
Clone Of:
: 1554768 1640757 (view as bug list) Environment:
Last Closed: 2019-01-11 11:47:00 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:
Bug Depends On: 1554768, 1554772, 1554784    
Bug Blocks: 1381612, 1640757    

Description Mikel Olasagasti 2017-02-08 16:20:09 UTC
Description of problem:

IHAC that would like to have the ability to migrate volumes between Cinder AZs

Version-Release number of selected component (if applicable):

RHOSP8

Additional info:

Blueprint https://blueprints.launchpad.net/cinder/+spec/volume-migration-between-availability-zones

Comment 1 Red Hat Bugzilla Rules Engine 2017-03-01 14:24:23 UTC
This bugzilla has been removed from the release and needs to be reviewed and Triaged for another Target Release.

Comment 7 Gorka Eguileor 2018-04-19 07:29:43 UTC
This is now in stable/Queens [1] so it will be included in OSP13 on next rebase.


[1]: https://review.openstack.org/#/c/550775/

Comment 8 Tzach Shefi 2018-05-03 09:28:58 UTC
I'll test then move to verified. 

For verification steps -> 
https://bugzilla.redhat.com/show_bug.cgi?id=1554784#c6

Comment 9 Tzach Shefi 2018-05-03 10:53:20 UTC
Verified on:
openstack-cinder-12.0.1-0.20180418194613.c476898.el7ost.noarch


Create a volume
cinder create --display-name nfs2 --volume-type nfs 1 --availability-zone dc2
+--------------------------------+--------------------------------------+
| Property                       | Value                                |
+--------------------------------+--------------------------------------+
| attachments                    | []                                   |
| availability_zone              | dc2      

Retype/migrate to other backend LVM,Zone dc1

cinder retype 74f5d314-3e76-4d42-8e16-c05388344fee lvm --migration-policy on-demand

Verify that volume successfully migrated to other availability zone ->  dc1

cinder show 74f5d314-3e76-4d42-8e16-c05388344fee
+--------------------------------+---------------------------------------+
| Property                       | Value                                 |
+--------------------------------+---------------------------------------+
| attached_servers               | []                                    |
| attachment_ids                 | []                                    |
| availability_zone              | dc1                                   |
..                                 |
| status                         | available 

Looking great, verified.

Comment 12 Tzach Shefi 2018-11-12 13:38:10 UTC
Hit a new bug, attached migration of vol, works but source volume remains.
This doesn't happen with migration of a none attached volume.

https://bugzilla.redhat.com/show_bug.cgi?id=1648931

Comment 13 Tzach Shefi 2018-11-12 14:02:04 UTC
Low usability bug.
User isn't notified that migration (to other az) of a vol with snapshots failed. It fails due to snapshots this is the correct/expected result. 
But user isn't notified that no migration occurred.  

https://bugzilla.redhat.com/show_bug.cgi?id=1648941

Comment 17 errata-xmlrpc 2019-01-11 11:47:00 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHEA-2019:0045