Bug 1523627 - [RFE] Ensure geo-rep based data sync works with RHV site-site failover/failback process
Summary: [RFE] Ensure geo-rep based data sync works with RHV site-site failover/failba...
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Gluster Storage
Classification: Red Hat Storage
Component: rhhi
Version: rhgs-3.2
Hardware: x86_64
OS: Linux
unspecified
high
Target Milestone: ---
: RHHI-V 1.5
Assignee: Sahina Bose
QA Contact: SATHEESARAN
URL:
Whiteboard:
Depends On:
Blocks: 1520833 1548985
TreeView+ depends on / blocked
 
Reported: 2017-12-08 12:26 UTC by Sahina Bose
Modified: 2018-11-08 05:38 UTC (History)
3 users (show)

Fixed In Version:
Doc Type: Enhancement
Doc Text:
Red Hat Hyperconverged Infrastructure for Virtualization now supports backup, failover, and failback to a remote secondary site.
Clone Of:
Environment:
Last Closed: 2018-11-08 05:37:25 UTC
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Red Hat Product Errata RHEA-2018:3523 0 None None None 2018-11-08 05:38:50 UTC

Description Sahina Bose 2017-12-08 12:26:51 UTC
Description of problem:

For gluster storage domains, the storage domain data is synced to remote site using DR setup based on gluster geo-replication. In case of a disaster, it should be possible to restore operations at the secondary site using the site-site failover mechanism available in RHV 4.2


Additional info:
This should theoretically work - will need qualification and doc updates

Comment 3 Sahina Bose 2018-04-04 03:10:22 UTC
Bug 1531916 has documented steps to achieve this. Moving this to ON_QA as this feature can be tested out as per the steps outlined

Comment 4 SATHEESARAN 2018-05-16 11:27:47 UTC
Tested this feature with RHV 4.2.3-6 and glusterfs-3.12

1. Created a geo-rep session manually
2. Set up the DR sync schedule
3. COnfigured failover target in the remote POD
4. With the mapping file in place, tested the failover

Comment 10 errata-xmlrpc 2018-11-08 05:37:25 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHEA-2018:3523


Note You need to log in before you can comment on or make changes to this bug.