Bug 1439708
Summary: | [geo-rep]: Geo-replication goes to faulty after upgrade from 3.2.0 to 3.3.0 | ||
---|---|---|---|
Product: | [Red Hat Storage] Red Hat Gluster Storage | Reporter: | Rochelle <rallan> |
Component: | geo-replication | Assignee: | Kotresh HR <khiremat> |
Status: | CLOSED ERRATA | QA Contact: | Rahul Hinduja <rhinduja> |
Severity: | urgent | Docs Contact: | |
Priority: | unspecified | ||
Version: | rhgs-3.3 | CC: | amukherj, csaba, rhs-bugs, storage-qa-internal |
Target Milestone: | --- | Keywords: | Regression |
Target Release: | RHGS 3.3.0 | ||
Hardware: | Unspecified | ||
OS: | Unspecified | ||
Whiteboard: | |||
Fixed In Version: | glusterfs-3.8.4-22 | Doc Type: | If docs needed, set a value |
Doc Text: | Story Points: | --- | |
Clone Of: | Environment: | ||
Last Closed: | 2017-09-21 04:37:54 UTC | Type: | Bug |
Regression: | --- | Mount Type: | --- |
Documentation: | --- | CRM: | |
Verified Versions: | Category: | --- | |
oVirt Team: | --- | RHEL 7.3 requirements from Atomic Host: | |
Cloudforms Team: | --- | Target Upstream Version: | |
Embargoed: | |||
Bug Depends On: | |||
Bug Blocks: | 1417151 |
Description
Rochelle
2017-04-06 12:16:59 UTC
Following code in master.py was causing this issue. diff between 3.2.0 master.py and 3.3.0 master.py reveals these additional lines: if not data_stime or data_stime == URXTIME: raise NoStimeAvailable() After commenting and restart geo-replication. It works. downstream patch : https://code.engineering.redhat.com/gerrit/#/c/102726/ verified with build: glusterfs-geo-replication-3.8.4-22.el6rhs.x86_64 After upgrading Master/Slave cluster from 3.2.0 to 3.3.0 latest version. Able to start geo-replication, it goes into history crawl and becomes changelog. It is working as expecting. Moving the bug to verified state. [root@localhost ~]# gluster volume geo-replication firstvol 10.70.43.185::secvol start Starting geo-replication session between firstvol & 10.70.43.185::secvol has been successful [root@localhost ~]# gluster volume geo-replication firstvol 10.70.43.185::secvol status MASTER NODE MASTER VOL MASTER BRICK SLAVE USER SLAVE SLAVE NODE STATUS CRAWL STATUS LAST_SYNCED -------------------------------------------------------------------------------------------------------------------------------------------------------- 10.70.43.30 firstvol /rochelle/brick1/b2 root 10.70.43.185::secvol N/A Initializing... N/A N/A 10.70.43.30 firstvol /rochelle/brick5/b3 root 10.70.43.185::secvol N/A Initializing... N/A N/A 10.70.43.148 firstvol /rochelle/brick2/b2 root 10.70.43.185::secvol N/A Initializing... N/A N/A 10.70.43.148 firstvol /rochelle/brick6/b3 root 10.70.43.185::secvol 10.70.43.158 Passive N/A N/A [root@localhost ~]# [root@localhost ~]# gluster volume geo-replication firstvol 10.70.43.185::secvol status MASTER NODE MASTER VOL MASTER BRICK SLAVE USER SLAVE SLAVE NODE STATUS CRAWL STATUS LAST_SYNCED --------------------------------------------------------------------------------------------------------------------------------------------------------- 10.70.43.30 firstvol /rochelle/brick1/b2 root 10.70.43.185::secvol 10.70.43.185 Active History Crawl 2017-04-10 22:53:07 10.70.43.30 firstvol /rochelle/brick5/b3 root 10.70.43.185::secvol 10.70.43.185 Active History Crawl 2017-04-10 22:53:08 10.70.43.148 firstvol /rochelle/brick2/b2 root 10.70.43.185::secvol 10.70.43.158 Passive N/A N/A 10.70.43.148 firstvol /rochelle/brick6/b3 root 10.70.43.185::secvol 10.70.43.158 Passive N/A N/A [root@localhost ~]# [root@localhost ~]# gluster volume geo-replication firstvol 10.70.43.185::secvol status MASTER NODE MASTER VOL MASTER BRICK SLAVE USER SLAVE SLAVE NODE STATUS CRAWL STATUS LAST_SYNCED ----------------------------------------------------------------------------------------------------------------------------------------------------------- 10.70.43.30 firstvol /rochelle/brick1/b2 root 10.70.43.185::secvol 10.70.43.185 Active Changelog Crawl 2017-04-10 22:53:07 10.70.43.30 firstvol /rochelle/brick5/b3 root 10.70.43.185::secvol 10.70.43.185 Active Changelog Crawl 2017-04-10 22:53:08 10.70.43.148 firstvol /rochelle/brick2/b2 root 10.70.43.185::secvol 10.70.43.158 Passive N/A N/A 10.70.43.148 firstvol /rochelle/brick6/b3 root 10.70.43.185::secvol 10.70.43.158 Passive N/A N/A [root@localhost ~]# Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory, and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://access.redhat.com/errata/RHBA-2017:2774 |