Bug 1642144 - RGW multisite stuck on a shard
Summary: RGW multisite stuck on a shard
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Ceph Storage
Classification: Red Hat
Component: RGW-Multisite
Version: 3.1
Hardware: x86_64
OS: Linux
low
medium
Target Milestone: rc
: 3.2
Assignee: Casey Bodley
QA Contact: ceph-qe-bugs
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2018-10-23 18:30 UTC by Vikhyat Umrao
Modified: 2019-01-04 21:52 UTC (History)
12 users (show)

Fixed In Version: RHEL: ceph-12.2.8-45.el7cp Ubuntu: ceph_12.2.8-43redhat1
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2019-01-03 19:02:13 UTC
Target Upstream Version:


Attachments (Terms of Use)


Links
System ID Priority Status Summary Last Updated
Ceph Project Bug Tracker 37448 None None None 2018-11-28 18:59:23 UTC
Github ceph ceph pull 25310 'None' 'closed' 'rgw: data sync accepts ERR_PRECONDITION_FAILED on remove_object()' 2019-12-06 20:12:58 UTC
Red Hat Product Errata RHBA-2019:0020 None None None 2019-01-03 19:02:22 UTC

Description Vikhyat Umrao 2018-10-23 18:30:18 UTC
Description of problem:
RGW multisite stuck on a shard

After the upgrade to RHCS 3.1 we're still facing some replication issues. After the upgrade "radosgw-admin sync status" reported a few shards that are recovering and new objects were created in the dc_zone.rgw.buckets.data pool. But then it got stuck on shard 64 and is still stuck on that shard:

          realm <removed name>
      zonegroup <removed name>
           zone <removed name>
  metadata sync syncing
                full sync: 0/64 shards
                incremental sync: 64/64 shards
                metadata is caught up with master
      data sync source: <removed name>
                        syncing
                        full sync: 0/128 shards
                        incremental sync: 128/128 shards
                        1 shards are recovering
                        recovering shards: [64]



Version-Release number of selected component (if applicable):
RHCS 3.1

Comment 43 errata-xmlrpc 2019-01-03 19:02:13 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2019:0020


Note You need to log in before you can comment on or make changes to this bug.