Bug 1662353 - data_log shards causing a lot of big omap objects
Summary: data_log shards causing a lot of big omap objects
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Ceph Storage
Classification: Red Hat Storage
Component: RGW-Multisite
Version: 3.1
Hardware: x86_64
OS: Linux
high
high
Target Milestone: z1
: 3.2
Assignee: Casey Bodley
QA Contact: Tejas
Bara Ancincova
URL:
Whiteboard:
Depends On:
Blocks: 1629656
TreeView+ depends on / blocked
 
Reported: 2018-12-27 23:55 UTC by Vikhyat Umrao
Modified: 2019-11-12 13:08 UTC (History)
13 users (show)

Fixed In Version: RHEL: ceph-12.2.8-84.el7cp Ubuntu: ceph_12.2.8-70redhat1xenial
Doc Type: Bug Fix
Doc Text:
.Datalogs are now trimmed regularly as expected Due to a regression in decoding of the JSON format of data sync status objects, automated datalog trimming logic was unable to query the sync status of its peer zones. Consequently, the datalog trimming process did not progress. This update fixes the JSON decoding and adds more regression test coverage for log trimming. As a result, datalogs are now trimmed regularly as expected.
Clone Of:
Environment:
Last Closed: 2019-03-07 15:51:27 UTC
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Ceph Project Bug Tracker 38075 0 None None None 2019-01-29 16:18:38 UTC
Ceph Project Bug Tracker 38373 0 None None None 2019-02-18 23:20:55 UTC
Github ceph ceph pull 26190 0 'None' closed rgw multisite: only update last_trim marker on ENODATA 2020-12-07 14:36:47 UTC
Github ceph ceph pull 26494 0 'None' closed rgw: fix rgw_data_sync_info::json_decode() 2020-12-07 14:36:46 UTC
Red Hat Knowledge Base (Solution) 4079681 0 Troubleshoot None Ceph - RGW Multisite - replication data logs are not getting trimmed, why? 2019-04-22 22:54:46 UTC
Red Hat Product Errata RHBA-2019:0475 0 None None None 2019-03-07 15:51:36 UTC

Description Vikhyat Umrao 2018-12-27 23:55:27 UTC
Description of problem:
data_log shards causing a lot of big omap objects and causing OSD's to hit full ratio

Version-Release number of selected component (if applicable):
RHCS 3.1 - 12.2.5-59.el7cp.x86_64

How reproducible:
At the customer site always

Comment 55 errata-xmlrpc 2019-03-07 15:51:27 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2019:0475


Note You need to log in before you can comment on or make changes to this bug.