Bug 1725163 - large omap object warning threshold is too high
Summary: large omap object warning threshold is too high
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Ceph Storage
Classification: Red Hat Storage
Component: RADOS
Version: 3.2
Hardware: x86_64
OS: Linux
high
high
Target Milestone: z1
: 3.3
Assignee: Neha Ojha
QA Contact: Manohar Murthy
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2019-06-28 14:45 UTC by Josh Durgin
Modified: 2019-10-22 13:29 UTC (History)
9 users (show)

Fixed In Version: RHEL: ceph-12.2.12-57.el7cp Ubuntu: ceph_12.2.12-51redhat1xenial
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2019-10-22 13:29:03 UTC
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Ceph Project Bug Tracker 40583 0 None None None 2019-06-28 15:42:45 UTC
Github ceph ceph pull 29175 0 None closed luminous: common/options.cc: Lower the default value of osd_deep_scrub_large_omap_object_key_threshold 2021-01-26 08:19:58 UTC
Red Hat Product Errata RHBA-2019:3173 0 None None None 2019-10-22 13:29:26 UTC

Description Josh Durgin 2019-06-28 14:45:19 UTC
The current default of 2million k/v pairs is too high. Recovery takes too long for bucket index objects with this much omap data in particular, which blocks access to client buckets until it completes.

Lower the default for osd_deep_scrub_large_omap_object_key_threshold so such objects can be detected before they become a problem.

Comment 1 Giridhar Ramaraju 2019-08-05 13:11:09 UTC
Updating the QA Contact to a Hemant. Hemant will be rerouting them to the appropriate QE Associate. 

Regards,
Giri

Comment 2 Giridhar Ramaraju 2019-08-05 13:12:11 UTC
Updating the QA Contact to a Hemant. Hemant will be rerouting them to the appropriate QE Associate. 

Regards,
Giri

Comment 3 Josh Durgin 2019-08-21 22:28:29 UTC
CodeChange since it is only updating a config value.

Comment 9 errata-xmlrpc 2019-10-22 13:29:03 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2019:3173


Note You need to log in before you can comment on or make changes to this bug.