Bug 1657392 - RGW memory leak OOM in a multisite environment
Summary: RGW memory leak OOM in a multisite environment
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Ceph Storage
Classification: Red Hat Storage
Component: RGW-Multisite
Version: 3.1
Hardware: x86_64
OS: Linux
high
high
Target Milestone: z1
: 3.2
Assignee: Tejas
QA Contact: Tejas
URL:
Whiteboard:
Depends On:
Blocks: 1690922
TreeView+ depends on / blocked
 
Reported: 2018-12-07 21:22 UTC by Vikhyat Umrao
Modified: 2019-11-12 13:16 UTC (History)
9 users (show)

Fixed In Version: RHEL: ceph-12.2.8-87.el7cp Ubuntu: ceph_12.2.8-73redhat1
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
: 1690922 (view as bug list)
Environment:
Last Closed: 2019-03-07 15:51:12 UTC
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Ceph Project Bug Tracker 38479 0 None None None 2019-02-25 20:58:05 UTC
Github ceph ceph pull 26639 0 'None' 'closed' 'rgw: data sync drains lease stack on lease failure' 2019-11-12 13:15:40 UTC
Red Hat Product Errata RHBA-2019:0475 0 None None None 2019-03-07 15:51:24 UTC

Description Vikhyat Umrao 2018-12-07 21:22:30 UTC
Description of problem:
RGW memory leak OOM in a multisite environment

PRD cluster
3 BareMetal RGW for client IO.
3 VM RGW for replication IO.

DR cluster
3 BareMetal RGW for both client IO and replication IO but secondary has very minimal client IO.

Out of 3 VM RGW for replication IO in PRD cluster 2 VM RGW's radosgw daemons are hitting OOM. These VM GW's have 64G RAM.


Version-Release number of selected component (if applicable):
RHCS 3.1
ceph-radosgw-12.2.5-42.el7cp.x86_64

How reproducible:


Steps to Reproduce:
1.
2.
3.

Actual results:


Expected results:


Additional info:

Comment 54 errata-xmlrpc 2019-03-07 15:51:12 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2019:0475


Note You need to log in before you can comment on or make changes to this bug.