Bug 1670415 - Incremental memory leak of glusterd process in 'gluster volume status all detail
Summary: Incremental memory leak of glusterd process in 'gluster volume status all detail
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Gluster Storage
Classification: Red Hat Storage
Component: glusterd
Version: rhgs-3.3
Hardware: Unspecified
OS: Unspecified
high
high
Target Milestone: ---
: RHGS 3.5.0
Assignee: Sanju
QA Contact: Kshithij Iyer
URL:
Whiteboard:
Depends On:
Blocks: 1696806
TreeView+ depends on / blocked
 
Reported: 2019-01-29 13:55 UTC by Andrew Robinson
Modified: 2022-03-13 16:52 UTC (History)
11 users (show)

Fixed In Version: glusterfs-6.0-1
Doc Type: Bug Fix
Doc Text:
A small memory leak that occurred when viewing the status of all volumes has been fixed.
Clone Of:
Environment:
Last Closed: 2019-10-30 12:20:20 UTC
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Red Hat Product Errata RHEA-2019:3249 0 None None None 2019-10-30 12:20:42 UTC

Description Andrew Robinson 2019-01-29 13:55:04 UTC
Description of problem:

For all three nodes in the customer's cluster, the memory usage of the glusterd service grows monotonically. 


Version-Release number of selected component (if applicable):


glusterfs-3.8.4-52.el7rhgs.x86_64


How reproducible:


Steps to Reproduce:
1. Reboot
2. Watch memory usage over days
3.

Actual results:

Memory usage of glusterd climbs until it consumes over 90% of the available memory.


Expected results:

Memory usage of glusterd service remain low and steady


Additional info:

When the customer opened the case, the memory usage by the glusterd service was over 90% of the available memory for each of the three nodes. The customer rebooted each node over the course of several days, which returned memory usage of the glusterd service to less than 3%. The customer has been collecting 'top' outputs every few hours. They show the memory usage steadily climbing since the reboots.

Comment 31 errata-xmlrpc 2019-10-30 12:20:20 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHEA-2019:3249


Note You need to log in before you can comment on or make changes to this bug.