Bug 1758784 - [Tracker #1757420] memory leak in glusterfsd with error from iot_workers_scale function
Summary: [Tracker #1757420] memory leak in glusterfsd with error from iot_workers_scal...
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Gluster Storage
Classification: Red Hat Storage
Component: rhgs-server-container
Version: ocs-3.11
Hardware: x86_64
OS: Linux
urgent
urgent
Target Milestone: ---
: OCS 3.11.z Batch Update 4
Assignee: Raghavendra Talur
QA Contact: Rachael
URL:
Whiteboard:
Depends On: 1757420
Blocks:
TreeView+ depends on / blocked
 
Reported: 2019-10-05 15:40 UTC by Raghavendra Talur
Modified: 2023-03-24 15:36 UTC (History)
9 users (show)

Fixed In Version: rhgs-server-container-3.11.4-14
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2019-10-30 12:32:53 UTC
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Red Hat Product Errata RHBA-2019:3257 0 None None None 2019-10-30 12:33:10 UTC

Description Raghavendra Talur 2019-10-05 15:40:25 UTC
Description of problem:

creation of new pvc causing all existing bricks going offline.

We are seeing multiple messages in brick logs

~~~

Message from brick logs proving that iot_workers_scale function is called regularly:
        ./bricks/var-lib-heketi-mounts-vg_a297e8e6c7ee27ef50ecdb8d275b5b1e-brick_59a437f1dda29b1f864cef14bb79969a-brick.log:[2019-09-30 10:33:37.520821] D [MSGID: 0] [io-threads.c:822:__iot_workers_scale] 12-vol_b0008eb67e38f0353db3de8d7ac8d696-io-threads: scaled threads to 3 (queue_size=3/3)

~~~

Version-Release number of selected component (if applicable):
OCS 3.11

How reproducible:
In customer environment frequently


Actual results:
All the brick processes going offline causing all volumes going down

Expected results:
New pvc creation should not cause all existing bricks going offline

Additional info: In further comments

Comment 12 errata-xmlrpc 2019-10-30 12:32:53 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2019:3257


Note You need to log in before you can comment on or make changes to this bug.