Red Hat Bugzilla – Bug 109531
Memory leak in Radeon DRI or GLX driver
Last modified: 2007-11-30 17:10:33 EST
From Bugzilla Helper:
User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.4.1)
Description of problem:
When using the radeon driver, X itself will leak memory when rendering
to an offscreen buffer. Essentially, it leaks the offscreen
The same code does not cause a leak when using an NVidia card (with
the XFree86-supplied driver).
Version-Release number of selected component (if applicable):
Can reproduce in RH >= 7.3
Steps to Reproduce:
1. Start X on a machine with a Radeon card. It will typically take 12M
of resident memory.
2.compile and run the attached program.
3.Note that X's resident memory is growing at the rate of 1 Meg /
second. Note that even when you quit the program, X does not free the
Actual Results: X grows indefinately until the program is exited, and
then does not free the memory until X itself is restarted.
Expected Results: X should allocate and de-allocate the offscreen
framebuffer as required (there is never more than 1 being used at a
time in this code).
I've discussed this code with Trolltech support, and they confirm that
this problem only manifests with the Radeon driver, and not with other
hardware or drivers.
Created attachment 95839 [details]
one of the xfree86 config files used (to show the driver used)
Created attachment 95840 [details]
program to demonstrate the problem
This is basically a default QT app (because its easier to render to an
offscreen buffer in QT), with the memory leak demonstrated. The code is
commented, including the exact line where the leak occurs. Note that the
application itself doesn't grow -- X does. Bad (tm).
Added to bugs.xfree86.com as #859
Since pixmaps are stored inside the X server, any application that
leaks pixmaps will cause the X server to increase in size. That
is 100% expected, and it's not an X server bug. The majority
of cases in which the X server appears to have a memory leak, are
usually traced back to either: 1) An application or 2) A toolkit
kill -9 the application and the problem should go away.
Additionally, you can use the application 'xrestop' to monitor X
resource usage. Sometimes it is a window manager at fault also.
This problem may end up being any of the above, or it may turn
out to be a legitimate X server bug or Radeon driver bug. I'm not
a C++ programmer so your C++ example isn't something easily useable
to me personally.
Since the bug has been reported upstream now however, I will track
it there and monitor the progress and resolution there.
Changing status to 'UPSTREAM'
The #2 comment in xfree86 bugzilla upstream report indicates the
problem is fixed in a newer release. I've closed the upstream report
as "FIXED", and am setting the status of this tracker to "RAWHIDE"
also, as we're about 2 or 3 Mesa/DRI implementations in the future
If the problem recurs, the new upstream bugzilla for the DRI project
is X.org bugzilla on freedesktop.org. If the problem persists, please
file a new bug report at http://bugs.freedesktop.org in the "xorg"
component, and we'll continue to track the issue there.
Thanks for testing.