Bug 8277

Summary: Red Hat Linux 6.0 consumes free memory too fast
Product: [Retired] Red Hat Linux Reporter: strokop
Component: apacheAssignee: Preston Brown <pbrown>
Status: CLOSED NOTABUG QA Contact:
Severity: high Docs Contact:
Priority: medium    
Version: 6.0   
Target Milestone: ---   
Target Release: ---   
Hardware: All   
OS: Linux   
URL: http:/www.aecom.yu.edu/xray
Whiteboard:
Fixed In Version: Doc Type: Bug Fix
Doc Text:
Story Points: ---
Clone Of: Environment:
Last Closed: 2000-01-13 02:34:49 UTC Type: ---
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:

Description strokop 2000-01-07 20:44:25 UTC
I am running Apache 1.3.6 on Red Hat 6.0. During last day I've lost
a lot of free memory around 90MB just retrieving pages from my website
powered by Apache. Apache people suggest that everything is normal and
suggested that Linux kernel caches memory in such a way.

 I actually don't know whether it's a bug or not but this behaviour
of Linux when you just loose accessible/free memory is not entirely
satisfactory.
(I used basically 'free' command to monitor all this and I know for sure
that memory marked 'used' cannot be accesed from my programs).
 Obviously Red Hat 6.0 eats my memory even if doesn't need it. It actually
ruins all my plans. I planned to run programs, really large and long
programs. 300-400K of memory for each. (I have two processors). But
such quick disappearance of memory makes me wonder whether I can do this
at all because of this memory problem. I had a long hard look at this
problem. It maybe not a bug. I checked the amount of memory after
each browser click both remotely and on the server itself and after
each click 4-300 KB of free memory disappears completely. I couldn't
believe my own eyes! During one night of clicking (it was only me on this
machine) after 50-60 clicks  I've lost around 30-40 MB of memory. Though
I have around 960M RAM, nevertheless, I don't understand why Linux ate 80
MB of memory in 2-3 days on almost empty machine which only served HTML
pages. We can do a joint experiment if some one wishes.

  In practice it means that in week or so after rebooting I will loose
at least 300 MB or more, the system will start swapping when I run
my two BIG programs than it will assign even more memory to cach and
buffers and I will never will be able  to do what I want to do... I want,
as you understnad to avoid swapping as long as possible...

 I showed (as a webmaster) all this to our System administrator -
he had no reasonable explanation for this "memory leaking".

 All this looks like a complete disaster to me since I had no time or
intention to go into tiny details of how kernel 2.2.5-15smp really works...
I have no a faintest idea why memory is cached and buffered and why
I never get it back when browser exits. The same situation occurs when
you run a large program directly from the server (not remotely). I
understand that operating system needs more cach but I don't
understand why it never releases it back. At least
I don't understand why system constanly consumes memory even if I run
the same program or retrieve the same web page (not immediately of course
but a few minutes later). The more I run the same program the more memory
I loose. It just shrinks...

  In this way anybody can crash my server just by clicking at
buttons at my web site... Unbelievable...

  I considered buying more Linux servers but after all that I am not
quite sure what to do. I never experienced such a problem on SGI machines,
for example...

  Having said this I only want to add that otherwise system operates
very smoothly. But this "memory thing"  - it's a complete overkill...
If the Red Hat system is really supposed to work like this? ...we all have
to know...



 Please help. I really need your comments on that.

 Regards, Boris

 Albert Einstein College of Medicine. Jan 7 2000

Comment 1 Jeff Johnson 2000-01-08 20:44:59 UTC
The free command only displays the total system usage. Try using ps and/or
top to identify what process is using memory. If the process is the apache
server, you might want to restart the process daily/weekly in order to
free the memory.

There's also a known memory leak in gethosbyname when running nscd in Red Hat
6.0. Either upgrade glibc to the latest 6.0 errata, or don't start nscd.

Comment 2 Preston Brown 2000-01-13 02:34:59 UTC
As jeff said, Linux has an efficient file and program cache.  When 'free'
approaches zero, you really aren't at zero.  If a new program needs memory, it
will be allocated from the cache.

Not a bug, trust us.