Red Hat Bugzilla – Bug 73433
apache/httpd.h HARD_SERVER_LIMIT too small
Last modified: 2007-04-18 12:46:20 EDT
From Bugzilla Helper:
User-Agent: Mozilla/4.0 (compatible; MSIE 5.5; Windows NT 5.0; H010818;
T312461; Hewlett-Packard IE5.5-SP2)
Description of problem:
The Apache config directive MaxClients is limited by a hard-compiled
constant in httpd.h called HARD_SERVER_LIMIT. For Linux, this limit
is set to 256, which is woefully inadequate for today's systems that
may need to have as many as 2048 concurrent clients.
Please up the hard limit in your base distro to 2048. If customers
do not want that many servers running (say because they're on an older
system), then they can manually lower the MaxClients config variable
to whatever they want, but there's nothing short of a recompile that
we can do to go higher.
Version-Release number of selected component (if applicable):
Steps to Reproduce:
Hand-edit the apache httpd.conf file to set MaxClients higher than 256. Start
the server. Observe that the server reports that it is automatically resetting
the value back to the hard-compiled limit.
Actual Results: The server reported that it was automatically resetting the
limit back down to the hard-compiled limit of 256.
Expected Results: It would be nice to see the server able to set the
MaxClients limit as high as 2048.
This can be worked around by recompiling the apache httpd binary, but this is
not an option for everyone (strict software appropriation standards, no
development platform, etc.)
I'd really like this for 7.2 and later versions of RedHat, also.
Refiling against Apache 2.0 package.
In Apache 2.0 this limit can be changed via the httpd.conf's ServerLimit
Working for me in Apache 2 bundled with Red Hat 8.0.
As noted, this works in 2.0. It's hard (if not impossible) to fix for 1.3
without changing the binary module interface.