Bug 19088

Summary: <limits.h> mis-defines symbol LONG_BIT
Product: [Retired] Red Hat Linux Reporter: Don Bennett <dpb>
Component: glibcAssignee: Jakub Jelinek <jakub>
Status: CLOSED ERRATA QA Contact: David Lawrence <dkl>
Severity: high Docs Contact:
Priority: high    
Version: 7.0CC: fweimer
Target Milestone: ---   
Target Release: ---   
Hardware: i386   
OS: Linux   
Whiteboard:
Fixed In Version: Doc Type: Bug Fix
Doc Text:
Story Points: ---
Clone Of: Environment:
Last Closed: 2000-11-21 16:38:13 UTC Type: ---
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:

Description Don Bennett 2000-10-13 21:05:02 UTC
Some of the system limits are screwed up. 
 I have determined that:
  1)   /usr/lib/gcc-lib/i386-redhat-linux/2.96/include/limits.h
 includes
  2)  /usr/lib/gcc-lib/i396-redhat-linux/2.96/include/syslimits.h
 which turns around and once again includes
  3)  /usr/lib/gcc-lib/i396-redhat-linux/2.96/include/limits.h, 
 but the second time _GCC_LIMITS_H is already defined, so it
 goes ahead and pulls in /usr/include/ limits.h, which in turn
 pulls in /usr/include/bits/xopen_lim.h.  This file counts on 
  INT_MAX being defined before it is included (or I get an incorrect
default)
 but the original compiler version of limits.h (1) hasn't yet defined
 that symbol.

I tried a few things to work around the problem, but I think I will rather
change my code not to depend on LONG_BIT.

Test program:


#define _GNU_SOURCE 1
#include <limits.h>

main()
{
    printf("size of long: %d\n", sizeof(long));
    printf("LONG_BIT: %d\n", LONG_BIT);
}


Sample output:

[dpb@cayenne] 239% ./testing
size of long: 4
LONG_BIT: 64

Comment 1 Jakub Jelinek 2000-10-20 15:37:03 UTC
I think I have fixed this in
http://sources.redhat.com/ml/libc-hacker/2000-10/msg00064.html
I'm now waiting for it being reviewed, if Uli agrees with that solution,
it will make into next glibc errata.
Thanks for noticing this (btw: if you passed -I/usr/include on the command
line, LONG_BIT and WORD_BIT would have proper values).

Comment 2 Jakub Jelinek 2000-10-23 14:57:57 UTC
Ulrich actually checked it in, so you can expect it in the next glibc errata.

Comment 3 Jakub Jelinek 2000-10-23 14:59:32 UTC
*** Bug 19600 has been marked as a duplicate of this bug. ***

Comment 4 Jakub Jelinek 2000-10-30 06:22:46 UTC
*** Bug 19996 has been marked as a duplicate of this bug. ***

Comment 5 Jakub Jelinek 2000-10-30 08:26:49 UTC
*** Bug 20015 has been marked as a duplicate of this bug. ***

Comment 6 msuencks 2000-11-19 20:17:47 UTC
I've already upgraded to glibc-devel-2.1.94-3.i386.rpm which
still brings the mentioned problem ( specifially I want to
build Python 2.0)

what files I am supposed to patch anyway - those under /usr/include
or those under /usr/lib/gcc-lib ?


Comment 7 msuencks 2000-11-21 16:34:57 UTC
more specific: the Python FAQ states that I can
work around the LONG_BIT error by running "configure" like this:

 CC="gcc -DINT_MAX=2147483647" ./configure 

however this works only with the headers from 
glibc-2.1.92 but not with the 2.1.94 update ..

Comment 8 msuencks 2000-11-21 16:38:11 UTC
more specific: the Python FAQ states that I can
work around the LONG_BIT error by running "configure" like this:

 CC="gcc -DINT_MAX=2147483647" ./configure 

however this works only with the headers from 
glibc-2.1.92 but not with the 2.1.94 update ..

Comment 9 Jakub Jelinek 2001-08-07 13:00:48 UTC
*** Bug 50849 has been marked as a duplicate of this bug. ***