Bug 131082 - ia64 double conversion bug
ia64 double conversion bug
Status: CLOSED RAWHIDE
Product: Fedora
Classification: Fedora
Component: gcc (Show other bugs)
rawhide
ia64 Linux
medium Severity medium
: ---
: ---
Assigned To: Jakub Jelinek
:
Depends On:
Blocks:
  Show dependency treegraph
 
Reported: 2004-08-27 08:09 EDT by Joe Orton
Modified: 2007-11-30 17:10 EST (History)
0 users

See Also:
Fixed In Version: 3.4.1-10
Doc Type: Bug Fix
Doc Text:
Story Points: ---
Clone Of:
Environment:
Last Closed: 2004-10-01 05:46:34 EDT
Type: ---
Regression: ---
Mount Type: ---
Documentation: ---
CRM:
Verified Versions:
Category: ---
oVirt Team: ---
RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: ---


Attachments (Terms of Use)

  None (edit)
Description Joe Orton 2004-08-27 08:09:51 EDT
gcc-3.4.1-9, ia64, dist-fc3 buildroot.

bash-3.00# cat dbl.c
#include <stdio.h>
#include <limits.h>
int main(int argc, char **argv)
{
   double d = -12.345;
   long l = (d > LONG_MAX) ? (unsigned long) d : (long) d;
   printf("%ld\n", l);
   return 0;
}
bash-3.00# gcc -Wall dbl.c
bash-3.00# ./a.out
-9223372036854775808
bash-3.00# gcc -O2 -Wall dbl.c
bash-3.00# ./a.out
0

Yes the program is weird and stupid, but it seems to have well-defined
behaviour by C99, AFAICT.

if the line is changed to, e.g.:

   long l = (d > LONG_MAX) ? (unsigned long) (puts(""), d) : (long) d;

it prints -12, also.
Comment 1 Joe Orton 2004-08-27 08:11:08 EDT
Forgot to mention: the program prints -12 on all our other platforms,
as expected.

Note You need to log in before you can comment on or make changes to this bug.