Bug 131082 - ia64 double conversion bug
Summary: ia64 double conversion bug
Keywords:
Status: CLOSED RAWHIDE
Alias: None
Product: Fedora
Classification: Fedora
Component: gcc
Version: rawhide
Hardware: ia64
OS: Linux
medium
medium
Target Milestone: ---
Assignee: Jakub Jelinek
QA Contact:
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2004-08-27 12:09 UTC by Joe Orton
Modified: 2007-11-30 22:10 UTC (History)
0 users

Fixed In Version: 3.4.1-10
Doc Type: Bug Fix
Doc Text:
Clone Of:
Environment:
Last Closed: 2004-10-01 09:46:34 UTC
Type: ---
Embargoed:


Attachments (Terms of Use)

Description Joe Orton 2004-08-27 12:09:51 UTC
gcc-3.4.1-9, ia64, dist-fc3 buildroot.

bash-3.00# cat dbl.c
#include <stdio.h>
#include <limits.h>
int main(int argc, char **argv)
{
   double d = -12.345;
   long l = (d > LONG_MAX) ? (unsigned long) d : (long) d;
   printf("%ld\n", l);
   return 0;
}
bash-3.00# gcc -Wall dbl.c
bash-3.00# ./a.out
-9223372036854775808
bash-3.00# gcc -O2 -Wall dbl.c
bash-3.00# ./a.out
0

Yes the program is weird and stupid, but it seems to have well-defined
behaviour by C99, AFAICT.

if the line is changed to, e.g.:

   long l = (d > LONG_MAX) ? (unsigned long) (puts(""), d) : (long) d;

it prints -12, also.

Comment 1 Joe Orton 2004-08-27 12:11:08 UTC
Forgot to mention: the program prints -12 on all our other platforms,
as expected.


Note You need to log in before you can comment on or make changes to this bug.