From Bugzilla Helper: User-Agent: Mozilla/4.78 [en] (X11; U; Linux 2.4.19-pre8-ac1 i686) Description of problem: round(), nearbyint() and friends just returns junk... Version-Release number of selected component (if applicable): glibc-2.2.4-13 How reproducible: Always Steps to Reproduce: 1.compile and run the attached example Actual Results: -1.998543 -1.998543 5.000000 Expected Results: 5.000000 5.000000 5.000000 Additional info: Well, I can't beleive such a bug can exist and the system to be rather stable at the same time... Maybe I am using that functions incorrectly, or maybe something not good with my local configuration... Anyway, after getting some sleep and reading mans, the bug is still there, so I've nothing to do rather than reporting it...
Created attachment 57929 [details] example of the strange behaveor
Hmm, and btw, compiling this example with -Wall gives a warnings about an implicit declaration of all 3 math functions, while math.h is included. Something seems really wrong here... gcc-2.96-98
Yes, nearbyint and round are functions new in ISO C99, rint is more common (some BSDs, SYS V, XOpen >= 500, ISO C99), though not in all standards either. If you want them, you have to enable them using the appropriate feature macros (e.g. either of -D_GNU_SOURCE, -D_ISOC99_SOURCE, -std=c99 should work). This is all documented in glibc documentation.
Hmm, while, strictly speaking, this is not a bug, this is still very confusing behaveor and may lead to a bugs. gcc silently compiles the code to some crap which doesn't work. I think that error message must be produced in this case. *Silently* producing the wrong code is not the best way of pointing to user that he haven't read the glibc docs...
I disagree. ISO C allows calling unprototyped functions and has exact rules what it has to do in that case, so making it an error by default is not possible since it is conforming code. Users always can use -Wstrict-prototypes -Werror to get errors for this.
Hello Jakub. I really appreciate your explanations, thank you. But I still don't see the light:( > I disagree. ISO C allows calling unprototyped functions Yes, but generally this only produces a warning and works correctly. Here *we don't have any warnings* (unless -Wall) AND the code is *non-functional*. The last one I can't understand at all: if it links well, why it doesn't work? And btw, why *no warnings*? > and has exact rules what it has to do in that case, Well, maybe there are some special rules that allows the code not to work without any warnings, but: > Users always can use -Wstrict-prototypes -Werror > to get errors for this. Wait, Stop here! I have only changed "int main()" to "int main(void)" in the attached example, no other changes were made. Then I do: cc -lm -Wstrict-prototypes -Werror -o tst tst.c And there is still *no warning* and no errors! And I still get a disfunctional code since it still compiles "sucessfully". So what I see is not what you say, which probably means that something is still not good. What I dislike here is a pure silence from gcc and a disfunctional code at the end. Do you really consider this is sane? Thanks for your patience.
You can write non-functional yet standard conforming code in many ways. If say nearbyint is not prototyped (and it cannot be unless you request ISO C99 feature set, since e.g. #include <math.h> char *nearbyint (char *p) { return p; } int main (void) { char *p = nearbyint ("foo"); } is a conforming program for ISO C89, likewise for a bunch of other standards), then what should the compiler warn about by default? There are tons of programs not using prototypes out there, warning for this by default would lead to people not reading warnings at all. With -Wstrict-prototypes, sorry, I've made a typo, wanted to write -Wmissing-prototypes.
> There are tons of programs not using prototypes out there, warning for this by default > would lead to people not reading warnings at all. I can understand this. What I can't understand is that the not prototyped function stops working in this example. If it is prototyped - it works, if not prototyped - doesn't work. What's going wrong in that particular case? Is it being linked with something other than when it is prototyped? I've got self-compilled glibc-2.2.5 here (from gnu.org, not RedHat's one), could you please point me an exact place in the sources where the things gets wrong? > With -Wstrict-prototypes, sorry, I've made a typo, wanted to write -Wmissing-prototypes. OK, I have tried -Wmissing-prototypes as well as both -Wmissing and -Wstrict together with -Werror, and still getting the pure silence. That fact makes me to bother you again...
It is simple. Unprototyped functions in C are implicitely prototyped with int function(); thus you get the effect of: int nearbyint(); int main (void) { printf ("%f\n", nearbyint (24.5)); } As nearbyint in libm returns double, not int (and IA-32 calling convention returns integral return values differently than floating ones), there is garbage in the register C expects the return value in (%eax). Then, you pass this to printf ..., which means the random garbage in %eax is stored into the stack, then printf tries to handle that value as double. As for the warning switch, I should have actually tried it. You can use -Werror-implicit-function-declaration, -Wall -Werror, -Wimplicit-function-declaration -Werror or -Wimplicit -Werror.