From Bugzilla Helper:
User-Agent: Mozilla/4.0 (compatible; MSIE 5.5; Windows 98; Tucows)
strlen and strrchr (the only two I tested) both segfault if used with a
NULL value. They should at least return 0 or -1 -- there _should_ be a
test for NULL in these.
Steps to Reproduce:
1. int a = strlen (NULL);
2. int b = strrchr (NULL, 'c');
Actual Results: Segmentation fault.
Expected Results: Returned an error.
E.g. ISO C99, 7.1.4 clearly states that:
If an argument to a function has an invalid value (such as ... a null pointer
...) ..., the behaviour is undefined.
This general rule is not overridden for any of the functions you named above,
so it applies.
And there is no reason to slow things down for the sake of invalid programs.
You can use wrappers around the standard functions which will check its
arguments before passing them to the libc implementation.
"Undefined" behaviour does not equate to a segmentation fault, it is simply
undefined (unspecified) and up to the programmer to interpret. If you believe
those two CPU cycles are so precious, far be it for me to disagree (glibc isn't
exactly _that_ efficient in the first place), but the stability of the system
should be a higher priority than sheer speed.
Note: in good programming practice, it is up to the library to check for
invalid parameters and tell the calling program.
A simple "if (value == NULL) return -1;" would be nice.
PS, a third opinion on this would be nice as well (I'll leave as 'resolved' in
Undefined behaviour means the routine can do anything, format your disks,
do nothing, send a signal, whatever.
Why is NULL so special btw, I mean e.g. passing 0xdeadbeef (provided it does
not point to a mapped memory) is the same category (but much harder to find out).
Note that the standards explicitely say which functions allow NULL as arguments,
e.g. free(NULL) is valid.
Even if glibc had this argument checking, if you tried to run your program
on Solaris, Irix, *BSD, name it, it would crash there, so it would be highly