Description of problem: I have some code something like this: uint64_t be; printf(PRIu64, be64toh(be)); This compiles on F17, but not on F18 and rawhide. The error is: ldm.c:1079:5: error: format '%lu' expects argument of type 'long unsigned int', but argument 5 has type 'long long unsigned int' [-Werror=format] In F17, __bswap_64 is defined as: # if __WORDSIZE == 64 # define __bswap_64(x) \ (__extension__ \ ({ register unsigned long __v, __x = (x); \ <snip> # else # define __bswap_64(x) \ (__extension__ \ ({ union { __extension__ unsigned long long int __ll; \ <snip> However, on F18 and rawhide it becomes: # if __GNUC_PREREQ (4, 2) static __inline unsigned long long int __bswap_64 (unsigned long long int __bsx) { return __builtin_bswap64 (__bsx); } # elif __WORDSIZE == 64 <as above> This means that on in F18 and rawhide, it returns unsigned long long int regardless of word length, which isn't the case on F17, causing my code to not compile. Version-Release number of selected component (if applicable): glibc-headers-2.16.90-11.fc19.x86_64.rpm
Unfortunately, you're going to need to fix your code as I don't see this changing. By going to a uniform "unsigned long long" return value it makes it easier to generally avoid warnings in packages which are compiled for multiple architectures with varying wordsizes.
Reopening. I didn't mark this as a regression lightly :) I'm interested to know what a 'fix' would look like in this case. The man page for be64toh() defines the function as: uint64_t be64toh(uint64_t big_endian_64bits); If I cast the return of be64toh() to uint64_t in the above case, the code compiles fine. However, this is a nonsense as that's the defined return type! Rather than returning unsigned long long, the code could instead return it's documented uint64_t, whose definition is wordsize-dependent.
This was fixed in rawhide with a rebase since upstream got this patch in October: http://sourceware.org/git/?p=glibc.git;a=commitdiff;h=d394eb742a3565d7fe7a4b02710a60b5f219ee64