Consider the following C code:
#include <stdio.h>
int main(int argc, char* argv[])
{
const long double ld = 0.12345678901234567890123456789012345L;
printf("%lu %.36Lf
", sizeof(ld), ld);
return 0;
}
Compiled with gcc 4.8.1
under Ubuntu x64 13.04
, it prints:
16 0.123456789012345678901321800735590983
Which tells me that a long double weights 16 bytes but the decimals seems to be ok only to the 20th place. How is it possible? 16 bytes corresponds to a quad, and a quad would give me between 33 and 36 decimals.
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…