[Top][All Lists]
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
printf () bug?
From: |
Bill Schaffer |
Subject: |
printf () bug? |
Date: |
Wed, 24 Aug 2005 19:17:58 -0400 |
Is there a printf() problem using 'long double' vs 'double' on
Windows Xp with i386?
printf( %.20lf %.20Lf ) for 'double' and 'long double' variables
result in both having 16-17 digits of precession. Meaning sign
bit & 8 bit exponent, gives 64-8-1 = 55 bits of precession.
2^55 = 3.6 * 10^16, or about 16 digits of precession.
Printing various examples shows 16 digit accuracy both though
'sizeof(long double)' equals 12 bytes, and 'sizeof(double)'
equals 8. (the '|' below indicates limit of accuracy).
sizeof() 1.6867376747234567891345 Orginal
|
08 lf 1.68673767472345681462 16 digits
12 Lf 1.68673767472345681462
161600.6867376747234567891 Orginal
|
08 lf 161600.68673767472500912845 17 digits
12 Lf 161600.68673767472500912845
123456789123.123456789123 Orginal
|
08 lf 123456789123.12345886230468750000 17 digits
12 Lf 123456789123.12345886230468750000
1234567891237.123456789123 Orginal
|
08 lf 1234567891237.12353515625000000000 16 digits
12 Lf 1234567891237.12353515625000000000
Both types print the same. It seems 'long double' should give better
precession.
Printf() problem?
or compiler load of 'long double' constant problem?
or 'long double' allocates 12 bytes but i386 handles only 8?
Or its just me.
Regards
Bill Schaffer
- printf () bug?,
Bill Schaffer <=