[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Wed, 14 Jun 2000 14:40:59 +0300 (EEST)
When I tried the GMP library I discovered a mysterious behaviour of
The included code initializes two integers (the second one is zero)
and multiplies them. Then it prints the result and finds out whether the
result is a perfect square (it should be because it's always zero).
Nonethless the test program prints that it's not.
I noticed that the initial value of n1 affects the result. If it's changed
the new program may or may not trigger the effect.
The program was compiled with gcc 2.95.2 using the command
gcc -o test test.c -Wall -W -O2 -g -lgmp
and run using ./test
Libgmp is the version 2.0.2.
The output of 'uname -a' is
Linux lemdara 2.2.12 #7 su maalis 26 16:26:00 EEST 2000 i686 unknown
mpz_t n1, n2;
mpz_mul(n1, n1, n2);
printf("It's a perfect square.\n");
printf("It's not a perfect square.\n");
- mpz_perfect_square_p bug,
Juho Östman <=