[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[avr-libc-dev] Problem with _delay_ms()
From: |
Curtis Maloney |
Subject: |
[avr-libc-dev] Problem with _delay_ms() |
Date: |
Fri, 19 Aug 2005 15:00:22 +1000 |
User-agent: |
Debian Thunderbird 1.0.6 (X11/20050802) |
Now, I've been working so far on the assumption that avr-libc is developed
solely for use with GCC. So it's not a problem to use GCC-sisms in the headers.
I today ran into a problem with _delay_ms(), where it brought in the floating
point code. This I found odd, because my project had used _delay_ms() elsewhere
without this effect.
I traced it down to this:
The existing code only called _delay_ms() once, so gcc inlined it, and the
optimisers (even -or perhaps especially- at -Os) cleaned up the float to a
constant int.
The new code called it twice, with different values. So the inline-able
function was not inlined, and thus the constant wasn't cleaned up.
I fixed this by giving the function declaration
__attribute__((always_inline))... and this had cut some 0x600+ bytes off my
program image size...
I suppose I should submit this to the bug tracker?
(gcc 4.0.1, binutils 2.16.1, avr-libc 1.2.5)
--
Curtis Maloney
- [avr-libc-dev] Problem with _delay_ms(),
Curtis Maloney <=