I'm developing a small logic analyzer with 7 inputs. My target device is an
ATmega168 with a 20MHz clock rate. To detect logic changes I use pin change interrupts. Now I'm trying to find out the lowest sample rate I can detected these pin changes. I determined a value of minimum 5.6 µs (178.5 kHz). Every signal below this rate I can't capture properly.
My code is written in C (avr-gcc). My routine looks like:
Kod:
ISR()
{
pinc = PINC; // char
timestamp_ll = TCNT1L; // char
timestamp_lh = TCNT1H; // char
timestamp_h = timerh; // 2 byte integer
stack_counter++;
}
My captured signal change is located at pinc. To localize it I have a 4 byte long timestamp value.
In the datasheet I read the interrupt service routine takes 5 clocks to jump in and 5 clocks to return to the main procedure. I'm assuming each command in my ISR() is taking 1 clock to be executed; So in sum there should be an overhead of 5 + 5 + 5 = 15clocks. The duration of one clock should be according to the clock rate of 20MHz 1/20000000 = 0.00000005 = 50 ns. The total overhead in seconds should be then: 15 * 50 ns = 750 ns = 0.75 µs. Now I don't understand why I can't capture anything below 5.6 µs. Can anyone explain what's going on?