View previous topic :: View next topic |
Author |
Message |
mutthunaveen
Joined: 08 Apr 2009 Posts: 100 Location: Chennai, India
|
delay in main causes delay in timer0 interrupt |
Posted: Thu Oct 22, 2015 10:34 pm |
|
|
Dear Forum
yesterday i started to write a program in 16F676 uC.
i used internal OSC 4Mhz
1. Timer0
2. EXT INT
3. Void main with Infinity loop.
1. timer 0 (toggle YELLO LED when a variable de-pleats inside timer0)
2. EXT INT (inside toggle a GREEN LED with a small delay (delay_ms(10)))
3. Void main Infinity loop (toggle RED LED with (delay_ms(1000))
problem -->
when delay in main is removed the Timer0 works fine
when delay in main is enabled then timer0 delay is also correspondingly increasing.
where is wrong? can someone help me.
thank you |
|
|
Ttelmah
Joined: 11 Mar 2010 Posts: 19504
|
|
Posted: Fri Oct 23, 2015 12:31 am |
|
|
You will be getting a warning saying that interrupts are disabled to prevent re-entrancy.
Problem is that you are using the delay routine, both inside an interrupt (INT_EXT), and outside. A function cannot be called from inside itself in the PIC (this is 're-entrancy'). So to prevent the possibility of the delay function being called from inside the delay function, interrupts have to be disabled when you use the delay in the main.....
Now, realistically using a delay inside an interrupt (except for the very shortest ones using a couple of cycles of delay), is 'not good practice'. General thing is that interrupt handlers should always exit 'ASAP'. If you want something to happen a moment after doing something in an interrupt, then program a timer to trigger at the interval required, and use an interrupt from this. As a separate thing, consider not using 'delays' in the main code at all. If (for instance) you have an interrupt triggering every few hundredths of a second, use this to time your delays. Have a simple counter. In the interrupt have something like:
Code: |
//global variable
unsigned int16 tick_counter;
//Then in the interrupt
if (tick_counter)
--tick_counter;
|
This way the 'tick_counter' will count _down_ to zero, and stop when it gets there.
Then in the main, have something like:
Code: |
#define TICKS_PER_SECOND xxx //whatever the number is
//then to delay for two seconds
tick_counter = (TICKS_PER_SECOND*2)-1; //this calculation is
//done by the compiler, so takes 'no time' for the processor
while (tick_counter)
{
//Do any jobs you want to do while waiting
}
|
This way you can do 'things' while you are waiting. A much better use
of processor resources.... |
|
|
mutthunaveen
Joined: 08 Apr 2009 Posts: 100 Location: Chennai, India
|
Nice lesson |
Posted: Fri Oct 23, 2015 2:06 am |
|
|
Dear Ttelmah
Thank you for the elaborated explanation. Now i got the complete knowledge on how delay impact interrupt and how to smartly apply delays.
Best practice is not to use Delay anywhere in code unless it is too short.
Thank you Ttelmah |
|
|
|