View previous topic :: View next topic |
Author |
Message |
hayee
Joined: 05 Sep 2007 Posts: 252
|
How to generate delay in ms |
Posted: Fri Feb 12, 2010 1:41 am |
|
|
Hi,
I want to generate a delay which is in milliseconds. The range of delay is between 0-200ms and the values will be entered from keypad.
What is the best way to generate delay? I don't wanna use any kind of interrupts because I am already using Timer0 interrupt which interrupts after every 100ms. I am also using external and serial interrupts.
Thanks |
|
|
bungee-
Joined: 27 Jun 2007 Posts: 206
|
|
Posted: Fri Feb 12, 2010 1:50 am |
|
|
delay_ms(x) is not good for you? |
|
|
hayee
Joined: 05 Sep 2007 Posts: 252
|
|
Posted: Fri Feb 12, 2010 2:12 am |
|
|
ya i have the least solution to use delay_ms() but i am looking for another option.
What will happens when i am in the delay_ms routine and an interrupt comes,
If the interrupt will be executed first then will the period of the delay routine increases or not? If Yes than for that reason i am not willing to use the delay_ms() routine. |
|
|
Ttelmah Guest
|
|
Posted: Fri Feb 12, 2010 5:08 am |
|
|
It'd perhaps help if you actually said why you don't want to use delay_ms....
Making a guess.
There are _three_ basic ways of timing on the PIC, not using external hardware:
1) Use an interrupt. Downsides need for handler, latency, and interference with other interrupt events.
2) Use a software timing loop - this is what 'delay_ms' does. Plus side, very simple. Downsides, timing is affected by handling interrupts etc..
3) Use a hardware timer, _without an interrupt.
I'd suspect that you want your timing to be more accurately handled than the software loop, while still dealing with your current interrupts.
If so, then option '3' may be the way to go, but you don't tell us enough to really give you an example.
What you would do, would depend on having another timer available, what clock rate you are running, etc. etc..
However, the basic operation would be:
Setup a timer's prescaler, so that it's timeout period would be slightly more than your longest time. In your case 200mSec. Calculate in advance, for the given clock rate, how many counts/mSec this timer will then have.
Then when you want to start the timing operation:
1) Load the timer, (assuming 16bit), with 65536-(mSec*count_per_msec).
2) Clear the timer interrupt flag.
3) Do your other jobs, waiting etc., till the timer flag sets.
Best Wishes |
|
|
hayee
Joined: 05 Sep 2007 Posts: 252
|
|
Posted: Mon Feb 15, 2010 12:31 am |
|
|
Ttelmah thanks.
I have a system which is connected with GPS module(GPS-310F).
1- GPS gives me a pulse after every 1 sec which I am reading through external interrupt (which is use for synchronizing the system with gps time).
2- GPS also give me date and time which I am reading through serial interrupt
3- I am generating pulses according to the values entered by keypad (the pulses contain both on and off time which can be entered from keypad) the range of the values are between 0-999.9 of 0.1 step, that's why I am using timer0 to generate interrupt after every 100ms.
Now what I further want to do is that the user enter the values from keypad for the delay. The delay will be of two types on_delay and off_delay. This value is between 0-200ms for example 20ms on and 10ms off.
When pulses starts, before going into high state the on delay should be executed first and when the pulse is going into low state the off delay should be executed first, this process will be continuous.
One another way to describe what I want to do is like
on_delay(20ms)->pulse_high->off_delay(10ms)->pulse_low->on_delay(20ms)->pulse_high->off_delay(10ms)->pulse_low..................cont
I hope you all will understand what I am trying to do. |
|
|
Guest
|
|
Posted: Tue Feb 16, 2010 1:21 am |
|
|
Another method of delaying without pausing the entire code is:
Code: | if(time<(keypad/1000)) //keypad = the number entered in the keypad by the user
{
delay_us(1000); //1ms, this can be reduced if you want the code to check for interrupts more often
++time;
}
if(time>=keypad)
{
time=0; //reset
GO-TO-NEXT-FUNCTION();
} |
-Vinnie- |
|
|
Guest
|
|
Posted: Tue Feb 16, 2010 1:23 am |
|
|
My mistake, I meant this:
Code: | if(time<keypad) //keypad = the number entered in the keypad by the user
{
delay_us(1000*keypad); //1ms, this can be reduced if you want the code to check for interrupts more often
++time;
}
if(time>=keypad)
{
time=0; //reset
GO-TO-NEXT-FUNCTION();
} |
|
|
|
vinniewryan
Joined: 29 Jul 2009 Posts: 154 Location: at work
|
|
Posted: Tue Feb 16, 2010 1:26 am |
|
|
I forgot to get rid of the comment that time, I wasn't logged in so I can't change it. I'm sure you get the basic idea. Instead of delaying the target duration all at once, break it up into multiple sections of small delays until the target duration is met, that way the code will check for inputs and interrupts in between. _________________ Vinnie Ryan |
|
|
Guest
|
|
Posted: Tue Feb 16, 2010 6:01 am |
|
|
Thanks vinniewryan
But how this scheme works
Let suppose from keypad 20 is entered.This means that 20ms delay is required.
If i put value=20 in keypad then it will becomes
delay_us(1000*20) => delay_us(20000)
and if i put value=0.02 (in ms) then it will becomes
delay_us(1000*0.02) => delay_us(20)
in both the case delay will be greater,there is something missing in your code. |
|
|
Guest
|
|
Posted: Tue Feb 16, 2010 11:07 am |
|
|
It looks correct to me, maybe I'm missing something.
I'll use your example:
If you put value=20 in keypad then it will becomes
delay_us(1000*20) => delay_us(20000) which is the same as 20ms
and if you put value=0.02 (in ms) then it will becomes
delay_us(1000*0.02) => delay_us(20) which is the same as .02ms.
us=nano seconds, ms=miliseconds, 1000 nano seconds(us) is the same as 1 milisecond (ms), so if you entered 20ms in the keypad, it should delay for 20,000us (nanoseconds) which is the same as 20ms.
Sorry if you already know this, it just seems like maybe you didn't realize it because your example code looks like it will work.
If you do use this method, make sure you don't delay for the whole duration entered by the keypad all at once. Use the 'time' variable so the code never pauses long enough to miss an interrupt. I was tired when I posted that code and I already see a mistake. Here's the actual code that would work:
if(time<keypad)
{
delay_ms(1); //1ms, this can be reduced if you want the code to check for interrupts more often
++time;
}
if(time>=keypad)
{
time=0; //reset
GO-TO-NEXT-FUNCTION();
} |
|
|
hayee
Joined: 05 Sep 2007 Posts: 252
|
|
Posted: Tue Feb 16, 2010 10:39 pm |
|
|
I think that ms stands for milliseconds and us stands for microseconds.
May be this scheme works, I will try this and hope that it suits me. |
|
|
asmallri
Joined: 12 Aug 2004 Posts: 1635 Location: Perth, Australia
|
Re: How to generate delay in ms |
Posted: Tue Feb 16, 2010 11:15 pm |
|
|
hayee wrote: | Hi,
I want to generate a delay which is in milliseconds. The range of delay is between 0-200ms and the values will be entered from keypad.
What is the best way to generate delay? I don't wanna use any kind of interrupts because I am already using Timer0 interrupt which interrupts after every 100ms. I am also using external and serial interrupts.
Thanks |
hayee wrote: | Hi,
I want to generate a delay which is in milliseconds. The range of delay is between 0-200ms and the values will be entered from keypad.
What is the best way to generate delay? I don't wanna use any kind of interrupts because I am already using Timer0 interrupt which interrupts after every 100ms. I am also using external and serial interrupts.
Thanks |
I would tackle this differently. Set timer 0 to give an interrupt every 1ms instead of 100ms. Setup a variable, say 100ms_tick which gets set to 100. and is decremented by 1 every ms. When it hits 0 you set it to 100 and then perform whatever operation you previously performed every 100ms.
Variable delays are now easy. Set a variable (my_delay) to whatever delay you want. Then in the 1ms interrupt handler, if the value is not currently 0 (very important) decrement it. If after decrementing the value is zero set the flag my_delay_flag in the interrupt handler.
Now in you mainline to set a variable delay you set my_delay to whatever delay value you are looking for and clear the my_delay_flag. You then poll the flag periodically to see if my_delay_flag is set. If set then your delay has expired.
If you need delay values greater than 255ms then you will need to use a 16 bit or wider value for my_delay - in this case in the mainline before you set the new value into my_delay you must disable the timer interrupt, set the desired delay and then enable the timer interrupt. _________________ Regards, Andrew
http://www.brushelectronics.com/software
Home of Ethernet, SD card and Encrypted Serial Bootloaders for PICs!! |
|
|
|