View previous topic :: View next topic |
Author |
Message |
hmmpic
Joined: 09 Mar 2010 Posts: 314 Location: Denmark
|
Change osc and have the delay_ms to match... |
Posted: Mon Jul 18, 2016 2:16 pm |
|
|
Want to change the osc from 4e6 to 16e6 in some time critical function.
It is easy to change, but then the delay_xx wont match.
Then I try to use #use delay(clock=16M,int) in the middle of my code, but that wont work at all?
It is possible to use #use delay(...) more than once, and the last used is the one the compiler use, but how does that work?
I can calculate a new delay when switching from 4 to 16... but if the #use delay(clock=16M,int) can work it will be most nice, i think?
Anyone have this to work? |
|
|
newguy
Joined: 24 Jun 2004 Posts: 1908
|
|
Posted: Mon Jul 18, 2016 3:20 pm |
|
|
Have you tried just leaving the #use delay(clock=4M) and when you switch to the 16MHz clock, just call an alternate delay_ms() function:
Code: | void my_16MHzdelay_ms(unsigned int16 delay) {
delay_ms(4 * delay); // this will work when the clock is actually 16MHz but the compiler thinks it's 4MHz
} |
|
|
|
PCM programmer
Joined: 06 Sep 2003 Posts: 21708
|
|
|
hmmpic
Joined: 09 Mar 2010 Posts: 314 Location: Denmark
|
|
Posted: Tue Jul 19, 2016 12:04 am |
|
|
@newguy, what you suggest is what i also have done. I just think if the compiler can do it all for me, then i prefer that. |
|
|
Ttelmah
Joined: 11 Mar 2010 Posts: 19515
|
|
Posted: Wed Jul 20, 2016 10:16 am |
|
|
At the end of the day, you won't be running the same code at low power as full power. Use the same trick with a #defined factor, and #define yourself another version of delay_xx. So:
Code: |
#define delay_slow_ms(x) delay_ms(x/factor)
|
The maths will be done at compile time, and be in integer. |
|
|
jeremiah
Joined: 20 Jul 2010 Posts: 1349
|
|
Posted: Wed Jul 20, 2016 1:09 pm |
|
|
Depending on the chip, your compiler version, and how complex your clock settings are, I have in the past used multiple #use delays in the following manner:
Code: |
//This is a dummy one to get the right value for delay
//only use clock= in the #use delay
#use delay(clock=4MHz)
#inline void delay_4MHz_ms(unsigned int16 msecs){
delay_ms(msecs);
}
//This is the REAL clock settings for my PIC, can use internal, osc, etc.
#use delay(internal=8MHz, clock=8MHz)
#inline void delay_8MHz_ms(unsigned int16 msecs){
delay_ms(msecs);
}
void main(){
delay_8MHz_ms(1000);
delay_4MHz_ms(1000);
delay_ms(1000);
while(TRUE);
}
|
The last one done should be the the one the compiler sets up the PIC for, so the previous ones just give me differently scaled delay methods. NOTE that you still have to setup the new clock settings manually to physically get the actual physical speeds, but this gives the delay_ms functions. I don't know if this is an intentionally supported feature or not though.
And all the #use delays need to be near the top of your code, not mixed in the middle. |
|
|
hmmpic
Joined: 09 Mar 2010 Posts: 314 Location: Denmark
|
|
Posted: Thu Jul 21, 2016 12:03 am |
|
|
Hi
Thanks for all the hints.
Changing clock on the fly can be complex depending on the setup, timers, ad, pwm, and other hardware. I do the delay as i have done before, it looks like what @newguy have posted. For me there are many nice features in CCS compiler, but sometimes it takes too many hours to learn and know them all. I will keep it simple and do the delay math in a custom function with some bit logic.
thanks. |
|
|
|