|
|
View previous topic :: View next topic |
Author |
Message |
FJMSoft
Joined: 20 Oct 2013 Posts: 36
|
PIC12F675 Timer PWM + UART input |
Posted: Fri Nov 01, 2013 1:54 am |
|
|
Hello there.
I'm trying to implement a PWM using timer0 and at same time receive some UART data.
The PWM is implemented and working well.
Problem is with the UART, as uC dont have a hardware UART it also dont have an interrupt for serial.
How can I achieve this? Ideas?
Using just getc is not working.
I also tried INT_EXT and a hand coded UART but something is going wrong (maybe my error as I'm new to this language).
No problem to stop the PWM while receiving the UART data, but I cant freeze uC waiting for the data.
To be more specific, the application is simple, PWM duty will be directly set to the UART data, this will happen at random time.
Thank you. |
|
|
alan
Joined: 12 Nov 2012 Posts: 357 Location: South Africa
|
|
Posted: Fri Nov 01, 2013 2:19 am |
|
|
Have a look at the code Thelmah posted about a day or so ago, using Timer2 as a software UART.
Regards |
|
|
Ttelmah
Joined: 11 Mar 2010 Posts: 19504
|
|
Posted: Fri Nov 01, 2013 4:59 am |
|
|
You might even be able to use the same timer for the PWM....
Best Wishes |
|
|
FJMSoft
Joined: 20 Oct 2013 Posts: 36
|
|
Posted: Fri Nov 01, 2013 1:11 pm |
|
|
I have not found the thelma post :(
As long as I understand, the INT_EXT is the best option, but I really dont care if it is INT_EXT or timer, as long as it works.
The code I tried is this one, but it does not work, I dont know why.
Code: |
int8 T;
#Define bt 100
#Define pt 110
#INT_EXT
void EXT_isr(void)
{
disable_interrupts(GLOBAL);
delay_us(pt);
shift_left(T,8,pin_a2);
delay_us(bt);
shift_left(T,8,pin_a2);
delay_us(bt);
shift_left(T,8,pin_a2);
delay_us(bt);
shift_left(T,8,pin_a2);
delay_us(bt);
shift_left(T,8,pin_a2);
delay_us(bt);
shift_left(T,8,pin_a2);
delay_us(bt);
shift_left(T,8,pin_a2);
delay_us(bt);
shift_left(T,8,pin_a2);
delay_us(bt);
delay_us(pt);
putc(T); // for testing
enable_interrupts(GLOBAL);
clear_interrupt(int_ext); // is this necessary?
}
|
|
|
|
Ttelmah
Joined: 11 Mar 2010 Posts: 19504
|
|
Posted: Fri Nov 01, 2013 1:31 pm |
|
|
1) Don't use delays in interrupts - a search here will find a lot about this.
2) Never _ever_ _ever_ use 'enable_interrupts(GLOBAL) inside an interrupt this is _forbidden_ on the PIC.
3) No you don't need to clear the interrupt.
4) The code hasn't got a hope of working. The shifts and other code will take far to long to have even the faintest chance.
5) The first variable to the shift is meant to be a variable _address_.
6) Then you are rotating 8bytes. Ur....
Do some forum searching on how to generate a software PWM. |
|
|
FJMSoft
Joined: 20 Oct 2013 Posts: 36
|
|
Posted: Fri Nov 01, 2013 2:14 pm |
|
|
Oh god, I dont know where to run now :(
1) I thought if all interrupts were disabled the delay would not be a problem.
2) I didnt know about this, I used to program 8051 before, disabling interrupts inside interrupts were very common.
Problem is disabling global or any interrupt? Could I disable only PWM timer so?
4) Serious? Oh god! I thought this was very fast, in 8051 were 1 or 2 cycles.
5) How can I do this?
6) My mistake! I misread byte as bit.
My pwm is working well, I'm using Timer0 to increase a counter and compare with the value.
pwm is not the problem, but the combination PWM+UART.
Anyway, I have done a search in the forum about the pwm as you said but havent found anything helpful :/ |
|
|
dbotkin
Joined: 08 Sep 2003 Posts: 197 Location: Omaha NE USA
|
|
Posted: Fri Nov 01, 2013 3:37 pm |
|
|
Are you completely bound to the 12F675? The 12F1822 is a FAR more capable chip, pin compatible, is actually a little cheaper in quantity, and has a hardware EUSART in addition to many, many other improvements. Even little things like a WPU on the MCLR pin make life so much easier. |
|
|
C Turner
Joined: 10 Nov 2003 Posts: 40 Location: Utah
|
|
Posted: Fri Nov 01, 2013 4:14 pm |
|
|
I've implemented "software" UARTs inside ISRs, but it takes a bit of planning before-hand.
If at all possible, set the ISR rate to be an integer multiple of the baud rate of the UART. If you are using it only to transmit, this can be a 1:1 ratio, but if you are making a receiver, it needs to be at least 3x the baud rate so that you can make "early-on time-late" determinations and adjust timing as necessary.
In one instance I had a PWM and ISR both running at 19.53125 kHz (e.g. 20 MHz clock, Timer 2 with a divisor of 0 (which is 256) and an ISR post-divisor of 1.)
For synthesizing a software UART I used 16x oversampling (an baud rate error of 1.7% or so) and in the same ISR that the PWM was updated, I also serviced the incoming/outgoing bits. If I found that a received bit was slightly early or late, I would bump that 16x oversample counter up or down as appropriate to keep the transition of the incoming data within the "center" sample.
In theory I could have probably gotten away with up to 4800 baud with the 19.53125 kHz ISR - but not much higher (with a standard baud rate, anyway) unless I'd re-done my ISR rate such that it equaled a multiple of the desired baud rate. The problem with doing this is that if you use Timer2 for this you can impact your PWM resolution adversely if you aren't careful: This is discussed in detail in the part's data sheet.
In my case I was using A/D and PWM to perform some rudimentary DSP audio processing and in this case I could use ONLY one ISR lest I suffer severe timing jitter. The A/D read - then A/D start was always followed by a PWM value calculated the last time I'd been in the ISR. Once I got those portions out of the way that were clock-cycle critical in when they had to occur, I was able to do the other portions that were less "jitter sensitive" such as update system timers and service the "software UART."
* * *
As has been pointed out on this board before, if you *do* have a time-intensive ISR occurring, the more conventional "software UARTs" (e.g. those outside the ISR and using printf, getc, etc.) often fail since the DELAY being calculated by the compiler based on clock speed will be WRONG because some of the available clock cycles used to time the delay will be eaten by the ISR.
In some cases - notably for low baud rates *AND* where the ISR was very consistent in how many clock cycles it ate up and occurring at a a rate much higher than the serial bit rate I was successful after I wrote a test code that used the DELAY function to toggle a pin, outside the ISR with nothing else going on. Precisely measuring that pin's transition rate with a frequency counter and/or scope I was able to calculate how much the ISR slowed the software delays and calculate the equivalent speed and plug that into the #use_delay statement.
Again, the caveat with this is that the ISR may cause a bit of timing jitter and if both the baud rate and ISR utilization percentages are high, you could end up with lots of timing variance (or variations of effective clock speed - or if you were to change the ISR so that more time was spent within it) and see dribbling errors in the communications: In general, the lower the baud rate, the more forgiving to jitter since it would integrate out over time.
CT |
|
|
ckielstra
Joined: 18 Mar 2004 Posts: 3680 Location: The Netherlands
|
|
Posted: Fri Nov 01, 2013 5:55 pm |
|
|
FJMSoft wrote: | 1) I thought if all interrupts were disabled the delay would not be a problem. | I'm sure some people here will disagree with me but I think it is allowed under special conditions to have a long delayed interrupt handler.
It is quick-and-dirty programming, but when you know what you are doing it sometimes is more important to get the job done than to create a cosmetic beautiful program. A getc() call inside the ISR could do the job and you will find several posts here on the forum for succesful examples.
That having said, you've made some rookie mistakes that I doubt you know what you are doing. Perhaps better stick to the correct code.
Here is the link to Ttelmah's interrupt driven software UART. It is in the 'Code Library' section of this forum: http://www.ccsinfo.com/forum/viewtopic.php?t=51403
Quote: | 2) I didnt know about this, I used to program 8051 before, disabling interrupts inside interrupts were very common.
Problem is disabling global or any interrupt? Could I disable only PWM timer so? | Read Ttelmah's post again. Enabling and disabling interrupts inside an interrupt handler is perfectly allowed. The one thing you are _not_ allowed is to enable the GLOBAL interrupt flag. Read the chapter on interrupts in the PIC data sheet. When the PIC detects an interrupt the hardware will disable the GLOBAL interrupt to prevent this interrupt being interrupted by another interrupt. On finalizing the interrupt the hardware will enable the GLOBAL flag again. Would you manually enable the GLOBAL interrupt inside the interrupt then all kind of strange errors could occur when another interrupt immediately triggers recursively. Disabling GLOBAL inside the interrupt is allowed but has no effect as it already is disabled.
Clearing the INT_EXT interrupt flag is not required as the CCS compiler will add code to clear the interrupt for you.
Quote: | 4) Serious? Oh god! I thought this was very fast, in 8051 were 1 or 2 cycles. | Yep, it is fast here as well. I don't know what Ttelmah was referring to.
Quote: | 5) How can I do this? | Basic C knowledge. Add the '&' operator before the variable name. Quote: | shift_left(&t,1,PIN_A2); | Note that it is common practice to write variable names (as a mix of) small letters. For constants and defined values write the full name in all capital letters. Just the other way around than you did. |
|
|
Ttelmah
Joined: 11 Mar 2010 Posts: 19504
|
|
Posted: Sat Nov 02, 2013 12:27 am |
|
|
On 4, what I was referring to, was that he was rotating 8 bytes.
A single byte can be rotated one bit in a couple of instructions, but to rotate 8 bytes, takes a lot longer. To do this eight times, and the overall time is getting worse.
What I have to ask is what he really wants to do?.
In the routine, he uses the pin number as if it is a variable (he is actually shifting a fixed bit from the pin number value, not the input value).
He also hasn't told us what CPU clock rate is being used?.
Now the entire serial receive he shows, can be replaced with a single CCS instruction, if he is prepared to have the PWM stop (as he says), using:
Code: |
#use RS232 (RCV=PIN_A2,baud=9600,SAMPLE_EARLY)
int8 T;
int1 have_val=FALSE;
#INT_EXT
void edge(void)
{
have_val=TRUE;
T=getc();
}
|
Then in the main, while looping for the pwm, test 'have_val'. If it goes TRUE, then set it FALSE, and process 'T'.
Within the limitations posted (stopping the PWM etc.), then this can do what is requested.
However with the timer interrupt approach, provided the PWM, can be a multiple of the bit sample time used (1/38400th sec in the example), then the PWM can stay running, while receiving.
Best Wishes |
|
|
FJMSoft
Joined: 20 Oct 2013 Posts: 36
|
|
Posted: Mon Dec 02, 2013 12:23 am |
|
|
I have gave up of the idea, it was not worthing the work anymore.
I tried as Ttelmah said but sometimes was getting junk bytes, I believe because of PWM interrupt working time + external interrupt latency.
Even in 2400baud I was getting junk bytes.
Maybe if I could set external interrupt priority this could work...
But this is offtopic here, I may ask in another post.
Thank you anyway. |
|
|
|
|
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum
|
Powered by phpBB © 2001, 2005 phpBB Group
|