CCS C Software and Maintenance Offers
FAQFAQ   FAQForum Help   FAQOfficial CCS Support   SearchSearch  RegisterRegister 

ProfileProfile   Log in to check your private messagesLog in to check your private messages   Log inLog in 

CCS does not monitor this forum on a regular basis.

Please do not post bug reports on this forum. Send them to support@ccsinfo.com

Fast Interrupt - output pulse with input

 
Post new topic   Reply to topic    CCS Forum Index -> General CCS C Discussion
View previous topic :: View next topic  
Author Message
prayami



Joined: 22 Dec 2004
Posts: 78

View user's profile Send private message

Fast Interrupt - output pulse with input
PostPosted: Wed Sep 14, 2005 5:46 pm     Reply with quote

Hi...I am using 18F4525 with 10Mhz crystal and PLL so 40Mhz Clock.

For first step, I am applying input to CCP1 of 50Hz. And want to generate
same output pulse on PIN C5. Program is simple and it is generating
output pulse correctly. But if I check the timing of the output pulse then
it is 4us on the rising edge.

I have checked the disassembly and there are only three instructions. So
it should be 0.3us or may be little more but not 4us.

How much minimum time require to serve the interrupt ?
I don't know assembly, so if it is require to write fast interrupt then How
can I write it?
Here is the code...
Code:

#CASE
#include <18F4525.h>
#fuses H4,NOWDT,NOLVP,NOBROWNOUT,NOPBADEN,PUT,NOXINST
,NOMCLR,NOLPT1OSC,PROTECT,CPD,NOWRT,NOWRTD,EBTR,NOCPB,
EBTRB,NOWRTC,NOWRTB
#use delay(clock=40000000)

int1  RPMOutStatus=0;   

#int_ccp1
void isr()
{      
   
    if(RPMOutStatus==0)
   {
      output_high(PIN_C5);

     setup_ccp1(CCP_CAPTURE_FE);   
      RPMOutStatus=1;

   }
   else
   {
      output_low(PIN_C5);

     setup_ccp1(CCP_CAPTURE_RE);   
      RPMOutStatus=0;

   }

   
}//#int_ccp1   
 

void main() {

    
   set_tris_c(0x9F); //C5-C6 output
       
     setup_ccp1(CCP_CAPTURE_RE);   
     
     enable_interrupts(INT_CCP1);   
     enable_interrupts(GLOBAL);
 
                      
   output_high(PIN_C6);          
            
  while(TRUE) {

   output_high(PIN_C6); //For Freq Mode   

                     
  }//while(TRUE)
 
}//main



Last edited by prayami on Thu Sep 15, 2005 2:57 pm; edited 2 times in total
treitmey



Joined: 23 Jan 2004
Posts: 1094
Location: Appleton,WI USA

View user's profile Send private message Visit poster's website

PostPosted: Thu Sep 15, 2005 7:55 am     Reply with quote

Can you describe what your overall design is. I don't understand why you take in a signal and want to make a copy of it.

Why don't you just route that signal to where you need it?
prayami



Joined: 22 Dec 2004
Posts: 78

View user's profile Send private message

PostPosted: Thu Sep 15, 2005 2:53 pm     Reply with quote

Thanks for reply...

Because in the next step, I want to do phase shift in the output pulse
from the input. So for this program, you can say phase shift is zero.

Our input will be from 1Hz to 150Hz.

As interrupt is serving slow, there is all ready 5us delay between input
pulse and output pulse.

The problem is at 150Hz max frequency input, 5us is too much in our
application. Because we want 0.1 degree resolution. so 5us is affecting
a lot in our application.

I don't know, why it is taking this much time. Actually there are aroung
four instructions before output changes. So there can be max 0.4us
or nearer difference in the input and output but not 5us.

Please give me some idea to achieve this task.
Ttelmah
Guest







PostPosted: Thu Sep 15, 2005 3:17 pm     Reply with quote

Does the system have to do anything else, other than time delay the waveform?. If not, then forget the interrupt, poll the input, and control the output when it changes.
The problem you are seeing is the interrupt latency. There are a number of 'parts' to this. Firstly, there is up to four clock cycles latency in the hardware, before the interrupt is even seen. then the software, has to save a copy of every register that may be in use in the main code (typically about 30 instructions), then test what interrupt has occured, and whether this interrupt is enabled (perhaps another 8 instructions), jump to the handler code (another couple of instruction times), and then finally execute your code (just a handful of instructions as described). Given that each instruction costs 4 instruction times, you have something in the order of 4.5 to 5uSec delay.
If you poll, instead of using an interrupt, and the system has to do nothing else, you should be able to get this below 0.5uSec fairly easily. However if you want real accuracy, approach it completely differently. Look at something like a hardware shift register, and change the clock frequency to this to alter the shift.
If there are no other interrupts in use, and all the code in the interrupt is distinct from that outside (so the interrupt protection never gets executed, and disables interrupts in the main code), the delay will remain pretty near to fixed (it'll vary slightly depending on the clock relationship between the signals).
You could also consider doing the job in hardware. Program the delay interval into one CCP, and program the other to reset the corresponding counter on it's change, which you source from the signal. set this to count to just '1', and on the next edge, the timer will reset, and then the output of the second CCP, will change after the programmed period. Only some chips allow the CCP's to be setup this way, and you will need to reverse the active 'edge' for the signal, as soon as the timer starts, to catch the next edge.

Best Wishes
bfemmel



Joined: 18 Jul 2004
Posts: 40
Location: San Carlos, CA.

View user's profile Send private message Visit poster's website

PostPosted: Thu Sep 15, 2005 3:26 pm     Reply with quote

prayami wrote:
Quote:
How much minimum time require to serve the interrupt ?


I'm just guessing as to the exact overhead you might have but in my program with several more interrupts I have 31 machine cycles to get into the interrupt routine from the time it is triggered until the actual routine is called. This would be 3.1us in your case, Iguess.

I guess you can do two things. First, you can compensate for that offset in your output waveform. Of course you would never be able to get down below 0.2 degrees of phase shift. Secondly, you could try using a FAST interrupt, good luck on that one. Many have tried few have succeeded. In the FAST interrupt the overhead is not done by the program and the interrupt can interrupt other low priority interrupts but you need to do the register saving yourself so you would want to strip your routine to the bare bones.

Best of Luck
- Bruce
wireless



Joined: 02 Nov 2003
Posts: 16
Location: London England

View user's profile Send private message

Fast Interrupt
PostPosted: Thu Sep 15, 2005 3:39 pm     Reply with quote

Prayami

I think the delay you describe is due to the interrupt start-up code which saves a large number of variables.

If you are only using one interrupt source can I suggest you look at the #INT_GLOBAL pre-processor command. This does not generate any start-up or clean-up code and so does not save any registers.

All you need to do is replace your #int_ccp with #int_global.

I have had similar problems with a decoder for high speed manchester coded data .

Good luck

Terry
prayami



Joined: 22 Dec 2004
Posts: 78

View user's profile Send private message

PostPosted: Thu Sep 15, 2005 5:01 pm     Reply with quote

Thanks ...for replying....

But, There will be total 5 interrupts will be there and enough code inside
the while loop. So I can't write code inside while loop. I have to use
interrupt.
And thus can't use #INT_GLOBAL as well.

So as you guys stated. I think there is no way left to reduce the delay
between two pulses from 4us.
Ttelmah
Guest







PostPosted: Fri Sep 16, 2005 2:43 am     Reply with quote

If one of the other interrupts also occurs, the delay is going to be even worse, and will vary....
Consider something really low level. If you just need to delay the signal by an amount, use something like one of the really cheap PICs, sitting in a polled loop, which delays by an amount defined by an integer presented on it's input pins. Have it do nothing else at all. The main chip then does the other jobs, and sets the required delay value fed to this. There are programmable delay chips on the market to do this, but most are more expensive than the PIC used this way.

Best Wishes
prayami



Joined: 22 Dec 2004
Posts: 78

View user's profile Send private message

PostPosted: Sun Sep 18, 2005 10:58 pm     Reply with quote

Thanks....Ttelmah.....

I think , we have to use another processor then.
ckielstra



Joined: 18 Mar 2004
Posts: 3680
Location: The Netherlands

View user's profile Send private message

PostPosted: Mon Sep 19, 2005 2:09 am     Reply with quote

prayami wrote:
Thanks....Ttelmah.....

I think , we have to use another processor then.
Every processor that is working with interrupts will have to save and restore the working registers and so inducing a delay. Newer processor types are faster but often also have more registers to save. As an example, I once did a project using a 500MHz pentium running QNX which was only capable of handling 1000 interrupts per second: 1ms per interrupt!

As Ttelmah already tried to explain to you the problem is not the processor but your design/requirements.
Display posts from previous:   
Post new topic   Reply to topic    CCS Forum Index -> General CCS C Discussion All times are GMT - 6 Hours
Page 1 of 1

 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum


Powered by phpBB © 2001, 2005 phpBB Group