CCS C Software and Maintenance Offers
FAQFAQ   FAQForum Help   FAQOfficial CCS Support   SearchSearch  RegisterRegister 

ProfileProfile   Log in to check your private messagesLog in to check your private messages   Log inLog in 

CCS does not monitor this forum on a regular basis.

Please do not post bug reports on this forum. Send them to support@ccsinfo.com

Holding keyboard key, int_rda, and garbled program

 
Post new topic   Reply to topic    CCS Forum Index -> General CCS C Discussion
View previous topic :: View next topic  
Author Message
evsource



Joined: 21 Nov 2006
Posts: 129

View user's profile Send private message

Holding keyboard key, int_rda, and garbled program
PostPosted: Thu Jan 04, 2007 8:40 am     Reply with quote

Hi,

I'm using int_rda to detect incoming characters. Things work fine until I hold down a key on the keyboard. Then the program goes wacky. Here's the code:

Code:

BYTE input;

#int_rda
void  data_ready() {
     disable_interrupts(GLOBAL);
     disable_interrupts(INT_RDA);
     disable_interrupts(INT_RTCC);
     
     input=fgetc(external);
     
     if(input==0x80) { // reset - bootloader sends this command
          delay_ms(150);
          reset_cpu();
     }
     
     enable_interrupts(INT_RDA);
     enable_interrupts(INT_RTCC);
     enable_interrupts(GLOBAL);
}


Thanks,

Ryan
Ttelmah
Guest







PostPosted: Thu Jan 04, 2007 10:15 am     Reply with quote

Never, _ever_, enable the global interrupt inside the interrupt handler. This will crash the system.
You can remove all the interrupt enables and disables in the code, but this one is the killer. If a second interrupt occurs while the routine is in the code, then the stack on the system will overflow, and the system will go haywire.
The interrupt hardware,already disables the interrupts, and automatically re-enables them at the moment of the return from the handler. You are overriding this protection...

Best Wishes
Guest








PostPosted: Thu Jan 04, 2007 10:27 am     Reply with quote

Hi,

OK, let me see if I have this correct? You told your program to stop everything each time a character is received (the INT_RDA interrupt), and now you say that the program goes "wacky" when you send it a continuous stream of characters?? What exactly do you expect to happen under these conditions??


Seth
Ttelmah
Guest







PostPosted: Thu Jan 04, 2007 10:33 am     Reply with quote

He has not told it to 'stop everything', except if chr$(0x80) is received. This is brutal, but should not cause a problem. However enabling the global interrupts, will result in the code destroying the contents of almost all the registers, and really will screw things up.

Best Wishes
evsource



Joined: 21 Nov 2006
Posts: 129

View user's profile Send private message

PostPosted: Thu Jan 04, 2007 11:45 am     Reply with quote

Okay, that's what I needed to know. I wasn't aware that the interrupts didn't allow themselves to be interrupted by other interrupts. It does make sense to make it work that way.

I appreciate the quick responses. Oh, and regarding the program going "wacky" with a continuous stream of characters ... I would just expect slow response to the code. Characters are only coming in at 19.2k, so the processor still has time to do other things. By "wacky", I mean strange characters popping up all over, etc. But that's probably a result of the erroneous global interrupt enabling I had in there.

Thanks,

Ryan
Guest








PostPosted: Thu Jan 04, 2007 1:52 pm     Reply with quote

TTelmah,

With all due respect, when you code an interrupt service routine, you ARE telling the processor to stop whatever task it is undertaking, and to service the interrupt. Perhaps you misunderstood my meaning?

Seth
Ttelmah
Guest







PostPosted: Thu Jan 04, 2007 4:00 pm     Reply with quote

The interrupt, in instruction flow terms, means 'pause', and get one character. As such a short interruption, not 'stop everything'. On 0x80, he does stop 'everything', and jump to zero, so I assumed you were talking about this.
The big problem though is in understanding PIC interrupts. On machines with a hardware priority interrupt system, a particular interrupt jumps directly to the 'handler'. On the PIC, this is not the case. When any interrupt is received, the hardware disables the global interrupt flag, and a 'global' handler is called (which unless you write it yourself, you never see). This saves all the working registers into temporary store, tests which interrupt has occurred, and then calls the required handler. When the handler returns, it clears the interrupt flag for the particular event, and returns to the global handler. This then restores all the registers, before executing a special return instruction, which enables the global interrupt flag, after it executes. This ensures that the code is 'back' in the main flow, before a second interrupt can be responded to.
Now the problem is that if you enable global interrupts inside the handler, they are enabled the whole way through the restore part of the 'global' routine. Typically about 30 instruction times. If a second interrupt occurs in this time, the processor calls the global handler a second time, and overwrites the stored register values, with the ones from inside the interrupt handler. How many of the saved values get destroyed, depends on how early the second event is called. When this routine returns, it goes back to the point in the restore code, and will now restore the wrong register values. Hence this will result in memory values being destroyed. Perfect for giving a 'wacky' result...

Best Wishes
Display posts from previous:   
Post new topic   Reply to topic    CCS Forum Index -> General CCS C Discussion All times are GMT - 6 Hours
Page 1 of 1

 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum


Powered by phpBB © 2001, 2005 phpBB Group