View previous topic :: View next topic |
Author |
Message |
bsodmike
Joined: 05 Aug 2006 Posts: 52
|
#define, what am I doing wrong? |
Posted: Sat Aug 05, 2006 1:20 pm |
|
|
Hiya,
I found something kinda strange today, Basically I'm doing something as follows:
#define WAIT_TIMEOUT 500
..
..
int8 timeOUT = WAIT_TIMEOUT;
Now in the rest of the code I count down to 0 with a 1us delay. I tried something like this:
printf("%d",timeOut);
and found that it was initialising the variable @ 244. Now is this because using '%d' is wrong or something else?
I found that defining WAIT_TIMEOUT in hex works like it should tho.
Cheers, Mike |
|
|
PCM programmer
Joined: 06 Sep 2003 Posts: 21708
|
|
Posted: Sat Aug 05, 2006 1:32 pm |
|
|
Declare a variable as int16 to hold a value of 500. An int8 can only
hold a value from 0 to 255. int16 can hold 0 to 65535.
(CCS integer declarations are unsigned values by default.)
Use "%lu" to display an int16 (or int32) variable in printf.
Note that the first character in "%lu" is a lowercase "L". |
|
|
bsodmike
Joined: 05 Aug 2006 Posts: 52
|
|
Posted: Sat Aug 05, 2006 10:33 pm |
|
|
Thanks... I just woke up after a decent nap and first thing that hit me was "How could you store more than 255 in an 8-bit variable, muppet!"...
Cheers, Mike |
|
|
|