I found some example code that converts ascii to decimal and it works well. If someone could explain why the code below doesnt I would appreciate it
I wanted to convert a 3 digit number to decimal so i wrongfully used this code below.
Code:
int test;
//test=(string[0]-0x30)*100 + (string[1]-0x30)*10 + (string[2]-0x30);
Thanks for your help
PCM programmer
Joined: 06 Sep 2003 Posts: 21708
Posted: Thu Nov 08, 2007 3:33 pm
Assuming that your 3 digits could be from 000 to 999, the result is
greater than an 'int' can hold. In CCS, an 'int' is an unsigned 8-bit
variable and can only hold from 0-255. So you need to declare 'test'
as an int16 as shown in bold below.
Also, you need to force the compiler to do 16-bit math on the first
part of the expression below. You can do this by casting the '100'
to a 'long'. This is done by appending an 'L' on the end, as shown below:
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum