I have a question. I am designing a battery charger and I want to read the voltage with the ADC on my pic. I am using a voltage divider to step down the voltage by 100 (meaning 12.65V would be .1265V) which I am going to use later for reading higher voltages, anyhow I got the adc outputting the right values but I cannot figure out how to detect a small voltage change. For instance if I read a voltage of 12.65 I get an ADC value of 25 (.1265*1023)/5) now if I want to read a voltage of 12.45 I still get an adc value of 25. Does anyone have a better way of doing this?
thanks
Douglas Kennedy
Joined: 07 Sep 2003 Posts: 755 Location: Florida
Posted: Thu Jun 24, 2010 11:07 am
The adc will count 1 for every 5/1023 or say every approx 5 mv. You take 12.65v and effectively make it .1265v with your divider. Now a 5mv step gives 25 approx for .125 (25 x 5mv) so 24 is .120v and 26 is .130. So with the divided by 100 you are restricted to approx 0.5v precision. The only way to keep good precision with a resister divider is to set the the range to be measured to 5v. If 20v is your max then you want 5v to be 1023 so you need 15k and 5k for the resistors or another set that maintains the same ratio.
SherpaDoug
Joined: 07 Sep 2003 Posts: 1640 Location: Cape Cod Mass USA
Posted: Thu Jun 24, 2010 6:48 pm
It's that 100:1 divider that is killing you. It gives you a 500V range, but 0.5V resolution.
Do you REALLY need a 500V range? _________________ The search for better is endless. Instead simply find very good and get the job done.
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum