CCS C Software and Maintenance Offers
FAQFAQ   FAQForum Help   FAQOfficial CCS Support   SearchSearch  RegisterRegister 

ProfileProfile   Log in to check your private messagesLog in to check your private messages   Log inLog in 

CCS does not monitor this forum on a regular basis.

Please do not post bug reports on this forum. Send them to support@ccsinfo.com

I2C problems -- not posting as guest this time

 
Post new topic   Reply to topic    CCS Forum Index -> General CCS C Discussion
View previous topic :: View next topic  
Author Message
teletype-guy



Joined: 11 Oct 2005
Posts: 8
Location: AZ

View user's profile Send private message Visit poster's website

I2C problems -- not posting as guest this time
PostPosted: Tue Oct 11, 2005 9:44 am     Reply with quote

My earlier "I2C problems" post showed up as guest for some reason. I've been told those posts are often ignored, so I am posting it again:

I think there may be an I2C problem in compiler 3.235 when using the software implementation. There was also definitely an I2C problem in compiler 3.207 when using the hardware implementation.

First let me point out that I have programmed pics and I2C since 92, starting with pic assembly and an i2c master/slave implementation on two pics. I moved to the CCS compiler many years ago and have since flushed most of what I knew about pic (actually parallax) assembly, but I still have a decent handle on i2c.

I have a current product that uses a pic master (was 16F876, now 18F252), and a pic slave (currently 16F876, but moving it to 18F252). The pic master gets instructions from a computer over a 232 port, and talks over the i2c port (built-in i2c hardware master mode) to the slave pic. The pic slave uses the built-in i2c hardware slave mode, to get instructions from the pic master. The pic slave the uses another set of pins for a second i2c port, and uses the CCS software i2c implementation to talk to a bunch of other i2c chips (this is the slave pic, but in master mode on the second i2c software port).

Now these other i2c chips are all identical, so the i2c data line is steered to the chips by a bi-directional mux (an HC4051 -- nothing fancy), with pull-up resistors on all data lines to keep those pins high when the mux is steering elsewhere. The i2c clock line is common to all chips, since it can toggle when data is high, with no problem. The slave is normally listening for commands on the i2c hardware port. When it needs to talk to the second i2c software (master) port, it does so by issuing #USE I2C... for the software port, does the start/code/stop thing, and then issues a #USE I2C... for the hardware (slave) port again.

None of this is a problem; I'm just describing the system. There are probably a thousand of these units in the field, and most have been in constant operation for years -- so this is not a lab-bench anomaly.

The system evolved as follows:

I2C HW Master / I2C HW Slave (& SW master)
-----------------------------------------------------
Initial design:
16F876 (PCM 3.114) / 16F876 (PCM 3.114) -- fine for years

Upgrade compiler for 18F and migrate master:
18F252 (PCMH 3.207) / 16F876 (PCM 3.114) -- fine for a year

Attempt to migrate slave to 18F:
18F252 (PCMH 3.207) / 18F252 (PCMH 3.207) -- PROBLEM-1

Upgrade compiler and test with original slave:
18F252 (PCMH 3.235) / 16F876 (PCM 3.114) -- works fine

Attempt to migrate slave to 18F:
18F252 (PCMH 3.235) / 18F252 (PCMH 3.235) -- fixed PROBLEM-1,
but now there is a new PROBLEM-2.

The problems appeared just recently, as I tried to port the old slave code from the 16F876 to the 18F252. Now this migration is well known to me, and the 16 vs. 18 stuff is not the issue.

PROBLEM-1 showed up when the master tried to read from the I2C slave. Writes were fine, but as soon as the slave tried to send the first byte back during a master-read op, things hosed-up. I was overdue for a compiler upgrade anyway, so I got PCMH 3.235. This fixed the first problem (on the i2c hardware port), but showed up a new problem (on the i2c software port).

PROBLEM-2 showed up when the standard i2c write sequences did not terminate properly on the i2c software port. Note that none of this code has changed over the years. The slave (acting as a master in i2c software mode) will start/send/stop, but the i2c data line stays low -- it is being pulled low by the i2c chip (a standard philips part) with the ACK bit. It appears as though the bus master (which is my pic slave, in this case) is not sending the last clock pulse to clear the ACK, before the stop condition. A brief description can be found at:
http://www.esacademy.com/faq/i2c/busevents/i2cgetak.htm

I could clear the ACK by pulsing the clock line low with (either a clip-lead or) an explicit bit of code after the stop. This clears up the chip I am currently talking to, but screws up the others. If I try to pulse the clock low before the stop, nothing works.

I do not have a logic analyzer at the moment to compare known good systems with the new one, but obviously something in the i2c software clocking is a problem.

Has anyone had any similar I2C problems? I have been changing I2C ports with #USE I2C... statements for years, but is it not really proper to do so, and is only showing up in recent compiler versions?

I am thinking that I should just use explicit I2C bit-banging code for this software master port -- does anyone have a link to some tested CCS-compatible snippets? I have written these before in parallax assembly (many many years ago), but I don't want to re-create the wheel for a c version.

thanks for any help you may offer.

gil smith
teletype-guy
gil--at--baudot.net
teletype-guy



Joined: 11 Oct 2005
Posts: 8
Location: AZ

View user's profile Send private message Visit poster's website

code snippet
PostPosted: Tue Oct 11, 2005 10:15 am     Reply with quote

Hi folks:

Note that the pic does not get stuck -- it is the I2C chip that was being written to that is stuck. This is the code snippet -- it is normally a SLAVE pic (using built-in i2c hardware), but changes to become a MASTER when needed (using i2c software, on different pins):

/////////////////////////////////////////////////////////////////////////////////////
// ----- disable ssp (i2c slave) receive interrupt while doing this write op:
disable_interrupts (INT_SSP);

// ----- use MASTER I2C on a software port:
//
#use I2C(MASTER, SDA=I2C_DAT2, SCL=I2C_CLK2, SLOW)

i2c_start();
i2c_write(addr);
i2c_write(sudaddr);
i2c_write(dat);
i2c_stop();

// ----- restore SLAVE I2C using the hardware port:
//
#use I2C(SLAVE, ADDRESS=100, SDA=I2C_DAT, SCL=I2C_CLK, SLOW, FORCE_HW)

// ----- enable ssp (i2c) receive interrupt:
enable_interrupts (INT_SSP);
/////////////////////////////////////////////////////////////////////////////////////

This has worked fine for years with a 16F876 and PCM 3.114 compiler.

Now, with an 18F252 and PCWH 3.235 compiler, when the code exits, the i2c data line (on the software port) is still low -- it is the actual i2c chip that is holding it low (the ACK bit), not the pic. I can clear the i2c chip by pulsing the clock line low with either a clip-lead, or by inserting:
delay_us(10);
output_low(I2C_CLK2);
delay_us(10);
output_float(I2C_CLK2);
just after the stop (putting this before the stop does not work).

This pulsing of the clock line is a hack of course, and not something I would leave in final code -- I am just trying to figure out what is happening. It appears as though the i2c_write is not sending the final clock pulse to clear the ACK.

If this is indeed a compiler issue, I need to just put software master routines directly in my code. Do they exist somewhere?

thanks,

gil
teletype-guy



Joined: 11 Oct 2005
Posts: 8
Location: AZ

View user's profile Send private message Visit poster's website

code snippet
PostPosted: Tue Oct 11, 2005 10:15 am     Reply with quote

Hi folks:

Note that the pic does not get stuck -- it is the I2C chip that was being written to that is stuck. This is the code snippet -- it is normally a SLAVE pic (using built-in i2c hardware), but changes to become a MASTER when needed (using i2c software, on different pins):

/////////////////////////////////////////////////////////////////////////////////////
// ----- disable ssp (i2c slave) receive interrupt while doing this write op:
disable_interrupts (INT_SSP);

// ----- use MASTER I2C on a software port:
//
#use I2C(MASTER, SDA=I2C_DAT2, SCL=I2C_CLK2, SLOW)

i2c_start();
i2c_write(addr);
i2c_write(sudaddr);
i2c_write(dat);
i2c_stop();

// ----- restore SLAVE I2C using the hardware port:
//
#use I2C(SLAVE, ADDRESS=100, SDA=I2C_DAT, SCL=I2C_CLK, SLOW, FORCE_HW)

// ----- enable ssp (i2c) receive interrupt:
enable_interrupts (INT_SSP);
/////////////////////////////////////////////////////////////////////////////////////

This has worked fine for years with a 16F876 and PCM 3.114 compiler.

Now, with an 18F252 and PCWH 3.235 compiler, when the code exits, the i2c data line (on the software port) is still low -- it is the actual i2c chip that is holding it low (the ACK bit), not the pic. I can clear the i2c chip by pulsing the clock line low with either a clip-lead, or by inserting:
delay_us(10);
output_low(I2C_CLK2);
delay_us(10);
output_float(I2C_CLK2);
just after the stop (putting this before the stop does not work).

This pulsing of the clock line is a hack of course, and not something I would leave in final code -- I am just trying to figure out what is happening. It appears as though the i2c_write is not sending the final clock pulse to clear the ACK.

If this is indeed a compiler issue, I need to just put software master routines directly in my code. Do they exist somewhere?

thanks,

gil
Mark



Joined: 07 Sep 2003
Posts: 2838
Location: Atlanta, GA

View user's profile Send private message Send e-mail

PostPosted: Tue Oct 11, 2005 4:18 pm     Reply with quote

http://www.ccsinfo.com/forum/viewtopic.php?t=22105
teletype-guy



Joined: 11 Oct 2005
Posts: 8
Location: AZ

View user's profile Send private message Visit poster's website

fixed it
PostPosted: Wed Oct 12, 2005 11:32 am     Reply with quote

Hi folks:

I solved my problem. I still believe this is a bug in compiler 3.235.

The problem was that as an I2C master in software mode (not using pic i2c hardware), my i2c chips were holding the dat line low (as if their ack bit was never cleared). I could manually clear the i2c chip by pulsing the clk line low. This was just on a master write op -- I don't know whether a master read would do this as well.

The solution was to insert a delay before the i2c_stop(). Even the smallest delay of 200-ns (delay_cycles(1) at 20-MHz) would fix it. I ended up using a 1-us delay for some margin.

I briefly looked at the signals on a scope, and it was about a 90-KHz bus rate (SLOW mode), and the set-up and other times I peeked at seemed fine. I did not have time to analyze the exact timing relationship near the i2c_stop., and once I found that a delay fixed things, I had to move on.

This may have been specific to my ic2 chip (TDA8524), or the pic used (18F252 at 20 MHz with compiler 3.235), but it was consistant across many units, and was never an issue with the earlier 16F876 (and compiler 3.114).

FWIW,

gil
Display posts from previous:   
Post new topic   Reply to topic    CCS Forum Index -> General CCS C Discussion All times are GMT - 6 Hours
Page 1 of 1

 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum


Powered by phpBB © 2001, 2005 phpBB Group