Jump to content

RS232 receive errors


hma

Recommended Posts

Hi,

Take a look at the included VI. While running in debug mode the output string of 16 characters is red normally from the device, however running on normal speed the VI reads only three characters and all the buffer characters including the CR-LF combination in between seemes flushed. I have inserted delays, but nothing seems to help. Why oh why?

Thanks in advance,

Hugo

Link to comment

Looking at your VI, I'm under the impression you have the flush buffer in there to "throw away" anything that is returned from the device after the "T1=#" command is sent. You might want to insert a delay in between this write and flush (with a flat sequence to preserve data flow) so that it doesn't flush too early.

I also notice that this device is operating at a very low baud (2400), so the default timeout value (in the VISA property node under "General Settings") of 2 seconds might in some cases not be long enough to wait for a response. I'd bump that up to 5-8 seconds.

Link to comment

QUOTE (orko @ Apr 15 2008, 08:53 PM)

Looking at your VI, I'm under the impression you have the flush buffer in there to "throw away" anything that is returned from the device after the "T1=#" command is sent. You might want to insert a delay in between this write and flush (with a flat sequence to preserve data flow) so that it doesn't flush too early.

I also notice that this device is operating at a very low baud (2400), so the default timeout value (in the VISA property node under "General Settings") of 2 seconds might in some cases not be long enough to wait for a response. I'd bump that up to 5-8 seconds.

Thank you all for your respons!

Well in short, the device is programmed to set to a certain temperature with the command "T1=(°C*100)". Next the receive buffer is flushed to be certain nothing can disturbes the response on the next command "T1" on which the device returns the programmed set point with a string as defined in the comments in the VI.

I am using the default timeout 10 sec as the help says abouth the serial init Vi. 2400 baud should allow to communicate more than 200 characters per sec. I only have messages of 16 characters. I made exercises with time delays between sent and receive commands but nothing seems to help. The ashtonishing fact is, however, that the VI runs fine in "slow motion" or "debug mode" where all is sloooow, while it returns timeout errors in normal run modes and returns only "$/r/n" as the answer string, so where is the first part of the message going? The fact that there is a flush of the buffer before the "T1" command is given and the fact the VI sees always the last 3 characters of the message means that all 16 characters must have been sent by the device, but the first 13 characters are sent to a "black hole". Probably it has something to do with the 2 CR-LF's in the message. I have to find that black hole!

Regards,

Hugo

Link to comment

The timeout value is the time that it waits for a response(the time where no data is being seen on the line), not the time it takes for a response to fully be recieved.

As far as it working in highlight execution mode, that definitely smells like a timing issue. Have you tried putting in the delays suggested by ASTDan and I? Make sure you are not just placing waits floating (without being attached inline using data flow). I would try the two waits suggested with values of 3000 microseconds each. If that works, then you can scale it down to the correct timing. If it doesn't work, post your updated VI (or screenshot) here.

Link to comment

QUOTE (orko @ Apr 16 2008, 03:25 PM)

The timeout value is the time that it waits for a response(the time where no data is being seen on the line), not the time it takes for a response to fully be recieved.

As far as it working in highlight execution mode, that definitely smells like a timing issue. Have you tried putting in the delays suggested by ASTDan and I? Make sure you are not just placing waits floating (without being attached inline using data flow). I would try the two waits suggested with values of 3000 microseconds each. If that works, then you can scale it down to the correct timing. If it doesn't work, post your updated VI (or screenshot) here.

Well I added 100msec delays after each VISA vi, even after the Flush buffer VI it was necessary. Now all is working.

Thank you all!

Hugo

Link to comment

QUOTE (hma @ Apr 17 2008, 11:12 AM)

Well I added 100msec delays after each VISA vi, even after the Flush buffer VI it was necessary. Now all is working.

Thank you all!

Hugo

Most devices return a End Of Transmission (EOT) or similar characyter to inform the caller that the data is sent out.

Its usually a CR (ascii 13) but sometimes it can be a combo of several chars.

I usually tru to read in a loop until the EOT is found or the required number of characters are received etc.

Makes for more robust code. Ill use my own timeout code rather than rely on VISA's

Ive had many cases where it appears to work nicely until .....

and then it has to be re-visited

thats why I would not put too much reliability in using delays.

IF you are just trying to make a quick hackthen the code shoud be fine.

If you need to design a permanent solution then consider making the receiving end a bit more intelligent.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.