Jump to content

Stop ASCII conversion during VISA write


Recommended Posts

Hi,

I am doing some basic comms between my PC and some hardware I have made using the VISA serial write function. It all works fine and I have no problem sending data as long as the data I want to send is is ASCII format. My problem comes when I want to simply write a specific byte to the serial out WITHOUT it being converted to ASCII.

For example if I want to write a hex number 0x05 (which would be interpreted in ASCII as an ENQ) it gets sent out as ASCII "5" (ie. 35 in hex). My hardware is looking for a hex value of 0x05 to come in the serial port which it then never gets (as obviously it receives 0x35 instead). Is there any way to get my LV serial write to just write a specific byte without applying the ASCII conversion?

Thanks

Link to comment

QUOTE (postformac @ Apr 7 2009, 06:28 PM)

For example if I want to write a hex number 0x05 (which would be interpreted in ASCII as an ENQ) it gets sent out as ASCII "5" (ie. 35 in hex). My hardware is looking for a hex value of 0x05 to come in the serial port which it then never gets (as obviously it receives 0x35 instead). Is there any way to get my LV serial write to just write a specific byte without applying the ASCII conversion?

Thanks

Right click on the string constant you are using and select "display as hex" then type 05. If you are using "scan from string" do the same thing on the format string.

Link to comment

I often have to send a lot of data with a CRC over a serial port. I have my data stored as a byte array (to do the CRC calculation) and then use the Byte Array to String to then send out to the serial port. This primitive does not convert to an ASCII equivalent. Our standard here is to send a 0x02 at the beginning of each serial message as a synch byte and I have had no problems with this method. But if you are only sending 1 byte, this method would not be very good for you. ShaunR probably has the best solution for you.

post-11268-1239192366.png?width=400

Link to comment

QUOTE (postformac)

Is there any way to get my LV serial write to just write a specific byte without applying the ASCII conversion?

The way I understand it, labview isn't actually converting to ASCII... you're actually entering ASCII text into the string, and in the background labview stores/sees/uses that as the associated byte value. Hence, when you send a string containing 05, you're actually sending the byte corresponding to the ASCII value "05" that you entered.

If you right click on any string indicator or control or constant you will see there are options in the menu of "Normal display; \Codes Display; Password display and HEX display. These set the way labview "displays" what you enter in the string in the case of indicators and sets how labview reads what you've typed into controls and constants.

For example:

post-14639-1239194191.png?width=400

In this picture, I've typed "05" into a string control set to "normal" in the top half, the 4 indicators next to it are set to display in the four modes as labelled (all 4 indicators are wired to the one control). The second control is set to display as HEX, and again I've typed "05" and the indicators are set up as for the first control. When I run the code, the indicators each read back from the relevant control. You can see that "05" entered as ASCII displays as "05" in normal but "3035" in Hex. Enter "05" in HEX mode though and it displays as 05 in hex, but as an undefined ASCII character in normal. In this example, codes shows the same as the normal since I've not entered any special characters.

Anyway, so if you want to enter the byte value "05" you must type it into a string control/constant set to "HEX" display. The alternative is to create a byte array, allowing you to work with actual numeric data (helpful for the sort of thing crossrulz was talking about) and then convert that to the appropriate string to send over the serial bus...

Hope that helps, I remember struggling with the same thing when I first had to start playing with serial communications (having done some GPIB stuff previously which was all done in legible words!).

Sorry, just seen the benefit of ShaunR's method - it allows you to mix ascii strings and specific bytes in the same string control/constant... So that might be a good way to go if you're sending a mixture of things over the interface...

Cheers

Paul

Link to comment

Brilliant, thanks guys, got it working at last!

I actually went with the byte array and byte array to string conversion method as the data to be sent is coming in as numeric data from other parts of my program where I am using it to switch case structures. Either way its all functioning properly now...

Cheers :worship:

Link to comment

Ok, next question on a similar theme but working the other way....

If I am receiving a floating point number in ASCII format through my serial port, is there a simple way to turn this into a floating point numerical value for display through a numeric indicator?

For example, if I receive "1.123" in ASCII format and want it displayed as "1.123" on a numeric indicator.

I tried using the "convert string to decimal number" function and it seems to not recognise the decimal point, I get a value of "1.000" out of my display. I have currently got round this by taking the string one byte at a time and manually looking for the decimal point, then dividing whatever is after the point by 1000 and adding it to whatever was before the point, which works fine. Another way I thought of doing it is to split the number in my external hardware before transmission and design the comms routine to recognise which bits are whole numbers and which are after the decimal point and then div by 1000 and add again.

Can't help thinking there must be a simpler way though, is there any inbuilt function that will recognise the signed decimal floating point string and output it to my indicator correctly?

Link to comment

QUOTE (postformac @ Apr 9 2009, 10:33 AM)

I tried using the "convert string to decimal number" function and it seems to not recognise the decimal point, I get a value of "1.000" out of my display. I have currently got round this by taking the string one byte at a time and manually looking for the decimal point, then dividing whatever is after the point by 1000 and adding it to whatever was before the point, which works fine.

Aaaagh! On the same palette as "Decimal String to Number", you will find "Fract/Exp String to Number". I'm not sure why these names can't be a bit better, but I guess they go way back.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.