Jump to content

Serial Port RS232


Sherif

Recommended Posts

Hello everybody...,..

Well I'm using a Labview 8.0 in my Project ... IN the Project we have a link between the PC and the FPGA using a serial Port RS232

since we are using the Labview as a software on the PC, we are going to use VISA WRITE and READ to Tx/Rx data form serial Port.

Actually i Perform the Loopback test on the Port ... Then we test the Transmission and receiption of Serial Cable using the Hyper terminal not the LabView ...

The Problem Now is that the Output from the LabView is a pulse signal -Analog- SO I need to convert it to a digital wave form then to Binary representation(8 Bits) in order to transmit it through the serial Port ... [using DWDT Analog to Digital.VI followed By Digital waveform to binary array.Vi]

but VISA WRITE takes string so i used (Byte Array to string.VI) Converter.

I need to know is it right what i did [ I mean will i have the same signal analog (but with Equivalent Binary format) after all these conversions ] and still continuously variating with time ....................?????????!!!!!

Because when I connect the Serial Cable Between the PC (LABVIEW) and the FPGA to test the received signal on the FPGA the Leds Lighted in the following sequence when all ones all leds are On

but when we are in zero period of the signal the Leds provides the following binary format 00001010 i don't know why ???????!!!

also when i change the Digital Waveform to BInary array.VI settings to be Digital Waveform to Binary U8 , simulation stops and an error message is introduced --attached with this Post-- .... does anybody know why /.....??

Link to post

QUOTE(BOBILLIER @ Jun 23 2007, 12:03 PM)

You can use type-cast like in my exempl.

Why using Type Cast ...?? to solve error message ....??

So what about conversions i made on the analog signal .....??

and what's the problem in using (Byte Array to String .VI) Converter in string palette cause when i use the type cast the simulation didn't run continously ... ??

Link to post

well i tried to write from Hyper-terminal and the ASCII code of writen strings is correctly displayed as a binary representation on the FPGA Board leds, But when i tried to write strings on LABVIEW using Basic READ WRITE example in LABVIEW examples the strings is not correctly represented in it's equivalent Binary format on the FPGA Board Leds.

Does anybody know why????

+ i still having the same error even when i used the Type Cast i.e Nothing solved .... ?? !!

Thanks in advance

Sherif Farouk (",)

Link to post

HERE'S MY code ... no i connected the serial port between the pc (labview) and the FPGA board

i received the same signal written on the visa write.vi but delayed and the full scale pk to pk is 2 even if i change it i don't know why

Can anybody help .....??!!

thnx in advance

Sherif Farouk (",)

Link to post

It's kinda difficult to work out where the problem is - are you sure the string that you're sending out the port is exactly the same in both Hyperterminal and LabVIEW? Are you sure the serial port settings are the same? Are you sure the termination characters are the same?

Link to post

QUOTE(crelf @ Jun 24 2007, 03:18 AM)

It's kinda difficult to work out where the problem is - are you sure the string that you're sending out the port is exactly the same in both Hyperterminal and LabVIEW? Are you sure the serial port settings are the same? Are you sure the termination characters are the same?

yes i'm sure if all wt you stated above except the termination characters cause i don't know where i can find it in the Hyper terminal if you can tell me i'd be thankful to you

kINDLY see the attached Hyper-terminal configuration i used in my test

Link to post

Try a loop back test (unplug the cable and put in a plug with pin 2 shorted to pin 3), then do the write in LabVIEW and read the response in Hyperterminal - that'll tell you what your LabVIEW program is really sending out.

Link to post

QUOTE(Sherif @ Jun 23 2007, 09:39 PM)

well i tried to write from Hyper-terminal and the ASCII code of writen strings is correctly displayed as a binary representation on the FPGA Board leds, But when i tried to write strings on LABVIEW using Basic READ WRITE example in LABVIEW examples the strings is not correctly represented in it's equivalent Binary format on the FPGA Board Leds.

Does anybody know why????

plus i still having the same error even when i used the Type Cast i.e Nothing solved .... ?? !!

Thanks in advance

Sherif Farouk (",)

I have only checked the picture you supplied, and in this picture you are converting U16 values to string using the "U8 to string" conversion.

This approach will only use the lower 8 bits in the U16 values, i.e. the higher 8 bits will be lost using the "U8 to string" conversion.

Example:

If you are sending two binary values 0x00FF, 0x0000, these will be sent as two ascii characters 0xFF and 0x00. If the FPGA expects 16 bit values the serial data would be interpreted as 0xFF00, which is wrong.

To send all 16 bits, you will have to use a function other than "U8 to string" to do the conversion, e.g. typecasting.

/J

Link to post

You're skipping a couple of subtle details.

1. The array coming from the waveform is U32 - and you're passing to the "byte array to array string" function which accepts U8 - so you're going to loose lots of resolution of your data, it will not even look right.

2. The typecast to string may work - as it will convert anything into a string - but they will include alot of non-printable string chars, I am not 100% sure they all go thru the serial port. (Simple to test)

3. If you use the typecast - it's important on the receiving side that the integer prototype on the covert from the typecast string is the right numeric representation. meaning if it's a U32 - then put a U32 numeric constant into the top input. Otherwise it will convert the string incorrectly and you'll have junk data.

Item 3 - you can test without the serial port - just write the cast to string then recast it back to array and compare the graphs...

Good Luck!

Link to post

QUOTE(crelf @ Jun 24 2007, 06:07 AM)

Try a loop back test (unplug the cable and put in a plug with pin 2 shorted to pin 3), then do the write in LabVIEW and read the response in Hyperterminal - that'll tell you what your LabVIEW program is really sending out.

OK Thank you Crelf i already tried the loopBack test you suggested but when i write for example Ahmed i received Amd

i don't know why

regards,

sherif farouk (",)

Thank you both JackHamilton & JFM you really helped me to discover where's the problem

but i don't know now why he pulse width of read message is half of the written on the port

do you have any ideas for this behavior...??

thnx in advance

Sherif Farouk (",)

-----------------

Link to post

QUOTE(orko @ Jun 27 2007, 01:52 AM)

Sherif,

The reason you are getting half of the data is because the 16bits per sample is being converted to 8bits by the 8bit array to string.

To fix this, wire up the resolution input to the Analog to Digital conversion VI, which defaults to 16 bits. Then change your Digital to Binary VI to be the "Digital Waveform to Binary U8" instance. This seems to work for me.

I already made what you said but still no change anyway thank you for your help

To illustrate the problem more and more

When the analog pulsed signal is converted to digital and sent Over serial Port and then read again on the labview the read signal has half the period of the written wave (i.e double the frequency or double rate)

Link to post

QUOTE(Sherif @ Jun 26 2007, 05:12 PM)

I already made what you said but still no change anyway thank you for your help

To illustrate the problem more and more

When the analog pulsed signal is converted to digital and sent Over serial Port and then read again on the labview the read signal has half the period of the written wave (i.e double the frequency or double rate)

My only suggestion is that perhaps on the read side, the "binary to digital" conversion is acting on an array of U8 without knowing what the desired sample rate should be of the output digital waveform.

Remember that once the bits are captured and sent along the serial port as a string of data, they are just ones and zeros of the raw data and all the timing information on the original analog waveform is lost.

Perhaps you are going to have to capture the sample rate during the analog to digital conversion in the write code and encode it into the string it writes out to the serial port. Since I think you are intending to control an FPGA board with the serial port I/O directly, you may have to rethink how you're going to extract the timing information from the string that appears on the serial port.

Hope this helps.

Link to post

QUOTE(orko @ Jun 27 2007, 09:40 AM)

My only suggestion is that perhaps on the read side, the "binary to digital" conversion is acting on an array of U8 without knowing what the desired sample rate should be of the output digital waveform.

Remember that once the bits are captured and sent along the serial port as a string of data, they are just ones and zeros of the raw data and all the timing information on the original analog waveform is lost.

Perhaps you are going to have to capture the sample rate during the analog to digital conversion in the write code and encode it into the string it writes out to the serial port. Since I think you are intending to control an FPGA board with the serial port I/O directly, you may have to rethink how you're going to extract the timing information from the string that appears on the serial port.

Hope this helps.

Thank you orko Now i'm sure that the error resulted from the code loaded on the FPGA board ,

i checked the VHDL code it's OK. However when i changed the VHDL process that is resposible for this problem to have double the period, the Labview VI didn't work [i.e simulation stops and a timeout error showed up]

thus, i have to change the Baud rate to fit the new values ....[period doubled then Baud rate should be Half of its value] now Labview back to life again and work But unfortunately with the same Problem .

This is because when i used factor two to reduce the baud rate I -- at the same time -- doubled the period so they cancel each other and back to the same Problem.

i don't know what Can i do ....??!!

do you have any idea .... !!

regards

Sherif Farouk (",)

-----------------

Link to post

QUOTE(Sherif @ Jun 27 2007, 07:54 PM)

The baud rate of the serial connection has nothing to do with the sampling rate of the analog signal data in this scenario. Since you are converting the samples of the analog signal to raw binary data (ones and zeros), it doesn't matter how fast or slow they get to the receiving end of the serial I/O.

It does however matter what you set as your sampling rate in the "Simulate Signal" VI that is generating your square wave, and making sure that the "binary to digital" sampling rate is set to the same value. This will ensure that the period of the waveform can be correctly translated from the bits received on the read. If you look into the "binary to digital" VI's diagram you will see that it uses this input as the "dt" of the resulting digital waveform (which is the time between each data point (bit) in the waveform), and that it defaults to 1000 uS. Looking at your front panel snapshot you gave us, I can tell that the sample rate of the input analog squarewave is not exactly 1000 uS (it's close, but they need to exactly match for this to work reliably). This isn't the main cause of your double frequency problem however....

There is also another issue that I see happening, now that I look closer at your front panel. You are writing 100 bytes to the serial port, but only getting back 51 bytes of data on your read, which would explain why there is a "double frequency" problem since it looks as if you're missing every other byte of data in the LabVIEW read. I'm not sure why this is happening yet however. Perhaps a simplified test VI that exhibits this problem on your machine could be posted?

One guess I have is that the byte array to string/string to byte array is producing NULL (0x00h) and DEL (0x7Fh) characters, which may be confusing the serial communications.... a quick test on this is to replace the "byte array to string" and "string to byte array" with something that is not going to produce non-printable characters. Like perhaps something like this:

http://forums.lavag.org/index.php?act=attach&type=post&id=6261''>http://forums.lavag.org/index.php?act=attach&type=post&id=6261'>http://forums.lavag.org/index.php?act=attach&type=post&id=6261

This method will produce nice (human readable) hex byte characters separated by commas. (ie: "00,00,7F,7F,...") which are gauranteed not to mess with the serial communications/term characters.

Best of luck!

Link to post

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.