menghuihantang Posted February 16, 2009 Report Share Posted February 16, 2009 A VB function which requires calling a string input by value: function exec_init(ByVal FileName: String): Byte In labview, I wired a string constant and made it a C string pointer. But it doesn't work. I tried to understant why. By converted to a C string pointer, a null-terminator is added to the end of the labview string; but a VB string doesn't need a null-terminator and causes the misinterpretation of the original labview string. Is this correct? If so, how can I pass a labview string to a VB program that awaits pass-by-value. Pascal string pointer is definitely not the choice, since it uses 8-bit for length info. Quote Link to comment
Aristos Queue Posted February 16, 2009 Report Share Posted February 16, 2009 What is the memory layout of a VB string? Quote Link to comment
menghuihantang Posted February 16, 2009 Author Report Share Posted February 16, 2009 QUOTE (Aristos Queue @ Feb 15 2009, 02:34 AM) What is the memory layout of a VB string? 32-bit length followed by Unicode sequence, similar to Labview string. Quote Link to comment
Aristos Queue Posted February 16, 2009 Report Share Posted February 16, 2009 QUOTE (menghuihantang @ Feb 15 2009, 11:10 AM) 32-bit length followed by Unicode sequence, similar to Labview string. This is not an authoritative answer -- someone else may know a better way. One thing you could do that would work is to create a LV string where the first 4 bytes are the length of the rest of the string and then pass it to the DLL call as a C-style string pointer. Quote Link to comment
LAVA 1.0 Content Posted February 16, 2009 Report Share Posted February 16, 2009 QUOTE (menghuihantang @ Feb 15 2009, 06:10 PM) 32-bit length followed by Unicode sequence, similar to Labview string. The lenght is that the number of characters or the memory size? LabVIEW is not storing a string in Unicode, it only uses single-byte characters, the length in the first 4 bytes is thus equal to the number of characters. For Unicode you will have a factor 2. For a LabVIEW string to Unicode conversion look http://forums.lavag.org/Convert-between-ASCII-and-Unicode-file2.html' target="_blank">here. Ton Quote Link to comment
Rolf Kalbermatter Posted February 17, 2009 Report Share Posted February 17, 2009 QUOTE (Aristos Queue @ Feb 15 2009, 03:50 PM) This is not an authoritative answer -- someone else may know a better way.One thing you could do that would work is to create a LV string where the first 4 bytes are the length of the rest of the string and then pass it to the DLL call as a C-style string pointer. Most likely he means an OLECHAR or BSTR string. It is basically a memory buffer that has a 32 bit character length in front followed by the actual characters (but as widechars, UTF16) but the pointer points to the first character hiding the length effectively. The problem is not the zero character at the end, in fact this layout was probably devised so that a BSTR could be still interpreted as C widechar string if one makes sure to zero terminate it. I have posted an unicode lllb on the NI forums ages ago that can also create and deal with BSTR and add it here for completeness. LabVIEW version >= 6.1 http://lavag.org/old_files/post-349-1234772122.llb'>Download File:post-349-1234772122.llb Rolf Kalbermatter 1 Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.