Ryan Vallieu Posted May 5, 2021 Report Share Posted May 5, 2021 I have code from another developer that allows me to interface with a server to get configuration data. I have compiled the C code into a .SO on my Linux system without error - but in writing my LabVIEW CLFN, I am unsure what to do with a Union Pointer to pass the data. The function in the C file: void getValue(int so, CharacteristicHandle ch, Value *v) { long longValue; unsigned long ulongValue; double doubleValue; char * c_ptr = 0; long msgType = g_val; sendLong( so, msgType ); sendCHARhandle( so, ch ); readLong( so, &longValue ); v->order = longValue; readLong( so, &longValue ); v->type = longValue; readLong( so, &longValue ); v->label = longValue; switch( v->type ) { case integer: { readLong( so, &longValue ); v->v.l = longValue; break; } case uLong: { readLong( so, &ulongValue ); v->v.ul = ulongValue; break; } case floating: { readDouble( so, &doubleValue ); v->v.d = doubleValue; break; } case charString: { readString( so, &c_ptr ); v->v.s = c_ptr; break; } } return; } The definition of "Value" from another header file: /* Define a type to hold the value and type information for * a characteristic. */ typedef struct Value { union /* This holds the value. */ { long l; unsigned long ul; double d; char *s; /* If a string user is responsible for * allocating memory. */ } v; long order; valueType type; /* Int,float, or string */ valueLabel label; /* Description of characteristic -- what is it? */ } Value; I'm a newb at CLFN - the primitive datatypes I've got working for my other calls - this one I am not sure how to configure with the CLFN. Is this possible to do with the CLFN? Quote Link to comment
LogMAN Posted May 5, 2021 Report Share Posted May 5, 2021 I'm no expert with CLFNs but I know this much: LabVIEW does not have a concept for unions but you can always pass an array of bytes or a numeric of appropriate size and convert later. Just make sure that the container is large enough to contain the largest member of the union (should be 64 bit from what I can tell). Conversion should be simple enough for numeric types but the pointer requires additional work. Here is some information for dereferencing pointers in LabVIEW (there is a section about dereferencing strings). https://forums.ni.com/t5/Developer-Center-Resources/Dereferencing-Pointers-from-C-C-DLLs-in-LabVIEW/ta-p/3522795 Quote Link to comment
dadreamer Posted May 5, 2021 Report Share Posted May 5, 2021 Just pass a 8 bytes wide element as an union (U64 / double / 8 x U8 cluster) and treat it according to the type field after the function call. But I also see, that you have to pass a struct (i.e., a cluster), not a single union. So you should bundle order, type and label fields to your cluster as well. I don't see a definition of valueType and valueLabel items of the Value struct. Assume, they are enum (I32) and long (I32), is that correct? I'm also kind of unsure, who is responsible to set the return type (long, unsigned long, double or string) - the caller or the callee?.. If it's the caller and you want the return to be a string, you have to allocate the necessary amount of memory for char *s (DSNewPtr and friends) and deallocate it later. If it's the callee, then when you're getting the return as a string, you have to deallocate that string, when done with it. Quote Link to comment
Ryan Vallieu Posted May 5, 2021 Author Report Share Posted May 5, 2021 valueLabel is a typedef enum typedef enum { noValue = 0, InstrumentServiceName, NetworkPortNumber, ComponentID, ScanRate, Slot, NumChannels, ChannelNumber, CardNumber, FirstChannelNum, IEEE488_Address, VXI_Address, SerialNumber, SensorID, ScanListItem, Gain, Filter, Excitation, Cluster, Rack, PCUtype, EUtype, RefType, PCUpressure, MaxPressure, AcqMode, CalTolerance, CalPressure1, CalPressure2, CalPressure3, CalPressure4, CalPressure5, DataFormat, ThermoSensorType, ThermoSensorSubType, SensorType, MaxValue, SwitchState, OrderForAction, RcalFlag, CalDelayTime, CalShuttleDelayTime, nfr, frd, msd, MeasSetsPerSec, ServerName, RefPCUCRS, CalMode, ZeroEnable, NullEnable, ZeroMode, RezeroDelay, SensorSubType, ModuleType, ModuleMode, MeasSetsPerTempSet, CardType, Threshold, ControlInitState, ControlPolarity, StateEntryLogicType, StateEntryDelay, StateExitLogicType, StateExitDelay, TriggerType, TriggerEdge, ScansPerRDB, InterruptLevel, CSR_Address, Program_Address, ScannerRange, Hostname, ConnectVersion, From, To, ShuntValue, ValveConfig, NextToLastLabel, lastLabel, EndOfValList = 9999 } valueLabel; and valueType is also defined as an Enum typedef enum { noValueType = 0, integer = 1, floating = 2, charString = 3, uLong = 4, maxValueType, EndOfValTypeList = 9999 } valueType; Quote Link to comment
dadreamer Posted May 6, 2021 Report Share Posted May 6, 2021 So try to pass this cluster to your DLL (Adapt to Type -> Handles by Value) and see what will happen. In theory you should receive NetworkPortNumber in lower half (I32) of v. Quote Link to comment
Ryan Vallieu Posted May 6, 2021 Author Report Share Posted May 6, 2021 Thanks, giving that a go this morning. Quote Link to comment
Rolf Kalbermatter Posted May 12, 2021 Report Share Posted May 12, 2021 On 5/6/2021 at 4:55 PM, dadreamer said: So try to pass this cluster to your DLL (Adapt to Type -> Handles by Value) and see what will happen. In theory you should receive NetworkPortNumber in lower half (I32) of v. Note that "long order" is an int32 under Windows in any bitness, but an int64 under Linux for 64-bit! And the i32 portion in v might be actually in the higher order half on Big Endian platforms. For current LabVIEW versions that is however only relevant for VxWorks RT targets. All other supported platforms are Little Endian nowadays. Quote Link to comment
Ryan Vallieu Posted May 14, 2021 Author Report Share Posted May 14, 2021 (edited) Thanks Rolf! I am on CentOS 7.6 64-bit on a PXIe-8135. Got side tracked, gonna make a test code that feeds constants through that data structure and wrap that so I can know what to expect for data. On 5/12/2021 at 1:04 PM, Rolf Kalbermatter said: Note that "long order" is an int32 under Windows in any bitness, but an int64 under Linux for 64-bit! And the i32 portion in v might be actually in the higher order half on Big Endian platforms. For current LabVIEW versions that is however only relevant for VxWorks RT targets. All other supported platforms are Little Endian nowadays. Thanks, giving that a go this morning. Edited May 14, 2021 by Ryan Vallieu Quote Link to comment
Ryan Vallieu Posted May 17, 2021 Author Report Share Posted May 17, 2021 That info about Long on Linux being 64-bit cleared up a bunch of issues around the other function calls provided to me. The old system being interfaced is a Solaris system returning 32-bit Long information and the sendLong function called out on the PXIe CentOS system was sending out 64-bit Long types to the Solaris system, so all the messages had 32 extra bits of information. I've made the owner of the API aware of the issue that Long is not guaranteed to be 32-bit on a system so the API must be reworked. Quote Link to comment
Rolf Kalbermatter Posted May 18, 2021 Report Share Posted May 18, 2021 (edited) 15 hours ago, Ryan Vallieu said: That info about Long on Linux being 64-bit cleared up a bunch of issues around the other function calls provided to me. The old system being interfaced is a Solaris system returning 32-bit Long information and the sendLong function called out on the PXIe CentOS system was sending out 64-bit Long types to the Solaris system, so all the messages had 32 extra bits of information. I've made the owner of the API aware of the issue that Long is not guaranteed to be 32-bit on a system so the API must be reworked. It's only for Linux 64-bit a 64-bit value. And it's seems a bit of a GCC choice, while Microsoft chose to keep long as a 32-bit integer (and not support long long for some time instead insisting in their _int64 private type). And while not sure about the original Sun Solaris versions which might only have existed as 32-bit anyways, the later Linux kernel based versions however almost certainly use the same logic as the other Linux versions, although Sun had a tendency of trying to customize it when they could, and sometimes even when they shouldn't :-). Edited May 18, 2021 by Rolf Kalbermatter 1 Quote Link to comment
Ryan Vallieu Posted May 18, 2021 Author Report Share Posted May 18, 2021 Well I have the basic union code isolated and the LabVIEW wrapper seems to work to call into the function. Thanks guys! Quote Link to comment
Ryan Vallieu Posted May 20, 2021 Author Report Share Posted May 20, 2021 Using the valueLabel and then either TypeCast or GetValueByPointer.xnode to convert from string is working. Thanks for the insight and the gained knowledge. Quote Link to comment
dadreamer Posted May 21, 2021 Report Share Posted May 21, 2021 10 hours ago, Ryan Vallieu said: GetValueByPointer.xnode to convert from string Good work done! Another way would be to use MoveBlock function to read out the string data: How to determine string length when dereferencing string pointer using LabVIEW MoveBlock That way you could either read one byte at a time until you reach NULL byte or call StrLen, then allocate an U8 array of proper length and call MoveBlock finally. From what I can vaguely recall, GetValueByPointer XNode is not that fast as LabVIEW native internal functions (if that matters for you). Also I'm kind of unsure, whether you should deallocate that string, when you retrieved it in LabVIEW or the library deallocates it on its own (might be GC or some other technique used). Perhaps you could ask the developer about that or study the source codes. If you don't care, then just check for memory leaks, repeatedly retrieving a string (in a loop) and checking the memory taken by LabVIEW process. Quote Link to comment
Rolf Kalbermatter Posted May 21, 2021 Report Share Posted May 21, 2021 (edited) 3 hours ago, dadreamer said: Good work done! Another way would be to use MoveBlock function to read out the string data: How to determine string length when dereferencing string pointer using LabVIEW MoveBlock That way you could either read one byte at a time until you reach NULL byte or call StrLen, then allocate an U8 array of proper length and call MoveBlock finally. From what I can vaguely recall, GetValueByPointer XNode is not that fast as LabVIEW native internal functions (if that matters for you). Also I'm kind of unsure, whether you should deallocate that string, when you retrieved it in LabVIEW or the library deallocates it on its own (might be GC or some other technique used). Perhaps you could ask the developer about that or study the source codes. If you don't care, then just check for memory leaks, repeatedly retrieving a string (in a loop) and checking the memory taken by LabVIEW process. And I incidentally just had an application that I had inherited from someone and needed to debug where GetValuePointer.xnode would return with an error 7 : File Not found, when executed in a build app. Rather than digging into xnode handling and find out why on earth it was returning such an error (for a reportedly valid pointer created with DSNewPtr) I simply replaced the whole thing with a call to StrLen and MoveBlock and was done with it! Edited May 21, 2021 by Rolf Kalbermatter 1 Quote Link to comment
Ryan Vallieu Posted October 22, 2021 Author Report Share Posted October 22, 2021 (edited) Just found that the GetValueByPointer.xnode is not working in the built EXE - works in development environment. Bah Looks like I will need to set up a CLFN for StrLen and use that with MoveBlock Edited October 22, 2021 by Ryan Vallieu Quote Link to comment
Rolf Kalbermatter Posted October 25, 2021 Report Share Posted October 25, 2021 On 10/22/2021 at 10:59 PM, Ryan Vallieu said: Just found that the GetValueByPointer.xnode is not working in the built EXE - works in development environment. Bah Looks like I will need to set up a CLFN for StrLen and use that with MoveBlock It's most likely related to the xnode. Not sure why, it supposedly has worked in the past or the code in my inherited application would have made no sense as it was always meant to run as build executable and used as such before. But I just threw it out and replaced it with direct calls to the LabVIEW manager functions StrLen and MoveBlock and everything was fine. Quote Link to comment
Ryan Vallieu Posted October 25, 2021 Author Report Share Posted October 25, 2021 (edited) I find that GetValueByPointer.vi is working (in LabVIEW Development mode) with the u64 returned in the 'v' union value treating it as a pointer, but when I try the same thing with StrLen and passing the U64 as an Unsigned Pointer - StrLen appears to be not working, returning a size of only 2 bytes, when I expect 7, and then when I feed the u64 pointer into the MoveBlock and Size 2 it isn't returning any characters. Edited October 25, 2021 by Ryan Vallieu Quote Link to comment
Ryan Vallieu Posted October 25, 2021 Author Report Share Posted October 25, 2021 (edited) Ahhhhh... https://knowledge.ni.com/KnowledgeArticleDetails?id=kA00Z0000019ZANSA2&l=en-GB Adding the ..\resource\lvimptsl.so to the project and then making sure to add it to the Always Included as specified in the link under a support \resource\ folder in the build directory allowed the EXE code to run properly. Of course now it is going to bug me why StrLen wasn't working correctly... Edited October 25, 2021 by Ryan Vallieu Quote Link to comment
ShaunR Posted October 25, 2021 Report Share Posted October 25, 2021 2 hours ago, Ryan Vallieu said: Of course now it is going to bug me why StrLen wasn't working correctly... If you are tryin g to dereference a string pointer then try this. PtrToStr.vi 1 Quote Link to comment
Ryan Vallieu Posted October 25, 2021 Author Report Share Posted October 25, 2021 I got the same kind of strange behavior. With the Ptr coming from the v in the junction as a U64 - the GetValueByPointer.xnode works, your example points to the wrong memory block, and does not give the same answer. I do recognize WHERE the text came from, but not what the alignment correction would need to be to get StrLen and MoveBlock to point to the correct memory location. Quote Link to comment
ShaunR Posted October 26, 2021 Report Share Posted October 26, 2021 Are you sure you don't have a pointer to an array of strings? Quote Link to comment
Rolf Kalbermatter Posted October 26, 2021 Report Share Posted October 26, 2021 9 hours ago, ShaunR said: If you are tryin g to dereference a string pointer then try this. PtrToStr.vi 21.28 kB · 3 downloads Technically there are two mistakes in this. StrLen() returns an int32_t not an uint32_t (LabVIEW array lengths are for historical reasons always int32 values, even though an uint32_t would be more logical as negative array lengths make no sense and there is no mechanisme I'm aware of in LabVIEW that would use negative lengths for arrays as a special indicator of something). But this is a minor issue. If your string gets bigger than 2 billion characters, something is likely very wrong. The other error is potentially more problematic as the length parameter of MoveBlock() is since many many moons a size_t (it used to be an int32 in very early extcode.h headers before the C99 standard was getting common place, which introduced the size_t and other xxxx_t types). And yes that is a 64-bit (usually unsigned) integer on 64-bit platforms. It will likely go well anyways as LabViEW will probably sign extend the U32 to an U64 to fill the entire 64-bit stack location for this parameter, but it is not entirely guaranteed that this will happen. Technically it would be allowed to optimize this and just fill in the lower 32-bit and leave the upper 32-bit to whatever is there in memory, which could cause the function to misbehave very badly. Quote Link to comment
Rolf Kalbermatter Posted October 26, 2021 Report Share Posted October 26, 2021 (edited) 9 hours ago, Ryan Vallieu said: I got the same kind of strange behavior. With the Ptr coming from the v in the junction as a U64 - the GetValueByPointer.xnode works, your example points to the wrong memory block, and does not give the same answer. I do recognize WHERE the text came from, but not what the alignment correction would need to be to get StrLen and MoveBlock to point to the correct memory location. This doesn't make much sense. His example points nowhere, it simply takes the pointer and interprets it as a Zero terminated C string and converts it to a LabVIEW String/Byte array. If this doesn't return the right value, your v value is not what you think it is. And the Get Value Pointer xnode does in principle nothing else for a string type. But without an example to look at, that we can ideally test on our own machines, we won't be able to tell you more. Edited October 26, 2021 by Rolf Kalbermatter Quote Link to comment
ShaunR Posted October 26, 2021 Report Share Posted October 26, 2021 (edited) 4 hours ago, Rolf Kalbermatter said: And yes that is a 64-bit (usually unsigned) integer on 64-bit platforms. Is that regardless of Application bitness? Should this be a unsigned pointer sized integer or is a more complex solution required to ascertain the OS (target) bitness? Edited October 26, 2021 by ShaunR Quote Link to comment
Rolf Kalbermatter Posted October 26, 2021 Report Share Posted October 26, 2021 1 hour ago, ShaunR said: Is that regardless of Application bitness? Should this be a unsigned pointer sized integer or is a more complex solution required to ascertain the OS (target) bitness? size_t is usually according to the platform bitness since it makes little sense to have size parameters that would span more than 4GB on a 32-bit platform. If you need a 64-bit value regardless you need to use the explicit uint64_t. time_t is another type that can be 64-bit even on 32-bit platforms, but it doesn't have to be. It often depends on the platform libraries such as the libc version used or similar things. Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.