Jump to content

Strictly typed VI reference connector pane falg coding

Tomi Maila

Recommended Posts

Hi all,

I know I have these weird questions all the time... This time I need to know what coding scheme for strictly typed VI reference connector pane flags.

So I need to query using VariantDataType library the connector pane settings of strictly typed VI. The connector pane information is returned as an array of flags. Now I don't know the coding scheme of these flags. So does anybody know the meaning of these flags or do I need to reverse engineer them by myself?

See the attached VI for details on how to retrieve these flags.



Link to comment

I've now done some research. Let's index the flag bits starting with zero from the most significant bit of the I16 flag.

  • The bit 0 describes if the terminal is a dynamic dispatch terminal
  • The bits 1-2 are unknown
  • The bit 3 is 1 for required inputs and dynamic dispatch inputs and 0 otherwise (inputs that need to be connected)
  • The bit 4 is 1 for recommended or optional input or output and 0 otherwise (terminal that doesn't need to be connected)
  • The bit 5 defines if the output is in-place with an input (1) or not (0).
  • The bit 6 is unknown
  • The bit 7 describes if the terminal is an input (0) or an output (1)
  • If bit 5 is 1, the bits 8-15 or 12-15 (unsure) tell if the output terminal is inplace with an input terminal. The input terminal index is encoded into these bits. The weird thing is that at least I cannot predict the inplaceness. I may have two identical VIs with identical block diagram and identical connector pane. One can pass error cluster in-place and the other cannot. It seems LabVIEW is not very deterministic in determining the inplaceness of an output. Actually it even seems that data doesn't have to be inplace in order for the output to be defined inplace. As an example an error out output terminal connected to a constant can be inplace with error in input connected to nowhere.

For unconnected terminals the flag is 0 for all 16 bits.

EDITED Dec 5 2007, 15.17 GMT.


Link to comment

I believe you are correct for bits 0-7.

I created a simple VI with 10 terminals, an error in and an error out connected on the BD. As I moved the Error In input from terminal to terminal, the least significant bits changed in my Error Out flags to the index of the Error In flags in the array. As I changed the number of terminals, the number of bits increased as I placed the Error In at the top-left terminal of the pane.

When I used more than 12 terminals, the least significant bits went to zero. The only thing I changed was the number of terminals and I let LabVIEW automatically reassign the positions. I did not see any other flag bits change when using > 12 terminals.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Create New...

Important Information

By using this site, you agree to our Terms of Use.