Jump to content

LVOOP static method VI related issues


Recommended Posts

I encountered some LVOOP class static method VI related issues. I reported these issues to NI, however I want to report them to the community as well so that you are aware of the issues, should you encounter them.

When a child class is passed to a static method of parent class, the method output is automatically casted to child class type under certain conditions; if the class wire from class input to output is continous, then output class is automatically casted to more specific in the calling VI block diagram. I like'd this behaviour first. I even expected LabVIEW to behave that way.

However, I noticed today that there are some issues that make this behaviour a real problem. See the attached project. There are two classes 1 and 2. Class 1 has static methods A, B and Class 2 has a dynamic methods C and D. In Main VI.vi there are three examples:

  1. The output of Method A is not automatically typecasted although the class wire is continous because there is a disable diagram in the block diagram of A. This is a bug, I think.
  2. The output of Method B is automatically typecasted becasue the class wire in B is continous. Ok.
  3. A method interface should complitely define the behaviour of the method in OOP. This guarantees that the developer can at later time change the implementation of the method and this change will not break any VI using the method. This is in the very soul of OOP and guarantees the reusability. In LabVIEW this means that the method connector pane should complitely define how the method can be used. However this doesn't hold for static LabVIEW class methods. This can be seen in the third example. Method B2 represents some later implementation of method B. Although the interface of methods B and B2 are identical, changing the internal implementation of B to B2 has broken a wire in the calling VI. This I think is really a design flaw that should be addressed in the next major release of LabVIEW.
  4. The method D is broken because of disable diagram structure. A definite bug!

About the third point. What I suggest is that the connector pane would complitely determine the behaviour of the VI. This means that there should be something similar to the dynamic dispatch terminal in static VIs. This terminal wouldn't affect dispatching but would determine if the output is automatically typecasted. Methods written in LabVIEW 8.20 should be automatically upgraded to use this new type of terminal so that every method, the output of which is typecasted in 8.20, would be upgraded to have these new terminals instead of standard terminals.

One alternative is to rename the dynamic dispatch output to simply dynamic output. Such a dynamic output would require that any wire connected to it would be continous from a class input. For dynamic VIs this input must be dynamic for static inputs it can be any of the class inputs. The input should not change from case to case in case structures.

Download File:post-4014-1169496645.zip

Link to comment

Jimi, please... you gotta stop posting to both NI and here. Or repost my reply along with your crosspost. I can't keep up with all the forums.

----------later----------

Ok, here's my cross post from the ni.com forum...

The code disable structure (and conditional disable structure) bug is fixed in the next version of LV, not 8.2.1.

With regard to the interface change...

I don't consider it a design flaw. I've had this conversation a few times with various customers/internal developers. Effectively you *have* changed the interface -- you've made it so that the output type is no longer guaranteed to be the same as the type that was passed in. All callers of this method now have to worry about the possibility of a downcast that might fail. It is a hard-to-notice change in the interface, but a change nonetheless. Perhaps something might be done to call attention to the fact that the type used to propagate through but doesn't any longer, but whether such a call out is someday added or not doesn't change the fact that the interface no longer guarantees type safety. I think this is the same category as a parameter passed by pointer in C++ changing from "const type *" to "type *".

Going so far as to provide a way to mark terminals to enforce runtime type propagation would be nice. It would help with those dynamic dispatch cases where you'd like mutliple terminals to propagate type, something that can't currently be done because we can't tell which implementation will be invoked and any new child might not preserve the type on its own diagram. But we wouldn't turn off the automatic detection -- the ease-of-use factor is way too high. As you said, you expected it to work this way when you started, and requiring everyone to mark terminals on all VIs -- not just members of the class, but any VI that uses the class -- raises an extreme usability barrier.

Link to comment
With regard to the interface change...

I don't consider it a design flaw. I've had this conversation a few times with various customers/internal developers. Effectively you *have* changed the interface -- you've made it so that the output type is no longer guaranteed to be the same as the type that was passed in. All callers of this method now have to worry about the possibility of a downcast that might fail. It is a hard-to-notice change in the interface, but a change nonetheless. Perhaps something might be done to call attention to the fact that the type used to propagate through but doesn't any longer, but whether such a call out is someday added or not doesn't change the fact that the interface no longer guarantees type safety. I think this is the same category as a parameter passed by pointer in C++ changing from "const type *" to "type *".

Going so far as to provide a way to mark terminals to enforce runtime type propagation would be nice. It would help with those dynamic dispatch cases where you'd like mutliple terminals to propagate type, something that can't currently be done because we can't tell which implementation will be invoked and any new child might not preserve the type on its own diagram. But we wouldn't turn off the automatic detection -- the ease-of-use factor is way too high. As you said, you expected it to work this way when you started, and requiring everyone to mark terminals on all VIs -- not just members of the class, but any VI that uses the class -- raises an extreme usability barrier.

What if I want to have an interface for my VI so that the VI wouldn't do this "coincidental automatic downcasting". Perhaps I need such an interface because I'm not certain if I can implement my VI so that coincidental automatic downcasting would work in later revisions of the VI. However the functionality of the VI would simply be setting a object private data member value and then passing the object out. Such a VI interface would be considered to use "coinsidental automatic downcasting", and the problem is, I cannot think of any good way of preventing it. So I cannot write a VI that would preserve it's interface. The only way I can think of avoiding the coinsidental automatic downcasting would be placing a case structure with constant selector. Doesn't sound like Zen of LabVIEW.

What I'm trying to say is that I think there needs to be a "manual override" for the coinsidental automatic downcasting so that the developer could at development time specify if this automatic behaviour will be used for a specific output terminal. So perhaps if the output terminals would by default be automatically downcasting but you could turn this feature off and then manually specify if an output terminal would be not downcasting or downcasting.

EDIT: If coincidental automatic downcasting can change the interface of the VI, then this interface should be recognized also by type specifying VI constants and call-by-reference-nodes. Currently this interface is ignored by these nodes. If you try to call your method dynamically, the node interface is different from what it is when your call is statically linked.

Link to comment
About the third point. What I suggest is that the connector pane would complitely determine the behaviour of the VI. This means that there should be something similar to the dynamic dispatch terminal in static VIs. This terminal wouldn't affect dispatching but would determine if the output is automatically typecasted. Methods written in LabVIEW 8.20 should be automatically upgraded to use this new type of terminal so that every method, the output of which is typecasted in 8.20, would be upgraded to have these new terminals instead of standard terminals.

I reported this as well, but it was discarded as "not a bug". I also think you should be able to define what you want, or how you want to see it, and that LabVIEW should not try to think for the programmer. It may be that the programmer simply made a mistake. If LabVIEW suddenly thinks it can solve the problem by replacing the VI with the more generic one, this may have consequences that the compiler could never had thought of. We all know that when computer starts to think for us, things go wrong ;)

For example, we have an appliaction that feed birds by calling a Bird.feed method. If we by accident wire a Bear to this method LabVIEW automatically replaces the Bird.feed VI with the more generic Animal.feed VI. This may yield unexpected problems because feeding Bears is a different ball game. We should have taken precautions before we started feeding the Bear. To a non-OO programmer this may sound far-fetched because we don't deal with Bears in LabVIEW, but similar problems occur with programs. As a programmer we know the classes we work with. On some classes we can do actions immediately, but on others the same method may require other things to be done first. If we wanted to be able to feed all animals including bears, we had written a method that was able to do that and had allowed Bears to be fed as well, by specifying Animal as the input type. If LabVIEW adjust the output type to what it sees in the diagram going into the VI, this can have unexpected effects on things further on in the program.

If you have to specify, at least you have thought about it.

Joris

You could add an option at the method call named "adjust output type to input". It would best give a visible "bumb" on the output to indicate that it is automatically casted. You would then keep be able to have predictable visible behaviour under all circumstances.

Joris

Link to comment
What I'm trying to say is that I think there needs to be a "manual override" for the coinsidental automatic downcasting so that the developer could at development time specify if this automatic behaviour will be used for a specific output terminal.
The case structure suggestion sounded perfect to me. Remind me to add to Zen of LabVIEW that "Odd is fine if the case is rare." I'll wager that no one who is not a LAVA reader raises this question in the next two years. If anyone does, I'll reconsider the issue.

EDIT: If coincidental automatic downcasting can change the interface of the VI, then this interface should be recognized also by type specifying VI constants and call-by-reference-nodes. Currently this interface is ignored by these nodes. If you try to call your method dynamically, the node interface is different from what it is when your call is statically linked.

You're not going through the same interface if you're using the Call By Reference node. The CBR node does not preserve inplaceness -- it would be odd to return an error that something you cannot see doesn't match. Similarly, the CBR does not preserve coincidental auto downcasting. The interface you're going through could invoke any VI with that type signature. There are really two options here, equally valid under the rules:

  1. If the CBR node were to demand that all VI refs that come to it must have the same coincidental terminal mapping, then it could return an error if one didn't.
  2. If the CBR allows any VI as long as the types match, then it can run any of them as long as it never does coincidental auto downcasting.

The question is whether you consider the CBR to be defining the interface to any VI of a given type signature or any VI of a given call signature. Given that the call signature is invisible to users -- and I've seen no feedback generally to change this -- then, like inplaceness, it should not be part of the CBR behavior.

Now, we already have "strict VI references". We could establish "super strict VI references." Maybe instead of one orange star we could put two orange stars. These references would preserve inplaceness, coincidental auto downcasting and any other call interface features that LV invents as time goes forward. Only the most advanced users would ever be interested in them, but it is something we could do.

PS: In case anyone else is wondering, Jimi asked me for a name for the feature, and I suggested

explicit automatic downcasting for what happens between the dynamic input and dynamic output FPTerminals, where the VI actually breaks if the runtime type is not preserved and

coincidental automatic downcasting when LV discovers that a static VI happens to preserve runtime type and so we enable automatic downcasting on callers of the VI.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.