Jump to content
News about the LabVIEW Wiki! Read more... ×
Tomi Maila

LabVIEW constant values change

Recommended Posts

I see advantages of "constructor" method solution (flexibility, DD/ inheritance, ...). But this produces huge init VI BD (see example for only5 nested objects and imagine the same for 100 objects). I'd like to have something more readable/maintainable. How do you manage initialization/"construction" of hundreds of nested objects?

Snap_001.png

Share this post


Link to post
Share on other sites
9 minutes ago, Petr said:

I see advantages of "constructor" method solution (flexibility, DD/ inheritance, ...). But this produces huge init VI BD (see example for only5 nested objects and imagine the same for 100 objects). I'd like to have something more readable/maintainable. How do you manage initialization/"construction" of hundreds of nested objects?

Short answer: Make it table driven.

Long answer: As you scale up like that, you're entering the realm where you aren't going to code those values directly into G. You may be looking at

  • an array of values in G that you loop over to create objects
  • a binary file where you read back the objects you serialized earlier
  • or something as complex as a SQL database with relationships between multiple tables and you process the entire database to create your object layout. 

These are just suggestions. There are many other options. The point is, you need to make your construction more data-driven instead of directly coded. This is true of all programming languages I've seen, not just G. The notation of objects is such that they come into being as data drives their existence. It's relatively easy to create an object with a given value, and if you're building up objects as they come into existence within a system, everything works fine. But to create an entire system of objects in one burst, you need a data structure that can describe an entire system. That's why things such as object databases exist. 

Share this post


Link to post
Share on other sites
On 12/18/2018 at 5:10 AM, Petr said:

But this produces huge init VI BD (see example for only5 nested objects and imagine the same for 100 objects)

It's worse than what you show. I see a lot of typedefs there. One issue with your approach is that there is no guarantee your values will remain when you update your typedefs. I have been using LabVIEW long enough to know that you should never trust typedefs on the block digram to keep the values when they get updated. I would change your code right now because it will fail in the future.

Share this post


Link to post
Share on other sites
19 minutes ago, Michael Aivaliotis said:

It's worse than what you show. I see a lot of typedefs there. One issue with your approach is that there is no guarantee your values will remain when you update your typedefs. I have been using LabVIEW long enough to know that you should never trust typedefs on the block digram to keep the values when they get updated. I would change your code right now because it will fail in the future.

Is that true in the recent versions? The typedefs got revision protection in LV 2015, I think. If the mutation is going to throw away data, it does the same sort of "relink" behavior as when changing conpanes of subVI so you preserve the data. 

Share this post


Link to post
Share on other sites
2 minutes ago, Aristos Queue said:

Is that true in the recent versions? The typedefs got revision protection in LV 2015, I think. If the mutation is going to throw away data, it does the same sort of "relink" behavior as when changing conpanes of subVI so you preserve the data. 

I'm confident you have data to support your argument. However, I've been burned so many times that I cannot keep track of if it was fixed or what version it was fixed or even if it was fixed but is buggy and it works only under certain conditions. I have lost money because of this and don't risk it anymore. Sorry. But the fact that it was an issue for so many years and only addressed in 2015, gives me pause. Again, not doubting your statement, but I don't trust it.

Share this post


Link to post
Share on other sites
17 hours ago, Michael Aivaliotis said:

I'm confident you have data to support your argument. However, I've been burned so many times that I cannot keep track of if it was fixed or what version it was fixed or even if it was fixed but is buggy and it works only under certain conditions. I have lost money because of this and don't risk it anymore. Sorry. But the fact that it was an issue for so many years and only addressed in 2015, gives me pause. Again, not doubting your statement, but I don't trust it.

It's definitely fixed in LabVIEW 2016, although it can be a pain. LabVIEW will break all VIs where a cluster or enum typedef is present whose elements contain values that it can not mutate unambigously. And you get a dialog that lists all those locations and it shows you an old and new view of the data (with a best guess for the new value) where you have to edit/select the new data value and then confirm to use that from now. This applies to enum values that somehow changed their name (or were removed) as well as to cluster elements with different name or a reordering that makes it impossible to reassign the old values unambigously.  Simply removing an element in a cluster or enum does however leave the other elements data intact, so it does a reassignment based on label/name rather than just oridinal order of the data.

It's a huge improvement, although it can feel a bit painful at times as an entire hierarchy can seem to break completely because of a minor edit in an enum label.

Edited by Rolf Kalbermatter
  • Sad 1

Share this post


Link to post
Share on other sites
5 hours ago, Rolf Kalbermatter said:

It's definitely fixed in LabVIEW 2016, although it can be a pain. LabVIEW will break all VIs where a cluster or enum typedef is present whose elements contain values that it can not mutate unambigously. And you get a dialog that lists all those locations

Yes, I remember when this feature came out. I was very happy about it. Then I foolishly trusted it. Then I discovered a bug in this feature by accident. The auto-mutation failed. So now I was burned again by a feature that I was suppose to trust to solve the original problem. You see why I'm shell-shocked.

  • Sad 1

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×

Important Information

By using this site, you agree to our Terms of Use.