Jump to content

LabVOOP Usage 1 Year Later


Recommended Posts

QUOTE(Val Brown @ Aug 24 2007, 12:11 AM)

As you point out, without CLEAR advantages to a new feature, most programmers doing real work won't -- and can't -- take the time to "play around" and see how it goes.

That is exactly my issue right now. And I don't see it changing significantly until NI supports training on LVOOP (or whatever we're calling it these days). Until then, all I'm saying to my boss is that I think I know how to use this stuff...kind of...and to give me more time to learn it. That isn't going to fly when we already spent thousands on advanced training that was based on the idea of having a certified developer on staff that knows what he's doing. Now I'm going to tell them that formal training isn't really needed to get a large project up and running with LVOOP...?? Uh uh. :nono:

The good news is that I hear through the grapevine that there may be new LVOOP courses on the way (a new two day addition to LV Adv)...no telling when that will happen though. Until then I'm afraid I may be stuck doing LVOOP on the side as a "hobby".

Link to comment

QUOTE(Justin Goeres @ Aug 24 2007, 06:30 AM)

Removing our access to clusters outside of classes would be one way to do it. Even that wouldn't truly force everyone to change, but it would most certainly ruin the experience for anyone refusing to do so. (I don't think NI should, or would do that.)

You've hit the nail on the head. The ONLY way(s) that I can think of to FORCE everyone to use classes would be for them to remove one of more central components of LabVIEW and, thereby, kick their most dedicated and long-standing customers right in the teeth.

That would be the most classless thing that NI could, no matter how good the overall implementation was and how comfortable it made the transition for current C++ and JAVA users.

If I'd wanted to have been COMPELLED to use objects I would have simply used C++ from the beginning. I really do NOT want G to simply become a "pretty" IDE for C++. It doesn't NEED to happen; doesn't bring ANY real benefits; violates the central design and organizing principles of LabVIEW; and is an affront to those who were not only "early adopters" of LabVIEW, but have used it consistently for years.

Link to comment

QUOTE(orko @ Aug 24 2007, 09:32 AM)

The good news is that I hear through the grapevine that there may be new LVOOP courses on the way (a new two day addition to LV Adv)...no telling when that will happen though. Until then I'm afraid I may be stuck doing LVOOP on the side as a "hobby".

I agree, training is essential.

We, Endevo, have re-written our two day course (which have been run since -98) to introduce OOP in LabVIEW, lvoop as well as by-reference extension is covered. We are currently writing the last words in the course book...

It will be made available this Autumn and we are working on having it made available for as many LabVIEW programmers as possible. At the least it will be available through:

Endevo in Sweden and Australia

VI Engineering in the US

Zühlke in Switzerland/Germany

And anywere you can convince me to travel :shifty: (which is WW).

Jan

Link to comment

QUOTE(Val Brown @ Aug 24 2007, 10:00 AM)

If I'd wanted to have been COMPELLED to use objects I would have simply used C++ from the beginning. I really do NOT want G to simply become a "pretty" IDE for C++. It doesn't NEED to happen; doesn't bring ANY real benefits; violates the central design and organizing principles of LabVIEW; and is an affront to those who were not only "early adopters" of LabVIEW, but have used it consistently for years.

*chuckle* I'm sorry... reading this (and the other posts) I can't help thinking of Monty Python and the Holy Grail. "Help! Help! I'm being repressed!"

You're in no danger whatsoever of classes being forced on you by the removal of clusters. Nor any other method of moving you toward classes by NI. The only thing that will compel you toward classes will be the awe-inspiring beauty of the coherent libraries of VIs that your peers produce in the next few years and the shame that you feel when you compare it to your own VI hierarchies. :worship: Why should we try to force you to use classes when your own base desires (for good code and sustainable designs) will draw you inevitably toward it?

Tangent A: C++ does not compel the use of classes. You can backslide into C anytime you want. The C++ compiler accepts all C syntax.

Tangent B: G as a pretty IDE for C++??? What a HORRIBLE vision!! Have you *seen* C++? It is more of a hack than a language. Like Darth Vader, C++ "is more machine than man now." My hat goes off to the hackers who designed it... there are amazing amazing aspects to it. But for arcane syntax, it wins over just about every language I've ever seen. G should not be a pretty IDE for any of the traditional programming languages. What it should be is a pretty IDE for expressing human concepts and needs to the CPU of the machine, in the most elegant, efficient and intelligible way possible... which is why you'll eventually _want_ classes.

Link to comment

QUOTE(Justin Goeres @ Aug 24 2007, 06:30 AM)

I hadn't heard that rumor, but my belief is that if NI is going to teach LVOOP (and I think they should) it should be taught early in the course tree. If the drumbeat of LVOOP adherents is "every cluster should be a class" (which I think I stole from AQ, but I also think is a decent jumping-off point), then I'd like to see LVOOP taught in place of (or alongside) clusters, right at the Basics level.

That gets me thinking... How about marketing LVOOP to new users as "private cluster libraries"? Don't even say "object" and especially don't say "object-oriented programming". Simply teach new users about the features of LabVIEW private cluster libraries and why these features help them write better code.

The problem with calling LVOOP "object-oriented programming" is that people will then try to research traditional OOP, and then one of the following can happen:

A) They quickly get caught in a bottomless pit of information and get discouraged because there is too much to learn. (I've been a student of OOP for quite some time and don't feel like I fully understand everything about all flavors of OOP.)

B) They understand and like traditional OOP and become dissapointed to learn that LVOOP is different.

Instead, teach them about the features and then surprise them later by saying "Guess what? You know how to do object-oriented programming in LabVIEW."

Link to comment

QUOTE(Jim Kring @ Aug 24 2007, 07:18 PM)

That gets me thinking... How about marketing LVOOP to new users as "private cluster libraries"?

Actually I hate the fact that many things in the context of LabVIEW have names that differs from 'industry standards'. I think it actually slows down the learning process instead of speeding it up as it's harder to tie the new concepts you hear to something you've previously heard of. I actually learned coding LabVIEW quite late compared to many of you, so the learning experiense is still in my fresh memory :)

Link to comment

QUOTE(Jim Kring @ Aug 24 2007, 10:18 AM)

I agree with Tomi that it's not a good idea. People shouldn't have to relearn every concept under a new name just to find out that they already knew about it...

QUOTE(Val Brown @ Aug 24 2007, 01:11 AM)

It really doesn't make sense for NI to alienate and disenfranchise its long-standing, advanced programming community that has "come up" using dataflow. Why have dataflow-based programming precluded by enforcing classes?

Val, have you even looked at LabVOOP? It's Dataflow! It's Objects! It's unique! It's, well, LabVOOP.

I'm pretty sure LabVOOP did more to "alienate" and "disenfranchise" the users most familiar with OOP than any other group. Getting them to rally around it has obviously been a problem...

Link to comment

QUOTE(Guillaume Lessard @ Aug 24 2007, 12:57 PM)

Yes, but part of the (perceived) problem with LVOOP is that people who (think they) are familiar with OO see it and exclaim, "But that's not OOP! I want it by reference!" If LVOOP as designed doesn't "look enough like" OOP the way lots of developers expect to see it, isn't that really just a semantic problem?

QUOTE(Guillaume Lessard @ Aug 24 2007, 12:57 PM)

Val, have you even looked at LabVOOP? It's Dataflow! It's Objects! It's unique! It's, well, LabVOOP.

It's all those things, but at its most basic: it's Clusters. Same old same old, with a little of the new.

QUOTE(Guillaume Lessard @ Aug 24 2007, 12:57 PM)

I'm pretty sure LabVOOP did more to "alienate" and "disenfranchise"
the users most familiar with OOP
than any other group. Getting them to rally around it has obviously been a problem...

The real problem, I think, is that there are 1000 slightly different, and conflicting, ideas about exactly what it means for something to be OOP. For some people, a critical component of an OOP implementation is that it's by-reference. LVOOP isn't (natively), so for those people the entire idea never gets off square one. But that's throwing the baby out with the bathwater.

I guess everyone is entitled to their personal crusades (Dog knows I have mine :angry: ), but after a year of LVOOP, I like it. I have to go back to LV711 on occasion, and it's jarring because my by-value objects are missing.

Link to comment

QUOTE(Tomi Maila @ Aug 24 2007, 05:05 PM)

However I don't think LVOOP should actually be called OOP at all. This is the part where I think NI made a wrong choice. The word object holds with LVOOP but the phrase OOP doesn't, at least not complitely. I guess if NI had chosen a different term such as http://expressionflow.com/2007/04/27/object-flow-programming-merging-dataflow-and-object-oriented-programming/' target="_blank">object-flow programming, it would have been easier for LabVIEW users to consider it somewhat different from OOP. There would not have been all those 'where are my by-reference objects' claims. People would have been generally happy that something new was introduced into LabVIEW. But selecting a word that already had a very strong meaning that didn't exactly coincide what LVOOP provided caused and still causes a lot of problems.

I disagree completely. We don't need to be introducing new terms that adequately describe things that have existing terms. LVOOP is object-oriented programing. If people are assuming by-reference behavior, we need to fix their assumption, not introduce new terms. If anything, I think the term LVOOP needs to go away and be replaced with just plain OOP. LVOOP smacks to me as an attempt to brand a preexisting concept with the LabVIEW name. We don't talk about CppOOP or JavOOP, why should OOP in LabVIEW be any different?

Link to comment

QUOTE(Aristos Queue @ Aug 24 2007, 08:38 AM)

*chuckle* I'm sorry... reading this (and the other posts) I can't help thinking of Monty Python and the Holy Grail. "Help! Help! I'm being repressed!"

....

Tangent A: C++ does not compel the use of classes. You can backslide into C anytime you want. The C++ compiler accepts all C syntax.

Tangent B: G as a pretty IDE for C++??? What a HORRIBLE vision!! Have you *seen* C++? It is more of a hack than a language. Like Darth Vader, C++ "is more machine than man now." My hat goes off to the hackers who designed it... there are amazing amazing aspects to it. But for arcane syntax, it wins over just about every language I've ever seen. G should not be a pretty IDE for any of the traditional programming languages. What it should be is a pretty IDE for expressing human concepts and needs to the CPU of the machine, in the most elegant, efficient and intelligible way possible... which is why you'll eventually _want_ classes.

Wow, the "I'm being repressed" refrain is what I was hearing for the "we NEED by-reference" OOP crowd. I'm all for co-existence -- I was responding to the SPECIFIC */ Comment /* that NI might COMPEL classes.

Re: Tangent A: Of course not and that's part of my point in re: to the above as well, but I wouldn't call USING C -- ANSI C -- as "backsliding". I might actually describe it more like "returning to the source...", esp in re: to C++.

Re: Tangent B: Yes, my point exactly -- it's a TERRIBLE, horrific image that G might only be seen as and IDE for the "real code".

As for how much I might "eventually _want_ classes", I have to say: I don't really think so. But what I do think I want -- and already have access to (and have had for quite some time) is Private Cluster Libraries". Now THAT makes direct sense, in terms of LabVIEW itself, for what "classes" are meant to do, at least as far as I understand them. "Classes" always seemed to me like it was precisely the wrong term -- Class DEFINITIONS perhaps but not just "Class".

One other footnote FWIW -- I've liked the internal, native to LV implementations -- ie the by-value implementations. But what I've also picked up in many of the discussions is that the by-reference adherents are really NOT happy. My other major point is that I really hope that NI doesn't try SO HARD to reach out to the "by reference" crowd that it loses touch with its core, historical community of users.

Link to comment

QUOTE(Aristos Queue @ Aug 22 2007, 11:16 PM)

If you have comments about the *overall* nature of LabVIEW classes, please post them here. Please don't post feature requests here. I'm interested mostly in getting a feel for adoption rates. I'd like to know how secure everyone feels doing professional development with LabVIEW classes. If there is a particular sticking point (other than you haven't upgraded at all yet) that is keeping you from developing with classes, I'd be interested in that.

Im sorry to say I tried going head first into LVOOP (while training in OOP ) at its release 8.2 stage which was (in hindsight) bad combination luck timing

whatever..

I spent good part of 3 months trying to sort it out (at my own risk) and concluded finally that I could only use LVOOP for protecting

data in an elegant way. That is currently all the functionality I can manage (with any trust) and probably all I will attempt for the time being.

I do have one data structure I am very proud of designing. I wanted to do much more with that one but it is done now.

I am back to using Evdevo by ref objects and building many functioning classes in that toolkit. Main reason for switching.

1) It was stable as a rock and 2) all the crashing class breaking and VI breaking episodes ceased and 3)My work (value) began to increase linearly.

4) It can be converted to executable with no worries.

Like a lot of us here I cannot do much experimenting in LabVIEW. That is why this forum is so valuable to me. Nevertheless I really do think

native Classes are the way to go in LV. I too look forward to using LVCLasses when they are ready.

Link to comment

by the way and just to add some fuel to the fire :D

If NI would ask me (of course it does not, but if it does ;) ) what is more important for the further development of LabVIEW

a) LVOOP

b) an Atmel AVR Module for Labview, which allows the user to programm an Atmel µC in a way the FPGA Module does ...

I'd answer b! ... and again I have famous support :D

med_gallery_17_57_612668.jpg

Link to comment

I am educated in computer science and I have tried to work with LabVOOP objects from the beginning. Those who have have followed the OO-by-value-or-by-ref debate know my position. I think they are quite unintuitive if you've learned to see objects as self-sufficient things, items you can command and query, and which have their own knowledge of the part of reality that they represent. LabVOOP objects are not self-sufficient. You need to do work to store the state and maybe lock that state if you are using the data from multiple places, which is prone to error. I know there are templates that implement that but they always have disadvantages, like a huge number of subVI's, extra code in the method VI's or wrapper VI's. This obfuscates the methods and makes the objects more difficult to use effectively. To summarize, the non-self-sufficiency is my main biggest problem with LV objects.

As I've stated before, I think the behaviour would not really be that different for starters if the wires were referencing. Starters would not notice. But for an advanced user the by-ref behaviour is just so much more valuable. I really miss that. And only by-ref is ready for the future, with distributed objects anywhere. I don't care where my objects is physically located, I just want to use it.

I know there has been a tremendous effort to get the object system working. I understand how much work it must have been and it must have been hard on the crew to have to delay the release of a new feature yet another time. But, despite the comments that I've made above, what you now have is really quite an achievement. So many obstacles have been overcome that noone thought in the beginning that would be present. Now place the final spoke and make it by ref. That unleashes the unrivaled potential that OOP has.

Joris

Link to comment

See. I think you've really stated the issue very clearly:

QUOTE(robijn @ Aug 27 2007, 10:28 AM)

I am educated in computer science and I have tried to work with LabVOOP objects from the beginning. Those who have have followed the OO-by-value-or-by-ref debate know my position. I think they are quite unintuitive if you've learned to see objects as self-sufficient things...

Essentially what is being asked -- to say it in this precise way -- is to make LV work like OO does in Java and C++. This means basically making LV be and IDE for OO as implemented in Java and C++.

Now I have no problem with this KIND of goal -- as long as it doesn't become mandatory in ANY WAY. I think co-existence is fine but dominance or demand for the "pure" OO implementation AS IF it were inherently superior doesn't really jibe with reality and it esp doesn't jibe with LV and it rich history of dataflow programming.

Link to comment

QUOTE(Val Brown @ Aug 27 2007, 08:49 PM)

It is a proved concept, long discussed in the computer science world. OK, that world did not have parallel programming natively (excelently presented at NIweek this year!), but for the rest most programming concepts are shared between LabVIEW, C, Java, Pascal and many other languages -- pointers not fortunately :) . In a way dataflow is also present in those languages, but only implicitly, not explicitly like in LabVIEW. LabVIEW is just a programming language, like many others, with advantages in certain areas. And I like its advantages a lot. Like editing a single VI and running/testing it right away. I don't want to write a program to test a simple function. Actually this is a point that has not been filled in in LabVOOP, it is not Rapid development yet. But I have no doubt that this is high on the list of the developers.

QUOTE(Val Brown @ Aug 27 2007, 08:49 PM)

Now I have no problem with this KIND of goal -- as long as it doesn't become mandatory in ANY WAY. I think co-existence is fine but dominance or demand for the "pure" OO implementation AS IF it were inherently superior doesn't really jibe with reality and it esp doesn't jibe with
LV
and it rich history of dataflow programming.

I completely agree with you. I have always disliked the way objects are presented as "the solution for everything". I think you should use every feature for things that it is best at. Objects should in many places not be used, if you ask me. There are many places where they are very convenient, but also many places where they add nothing and rather distract from the simple things that are actually being performed.

Joris

Link to comment

Joris:

I think we agree on a number of points but also disagree on a few, or at least draw somewhat different conclusions from points on which we disagree.

Yes, OO is, as you put it "a proved concept" in CS but, as you also point out, parallelism is a real problem for "traditional CS". It not only is NOT a problem for LV; rather, LV has been developed from the ground up with the seeds of true parallelism built right in. This is part of the reason that dataflow has been so fundamental to LV -- the way it is implemented in LV but allows for parallelism but also for determinism, and there are multiple means to implement these possibilities.

A big concern I have for NI is that the desire to "bring in" the traditional OO-based CS-educated groups may seduce NI -- or others! -- into simply "leaving out" dataflow and the kinds of by-value OO that are -- and have been! -- possible for quite a while in LV.

Adding in OO -- including by-reference -- is a wonderful idea. Implying that such an "addon" is the always and best way to program -- to make it into an implicit or explicit "best practice" -- is to do a profound disservice to the community of LV programmers, IMO.

I'm glad to hear you also agree with that, while OO can be very useful in many instances, it is not only not the best approach in every situation, it is actually a less than ideal approach in a number of situations. That message needs to also be posted along with the efforts of the many dedicated and highly talented folks who have done so much to implement OO in LV, and who continue to refine and extend its capabilities.

Link to comment

QUOTE(Justin Goeres @ Aug 27 2007, 12:43 PM)

Do you mean that NI presents LVOOP this way, or that OO in general (in other contexts/languages) is presented this way?

I, for one, don't think NI is guilty of that.

No I don't think NI is guilty of that YET but, my concern was provoked a bit recently by a thread here on LAVA that implied that NI might COMPEL the use of classes at some point in the future.

To my way of thinking that would be an incredibly stupid and misguided thing to do.

But my sense is that there are people "out there" who do present "OO uber alles"...

Link to comment
  • 2 months later...

QUOTE(Val Brown @ Aug 27 2007, 09:53 PM)

Yes but it is not an uncommon sentiment, usually stated in at least somewhat different language.

It's phrasing is unfortunately referring to a piece of history which is disliked (pretty unanimously). It's quite insensitive to citizens of the respective country to bring this kind of thing into a post with the innuendo that it's representative of a bad thing.

This aside:

I've recently implemented a set of LVOOP VIs for an instrument family. I initially didn't get it at all, and have previously been a sceptic of LVOOP (indeed OOP in general). But the penny has dropped. I find it an elegant way to achieve things which were otherwise only very difficult to achieve previously. I now have the ability to address instruments with significantly different hardware implementation / interface / protocols as a general family of units bound by their base class definition.

It also makes it MUCH easier to add new devices in future if need be.

I don't quite get the discussion over by reference of by value. I understand the differences, but I personally prefer the by value concept because it's closer to what LV does in other instances.

I like LVOOP, and I'll be looking for ways to use it in future too, but by no means to the exclusion of "standard" LabVIEW.

Shane.

Link to comment
  • 4 weeks later...

Looking at the response to the poll and comments to-date the results are interesting. The first observation is that there is not a question on why those that have adopted LV classes use them. Between Aug 2006 and Aug 2007, have you used LabVIEW classes?80% of the respondants have tried classes in some form.If you have NOT used LabVIEW classes, why?Even though 20% of the respondants said they have not used LV classes 54% still gave reasons why they have not used classes.If grouping these responses by reasons why not in similar categories you get the following breakdown of the 54%:

  • Training issues (training or learning) 25%
  • Application issues (stability, upgrading, targets) 17%
  • Other (by-ref, boycot, or other) 12%

Do you plan significant development using LV classes within the next year?Even with the issues described over 60% still want to use classes in the near future.Analysis of the results.Between Aug 2006 and Aug 2007, have you used LabVIEW classes?Question: Do the respondants represent a normal population of the LabVIEW community? Or another way to approach this, has 80% of all LabVIEW users throughout the world tried LabVIEW classes?Simple answer to this question is no. Question: Do the respondants represent a normal population within the certified develeoper community? Maybe although this is still a little high. If you have NOT used LabVIEW classes, why?Of the 54% that gave reasons why not it appears that 42% would use classes as implemented if they knew how to use them and felt that LabVIEW classes were mature enough. Do you plan significant development using LV classes within the next year?There is still a strong desire in this population to support LV classes but is unclear based on these results whether the non-certified user would use or understand these tools.Personally I am begining to understand the class mentality from an abstract perspective (i.e. a cow is an instance of a four legged, mamal). The basic concepts of utilizing private & public functions, dynamic dispatching, overrides, inhertance, etc. are powerful tools that can support re-use, data hiding, expandability, etc. These concepts, if not applied correctly, can also result in project overruns, debugging errors, confusion, etc. I personally have not found enough examples or material specific to the implementation of LV classes to significantly embrace this architecture in applications other then personal experimental purposes. A tool in the tool box should speed the development process or allow the ability to accomplish a task that otherwise could not be done. I have yet to find either of these conditions met by LV classes.It appears in order for native LV classes to mature the following is needed:

  1. NI needs to continue to support bug fixes & new features (which appears they are doing)
  2. New Training courses, added content to existing courses (maybe the advanced LabVIEW course) and example code is needed similar to the State Machine, Producer/Consumer course material used for the CLD.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.