Jump to content

Topic about LabVIEW just started on slashdot.org


Recommended Posts

  • Replies 50
  • Created
  • Last Reply

Top Posters In This Topic

QUOTE (dblk22vball @ Jan 30 2009, 11:28 AM)

i chimed in a bit for LV, people still want to call LabVIEW for kindergarder programmers, BOOO!!

I chimed in as well. The problem is that LV does make it "too easy" to do REAL programming and not have a CS background. That generates jealousy and anger esp from those who've invested years in learning the arcana of overloading functions from some obscure lib somewhere. The buggy whip manufacturers got really upset by automobiles...

Link to comment

QUOTE (Val Brown @ Jan 30 2009, 09:17 PM)

I chimed in as well. The problem is that LV does make it "too easy" to do REAL programming and not have a CS background. That generates jealousy and anger esp from those who've invested years in learning the arcana of overloading functions from some obscure lib somewhere. The buggy whip manufacturers got really upset by automobiles...

I chimed in too.

Man I really wish NI would do something about this "easy to use" and "no programming required" marketing guff. I mean WTF?

It's hardly surprising these people have these opinions, they're being forced down their throat by NI....

Shane.

Link to comment

QUOTE (shoneill @ Jan 30 2009, 12:44 PM)

I chimed in too.

Man I really wish NI would do something about this "easy to use" and "no programming required" marketing guff. I mean WTF?

It's hardly surprising these people have these opinions, they're being forced down their throat by NI....

Shane.

Shane,

I beg to differ. LV is easy to use and no background in CS is needed -- to at least get SOMETHING running and sometimes even quite serviceable code. Try to do THAT with C++...

What part of those phrases disturbs you? Is it the implication that one CAN potentially do more with less CS background or is it something else?

Link to comment

QUOTE (dblk22vball @ Jan 30 2009, 02:28 PM)

i chimed in a bit for LV, people still want to call LabVIEW for kindergarder programmers, BOOO!!

Just remind everyone CERN runs the Large Hadron Collider with LV. If LV is good enough to create a black hole or Higgs boson it's good enough for me. :thumbup:

Link to comment

My take on this is that LV has two natures. It is more accessible than most other languages for getting up and running and getting something useful done with the investment of one or two afternoons.

And yet there is a richness and elegance in the language and its advanced features for accomplishing some truly amazing things. In other words, it's a totally awesome programming environment (both the language and the execution/task scheduler).

The problem is that not many people care that both of those are true, including the other posters on /.

I'm not sure how to make an argument that would affect anyone's opinion about it.

Even NI Marketing knows that this dual nature of LabVIEW is mostly just a problem for them.

Link to comment
QUOTE (jdunham @ Jan 30 2009, 03:22 PM)
Even NI Marketing knows that this dual nature of LabVIEW is mostly just a problem for them.
Except that it hasn't been a problem for most of our history. Our bread and butter comes from the zero-experience-with-CS engineers and scientists. Those of you who are skilled programmers and use LabVIEW as a "fully armed and operational battle station" are a much smaller group, though you're growing. But because the bulk of users are in that first group, we do tend to spin LabVIEW for only that first usage. Heretofore it has paid the bills well and we've spread in more advanced contexts by word of mouth and the lone LV user who smuggles a copy of LV into his/her circle of C++-using co-workers. It certainly will be a problem in the future if the "large, full applications" group starts eclipsing the "one while loop around three Express VIs" group, but my understanding is we're still a ways away from that inflection point.

Link to comment

QUOTE (Aristos Queue @ Jan 30 2009, 01:34 PM)

Except that it hasn't been a problem for most of our history. Our bread and butter comes from the zero-experience-with-CS engineers and scientists. Those of you who are skilled programmers and use LabVIEW as a "fully armed and operational battle station" are a much smaller group, though you're growing. But because the bulk of users are in that first group, we do tend to spin LabVIEW for only that first usage. Heretofore it has paid the bills well and we've spread in more advanced contexts by word of mouth and the lone LV user who smuggles a copy of LV into his/her circle of C++-using co-workers. It certainly will be a problem in the future if the "large, full applications" group starts eclipsing the "one while loop around three Express VIs" group, but my understanding is we're still a ways away from that inflection point.

How do you see that being a problem? FWIW it seems to me that keeping both "groups" of LV users active and happy is what counts and is pretty "doable". At a certain level other programmers likely won't get it for quite awhile -- too much investment in "BetaMax and 8 tracks", but perhaps that's just my opinion.

Link to comment

QUOTE (Aristos Queue @ Jan 30 2009, 03:34 PM)

Except that it hasn't been a problem for most of our history. Our bread and butter comes from the zero-experience-with-CS engineers and scientists. Those of you who are skilled programmers and use LabVIEW as a "fully armed and operational battle station" are a much smaller group, though you're growing. But because the bulk of users are in that first group, we do tend to spin LabVIEW for only that first usage. Heretofore it has paid the bills well and we've spread in more advanced contexts by word of mouth and the lone LV user who smuggles a copy of LV into his/her circle of C++-using co-workers. It certainly will be a problem in the future if the "large, full applications" group starts eclipsing the "one while loop around three Express VIs" group, but my understanding is we're still a ways away from that inflection point.

I have often thought that if NI were to market LabVIEW more as a pure development language and environment that "much smaller group" would grow much faster. I have a CS degree and I use LabVIEW to develop large scale applications. The type of designs that I put together most casual users who simply learn LabVIEW on their would not be able to design. There is a big difference between writing a simple program and designing a large scale application. There are many ascpets of G that make designing systems very easy for a programmer. Multitasking, parallelism, synchronization via queues, and other features that greatly simply the impementation of complex designs. Similar architectures in traditional programming language are not as easy to implement since the programmer is required to do more of the work himself. G encapsulates many things for the programmer. I firmly believe that if NI did a better job promoting LabVIEW as a pure development environment than its adoption would be much faster. Keeping it a "secret" does nothing to promote its growth.

Link to comment

QUOTE (Aristos Queue @ Jan 30 2009, 04:45 PM)

I have thought so too, but IANAM*.

*I Am Not A Marketer

Perhaps yes, perhaps no. I think the biggest two hurdles for LV gaining such "general" acceptance as REAL programming environment are its "high" cost and its undercutting of how much traditional programmers need to do in architecting large, complex apps.

No, I'm not saying that LV costs too much, I'm saying the perception is that the cost is high. I've heard so called "real" programmers complain that they can buy x,y, or compiler of IDE for much, much less. Of course in those analyses leave out the costs of the additional libraries, reference materials, courses to attend and overall development time (and cost). Those other aspect simply are seen as "the costs of doing business" and so seem to not "count".

The second issue is IMO the more profound one as it goes to the whole issue of identity and that issue drives a lot of what most humans do. The idea that one doesn't have to be a "guru" to do high-level work does seem to be offensive to a number of the more traditional text-based programmers I've known over the years.

I think a side note to some of that second issue is the way some live out their commitment to open source initiative. Whether or not people know about OpenG and its wonderful efforts, there is a perception that LV is a "closed shop" and some more traditional text-based folks that I know, really don't like that at all. I have always found that interesting in terms of the commitment that some of those same people have to MS products but I guess "that's different" as Emily Latella would have said.

Link to comment

QUOTE (PaulG. @ Jan 31 2009, 07:18 AM)

That's a bit of a stretch, LabVIEW controls the collimators.

http://sine.ni.com/cs/app/doc/p/id/cs-10795

I would be sure it is used for other things aswell but "runs" probably not quite.

I would of tried to reply on slashdot but the responses I would of got are:

"Imagine a cluster of those"

"In Russia LabVIEW programs you"

"yes but will it run LabVIEW"

etc..

Link to comment

QUOTE (Aristos Queue @ Jan 30 2009, 06:45 PM)

I have thought so too, but IANAM*.

*I Am Not A Marketer

It seems like worldwide adoption of LabVIEW as a general purpose language would mean nothing but good things for NI.

But then again, I'm not a marketer either...

-D

Link to comment

The original question was why aren't there any open source alternatives to LabVIEW. Please correct me if I am wrong, but as far as I know NI-DAQmx still supports .NET, VB, Visual C/C++, C# and even Matlab straight out of the box (with the DAQ toolbox in Matlab). Drivers for using C in LINUX is also supported. I know many laboratory personell that have used BASIC with NI-DAQ (the old version) with NI hardware for decades.

What would an open source initiative add to the world of DAQ? Data Acquisition is inherently hardware based, and will allways be. I have allways thought that the real strength of NI is the NI-DAQ drivers, enabling the use of NI hardware for just about every thing possible. LabVIEW is just an add-on enabling easy or fast use of NI hardware.

LabVIEW has grown way beyond being just an easy eaccess to NI-DAQ, but as a general programming language it simply falls short compared with text based programming with few exceptions. It is good for two things: data acquisition and user interfaces, but it is utterly closed, you can't even look at the source without installing LabVIEW.

An open source labview is certainly doable from a DAQ point of view, all the drivers are there. The "perfect labview" IMHO, would be a native by-ref object based version with seemless integration with C++, more as a user interface builder for C++. But then again, this is more or less exactly what Qt is, it is open source, works on all operating systems, and there should be no problems using NI-DAQ with it :rolleyes:

Link to comment

QUOTE (bsvingen @ Jan 31 2009, 12:35 AM)

The original question was why aren't there any open source alternatives to LabVIEW. Please correct me if I am wrong, but as far as I know NI-DAQmx still supports .NET, VB, Visual C/C++, C# and even Matlab straight out of the box (with the DAQ toolbox in Matlab). Drivers for using C in LINUX is also supported. I know many laboratory personell that have used BASIC with NI-DAQ (the old version) with NI hardware for decades.

What would an open source initiative add to the world of DAQ? Data Acquisition is inherently hardware based, and will allways be. I have allways thought that the real strength of NI is the NI-DAQ drivers, enabling the use of NI hardware for just about every thing possible. LabVIEW is just an add-on enabling easy or fast use of NI hardware.

LabVIEW has grown way beyond being just an easy eaccess to NI-DAQ, but as a general programming language it simply falls short compared with text based programming with few exceptions. It is good for two things: data acquisition and user interfaces, but it is utterly closed, you can't even look at the source without installing LabVIEW.

An open source labview is certainly doable from a DAQ point of view, all the drivers are there. The "perfect labview" IMHO, would be a native by-ref object based version with seemless integration with C++, more as a user interface builder for C++. But then again, this is more or less exactly what Qt is, it is open source, works on all operating systems, and there should be no problems using NI-DAQ with it :rolleyes:

In other words, you're essentially a C++ guy, committed to open source, who just wants a graphical interface. FWIW, I think you're missing the point of what LV is and does but then again everyone is entitled to their own opinions.

Link to comment

QUOTE (bsvingen @ Jan 31 2009, 01:31 AM)

I am in fact essentially a LabVIEW guy :) and I tried to answer the original question of why there are no open source alternatives to LabVIEW. The answer is obvious: NI-DAQ drivers that can be used out of the box for C, C++, .NET, C#, VB, Matlab (and probably many more with slight modifications, wrappers etc), so the need for yet another language really isn't there. And when you start thinking about it an open source version would have to use those NI-DAQ drivers, there are no other options. You would also need an open source high performance window system, http://www.qtsoftware.com/' rel='nofollow' target="_blank">Qt . The point is, Qt is here, it works for all operating systems (even smartphones soon) in both C++ and Java, NI-DAQ is here and works for all operating systems, everything is here, but what purpose would it serve? You would be better off just using Qt as is using the TONS of scientific software already available as open source, with full readable source code.

We even have an old Mac in our lab with a full functioning LV 1.0 on it :D What point am I missing exactly?

You make my point. "Readable" means text-based and that ain't LV although, as you well know, LV can make use of text-based language constructs.

Link to comment

Wow, quite a few posts since I left....

My personal take is that NI needs to address both "camps" of programmers with LV. Power programmers are the ones who can show off what's REALLY possible with LV, whereas the ease-of-use is a double-edged sword. I can appreciate the "bread and butter" aspect of things. No company is stupid enough to isolate themselves from their main customer base. But looking at longer-term adoption of LV, making people aware from the get-go that there's a lot more to LV than "ease of use" or "no programming experience required" will simply put those people in a different frame of mind.

I think when many people take their first steps in LV, the "easy" thing lets them easily forget the two words "software engineering" and anything associated to it. Even the idea that there might be more to it might push more people to investigate the boundaries of LV instead of putting it down as a "nice toy". By all means market the "ease of use" but please just spare a few words for us poor misiunderstood "power progammers". I don't know HOW to do it, but hey, that's what maeketing people get paid for right?

Shane.

Link to comment

QUOTE (Aristos Queue @ Jan 30 2009, 01:34 PM)

Except that it hasn't been a problem for most of our history. Our bread and butter comes from the zero-experience-with-CS engineers and scientists. Those of you who are skilled programmers and use LabVIEW as a "fully armed and operational battle station" are a much smaller group, though you're growing. But because the bulk of users are in that first group, we do tend to spin LabVIEW for only that first usage. Heretofore it has paid the bills well and we've spread in more advanced contexts by word of mouth and the lone LV user who smuggles a copy of LV into his/her circle of C++-using co-workers. It certainly will be a problem in the future if the "large, full applications" group starts eclipsing the "one while loop around three Express VIs" group, but my understanding is we're still a ways away from that inflection point.

The problem is that your loyal users are all annoyed that the software development community won't take LabVIEW seriously and any real projects in LabVIEW are under constant threat of conversion to other languages by order of management. So yeah, most of us agree that the current marketing focus on the noobs keeps the bills ( AQ's salary) paid, and that's generally good, but it does create problems. I wasn't trying to say it was a fatal problem, or even the biggest problem. In fact it could be the biggest problem, because it has taken a successful and brilliant product and doomed it to a minor niche in the world of computing (sorry for being overly dramatic, you can start flaming me now!)

QUOTE (bsvingen @ Jan 31 2009, 12:35 AM)

I have allways thought that the real strength of NI is the NI-DAQ drivers, enabling the use of NI hardware for just about every thing possible. LabVIEW is just an add-on enabling
easy
or fast use of NI hardware.

Dude, the real strength of LabVIEW is the intuitive nature of dataflow programming and the awesome task scheduling engine that allows multithreaded program execution with hardly any management code required. It's an excellent signal processing and control environment. For me it's so much more than an add-on for manipulating NI hardware.

Link to comment

QUOTE

... won't take LabVIEW seriously and any real projects in LabVIEW are under constant threat of conversion ....

what's certainly not helping in my opinion is the enormous complexity of turning a moderately complicated LV project into executable code.

(thinking of the LVOOP <--> app builder issues)

Link to comment

QUOTE (Dirk J. @ Feb 2 2009, 12:31 AM)

what's certainly not helping in my opinion is the enormous complexity of turning a moderately complicated LV project into executable code.

(thinking of the LVOOP <--> app builder issues)

Again I think several of you guys are making several of my point(s). It SEEMS difficult to build with LV because you're benchmarking what/how LV works and does its builds against text-based programming IDEs which have ENORMOUS amounts of "hacks" in them to make it all "seem easy". What you're "used to using" seems easy almost regardless of how easy the alternative is. Case in point within the LV environment. I don't yet use the wonderful option that Darren demonstrated so well at NI Week. Having to move to the keyboard while in LV is SO FOREIGN to me that it seems slower to me than just using my trackpad and dropping down the pallets. Now which way is really "easier"? If you're on a desktop system and need a mouse (or a laptop and USE a mouse) then moving to the keyboard to type a bit doesn't seem like a big deal but, if you're NOT using a mouse at all then it really can become a far digger deal. The feature is great and I'll get into using it "at some point" but it's not high on my list while I have development and tech support demands on me like I do currently.

Yes, NI could do some things differently but I'm not so sure that it's just "noobs" who like the current orientation of the company.

Link to comment

QUOTE (jdunham @ Feb 2 2009, 06:16 AM)

Dude, the real strength of LabVIEW is the intuitive nature of dataflow programming and the awesome task scheduling engine that allows multithreaded program execution with hardly any management code required. It's an excellent signal processing and control environment. For me it's so much more than an add-on for manipulating NI hardware.

Yes, but everything about LabVIEW is tuned towards dataflow. This is intuitive for datalogging where data is in fact flowing or streaming, but counterintuitive for just about everything else where no data is flowing, or at least you would not naturally think in those terms. Classic GOOP is more intuitive in most situations, because the data has now stopped flowing, it is only changing based on the reference that is flowing. LVOOP is somewhat more tuned towards dataflow than ordinary dataflow, because in addition to the data flowing, the functions and knowledge of what can be done to that particular data is also flowing (dynamic dispatch).

Another thing that is counterintuitive and IMO counterproductive for everything except streaming data, is the left-to-right paradigm. The natural thing to do when you can program in two dimensions, is to fully use those dimensions to build abstract programming topologies or topologies representing real world topologies where data and messenges flows in all directions. This also requires objects with persistent datastorage in the form of GOOP (or more precisely reentrant LV2 style globals) rather than LVOOP. Complex programs can be built easely using this "paradigm", because the topology inherently within all software is simply laid out on the screen. This can be done today in LV, but it is cluttered and confusing because of the left-to-right paradigm, and becomes utterly counterintuitive for someone thinking only in dataflow.

LabVIEW is similar to FORTRAN. They both can be used to make any kind of program, but they are both really only good at doing one single thing; datalogging and number crunching respectively. Like FORTRAN, the reason for this is not inherent shortcomings in the language structure, but because they both are tuned towards those specific tasks.

Link to comment

QUOTE (bsvingen @ Feb 2 2009, 04:39 PM)

Yes, but everything about LabVIEW is tuned towards dataflow. This is intuitive for datalogging where data is in fact flowing or streaming, but counterintuitive for just about everything else where no data is flowing, or at least you would not naturally think in those terms. Classic GOOP is more intuitive in most situations, because the data has now stopped flowing, it is only changing based on the reference that is flowing.

I think we can agree that a dataflow language is more useful when you have data flowing. I'm not sure we agree on much else though. I think by-reference objects in LabVIEW are less intuitive, even if they can be very useful.

QUOTE (bsvingen)

Another thing that is counterintuitive and IMO counterproductive for everything except streaming data, is the left-to-right paradigm. The natural thing to do when you can program in two dimensions, is to fully use those dimensions to build abstract programming topologies or topologies representing real world topologies where data and messenges flows in all directions.

The natural thing to do when you have a bunch of cans of spray paint is to make a big muddy mess on the wall. I find the left-to-right paradigm helps me make sense of the dataflow; I can model the dataflow in my brain. Without the left-to-right, it would be much harder for me to create that model. Without that model I would find the code to be unmaintainable, unless LabVIEW had a totally different set of visual cues illustrating that omnidirectional flow.

QUOTE (bsvingen)

LabVIEW is similar to FORTRAN. They both can be used to make any kind of program, but they are both really only good at doing one single thing; datalogging and number crunching respectively. Like FORTRAN, the reason for this is not inherent shortcomings in the language structure, but because they both are tuned towards those specific tasks.

All I can say is that I have written many VIs which do a lot more than datalogging and number crunching (though plenty of that too), and I think LV has been a help more than a hinderance. Communications and feedback control come to mind right away. I have also had lots of success with data visualization.

Link to comment

QUOTE (bsvingen @ Feb 2 2009, 06:39 PM)

As jdunham stated we can agree that LabVIEW is very useful when working with data flows but to suggest that it is not as useful for anything else is a very simplistic view of application design. Any system can be designed with data flow as the primary design concern. Yorton and DeMarco had several books on structured programming in the late 80's. Their work had nothing to do with LabVIEW. All you need to do is look at the work on structured programming and you will see that its underlying principle was data flow. An extension (structured analysis) of this was to include events, which can be viewed as asynchronous data flow. LabVIEW just happens to be one of the languages that supports data flow directly. In general data flow is not taught in most computer sciences or programming classes. They still teach to view things as procedures and sequencial programming techniques. However once you learn to design systems with data flow one naturally gravitates towards using LabVIEW because it is a natural fit. I have been working with LabVIEW since the mid 90's and I can count on one hand how many times my applications were dealing with data capture. I use LabVIEW as a general purpose programming language because I see the value in designing systems using state machines and data flow (nothing to do with data logging). These designs tend to be more flexible and robust than viewing the system as a simple flow chart (sequential programming).

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.




×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.