Jump to content
Michael Aivaliotis

Is LabVIEW a Functional or Object-Oriented language?

Recommended Posts

Just watched this presentation by Richard Feldman called: Why Isn't Functional Programming the Norm.

As I was watching this, many ideas came to mind about how LabVIEW stacks up in various areas of the presentation. I wanted to hear what the community thinks about this. We can all agree that LabVIEW is NOT a popular language (as defined in the video) and it probably will not end up on any presentation as the one in this video (I desire for this to change though). However, I think the discussion in the community about FP vs OO is currently taking place. I know people that do not use OO in LabVIEW and many that swear by it. So I think this is a fitting discussion. However, the core question of the presentation as put by Richard is "Does OO features make a language popular?" His argument is NO.

I don't think OO by itself will make LabVIEW popular, but where does LabVIEW end up on the reasons for popularity as presented? Or better yet, what can make LabVIEW more popular? Is that something that anyone should care about?

Share this post


Link to post
Share on other sites
Quote

Is LabVIEW a Functional or Object-Oriented language?

It's a dataflow language with some functional and OO features. One of these is not like the others and you'll notice "state" is never mentioned in video.

Share this post


Link to post
Share on other sites

I don't think Functional and OO are contradictory, per se.  Rather, it's by-reference objects contained in other objects that is very non Functional.  By-value objects and LabVIEW dataflow seems quite Functional to me.

Share this post


Link to post
Share on other sites

OO is contradictory to functional as practiced by C#/JAVA/C++. Those languages insist on by classes by pointer or reference (C++ can do by value, but doesn't use it commonly).

OO is compatible to functional when it behaves by value, as it does in LabVIEW.

But many functional languages consider OO to be a half step toward more functional features: dynamic dispatching is subsumed by pattern matching, for example.

 

Share this post


Link to post
Share on other sites
On 10/16/2019 at 11:31 PM, Aristos Queue said:

OO is contradictory to functional as practiced by C#/JAVA/C++. Those languages insist on by classes by pointer or reference (C++ can do by value, but doesn't use it commonly).

Unless you use STL which makes a lot of use of them. The big question is if STL buys you much or just causes you even more trouble. It's better than using the C++ template feature yourself though. That is pure evil.

Share this post


Link to post
Share on other sites
7 hours ago, Rolf Kalbermatter said:

 It's better than using the C++ template feature yourself though. That is pure evil.

In the wrong hands, perhaps it is some evil. I made extensive use of it to get sets and maps in LV 2019 acceleration for specific inner data types with some quite readable code (according to my reviewers).

I save "pure evil" for data multiple inheritance and single character variable names.

7 hours ago, Rolf Kalbermatter said:

The big question is if STL buys you much or just causes you even more trouble.

This is a question??? In 2005, maybe. At this point, if you aren't using the STL in your C++ code, I suggest you change languages because you aren't using C++ right. (Note that if you are weeping constantly and your hands bleed and your stomach ulcers bloom, those are good signs that you're using C++ correctly. Incorrect C++ use is generally associated with euphoria and a belief that you've found "an easy way to do it!" by avoiding some part of the standard template library.)

Share this post


Link to post
Share on other sites
On 10/18/2019 at 11:17 AM, Aristos Queue said:

single character variable names.

That brings me back to my time learning BASIC programming on my VIC-20 (in the early 80s).

Share this post


Link to post
Share on other sites
1 hour ago, jcarmody said:

That brings me back to my time learning BASIC programming on my VIC-20 (in the early 80s).

Did this today (and any other day I write C).

for (int i=0; i < len; i++){

The big question is .. should it have been "a" instead of "i"? :D

Edited by ShaunR
  • Haha 1

Share this post


Link to post
Share on other sites
7 hours ago, ShaunR said:

 


for (int i=0; i < len; i++){

The big question is .. should it have been "a" instead of "i"? :D

Sounds like a good idea if your goal is to get other programmers to hate you.

  • Like 1

Share this post


Link to post
Share on other sites
On 10/18/2019 at 5:17 PM, Aristos Queue said:

This is a question??? In 2005, maybe. At this point, if you aren't using the STL in your C++ code, I suggest you change languages because you aren't using C++ right. (Note that if you are weeping constantly and your hands bleed and your stomach ulcers bloom, those are good signs that you're using C++ correctly. Incorrect C++ use is generally associated with euphoria and a belief that you've found "an easy way to do it!" by avoiding some part of the standard template library.)

That's probably why I gave up on C++ years ago already. If I have to program something that requires anything on low level, I prefer simple old standard C. Sure nobody will build a software like LabVIEW in C anymore but that is not my target anyways.

Share this post


Link to post
Share on other sites
12 hours ago, ShaunR said:

I also once heard that wether you use "a" or "i" depends on if you came from a mathematical or engineering background.

I often just use "x".  Curious where that puts me...

Share this post


Link to post
Share on other sites

If you grew up on Fortran you'd know that I default declares to INTEGER and A to REAL.

Share this post


Link to post
Share on other sites
2 hours ago, crossrulz said:

I often just use "x".  Curious where that puts me...

It makes you a target for the zealots who insist on "a" or "i". 🙂

Share this post


Link to post
Share on other sites
On 10/22/2019 at 12:29 PM, ensegre said:

If you grew up on Fortran you'd know that I default declares to INTEGER and A to REAL.

If you grew up on Fortran I would have hoped they had let you retire by now. No rest for the wicked?

  • Haha 1

Share this post


Link to post
Share on other sites
16 hours ago, ShaunR said:

If you grew up on Fortran I would have hoped they had let you retire by now. No rest for the wicked?

Fortran 77 still was taught in at least one college in 1993 on green-and-black and amber-and-black terminals. Was even a mandatory class for freshmen.

Share this post


Link to post
Share on other sites

In 2014 I took a 1 credit Fotran course. I got a degree in Mechanical Engineering so it actually ended up being the only programming course I ever took in college.

Share this post


Link to post
Share on other sites

I think it was in the early '90s that I took a Fortran class.  People only a year ahead of me had used punch-cards; I'm glad I missed that boat.

Share this post


Link to post
Share on other sites

I have been using Fortran for scientific computing till the mid 2000s, and that was no way odd in the numerical analysis community. Only, I stuck to FORTRAN 77 when everyone already transitioned to FORTRAN 90. You do still find reputable numerical physics codes around in Fortran.

Share this post


Link to post
Share on other sites
20 hours ago, ShaunR said:

If you grew up on Fortran I would have hoped they had let you retire by now. No rest for the wicked?

The time dilation near a massive, dense, nearly-impenetrable object (aka Fortran) keeps them forever young. It's also why fixing a bug takes so long from the perspective of those standing further away. The LIGO gravity wave detector had to filter out Fortran code submissions to detect black holes (both cause merge collisions).

My second internship involved Fortran. I know of what I speak. I was grateful for the experience... it set me on the path to ever higher-level languages!!!

Edited by Aristos Queue

Share this post


Link to post
Share on other sites
On 10/24/2019 at 7:37 PM, ensegre said:

I have been using Fortran for scientific computing till the mid 2000s, and that was no way odd in the numerical analysis community. Only, I stuck to FORTRAN 77 when everyone already transitioned to FORTRAN 90. You do still find reputable numerical physics codes around in Fortran.

Most Popular Programming Languages 1965 - 2019

Share this post


Link to post
Share on other sites

So what (however the data is collected).  Did I say my community was using it because it aimed at being popular?

Share this post


Link to post
Share on other sites
1 hour ago, ensegre said:

So what (however the data is collected).  Did I say my community was using it because it aimed at being popular?

Nothing what. It's just an interesting data presentation showing the rise and fall of languages over time. It's just a shame LabVIEW isn't on there.

Share this post


Link to post
Share on other sites
Quote

Is LabVIEW a Functional or Object-Oriented language?

It's a dataflow programming language that supports both functional and Object Oriented programming paradigms. Like C++ but not confusing :P

On 10/15/2019 at 1:29 AM, Michael Aivaliotis said:

I don't think OO by itself will make LabVIEW popular, but where does LabVIEW end up on the reasons for popularity as presented?

LabVIEW will likely never be popular by the definition in this video, because it is not just a programming language but an ecosystem of hard- and software. It requires a lot of trust in NI and partners. You'd have to compare it to other proprietary programming languages with similar ecosystem for it to be "popular" in comparison.

On 10/15/2019 at 1:29 AM, Michael Aivaliotis said:

Or better yet, what can make LabVIEW more popular?

The first thing that comes to mind is interoperability. Calling external code from LabVIEW and vice versa still requires a decent amount of Voodoo (see the SQLite Library or OpenG ZIP as prime examples). To my knowledge there is no "plug-n-play" solution for these kinds of things. This is when the second best solution is often good enough.

On 10/15/2019 at 1:29 AM, Michael Aivaliotis said:

Is that something that anyone should care about?

NI is of course interested in making LabVIEW more popular to grow business. As users we should be interested in making it more popular so that NI and the community can cope with ever-growing requirements and to open up new (business) opportunities. At the same time there is also a risk of growing too fast. The more popular LabVIEW gets, the more LabVIEW is used for tasks it wasn't originally designed for. This will inevitable result in more features being added which increases complexity of the entire ecosystem. If this process is too fast, chances are that poor decisions lead to more complex solutions, which are more expensive for NI to implement and maintain in the future. At some point they have to rethink their strategy and do some breaking changes. I assume this is where NXG comes into play.

Is this good or bad? I don't know. It probably depends :lol:

 

Share this post


Link to post
Share on other sites
On 10/27/2019 at 8:37 PM, LogMAN said:

The first thing that comes to mind is interoperability. Calling external code from LabVIEW and vice versa still requires a decent amount of Voodoo (see the SQLite Library or OpenG ZIP as prime examples). To my knowledge there is no "plug-n-play" solution for these kinds of things. This is when the second best solution is often good enough.

It's called bindings to non native libraries and functionality. It's a standard problem in every programming environment. The perceived difficulty has always a direct relation with the distance of the programming paradigme of the calling environment to the callee. In C it is almost non existent, since you have to bother about memory management, thread management, etc, etc. no matter what you try to call. In higher level languages with a managed environmen like LabVIEW or .Net, it seems a lot more complicated. It isn't really but the difference between what you normally have to do in the calling environment is much bigger when calling such non native entities. And each environment has of course a few special subtleties. The one currently causing me a lot of extra work for the OpenG ZIP library is the fact that LabVIEW always has and still does assume that STRING==BYTEARRAY and that encoding does not exist in a LabVIEW platform.

A ZIP file can contain encoding in the stored file names and nowadays regularly does. So the strings that are returned as filenames in an archive need to be treated with care. Except when I then try to turn it into a LabVIEW path to create the file, the whole care falls into the water as the filepath either will alter the name to something else or even possibly attempt to create a file with invalid characters. So the solution is to replace the Open File, Create File and Create Directory among with some others functions (like Delete File) with my own version that can handle the paths properly. Great idea except that LabVIEW does not document and hence not guarantee how the underlaying file system object is mapped into a file refnum. So in order to be safe here I also have to create the Read File, Write File, Close File, File Size and such functions.All doable but serious work to do.

I'm basically rewriting the LabVIEW File Manager and Path Manager functionality for a considerable amount.

Edited by Rolf Kalbermatter
  • Like 1

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.