Jump to content

Can I ride the LV/TS train to retirement?


Phillip Brooks

Recommended Posts

On a related note GDevCon 4 just ended in Glasgow with a record number of attendees....šŸ‘Ā (Maybe that is partly because the alternatives are not as good/many anymore, but let me be optimistic for a while...) If Emerson does not want to kill or starve LabVIEW, maybe there is a nice future beyond 10 years as well. Retirement age is steadily getting pushed upwards though so personally it would be nice if things went on for another 25 years, or even better; allowed our kids to make a career working with G everywhere in 2050šŸ˜€

Link to comment
4 hours ago, Mads said:

On a related note GDevCon 4 just ended in Glasgow with a record number of attendees....šŸ‘Ā (Maybe that is partly because the alternatives are not as good/many anymore, but let me be optimistic for a while...) If Emerson does not want to kill or starve LabVIEW, maybe there is a nice future beyond 10 years as well. Retirement age is steadily getting pushed upwards though so personally it would be nice if things went on for another 25 years, or even better; allowed our kids to make a career working with G everywhere in 2050šŸ˜€

What is a record number of attendees in metric units?

BTW, the program looks interesting. Will it ever get online and will code be shared?

Link to comment
5 hours ago, X___ said:

What is a record number of attendees in metric units?

That I don't know

5 hours ago, X___ said:

BTW, the program looks interesting. Will it ever get online and will code be shared?

They've shared the videos of the first 3 editions, so it's reasonable to hope they'll do the same for the 4th : https://www.youtube.com/@GDevCon/playlists

Follow on LinkedIn or search on GitHub, some (most?) presenters publish what they present.

Edited by Antoine Chalons
Link to comment
11 hours ago, Mads said:

Maybe that is partly because the alternatives are not as good/many anymore, but let me be optimistic for a while...

I must add that "small" doesn't necessarily imply "bad." I have never experienced LabVIEW, or the LabVIEW-centric world of electronic testing, to be anything other than a niche. But yeah, that niche can be strong and thriving orĀ weak and struggling.

To be honest, I don't know why I'm speculating here. I've been working for over 10 years, exclusively for two companies. I know very little about the world around me. There might be strong pockets of hardware companies with R&D labs equipped with NI equipment everywhere. I just haven't heard of them because they get overshadowed by the noise surrounding software, AI, machine learning, and other "hyped" technologies :D

Link to comment
15 hours ago, Phillip Brooks said:

I can limp along, but I'm unhappy arriving at work every day and I'm not sure how long I can continue.

That's the point where most of us switch to consultancy. For that switch, though, you need good personal relationships with customers and be happy with an irregular income.Ā 

Link to comment
On 9/21/2023 at 7:09 AM, ShaunR said:

Well. If I were your manager then I'd ask you to find a contractor that can do it in a week and task you with managing them.Ā :P

That would be the right attitude to have as a manager.Ā  But that also reduces my value to the company.Ā  Now instead of doing work, I have a guy, that has to hire a guy to get the work done.Ā  My value in the company stays very high assuming NI and LabVIEW are the accepted tools used.

Link to comment
On 9/26/2023 at 3:47 AM, ShaunR said:

Or, in other words, "A Manager". That's all managers are! Your value isn't diminished, you've just been given the opportunity to increase your skill-set.

Change is scary. Especially when it is forced on you.Ā  I guess that summarizes how I feel about this whole situation.

Link to comment

I see no downside to learning a new language on the side. Even if you never use it professionally, it's still fun to learn new things and it will certainly help you think in different ways. My suggestion is to use the languages that keep you close to hardware:

C++ with Arduinos. This lets you connect to other chips that will give you your ADCs, DACs, GPIO, etc.

Python with the Raspberry Pis and other single board computers

The LINX toolkit lets you get your feet wet with the above hardware in LabVIEW. You can go back and forth and see how each tool handles the same task.

Verilog so you can get back to graphical programming with the IP integrator. This will also get you high speed ADCs and DACs if you need that performance.

In other good news, you're now forced into a clear line between the UI and the backend. I suggest Javascript/HTML for the UI.

Link to comment
  • 1 month later...

I've read this post with great interest, being a LabVIEW developer for more than 20 years (and hence 20 years to go).

I am also worried about the furture of LabVIEW. What worries me the most, is the help that text-based languages are getting now from AI. Code reviewing and code generation are already at a significant level, and are getting better every single day.
E.g. Github Copilot (trained on billions lines of code), is a programmers assistant that you cannot overestimate.

I don't see such a thing coming for LabVIEW, which will rapidly make LabVIEW fall behind in comparison to other languages.

LabVIEW is in great need of big investments/rework. As much as I disliked NXG, it was started for a valid reason. Without such investments, we keep working everyday in a development environment, who's foundations are (almost) 40 years old. šŸ™ˆ

Link to comment
On 10/30/2023 at 12:08 PM, Thomas_VDB said:

... the help that text-based languages are getting now from AI ...

I've had similar thoughts.Ā  It seems to me the LabVIEW user base isn't big enough to justify the investment to get that type of thing up and running.Ā  However, I'd like to be proven wrong.

When I entered the LabVIEW world >20 years ago one of its selling points was its ability to create desktop-based test and measurement applications faster and with less effort than the available alternatives.Ā  I felt confident that was true based on my (limited) perception of the alternatives, and because the user base seemed considerable.Ā  I'd attend user groups that would number 50+ attendees, NI reps were aplenty, and jobs seemed bountiful.

Iā€™m not sure both those things are true anymore.Ā  Iā€™ve been fortunate enough to find gainful employment with my LabVIEW skills that will probably (fingers crossed) take me to retirement, but the engineer in me is itching to pickup the next best tool as Iā€™m no longer quite so certain LabVIEW is still the king of the hill.

Who knows, maybe Emerson will surprise us all?

Link to comment
On 10/30/2023 at 4:08 PM, Thomas_VDB said:

LabVIEW is in great need of big investments/rework. As much as I disliked NXG, it was started for a valid reason. Without such investments, we keep working everyday in a development environment, who's foundations are (almost) 40 years old.

Well. C++ is just as old and C is 10 years older - so go figure! The whole software discipline hasn't really moved on in the last 60 years (check this out!) and while I do see AI changing that, it's probably not in the way you think. If AI is to be instrumental to programming it must come up with a new way of programming that obsoletes all current languages-not automate existing languages. Software needs it's Einstein vs Newton moment and I've seen no sign of that coming from the industry itself. Instead we get more and more recycled and tweaked ideas which I like to call Potemkin Programming orĀ Potemkin languages.

On 10/30/2023 at 4:08 PM, Thomas_VDB said:

I don't see such a thing coming for LabVIEW, which will rapidly make LabVIEW fall behind in comparison to other languages.

I disagree. LabVIEW, so far, has been immune to AI but it has also been trapped in theĀ  Potemkin Programming paradigm (ooooh. PPP :D). It needs another "Events" breakpoint (when they introduced the event structure and changed how we program). Of all the languages, LabVIEW has the potential to break out of the quagmire that all the other languages have languished in.Ā What we don't need is imports from other languages making it just as bad as all the others. We need innovators (like the guy that invented malleable VI's for funsies) only in their 20's and 30's - not 70's.

Edited by ShaunR
  • Like 2
Link to comment
On 10/30/2023 at 5:08 PM, Thomas_VDB said:

I am also worried about the furture of LabVIEW. What worries me the most, is the help that text-based languages are getting now from AI. Code reviewing and code generation are already at a significant level, and are getting better every single day.

I've always thought LabVIEW's higher level of abstraction somewhat reduced the need for that. But I'm not proficient enough in any other language to make a fair comparision.

Ā 

Link to comment
On 11/1/2023 at 5:41 AM, ShaunR said:

It needs another "Events" breakpoint (when they introduced the event structure and changed how we program). Of all the languages, LabVIEW has the potential to break out of the quagmire that all the other languages have languished in.Ā 

Maybe separate the front panel and diagram into different files/entities so that the GUI can be used with other programming languages in addition to G.Ā Ā 

Link to comment
On 11/3/2023 at 8:49 PM, smarlow said:

so that the GUI can be used with other programming languages in addition to G

Ā 

On 11/1/2023 at 9:41 AM, ShaunR said:

What we don't need is imports from other languages making it just as bad as all the others.

Not a huge leap but an improvement on events (IMO) would beĀ Named events (ala named queues). Events that can be named but (perhaps more importantly) can work across applications in a similar way that Windows messages can be hooked from an application-all driven from the Event structure. I initially experimented with a similar technology when VIM's where first discovered (although it didn't work across applications). Unfortunately, they broke the downstream polymorphism and made it all very manual with the Type Specialization Structure - so I dropped it.

Another is callbacks in the Event Structure. Similar to the Panel Close event, they would have an out that can be described.Ā 

But getting on to the LabVIEW GUI. That needs to go completely in its current form. It's inferior to all other WISIWIG languages because we cannot (reliably) create new controls or modify existing ones' behaviour. They gave us the 1/2 way house of XControls but that was awful, buggy and slow. What we need is to be able to describe controls in some form of mark-up so we can import to make native controls and indicators. Bonus points if we can start with an existing control and add or override functionality. All other WISIWIG languages allow the creation of controls/indicators. This would open up a whole new industry segment of control packs for LabVIEW like there is for the other languages and we wouldn't have to wait until NI decide to release a new widget set. At the very least allow us to leverage other GUI libraries (e.g.Ā imguiĀ or WXWidgets).

  • Like 2
Link to comment
On 11/3/2023 at 9:49 PM, smarlow said:

Maybe separate the front panel and diagram into different files/entities so that the GUI can be used with other programming languages in addition to G.Ā Ā 

That's what LabWindows/CVI was. And yes the whole GUI library from LabWindows/CVI was under the hood the LabVIEW window manager from ca. LabVIEW 2.5! My guess (not hindered by any knowledge of the real code) is that at least 95% of the window manager layer in LabVIEW 2023 is still the same. And I'm not expecting anyone from NI chiming in on that, but that LabWindows/CVI used a substantial part of the LabVIEW low level C code is a fact that I know for sure. but look where that all went! LabWindows/CVI 2020 was the latest version NI released and while they have mostly refrained from publicly commenting on it, there were multiple semi-private statements from NI employees to clients, that further development of LabWindows/CVI is not just been put in the refrigerator but definitely has been stopped.

As to AI as promoted by OpenAI and friends at this point: It has very little to do with real AI. It's however a very good hype to raise venture capital for many useless things. Most of it will be never ever talked about again after the inventor went off with the money or it somehow vaporized in other ways. It feels a lot like Fuzzy Logic some 25 years ago. Suddenly everybody was selling washing machines, coffee makers and motor controllers using Fuzzy Logic to achieve the next economical and ecological breakthrough. At least there was a sticker on the device claiming so, but if the firmware contained anything else than an on-off controller was usually a big question. People were screaming that LabVIEW was doomed because there was no Fuzzy Logic Toolkit yet (someone in the community created eventually one but I'm sure they sold maybe 5 or so licenses if they were really lucky). Who has heard for the last time about Fuzzy Logic?

Sure AI, once it is real AI, has the potential to make humans redundant. In the 90ies of last century there was already talk about AI, because a computer could make some model based predictions that a programmer first had to enter. This new "AI" is slightly better than what we had back then, but a LLM is not AI, it is simply a complex algorithmic engine.

Ā 

Edited by Rolf Kalbermatter
Link to comment

NXG's file types were all mostly human readable.Ā  The front panel and the block diagram were broken up into separate XML sections.Ā  At one point someone asked about having multiple front panels, and it was shown that the XML structure supported from 0 to N front panels for any GVI.Ā  NI never said they intended on supporting any number of front panels other than 1, but I thought that would have been an interesting thing to explore.

Link to comment
On 11/5/2023 at 6:58 AM, ShaunR said:

What we need is to be able to describe controls in some form of mark-up so we can import to make native controls and indicators. Bonus points if we can start with an existing control and add or override functionality. All other WISIWIG languages allow the creation of controls/indicators. This would open up a whole new industry segment of control packs for LabVIEW like there is for the other languages and we wouldn't have to wait until NI decide to release a new widget set. At the very least allow us to leverage other GUI libraries (e.g.Ā imguiĀ or WXWidgets).

SVG is the way to go.Ā  There is a mountain of open source SVG rendering code that was built to make modern browsers SVG compatible.Ā  The thing to do would be to create a custom SVG-based format for all LabVIEW controls/indicators, and publish the specification.Ā  That way anyone who wants to would be able to create their own front panel object design libraries.Ā  Users could create their own designs and look for any application.Ā  I mentioned this over a decade ago in my Websocket post.Ā  They could also integrate the connector pane into the diagram as a boundary with multiple frames and tools for creating connector terminals.Ā  Unlimited sprawl on the diagram would be replaced with "main structure" whose boundary is the connector pane.Ā  Users would create paginated code within the boundary, with the separate pages executing in parallel.Ā  This would be an option rather than a rule.Ā  There would still be a traditional connector pane, but it would go with the diagram rather than the front panel.Ā  The relationship between front panel objects and diagram terminals would be user-configurable (Diagram files would be configurable to point to one or more front panel files, with the front panel objects therein becoming available to be included in the diagram, at the user's discretion).

  • Like 1
Link to comment

Hi

The strength and weakness of LabVIEW is that it is not a clearly pattern'ed development platform.

The problem is that a newcomer to LabVIEW will try to google the world to get ideas how to use LabVIEW. And the suggestions will point in all directions.

Directions :

Classic ( legacy ? ) LabVIEW - Making programs without using events. That is how we did it ( at least ) until 2001 LabVIEW 6.1, which introduced the Event. So all the mechanical engineers lured into the world of the simple LabVIEW suddenly had to consider Events, as used in common Windows programming. They had to become programmers instead of just being engineers solving a problem using this simple system called LabVIEW. There are plenty of books about this. Both describing the legacy concept, as well as the Event concept.

Objects in LabVIEW - LVOOP was introduced in 2006/2007 LabVIEW 8.0/8.2. So now the non-programmer was really in trouble. And probably dead by a heart attack. There are not that many books about his subject.

But NI was on a wave of innovation.

So the next was Actor Framework introduced in 2012. Even fewer books about his.

And there were more exotic ( ~ programmer specific ) additions to the LabVIEW language world later on. With even fewer books explaining that.

The problem here is not that LabVIEW developed into something with Bell'ls and Whistles. That is natural. But the problem is that NI never clearly stated who were the intended target of all those benefits/additions. The users/customers had to figure out that themselves.

Just to make it clear. A mechanical engineer can still use the legacy methods of programming LabVIEW, ignoring Events and Objects. For simple tasks. But he/she will surely be suggested to do 'better'.Ā 

I am not a mechanical engineer. And use advanced topics, like FPGA's ..Ā 

Regards

Link to comment
12 hours ago, smarlow said:

All I know is that if they don't do something to make it a more powerful language, it will be difficult to keep it going in the long run.Ā  It was, in the past always a powerful choice for cross-platform compatibility.Ā  With the macOS deprecating (and eventually completely removing) support for OpenGL/OpenCL, we see the demise of the original LabVIEW platform.

I for one would like to see a much heavier support for Linux and Linux RT.Ā  Maybe provide an option to order PXI hardware with an Ubuntu OS, and make the installers easier to use (NI Package Manager for Linux, etc.).Ā  They could make the Linux version of the Package Manager available from the Ubuntu app store.Ā  I know they say the market for Linux isn't that big, but I believe it would be much bigger if they made it easier to use.Ā  I know my IT department and test system hardware managers would love to get rid of Windows entirely.Ā  Our mission control software all runs in Linux, but LabVIEW still has good value in rapid application development and instrument bus controls, etc.Ā  So we end up running hybrid systems that run Linux in a VM to operate the test executive software, and LabVIEW in Windows to control all our instruments and data buses.

Allowing users the option to port the RT Linux OS to lower-cost hardware, they way did for the Phar Lap OS would certainly help out, also.Ā  BTW, is it too much to ask to make all the low-cost FPGA hardware from Digilent LabVIEW compatible?Ā  I can see IOT boards like the Arduino Portenta, with its 16-bit analog I/O seriously eating their lunch in the near future.Ā  ChatGPT is pretty good at churning out Arduino and RaspberryPi code that's not too bad.Ā  All of our younger staff uses Digilent boards for embedded stuff, programming it in C and VHDL using Vivado.Ā  The LabVIEW old-timers are losing work because the FPGA hardware is too expensive.Ā  We used to get by in the old days buying myRIOs for simpler apps on the bench.Ā  But that device has not been updated for a decade, and it's twice the price of the ZYBO.Ā  Who has 10K to spend on an FPGA card anymore, not to mention the $20K PXI computer to run it.Ā  Don't get me wrong, the PXI and CompactRIO (can we get a faster DIO module for the cRIO, please?), are still great choices for high performance and rugged environments.Ā  But not every job needs all that. Sometimes you need something inexpensive to fill the gaps.Ā  It seems as if NI has been willing to let all that go, and keep LabVIEW the role of selling their very expensive high-end hardware.Ā  Ā But as low-cost hardware gets more and more powerful (see the Digilent ECLYPSE Z7), and high-end LV-compatible hardware gets more and more expensive, LabVIEW fades more and more

I used to teach LabVIEW in a classroom setting many years ago.Ā  NI always had a few "propaganda" slides at the beginning of Basics I extolling the virtues of LabVIEW to the beginners.Ā  One of these slides touted "LabVIEW Everywhere" as the roadmap for the language, complete with pictures of everything from iOT hardware to appliances.Ā  The reality of that effort became the very expensive "LabVIEW Embedded" product that was vastly over-priced, bug-filled (never really worked), and only compatible with certain (Blackfin?) eval boards that were just plain terrible.Ā  It came and went in a flash, and the whole idea of "LabVIEW Everywhere" went with it.Ā  We had the sbRIOs, but their pricing and marketing (vastly over-priced, and targeted at the high-volume market) ensured they would not be widely adopted for one-off bench applications.Ā  Lower-cost FPGA evaluation hardware and the free Vivado WebPack has nearly killed LabVIEW FPGA.Ā  LabVIEW should be dominating.Ā  Instead you get this:Ā image.png.46ff7c8fd8e6f985bfb92195a556b56e.png

I've always been suspicious of Google Trends because it doesn't give context. The following is C++ but I refuse to believe it's marching to oblivion.

image.png.003477b4ce8a9fe7073c8128fe039f76.png

But it's fun to play withĀ :D

image.png.77693f4165dce9f3188bdfe3c41580f5.png

  • Like 2
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.