ShaunR Posted March 2, 2023 Report Share Posted March 2, 2023 Chasing the golden geese, it seems. Interesting stuff throughout but the real juice is at about 55mins in the "Distribution of Resources". (Spoiler, it's a small club and we aint in it). The mention of porting Unicode made me laugh heartily though . My final takeaway was too many C# coders left over from NXG. 1 1 Quote Link to comment
Jordan Kuehn Posted March 2, 2023 Report Share Posted March 2, 2023 I am intrigued by a separate "NI Create" event for developers. Quote Link to comment
X___ Posted March 2, 2023 Report Share Posted March 2, 2023 (edited) 19 hours ago, ShaunR said: Chasing the golden geese, it seems. Interesting stuff throughout but the real juice is at about 55mins in the "Distribution of Resources". (Spoiler, it's a small club and we aint in it). The mention of porting Unicode made me laugh heartily though . My final takeaway was too many C# coders left over from NXG. Seriously: it's over (it actually starts at 54 mins in). Edited March 2, 2023 by X___ Quote Link to comment
leod Posted March 3, 2023 Report Share Posted March 3, 2023 Seems like NI's head is turned away from Labview. Mentioning modern tech (like Docker, etc.) doesn't change the fact that Labview still only partially transitioned to 64 bit, still has BMP graphics made for 640x480 monitors, and all the long-awaited stuff from NXG will be thrown away completely. BTW, Labview 2023q1 was quietly released, you will be surprised how few new features are there (got a new icon again, though!). Quote Link to comment
i2dx Posted March 5, 2023 Report Share Posted March 5, 2023 On 3/2/2023 at 5:53 PM, X___ said: Seriously: it's over (it actually starts at 54 mins in). my gut feeling says: you're right starting at Minute 54:17: >>we need patience, give NI a chance to prove itself and >> I am not convinced NI is gonna deliver on that, but I'm hopfull that NI 's gonna deliver on that ... hope dies last I guess LabVIEW is a dead horse. Quote Link to comment
X___ Posted March 5, 2023 Report Share Posted March 5, 2023 There is no need for gut feeling: the simple fact that they are not even trying to hype the future of LabVIEW is speaking volume. We implicitly learn that the LabVIEW team has been gutted or at least not given the means it needs to even keep in sync with OSes (https://www.ni.com/en-us/support/documentation/supplemental/22/ni-product-compatibility-for-microsoft-windows-11-.html). Heck, TestStand and VeriStand themselves, of which I know nothing but understand are where NI is investing its resources, are not Windows 11 ready. As for the rest, I guess you can check the release notes for 2023 Q1 (https://www.ni.com/pdf/manuals/labview-2023-q1-and-drivers.html#examples) to understand that the T's are not even crossed anymore, since the examples for this version are apparently to be found in the \LabVIEW 2022\examples directory... I am not sure it is fun to work at SpaceX, but I doubt there is much feeling of exhilaration at LabVIEW R&D nowadays... Quote Link to comment
i2dx Posted March 5, 2023 Report Share Posted March 5, 2023 53 minutes ago, X___ said: There is no need for gut feeling: the simple fact that they are not even trying to hype the future of LabVIEW is speaking volume. We implicitly learn that the LabVIEW team has been gutted or at least not given the means it needs to even keep in sync with OSes Yes, it's sad. I don't believe LV has a future, too. Best we can hope, that it is somehow maintained at the status quo. I guess the low acceptance of NXG broke their neck. But I told them, that NXG is "Klick-Bunti-Kinder"-Software (hard to translate that from German, it's something with to many unseless features, way to much focus on the "look & feel" instead of quality and functionalty, overall something you'd give your kids to play with, but not for serious work). They didn't listen - as so often. Ok, it's not that big problem for me, because I decided to walk on new path 3 years ago and switched to STM32 and developing with Open Source Tools. What I did with cRIOs, LVRT + FPGA before I do now with my own PCBs and C. I gave up my SSP 2 years ago, when they told me, that there is no SSP any more and I had to pay for a SAS-plan instead at almost doubled price. But I feel for all the developers, that hat put their efforts into this tool, have maybe 20 yrs or more work experience and now have to learn, that NI is letting them down and they have no other choice then reinvent themselves. Besides that: I'm not a native speaker, I don't understand what you mean with the last sentence. Quote Link to comment
X___ Posted March 5, 2023 Report Share Posted March 5, 2023 2 hours ago, i2dx said: Besides that: I'm not a native speaker, I don't understand what you mean with the last sentence. I only meant that 2023 Q1 is the first version to officially support Windows 11 and that apparently, the head count in the LabVIEW R&D team is either classified or not to be admitted in public. Quote Link to comment
codcoder Posted March 6, 2023 Report Share Posted March 6, 2023 Idk, why should the failure of NXG mark the failue of NI? Aren't there other examples of attempts ro reinvent the wheel, discovering that it doesn't work, scrap it and then continuing building on what you got? I can think of XHTML which W3C tried to push as a replacement of HTML but in failure to do so, HTML5 became the accepted standard (based on HTML 4.01 while XHTML was more based on XML). Maybe the comparision isn't that apt but maybe a little. And also, maybe I work in a back water slow moving company, but to us for example utilizing LabVIEW FPGA and the FPGA module for the PXI is something of a "game changer" in our test equipment. And we are still developing new products, new systems, based on NI tech. The major problem of course for me as an individual is that the engineering world in general have moved a lot from hardware to software in the last ten years. And working with hardware, whether or not you specilize in the NI eco system, gives you less oppurtunities. There are hyped things like app start ups, fin tech, AI, machine learning (and what not) and if you want to work in those areas, then sure, LabVIEW isn't applicable. But it never as been and that shouldn't mean it can't be thriving in the world it was originally conceived for. Quote Link to comment
ShaunR Posted March 6, 2023 Author Report Share Posted March 6, 2023 2 hours ago, codcoder said: Idk, why should the failure of NXG mark the failue of NI? I don't think anyone is saying that, so much, with respect to NI as a whole. But the effort and investment in NXG made LabVIEW (Classic?) the withered limb of NI, Now they have lots of C# devs who can't do jack with LabVIEW. From this seminar, it looks like this is a solution (lots of C# devs) looking for a problem (Cloudy stuff) and they see LabVIEW as a stagnant technology instead of the axiomatic driver behind their hardware it actually is. Don't get me wrong. They can very easily move into this space and do all the cloudy stuff. But their view of how this will fit together is flawed (IMO). They are viewing it purely like an IT system rolling out images (think AWS Compute-IAAS) when, in fact, those images will be highly specialised LabVIEW installations for very specific and usually custom hardware configurations. 2 hours ago, codcoder said: But it never as been and that shouldn't mean it can't be thriving in the world it was originally conceived for. They lost Test and Measurement to Python a while ago-arguably the mainstay for LabVIEW. 2 Quote Link to comment
codcoder Posted March 6, 2023 Report Share Posted March 6, 2023 (edited) 32 minutes ago, ShaunR said: They lost Test and Measurement to Python a while ago-arguably the mainstay for LabVIEW. Yes, of course. I can't argue with that (and since English isn't my first language and all my experience comes from my small part of the world, it's hard for me to argue in general 😁). We don't use python where I work. We mainly use Teststand to call C based drivers that communicate with the instruments. That could have been done in LabVIEW or Python as well but I assume the guy who was put to do it knew C. But for other tasks we use LabVIEW FPGA. It's really useful as it allows us to incorporate more custom stuff in a more COTS architecture. And we also use "normal" LabVIEW where we need a GUI (it's still very powerful at that, easy to make). And in a few cases we even use LabVIEW RT where we need that capability. I neither of those cases we plan to throw NI/LabVIEW away. Their approach is the entire reason we use them! I don't really know where I'm heading here. Maybe something like if you're used to LabVIEW as a general purpose software tool then yes, maybe Python is the best choice these days. Maybe, maybe, "the war is lost". But that shouldn't mean LabVIEW development is stagnating or dying. It's just that those areas where NI excel in general aren't as big and thriving compared to other. I.e. HW vs. SW. Edited March 6, 2023 by codcoder Quote Link to comment
ShaunR Posted March 7, 2023 Author Report Share Posted March 7, 2023 20 hours ago, codcoder said: Yes, of course. I can't argue with that (and since English isn't my first language and all my experience comes from my small part of the world, it's hard for me to argue in general 😁). English *is* my first language and I'm not as eloquent as you are. There is no real argument here, though. 21 hours ago, codcoder said: But that shouldn't mean LabVIEW development is stagnating or dying. It's just that those areas where NI excel in general aren't as big and thriving compared to other. I still use LabVIEW 2009 for development because there is little that the later versions offered of significance. It's also robust, stable and fast. That cannot always be said for some of the later versions (looking at you 2011/2012). Some features that actually got us excited weren't even on the roadmap (VIM's anyone?). NI have been so far behind the curve for features that we want that we have all created our own solutions so if one of them actually gets implemented, it's a moot feature. TLS/SSL, for example was only released in LV2020 but I (and Rolf) had created solutions a decade before that. The one thing we have been yelling at NI about for about 15 years is Unicode which we cannot really make a native solution for. This is why I laughed when it was mentioned in this talk. I moved to HTML UI's and relegated LabVIEW to a back-end service through Websockets which solved the problem but it's a sledgehammer to crack a nut. 1 Quote Link to comment
codcoder Posted March 7, 2023 Report Share Posted March 7, 2023 Thanks for sharing! I guess I'm just trying to stay positive. Every time the future of LabVIEW is brought up here things goes so gloomy so fast! And I'm trying to wrap my head around what is true and what is nothing but mindset. It's ironic though: last months there has been so much hype around AI being used for programming, especially Googles AlphaCode ranking top 54% in competetive programming (https://www.deepmind.com/blog/competitive-programming-with-alphacode). Writing code from natural text input. So we're heading towards a future where classic "coding" could fast be obsolete, leaving the mundane tasks to the AI. And still, there already is a tool that could help you with that, a tools that has been around for 30 years, a graphical programming tool. So how, how, could LabVIEW not fit in this future? 2 Quote Link to comment
hooovahh Posted March 7, 2023 Report Share Posted March 7, 2023 3 hours ago, ShaunR said: Some features that actually got us excited weren't even on the roadmap (VIM's anyone?). I do love how VIMs came to be. I'm having a real hard time finding it. But there was an idea on the Idea Exchange that there should be a function that can delay any data type, similar to the OpenG Wait which takes an error in and passes it out. Jeff K. posted on the thread saying something like "Oh yeah that is a thing, you just need to use a VIM, here is an example which uses XNodes." It blew my mind. Then in the next release of LabVIEW for the Mac, Jeff K. sneaked in a new VIM on the palette which some high up in R&D didn't know, which had the type specialized structure in it, which was also unreleased. I downloaded that version just to get the VIM and structure. I get the feeling the reason VIMs seemingly came out of nowhere, is that Jeff was pushing for it for years, and then when it was mostly stable he just put it out there to get the public's response. When everyone saw the potential that he also saw in it, R&D put efforts into getting it out there. This is just my speculation from the outside. 1 Quote Link to comment
crossrulz Posted March 7, 2023 Report Share Posted March 7, 2023 3 hours ago, hooovahh said: I do love how VIMs came to be. I'm having a real hard time finding it. But there was an idea on the Idea Exchange that there should be a function that can delay any data type, similar to the OpenG Wait which takes an error in and passes it out. Jeff K. posted on the thread saying something like "Oh yeah that is a thing, you just need to use a VIM, here is an example which uses XNodes." It blew my mind. Then in the next release of LabVIEW for the Mac, Jeff K. sneaked in a new VIM on the palette which some high up in R&D didn't know, which had the type specialized structure in it, which was also unreleased. I downloaded that version just to get the VIM and structure. Here's the starting point: Wait (ms) with error pass-through From what I remember at CLA Summits, this was unofficial up to 2016 (called a "macro" at this point in time). But the cat was waaaaaaaaaaay out of the bag, so NI spent time to make it a proper feature and malleable VIs started in 2017. The Type Specialization Structure was not quite ready for prime time for another year (2018). 1 Quote Link to comment
dadreamer Posted March 7, 2023 Report Share Posted March 7, 2023 3 hours ago, crossrulz said: The Type Specialization Structure was not quite ready for prime time for another year (2018). The Type Spec Structure is accessible in LabVIEW 2017, if SuperSecretPrivateSpecialStuff=True is written to labview.ini and the user RMB clicks on the Diagram Disable Structure and chooses "Replace With Type Specialization Structure" menu entry. Quote Link to comment
X___ Posted March 8, 2023 Report Share Posted March 8, 2023 https://finance.yahoo.com/news/ni-acquires-set-gmbh-accelerate-140000459.html 1 Quote Link to comment
ShaunR Posted March 8, 2023 Author Report Share Posted March 8, 2023 8 hours ago, X___ said: https://finance.yahoo.com/news/ni-acquires-set-gmbh-accelerate-140000459.html Interesting. Not the sort of behaviour of a company that's about to roll over. Quote Link to comment
hooovahh Posted March 8, 2023 Report Share Posted March 8, 2023 18 hours ago, crossrulz said: Here's the starting point: Wait (ms) with error pass-through From what I remember at CLA Summits, this was unofficial up to 2016 (called a "macro" at this point in time). But the cat was waaaaaaaaaaay out of the bag, so NI spent time to make it a proper feature and malleable VIs started in 2017. The Type Specialization Structure was not quite ready for prime time for another year (2018). Thanks, that's it. I couldn't find it because I was searching for Completed ideas, but apparently it was declined. The Type Specialized Structure was in LabVIEW 2016, and was part of my NI Week demo that year. I frantically downloaded the Mac 2016 version, extracted the installer to get the VIM that contained the structure, so I could update my presentation and demo. When it was a Macro it was an XNode behind the scenes, and even the 2017 beta used XNodes. But in the official 2017 release it was its own technology. XNodes and Classes don't work well together, and locking libraries made editing them challenging so it was necessary to be their own thing. Quote Link to comment
crossrulz Posted March 8, 2023 Report Share Posted March 8, 2023 18 minutes ago, hooovahh said: Thanks, that's it. I couldn't find it because I was searching for Completed ideas, but apparently it was declined. But yet technically completed with the Stall Data Flow.vim that was released in 2017 along with a bunch of other malleable VIs. Quote Link to comment
HYH Posted March 8, 2023 Report Share Posted March 8, 2023 8 hours ago, ShaunR said: Interesting. Not the sort of behaviour of a company that's about to roll over. Indeed interesting. Such an acquisition makes very good sense to substantiate the value of the recent entry into power electronics testing. See this from 2022 as an example : https://www.businesswire.com/news/home/20220524005382/en/NI-Releases-Latest-Battery-Test-System-to-Enhance-Safety-and-Performance-of-Electric-Vehicles Regards Quote Link to comment
X___ Posted March 11, 2023 Report Share Posted March 11, 2023 https://www.fool.com/investing/2023/03/10/who-will-win-the-bidding-war-for-national-instrume/ Quote Link to comment
X___ Posted March 23, 2023 Report Share Posted March 23, 2023 https://seekingalpha.com/news/3949487-national-instruments-sale-process-likely-to-be-completed-in-early-april-report Quote Link to comment
codcoder Posted March 23, 2023 Report Share Posted March 23, 2023 (edited) So it's settled then? NI will become a subdivision of Emerson? Edited March 23, 2023 by codcoder Added text Quote Link to comment
EHS Posted March 23, 2023 Report Share Posted March 23, 2023 2 weeks ago I checked on Bloomberg the buy/sell actions from insiders (list is not full) Starkloff Eric Howard (CEO) sold 50'859 shares on 2023-02-01 Benjamin Thomas (EVP, CTO) bought 10'604 shares on 2023-03-03 Favre Ritu C (EVP) bought 12'118 shares on 2023-03-03 Rust Scott A (EVP) bought 9'846 shares on 2023-03.03 Daniel Berenbaum (CFO) bought 14'427 shares on 2023-02-16 draw your own conclusions Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.