Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation since 07/28/2020 in Posts

  1. Dear Santa NI I am now in my 40s with youngish kids, so despite the fact that all I got for Christmas this year was a Pickle Rick cushion I am not actually complaining. However, I would like to get my order in to the Elves as early as possible. This is my wishlist, in no particular order. I expect this list will not be to everyone's taste, this is ok, this is just my opinion. Make LabVIEW free forever. The war is over, Python has won. If you want to be relevant in 5 to 10 years you need to embrace this. The community edition is a great start but is is probably not enough. Note: I accept it might be necessary to charge for FPGA stuff where I presume you license Xilinx tools. NI is and has always been a hardware company. Make all toolkits free. See the above point. Remove all third party licensing stuff. Nobody makes any money from this anyway. Encourage completely open sharing of code and lead by example. Take all the software engineering knowledge gained during the NXG experiment and start a deep refactor of the current gen IDE. Small changes here please though... we should not have to wait 10 years. Listen to the feedback of your most passionate users during this refactor. NXG failed because you ignored us and just assumed we would consume whatever was placed in front of us. I am talking about the people like those reading this post on Christmas day and their spare time because they are so deeply committed to LabVIEW My eyes are not what they used to be, so please bring in the NXG style vector graphic support so I can adjust the zoom of my block diagram and front panel to suit accordingly As part of the deep refactor, the run-time GUI needs to be modernised. We need proper support for resizable GUIs that react sensible to high DPI environments. Bring the best bits of NXG over to current gen. For example the dockable properties pane. (Sorry not much else comes to mind) Remove support for Linux and Mac and start to prune this cross compatibility from the codebase. I know this is going to get me flamed for eternity from 0.1 % of the users. (You pretty much made this decision for NXG already). Windows 10 is a great OS and has won the war here. Get rid of the 32-bit version and make RT 64-bit compatible. You are a decade overdue here. Add unicode support. I have only needed this a few times, but it is mandatory for a multicultural language in 2021 and going forward Port the Web Module to Current Gen. All the news I have heard is that the Web Module is going to become a standalone product. Please bring this into Current Gen. This has so much potential. Stop adding features for a few years. Spend the engineering effort polishing. Fix the random weirdness we get when deploying to RT Open source as many toolkits as you can. Move the Vision toolkit over to OpenCV and make it open source Sell your hardware a bit cheaper. We love your hardware and the integration with LabVIEW but when you are a big multiple more expensive than a competitor it is very hard to justify the cost. Allow people to source NI hardware through whatever channel makes most sense to them. Currently the rules on hardware purchasing across regions are ridiculous. Bring ni.com into the 21st century. The website is a dinosaur and makes me sad whenever I have to use it Re-engage with universities to inspire the next generation of young engineers and makers. This will be much easier if the price is zero Re-engage with the community of your most passionate supporters. Lately it has felt like there is a black hole when communicating with you Engineer ambitiously? What does this even mean? The people using your products are doing their best, please don't patronise us with this slogan. Take the hard work done in NXG and make VIs into a non-binary format human readable so that we can diff and merge with our choice of SCC tools Remove all hurdles to hand-editing of these files (no more pointless hashes for "protection" of .lvlibs and VIs etc) Openly publish the file formats to allow advanced users to make toolkits. We have some seriously skilled users here who already know how to edit the binary versions! Embrace this, it can only help you. Introduce some kind of virtualenv ala Python. i.e. allow libraries and toolkits to be installed on a per-project basis. (I think this is something JKI are investigating with their new Project Dragon thing) For the love of all that is holy do not integrate Git deeply into LabVIEW. Nobody wants to be locked into someone else's choice of SCC. (That said, I do think everyone should use Git anyway, this is another war that has been won). That is about it for now. All I want is for you guys to succeed so my career of nearly 20 years does not need to be flushed down the toilet like 2020. Love you Neil (Edited: added a few more bullets)
    14 points
  2. “There was this fence where we pressed our faces and felt the wind turn warm and held to the fence and forgot who we were or where we came from but dreamed of who we might be and where we might go...” -- Opening lines of “R is for Rocket” by Ray Bradbury I spent 20 years building this G language of ours. It’s time for me to go enjoy the fruits of that labor as a user! I will still be employed by NI, but I will be working full time for Blue Origin. As part of the NI “Engineer in Residence” program, I will be on loan to Blue Origin to revise their engine and support test systems. They wanted a Certified LabVIEW Architect with deep knowledge of LVOOP, multiple years of experience with Actor Framework, and deep knowledge of cRIO and PXI. I asked, “Can we negotiate on that last part?” They said, “Yes, yes we can.” Turns out, based on the interview, I know more than I thought – apparently some hardware knowledge does rub off just by sitting near it for a couple decades. This new job runs for six months, but it is extensible through the end of 2021 at the discretion of myself and Blue Origin. When I come back, I do not know if I will be returning to LabVIEW. Spaceflight has long been a passion of mine. Over my 20 years with LabVIEW R&D, I have had the chance to help out with various Mars rovers, large telescopes, and rocket launches. It has been awesome, and I’m proud of the language advances I brought to LabVIEW that helped so many users along the way. Now, I am going to focus on just one customer for a while... a customer with very large rocket engines! My last day will be Friday, October 23. I will not have the same availability to respond to posts as I have in the past, but Aristos Queue will still be around on the forums. “And, walking, I went beyond the fence.” -- ending of “R is for Rocket”
    12 points
  3. I did not know. That possibility was not even on my radar. Even though the drumbeat of bad news had been going for a while, most corporations refuse to change direction on a bad decision. NI showed more sentience than I usually expect from massed humans: the sunk cost fallacy is a trap that is very hard to get out of. I figured the very good engineers on NXG would either surge through it and make it fly or we would bankrupt the company trying. That's the pattern established by plenty of other companies. Mixed. I spent 4.5 years directly working on NXG (2011 to 2016) and countless hours in later years working with the NXG team to design a future G. I really wanted it to fly. There is so much good in that IDE, including some amazing things that I just don't see how we ever do in the LabVIEW codebase without just shattering compatibility. But at the same time, I was watching good friends toil on something that the market just wasn't adopting. The software had some problems that were going to take a long time to solve. The issues were all solvable, but the time needed to fix them... that was harder and harder to justify. NXG gave us a GREAT platform for other software: Veristand, FlexLogger, etc. That code is extremely modular and can be repurposed for all sorts of tools. We also learned a heck of a lot by building NXG -- some things that I thought we could never do in LabVIEW now seem possible. NXG gave us a sandbox to learn a whole lot about modern software engineering without putting the delivery schedule for mature software at risk, and those practices [have been|are being] brought back and applied to LabVIEW -- that will decrease cost of maintaining older code. All in all, NXG was valuable -- the expenditure was not a complete loss. I am very sorry to the few customers who did migrate to NXG. We don't have a reverse migration tool, and building one would be absurdly expensive. Leaving those folks stranded is going to hurt -- I hate letting our customers down and just saying, "We have no solution to help you." There aren't many of those folks (that's essentially the problem), but they do exist, and they are basically stuck having to rewrite their NXG apps in some other tool. I can only hope that they pick LabVIEW. I don't know if this will help us or hurt us with customers in the future... on one hand, people may say, "Well, you let us down on NXG, why should we trust you will be there for us on any new products?" On the other hand, this decision was clearly made listening to customer feedback, and it takes a lot of humility to swallow a loss that big, which may make customers trust our judgement more in the future. And, really, there's nothing to compare with the scale of NXG -- an entire computing platform -- so this does seem like something that needs to be judged in isolation. I really like programming in G. I like being able to expand G to make it more powerful. I wanted NXG to succeed because it had the potential to be a better G. It failed. Its failure means more resources for the existing LabVIEW platform, which will directly help our customers in the short run. It leaves open some big questions for the long run. So, in summary: I think it was a decision that had to be made, and I'm happy to work for a company that can learn from new data, then admit a mistake, and then figure out how to correct it.
    12 points
  4. Two questions about this: 1. Does something like this already exist? 2. Is this something that could be useful? Every once in a while I need dynamic UI components that can be generated at runtime. One nice thing to use for this is a picture control; however it doesn't lend itself as well to keeping other pieces of function such as mouse click events and such. I put together a mini library of UI functions for this that has the ability to be extended. The UI can be generated dynamically at runtime and be any picture thing that you can draw. Using Standard layout techniques that you might find in other GUI libraries. The hierarchy generation can always be simplified by using some type of templating string. Example1.vi Front Panel on Pgui2.lvproj_My Computer _ 2021-07-02 14-03-54.mp4
    5 points
  5. Does it help to re-ask the question as "where should LabVIEW have a future?" It is not difficult to name a number of capabilities (some already stated here) that are extremely useful to anyone collecting or analyzing data that are either unique, or much simpler, using LabVIEW. They're often taken for granted and we forget how significant they are and how much power they unlock. For example (and others can add more): FPGA - much easier than any text-based FPGA programming, and so powerful to have deterministic computational access to the raw data stream Machine vision - especially combined with a card like the 1473R, though it's falling behind without CoaXPress Units - yes no-one uses them , but they can extend strict programming to validation of correct algorithm implementation Parallel and multi-threaded programming - is there any language as simple for constructing parallel code? Not to mention natural array computations Real-time programming Data-flow - a coherent way of treating data as the main object of interest, fundamental, yet a near-unique programming paradigm with many advantages and all integrated into a single programming environment where all the compilation and optimization is handled under the hood (with almost enough ability to tweak that) Unfortunately NI appear to be backing away from many of these strengths, and other features have been vastly overtaken (image processing has hardly been developed in the last 10 years, GUI design got sidetracked into NXG unfortunately). But the combination of low-level control in a very high-level language seems far too powerful and useful to have no future at all.
    5 points
  6. I've just implemented this and posted a beta: https://forums.ni.com/t5/JDP-Science-Tools/BETA-version-of-JSONtext-1-5-0/m-p/4136116 Handles comments like this: // Supports Comments { "a":1 // like this "b":2 /*or this style of multiline comment*/ "c": 3 /*oh, and notice I'm forgetting some commas A new line will serve as a comma, similar to HJSON*/ "d":4, // except I've foolishly left a trailing one at the end }
    5 points
  7. Are Italian LV developers more prone to producing spaghetti code? 🤨
    4 points
  8. I have been coming round to supporting comments in JSONtext, at least for ignoring comments on reading (which is quite simple to implement, I think). And possibly features to be more forgiving of common User error, such as missing commas.
    4 points
  9. Coming from my personal experience, I still lean towards no. I had a discussion with Nancy Hanson about this and we came the the conclusion that the CLA was not a destination, but the opening of doors to learn (yes, this was alluding to the CLA Summits). Personally, I had 0 experience using OOP when I got my CLA. But after my second CLA Summit, I found an application that deserved a very basic OOP architecture. The CLA Summit opened that door for me. Now I would say ~50% of what I do is OOP. There is still A LOT you can do effectively without OOP. And keep in mind that part of a CLA is to make architectures that your less experienced developers can use and understand. If they can't use OOP, then your OOP architectures will not be effective. So should it be REQUIRED? No. Highly recommended? Absolutely.
    4 points
  10. I have made public a document detailing an old internal feature of LabVIEW that will be of great interest to those of you deploying Packed Project Libraries. Until recent conversations with a customer, I never considered that this would have much utility. The problem this solves: First, you build a packed project library (PPL) from source. Then, you write a VI that calls that PPL. It works fine. But now you load the caller VI under a different target in your project. The caller VI breaks because it tries to load the PPL, and the PPL refuses because it isn't built for the new target. Packed project libraries are compiled for one and only one specific target. How can you write ONE caller VI that will load DIFFERENT libraries depending upon the target without adding Conditional Disable structure complications? https://forums.ni.com/t5/Community-Documents/Resolution-of-Pseudopaths-in-LabVIEW-Per-Target-Invocation-of/ta-p/4087124
    4 points
  11. Yes, I don't assume you use many new features of LabVIEW from the last 10 years if you still develop in LabVIEW 2009.
    3 points
  12. NI didn't say they would be porting NXG features to 2021, but to future versions of LabVIEW. Technically such a promise would have been unfulfillable, since at the time the NXG demise was announced, LabVIEW 2021 was basically in a state where anything that was to be included in 2021 had to be more or less fully finished and tested. A release of a product like LabVIEW is not like your typical LabVIEW project where you might make last minute changes to the program while testing your application at the customer side. For a software package like LabVIEW, there is a complete code freeze except for breaking bug fixes, then there is a testing, packaging and testing again cycle for the Beta Release, which typically takes a month or two alone, then the Beta phase of about 3 to 4 months and finally the release. So about 6 months before the projected release date, anything that is not considered ready for prime time is simply not included in the product, or sometimes hidden behind an undocumented ini file setting. Considering that, the expectation to see any significant NXG features in LabVIEW 2021 was simply blue eyed and irrational. I agree with you that LabVIEW is a unique programming environment that has some features that are simply unmatched by anything else. And there are areas where its age is clearly showing such as lack of proper Unicode support, and related to that the lack of support for long path names. Personally I feel like I could tackle the lower level part of full Unicode support in LabVIEW including full Unicode path support quite easily if I was part of the development team, but have to admit that the higher level integration into front panels and various interfaces is a very daunting task that I have no idea how I would solve it. Still, reworking the lower level string and path management in LabVIEW to fully support Unicode would be a first and fundamental step to allow the other task of making this available to the UI in a later stage. This low level manager can exist in LabVIEW even if the UI and higher level parts don't yet make use of it. The opposite is not possible. That is just one of many things that need some serious investment to make the whole LabVIEW platform again viable for further development into the future. This example also shows that some of the work needed to port NXG features back to LabVIEW require first some significant effort that will not immediately be visible in a new LabVIEW version. While such a change as described above is definitely possible to do within a few months, the whole task of making whole LabVIEW fully Unicode capable without breaking fundamental backwards compatibility, is definitely something that will take more than one LabVIEW version to eventually fully materialize. There are a few lower hanging fruits that can help prepare for that and should have been done years ago already but were discarded as "being already fixed in NXG" but the full functionality just for full Unicode support in LabVIEW is going to be a herculean task to pull off, without going the path of NXG to reinvent LabVIEW from scratch (which eventually proved to be an unreachable feat). My personal feelings about the future of LabVIEW are mixed. Not so much because LabVIEW couldn't have a future but because of the path NI as a company is heading. They have been changing over the last few years considerably, from an engineering driven to a management driven company. While in the past, engineers had some real say in what NI was going to do, nowadays it's mostly managers who see Excel charts, sale numbers and the stock market exchange as the main decision making thing for NI. Anything else has to be subordinated to the bigger picture of a guaranteed minimum yearly growth percentage and stock price. The traditional Test & Measurement market NI has served for much of its existence is not able to support those growth numbers anymore. So they are making heavy inroads into different markets and seem to consider the traditional T&M market by now just as a legacy rather than a significant contributor to their bottom line.
    3 points
  13. Working on the next JSONtext functionality, which is features to improve support of JSON Config Files. See https://forums.ni.com/t5/JDP-Science-Tools/BETA-version-of-JSONtext-1-6/td-p/4146235
    3 points
  14. VISA is ok but not everyone uses it so...
    3 points
  15. I've encountered a black imaq image display in exes, solved by unchecking the box to allow running in a later runtime version. Don't know if that is related to your problem.
    3 points
  16. I've been toying with the idea of implementing a new TOML library for LabVIEW. I've been using OpenG variant config for years, but I would prefer to use a more standardized format for my ini config files. TOML is the best candidate for this. Erdosmiller's library is pretty good, but as the author points out, it is no longer maintained, and it didn't gracefully handle all of the datatypes that I wanted to use. It would be great to have the flexibility of jsontext but for TOML format. I'll post back here if I manage to get the project off the ground.
    3 points
  17. So this was posted on the NI forums: https://forums.ni.com/t5/LabVIEW/Our-Commitment-to-LabVIEW-as-we-Expand-our-Software-Portfolio/td-p/4101878?profile.language=en
    3 points
  18. Update: I used the dll call from the link @dadreamer provided, and made a Messenger-Library "actor" that I can use for debugging. Already found a couple of bugs with it.
    3 points
  19. Shaddap-a you face!
    3 points
  20. I agree with all your points. Definitely on making LabVIEW free for all purposes, if not even open source. NI may hang on to the mega-costumers for a while with its current business model. But eventually it'll get marked as a legacy software and slowly replaced by younger people with newer ideas and experience in different, more accessible languages. The idea that a company can sell a programming language these days is ridiculous when there are so many free alternatives. I am not counting the community edition. It needs to be free for any purpose.
    3 points
  21. I don't really expect many new language features or UX improvements in LabVIEW just because they stop working on NXG. From what we know there are only a few knowledgeable people at NI who are intimately familiar with the codebase and some of its intricate details which fundamentally drive LabVIEW. There are also many customers who rely on that technology for their own business. Because of that, NI can't just throw more developers at it and change LabVIEW fundamentally unless they find a way to stay compatible or take a bold step and do breaking changes (which are inevitable in my opinion). LabVIEW will probably stay what it is today and only receive (arguably exciting) new features that NI will leverage from the NXG codebase to drive their business. Unfortunately NI hasn't explained their long-term strategy (I'll assume for now that they are still debating on it). In particular what LabVIEW/G will be in the future. Will it be community-driven? Will it be a language that anyone can use to do anything? Will it be the means to drive hardware sales for NI and partners? Will it be a separate product altogether, independent of NI hardware and technology? There are also a lot of technology-related topics they need to address. Does LabVIEW Support Unicode? - National Instruments Comparing Two VIs in LabVIEW - National Instruments (ni.com) Error 1316 While Using .NET Methods in LabVIEW - National Instruments (ni.com) Using NULL Values or Pointers in LabVIEW - National Instruments (ni.com) Not to forget UX. The list is endless and entirely different for any one of us. If and when these will be addressed is unknown. Don't get me wrong, I'm very excited and enthusiastic about LabVIEW and what we can do with it. My applications are driven by technology that other programming languages simply can't compete with. Scalability is through the roof. Need to write some data to text file? Sure, no problem. Drive the next space rocket, land rover, turbine engine, etc.? Here is your VI. The clarity of code is exceptional (unless you favor spaghetti). The only problem I have with it is the fact that it is tied to a company that want's to drive hardware sales.
    3 points
  22. The first time you mentioned this I thought it was a nice gesture, now I think you are just desperate for friends...or an alcoholic. I'm down.
    3 points
  23. I want to remind once again that all this information is just to have fun playing with LabVIEW and not intended for real projects use. I believe you all understand that. 🙂 Not that a big opening here and even not an opening for some ones, but I found this interesting enough to make a thread. As you may already know, when some library is being called using CLF Node, LabVIEW enters ExtFuncWrapper first to do some guard checks to prevent itself from a silent crash and output some error message to the user instead. I've always considered that wrapper boring enough to study, thus never looked inside. But when once again I faced that I can't call some function through CLFN and have to write my own wrapper library, I asked myself why we cannot call some code by its pointer as in almost any well-known text language?.. Therefore I decided to know how ExtFuncWrapper calls the function. It turned out that ExtFuncWrapper receives the function's pointer (along with the parameters struct pointer) and calls that pointer later as it is (i.e., not doing any manipulations with it). So we can use it to call any function and even any code chunk directly from the diagram! After further research I found ExtFuncWrapper not very convenient to use, because to bypass some checks the parameters struct should be prepared accordingly before the call. But there are many ExtFunc... wrappers in labview.exe and ExtFuncCBWrapper is much easier to use. It has the following prototype: int32_t ExtFuncCBWrapper(uintptr_t CodeChunk, int32_t UseTLS, void *CodeParams); Here CodeChunk is our func / code pointer, UseTLS is 0 as we don't use LabVIEW's Thread Local Storage and CodeParams is our parameters struct. When called ExtFuncCBWrapper runs that CodeChunk, passing CodeParams to it, so we can freely use it later to do what we want. Nuff said, here are the samples. This one increments a Numeric. ExtFuncCBWrapper-Increment.vi As you can see, I'm passing a Numeric value as CodeParams pointer into ExtFuncCBWrapper and in the assembly I have to pull that pointer out to deal with my parameters. I'm not that excellent in asm codes, so I used one of many online x86 compilers-decompilers out there. It's even simplier in 64-bit IDE as the first parameter is already written into RCX. Okay, here goes a more advanced example - it calculates a sum of two input Numerics. ExtFuncCBWrapper-SumOfTwo.vi Here I'm passing a cluster of three parameters as CodeParams pointer (two Numerics and the resulting Sum) and in the asm I'm grabbing the first parameter, adding it to the second one and writing the result into the third one. Pretty simple operations. Now let's do some really wild asm on the diagram! 😉 The latter example calls MessageBox function from user32.dll. ExtFuncCBWrapper-MsgBox.vi This is what Rolf calls a diagram voodoo. 😃 I have to provide 4 parameters to MessageBox, 2 of which are string pointers. Thus I'm getting these pointers and writing them into my params cluster (along with the panel handle and the dialog type). When ExtFuncCBWrapper is called, in the asm code I have to use the prologue and epilogue pieces to prevent the stack corruption as I'm pushing 4 parameters later onto the stack to provide them to MessageBox. After the call I'm writing the result into the function return parameter. In 64-bit IDE the prologue/epilogue is somewhat simplier. Maybe you already noticed that I'm doing VirtualProtect before calling ExtFuncCBWrapper. This is done to pass through Windows Data Execution Prevention (DEP) protection. I'm setting execute and read/write access for the memory page with my code, otherwise LabVIEW refuses to run my code and throws an exception. Surprisingly it is thrown only in 64-bit IDE, but in 32-bit LV I can just wire the U8 array to ExtFuncCBWrapper, not going through that DSNewPtr-MoveBlock-VirtualProtect-DSDisposePtr chain. I did not start to figure out such a behaviour. Well, to be honest, I doubt that someone will find all these samples really useful for his/her tasks, because these are very low-level operations and it's much easier to use a common CLFN or a helper DLL. They are here just to show that the things described are definitely doable from an ordinary diagram, and that doesn't require writing any libraries. With a good asm skills it's even possible to realize callback functions or call some exotic functions (like C++ class methods). Some things might be improved also, e.g. embedding a generic assembly compiler to have a possibility to write the codes instead of raw bytes on the diagram. Ideally it's even possible to implement an Inline Assembly Node. Unfortunately, I have neither the time nor much desire to do it myself.
    3 points
  24. After starting NXG 5.0.0 and traditional hang of the whole OS for 3 minutes, LabVIEW forgot how to write text on the screen Ok, not only LabVIEW, other apps too... They used to have a name software that caused weird system behaviour: a virus...
    3 points
  25. Here are my points: By default it should list about 15 to 20 of the most recently updated packages. That way even new packages get promoted and it doesn't feel "static". I want to select between a few high level categories (i.e. Frameworks, Utilities, Drivers). I want to specify the version of LV that the package should support (it should take older package versions into account). Each package should provide some key facts: Name, version, author, summary, picture, rating, price, download count, download button. I want to open the details page if I find a package that I might like. I want to scroll the page, not a frame inside the page. In my opinion there is no "right" way to browse for packages, so it should rather provide options that users are willing to use and should make some educated guesses on what default settings and key facts are important (maybe do UATs?). Since there are already a few links to our favorite pages, here is one of mine. It is for a game, but I think the "card" design could work for GCentral as well: https://mods.factorio.com/
    3 points
  26. on that note, stumbled upon this last night...
    3 points
  27. View File Y Controls Support - Version 2.0.2.0 Installer.zip Installs support for Y Controls. Requires LabVIEW 2019 or later. Submitter paul_cardinale Submitted 05/19/2021 Category General LabVIEW Version Not Applicable License Type BSD (Most common)  
    2 points
  28. LINX now is supported on commercial applications starting in 2020 BTW. Your opinion is valid, and you have reasons for it, but I think it might be a bit of forest from the trees situation here. LabVIEW tends to have a one or two major bullet points of new features with each release, with many smaller improvements that are less noteworthy. Some of these aren't very applicable to me and I don't see the benefit of the update, but I can still recognize that a bunch of effort was put into a making it into the release, and makes me think NI isn't sitting idle. I know I made a list like this in the past when a similar topic has come up but I'm having a hard time finding it. 2012 - Loop Tunnel improvements with concatenating, conditional, indexing, and last value / Project Templates 2013 - Improved Web Services / WebDav / SMTP / Bookmark Manager 2014 - Actor Framework (I might be off by a version or two) / 64 bit Mac and Linux support 2015 - Custom Right Click Framework 2016 - Channel Wires 2017 - VIMs / Forward Compatible Runtime 2018 - Command Line Interface / Python integration / Type Specialized Structure for Improved VIMs 2019 - Sets and Maps 2020 - Interfaces for classes / Free Community Edition with Application Builder And here are a few of my favorite features that I can't remember what version they were added. Error Ring, Improved VI calling under Application Control for starting asynchronous processes and static VI references, DVRs, Conditional Disables based on environment or Project variables, Linux Real-time operating system, allowing for 3rd party and open source tools to be installed and called with the System Exec, and then adding an embedded HMI, User Events, LINX toolkit for running LabVIEW VIs natively on a Raspberry Pi, or controlling an Arduino connected to the host, QuickDrop's plugin system allowing for all kinds of tools, filtering on search results, improved performance of tree and listbox controls, NIPM, and loads or more scripting functions with more added with each version. I sure hope LabVIEW has a future because I've hitched my career to it. But even if NI closed its doors tomorrow I feel like I'd still be using it for another 10 years or so, or until it didn't run on any supported version of Windows. But I feel your concern, and if I were a junior engineer starting out, I would feel safer in another language.
    2 points
  29. @The Q started such a thing on the LabVIEW Wiki: https://labviewwiki.org/wiki/Text-Based_terminology
    2 points
  30. Boss: There's no "i" in team. ShaunR: There's no "u" either but there is "me".
    2 points
  31. I've fixed quite a few bugs and changed the way timestamps are handled, now using JDP Science Common VIs for RFC-3339 You can follow on GitHub
    2 points
  32. Here's my tuppence. Get an indication from your IT dept when thy expect the issue to be resolved then get your project manager to bill the time against IT's budget. Add to your project plan the time they indicated and add a note that the project delivery date will be be delayed by at least that amount of time. Keep doing that if/when their indicated time expires until they resolve the issue. If you have weekly project meetings, make sure it is in the minutes and get an action on IT to resolve the issue and require status updates from IT so it never drops off the agenda. Ensure the IT issue is elevated to a high project risk category. Then. Download the VISA executable installer from "http://download.ni.com/evaluation/labview/ekit/other/downloader/" and tell your project manager you *may* have a temporary, sub-optimal solution which you are investigating for this particular package but that you fully expect continuing issues that may not have similar solutions even if you are successful this time, which is not guaranteed..
    2 points
  33. I recently ran into an interesting problem: some calculations I was doing in which I used parallelized loops were taking an inordinate amount of time (and consuming 100% of my i9 cores). Turning to profiling to figure out where I might be bugging out or able to find some optimization, I realized that the most active subVI was this: Error Cluster from Error Code.vi There is an interesting discussion elsewhere about why this VI is a nuisance (even in its modernized version), which is compounded by the facts that: - it is randomly used by NI in its code (some error codes are never converted, let alone passed, so good luck to figure out why your code fails) - there is no particular discipline (from NI) on how it is used (for instance, random error codes (aka 1) are plumped on the diagram and connected to said ECfEC VI - it is used in locked VIs (super secret ugly code, presumably) For kicks, I zapped it and replaced it by a simple version of mine everywhere I could (that is, except in the locked VIs) and reran my calculation. Same symptoms. The locked VIs were for sure not the problem, as they were not called during the calculation, so I had to find out where this VI was called from and narrowed it down to one caller. I opened up the diagram... and did not find it there. However, I had two Error Ring "constants" on the diagram which, you probably know that or have figured it by now, I didn't, calls ECfEC.vi. One of the Error Ring, O Irony!, was a "no error" Error Ring "constant" (no comment): Therefore, merely running that subVI (which was supposed to be quasi-instantaneous), was now launching LabVIEW into the ECfEC.vi maze and hogging my computer. I have now removed the incriminating Error Rings and moved on, but I thought this potential issue should be better advertised. My 2 cts.
    2 points
  34. I am just starting on trying to be able to use Python code from a LabVIEW application (mostly for some image analysis stuff). This is for a large project where some programmers are more comfortable developing in Python than LabVIEW. I have not done any Python before, and their seem to be a bewildering array of options; many IDE's, Libraries, and Python-LabVIEW connectors. So I was wondering if people who have been using Python with LabVIEW can give their experiences and describe what set of technologies they use.
    2 points
  35. 1. What type of source control software you are using? Git on GitLab with Git Fork or Git Tower 2. You love it, or hate it? LOVE it!! 3. Are you forced to use this source control because it's the method used in your company, but you would rather use something else We get to choose, and git is the scc system of our choice. 4. Pro's and Con's of the source control you are using? Pro: Decentralised, very popular (i.e. many teams use it), very flexible, very fast, feature branches, tagging, ... Con: Complicated to set up if using SSH, Exotic use cases/situations can be very hard to solve and sometimes need turning to the command line 5. Just how often does your source control software screw up and cause you major pain? Never. It's always me (or another user) doing something wrong. For our internal team of 5, it never happens. For our customers, the occasional problem occurs. As mentioned above, we use Git Fork and/or Git Tower as our graphical UIs to git. IMHO, using a graphical client is soo superior to working with git on the command line (but I'm posting to a LabVIEW forum, so who am I telling this?). In order to avoid having to merge VIs (we actually do not allow that in our internal way of working), we use the gitflow workflow. It helps us with planning who's working on which parts of an application (repo). I honestly can count the number of unexpected merge conflicts in the last few years with one hand. On top of git, many management systems like GitHub, GitLab, Bitbucket etc. offer lots of functionality like merge requests, integration with issue tracking, and other project-management-type features.
    2 points
  36. Side note: I am definitely going with the native node, in LabVIEW 2020. I think NI is underselling it by providing examples that are far too simple (including no examples of cluster-to-tuple or how to hold state data in your py module). I prototyped my analysis template py module yesterday and it went easy. No head banging frustrations at all.
    2 points
  37. I checked with LabVIEW R&D, they said there is no way to determine this information in G code.
    2 points
  38. Adding to crossrulz suggestion. If you do want simple I2C or SPI, using an Arduino is a great solution in LabVIEW. NI has their LINX toolkit, which downloads a known set of firmware to the Arduino, and then in LabVIEW you have the LINX palette which basically tells the microcontroller to execute some set of steps. These commands are just serial commands, and when plugged into USB should appear as a normal VISA device in LabVIEW. I don't have a pharlap system to test with but with other remote devices I've used they just get enumerated when they are plugged in. I personally have not used pharlap much and am unaware of the difficulties getting that hardware working on it.
    2 points
  39. You got it right. "Delete branch" will delete the branch on your fork. It does not affect the clone on your computer. The idea is that every pull request has its own branch, which, once merged into master, can safely be deleted. This can indeed be confusing if you are used to centralized VCSs. In Git, any repository can be remote. When you clone a Git repository, the source becomes remote to the clone. It doesn't matter if the remote is on your computer or on another server. You can even have multiple remote repositories if you wanted to. You'll notice that the clone - by default - only includes the master branch. Git allows you to pull other branches if you want, but that is not mandatory. Likewise, you can have as many branches of your own without having to push them to the remote (sometimes you can't because they are read-only). On GitHub, when you fork a project, the original project becomes remote to your fork (you could even fork a fork if you wanted to...). When you clone the fork, the fork becomes remote to your clone. When you add a branch, you can push it to your fork (because you have write-access). Then you can go to GitHub and open a pull request to ask the maintainer(s) of the original project to merge your changes (because you don't have write-access). Once merged, you can delete the branch from your fork, because the changes are now part of master in the original project (there is no reason to keep it). Notice that the master branch on your fork is now behind master of the original project (because your branch got merged). Notice also that this doesn't affect your local clone (you have to delete the branch manually). You can now update your fork on GitHub, pull from your fork, and finally delete the local branch (Git will warn you about deleting branches that have not been merged into master). There is a page which describes the general GitHub workflow: Understanding the GitHub flow · GitHub Guides Hope that helps.
    2 points
  40. That's not very American. Where's the guns?
    2 points
  41. To be honest, I would probably not put them anywhere in that view. It’s called Class View for a reason. 😀 It hadn’t really occurred to me that you would want to have the non-class VIs visible in there. Is that a flaw or just out of the box thinking?
    2 points
  42. The other day, I wrote up a lengthy response to a thread about NXG on the LabVIEW Champions Forum. Fortunately, my computer blue-screened before I could post it--I kind of had the feeling that I was saying too much and it was turning into a "drunk history of NXG" story. Buy me some drinks at the next in-person NIWeek/CLA/GLA Summit/GDevCon, and I'll tell you what I really think! First, I'll say that abandoning NXG was a brave move and I laud NI for making it. I'm biased; I've been advocating this position for many years, well before I left NI in 2014. I called it "The Brian Powell Plan". I'm hopeful, but realistic, about LabVIEW's future. I think it will take a year or more for NI to figure out how to unify the R&D worlds of CurrentGen and NXG--how to modify teams, product planning, code fragments, and everything else. I believe the CurrentGen team was successful because it was small and people left them alone (for the most part). Will the "new world without NXG" return us to the days of a giant software team where everyone inside NI has an opinion about every feature and how it's on the hook for creating 20-40% revenue growth? I sure hope not. That's what I think will take some time for NI to figure out. Will the best of NXG show up easily in CurrentGen? No! But I think the CurrentGen LabVIEW R&D team might finally get the resources to improve some of the big architectural challenges inside the codebase. Also, NI has enormously better product planning capability than they did when I left. I am optimistic about LabVIEW's future.
    2 points
  43. The future is Python for many of the applications, it is easy to get in to for newcomers to programming, works great has a strong package management system and large community, and can be applied to virtually any OS you can think of, as well as it can be even used to program GPUs if you are so inclined. The huge advantage of using G and LabVIEW is that paired with NI Hardware, in the hands of someone skilled with LV you can bang out a solid prototype of a product or a Test and Measurement system so fast it will make people's heads spin. NI hardware is absolutely top notch for High End use cases or rapid prototyping, complex one offs , scientific use or complicated Test and Measurement end of line test space. However in the IoT there is strong competition for the NI SB RIO line up, for a SB RIO you will pay 1500 US. There are so may cheap programmable & capable pieces of hardware, such as Jetson Nano (ARM7+NVidia GPU for vision) or Raspberry PI (ARM7 1.4 GHz with 8GB RAM) or even Asus Tinker Board ... which will serve so many purposes and have onboard GPIO and can be purchased for 50-60 bucks ... that in that space Python paired with Linux knowledge is really making headway. And if you want to go with ZynQ from Xlinix you can get a board with FPGA ~300 Bucks, which is basically the same HW as SB RIO, all you have to do is use different software tools. If NI would consider unlocking the ability use NI FPGA with the ability to deploy to non NI Hardware ... I think this is where G absolutely would take off like wildfire and be used on millions of IoT devices everywhere in the world that are powered by and ARM7 + FPGA module... but as it stands now if you use NI FPGA you must deploy on a target you've purchased from NI. I'll really be a stickler but if we're talking about the programming language we should talk of G. LabVIEW the IDE. Never say never, I am not aware of any other graphical programming language which could be used for general purpose programing and is as complete as G. If you have come across something interesting I would like to look at it, but I feel that it would be at best an academic project, which I would not use in production code.
    2 points
  44. We modify code while LV is running all the time 🙂 We even have a Quick Drop shortcut that toggles a Sub-VI's Loading Option between "Load with callers" and "Reload for each call".
    2 points
  45. Welcome to Lava! There is no way to intercept WaitForNextEvent, other than generating the event it is waiting for or the timeout occurring. It is, however, possible to handle the event directly in LabVIEW. Are you familiar with .NET Event Callbacks? Use the Register Event Callback function to execute a callback VI to generate a user event for an Event Structure. The event structure can then take care of the timeout as well as other events that may end execution prematurely. Here is an example. For simplicity reasons I didn't include error handling and the callback VI, but you should be able to work the rest out from here. Disclaimer: I haven't tested this, but I'm certain it will do what you want.
    2 points
  46. I'm kind of unsure whether this could be accomplished with a common File Dialog or an underlying yellow File Dialog and its ExtFileDialog internal function. But you could switch to .NET and use some third party libraries available. One of those is BetterFolderBrowser proposed here. I have just tested it on both 32- and 64-bit LabVIEW 2019/2020 and it works great. Here's the basic example: Untitled 1.vi
    2 points
  47. I don't think that would work well for Github and similar applications. FINALE seems to work (haven't tried it yet) by running a LabVIEW VI (in LabVIEW obviously) to generate HTML documentation from a VI or VI hierarchy. This is then massaged into a format that their special WebApp can handle to view the different elements. Github certainly isn't going to install LabVIEW on their servers in order to run this tool. They might (very unlikely) use Python code created by Mefistoles and published here to try to scan project repositories to see if they contain LabVIEW files. But why would they bother with that? LabVIEW is not like it is going to be the next big hype in programming ever. What happened with Python is pretty unique and definitely not going to happen for LabVIEW. There never was a chance for that and NI's stance to not seek international standardization didn't help that, but I doubt they could have garnered enough wide spread support for this even if they had seriously wanted to. Not even if they had gotten HP to join the bandwagon, which would have been as likely as the devil going to church 😀.
    2 points
  48. I have the same board. This is what I did. Download the RaspBerry Pi Imager v1.2 and used that to format a microSD card used for the Raspi. Select the first recommended OS: Raspbian Boot up the Pi with keyboard and mouse. Walk through the startup config (installing updates, etc) and wifi setup. When asked to enter a new password, ignore this and just click next. Reboot as suggested. Go to RaspBerry Pi configuration and on the Interfaces tab, enable SSH Open a command prompt on raspi and type: sudo raspi-config Select 7: Advanced Select A1: Expand file system. (this will expand the file system if it's not already expanded) Reboot In LabVIEW select from the Tools > MakerHub > LINX > LINX Target Configuration Click the connection button and it should connect. Hostname: raspberrypi, username: pi, password: raspberrypi. These are all the defaults. Click the Installation button. Click the Update button on the installation page. it should go through the process of doing the update. At some point the raspi will reboot. this is part of the process. When the raspi reboots, the LINX target configuration dialog will lose connection and give an error. This is normal. it will take a while to reconnect. Eventually, it should come back. If not then click the Connection button and try to connect. The Installation panel should now show the installed version: Click on Launch Example. In LabVIEW, right-click on the Raspberry Pi Target and select connect. This should should show the deployment progress dialog and after connection a small green indicator will appear in the target tree You should be able to execute (run) the VI now. Everything should be good to go now. Sometimes you will try to connect, in the project, and then you will get an error not connecting. If that happens, just wait and try again. I find that the connection is more reliable if you use the IP address of the raspi instead of the DNS name. To specify an IP address, right-click on the Raspberry Pi target and select Disconnect. Then right-click again and select properties. In General, enter the IP address of the raspi. Then click OK. To find the IP address of the raspi, type: hostname -I in a raspi command prompt. I think the reason why the log message states Raspberry Pi 2 B, is because the LINX toolkit is old and that message was probably not updated to handle all the new boards that have come out since release? Not sure. i'm getting the same message on my system even though the board is Pi3.
    2 points


×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.