Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation since 11/27/2020 in Posts

  1. Dear Santa NI I am now in my 40s with youngish kids, so despite the fact that all I got for Christmas this year was a Pickle Rick cushion I am not actually complaining. However, I would like to get my order in to the Elves as early as possible. This is my wishlist, in no particular order. I expect this list will not be to everyone's taste, this is ok, this is just my opinion. Make LabVIEW free forever. The war is over, Python has won. If you want to be relevant in 5 to 10 years you need to embrace this. The community edition is a great start but is is probably not enough. Note: I accept it might be necessary to charge for FPGA stuff where I presume you license Xilinx tools. NI is and has always been a hardware company. Make all toolkits free. See the above point. Remove all third party licensing stuff. Nobody makes any money from this anyway. Encourage completely open sharing of code and lead by example. Take all the software engineering knowledge gained during the NXG experiment and start a deep refactor of the current gen IDE. Small changes here please though... we should not have to wait 10 years. Listen to the feedback of your most passionate users during this refactor. NXG failed because you ignored us and just assumed we would consume whatever was placed in front of us. I am talking about the people like those reading this post on Christmas day and their spare time because they are so deeply committed to LabVIEW My eyes are not what they used to be, so please bring in the NXG style vector graphic support so I can adjust the zoom of my block diagram and front panel to suit accordingly As part of the deep refactor, the run-time GUI needs to be modernised. We need proper support for resizable GUIs that react sensible to high DPI environments. Bring the best bits of NXG over to current gen. For example the dockable properties pane. (Sorry not much else comes to mind) Remove support for Linux and Mac and start to prune this cross compatibility from the codebase. I know this is going to get me flamed for eternity from 0.1 % of the users. (You pretty much made this decision for NXG already). Windows 10 is a great OS and has won the war here. Get rid of the 32-bit version and make RT 64-bit compatible. You are a decade overdue here. Add unicode support. I have only needed this a few times, but it is mandatory for a multicultural language in 2021 and going forward Port the Web Module to Current Gen. All the news I have heard is that the Web Module is going to become a standalone product. Please bring this into Current Gen. This has so much potential. Stop adding features for a few years. Spend the engineering effort polishing. Fix the random weirdness we get when deploying to RT Open source as many toolkits as you can. Move the Vision toolkit over to OpenCV and make it open source Sell your hardware a bit cheaper. We love your hardware and the integration with LabVIEW but when you are a big multiple more expensive than a competitor it is very hard to justify the cost. Allow people to source NI hardware through whatever channel makes most sense to them. Currently the rules on hardware purchasing across regions are ridiculous. Bring ni.com into the 21st century. The website is a dinosaur and makes me sad whenever I have to use it Re-engage with universities to inspire the next generation of young engineers and makers. This will be much easier if the price is zero Re-engage with the community of your most passionate supporters. Lately it has felt like there is a black hole when communicating with you Engineer ambitiously? What does this even mean? The people using your products are doing their best, please don't patronise us with this slogan. Take the hard work done in NXG and make VIs into a non-binary format human readable so that we can diff and merge with our choice of SCC tools Remove all hurdles to hand-editing of these files (no more pointless hashes for "protection" of .lvlibs and VIs etc) Openly publish the file formats to allow advanced users to make toolkits. We have some seriously skilled users here who already know how to edit the binary versions! Embrace this, it can only help you. Introduce some kind of virtualenv ala Python. i.e. allow libraries and toolkits to be installed on a per-project basis. (I think this is something JKI are investigating with their new Project Dragon thing) For the love of all that is holy do not integrate Git deeply into LabVIEW. Nobody wants to be locked into someone else's choice of SCC. (That said, I do think everyone should use Git anyway, this is another war that has been won). That is about it for now. All I want is for you guys to succeed so my career of nearly 20 years does not need to be flushed down the toilet like 2020. Love you Neil (Edited: added a few more bullets)
    14 points
  2. I did not know. That possibility was not even on my radar. Even though the drumbeat of bad news had been going for a while, most corporations refuse to change direction on a bad decision. NI showed more sentience than I usually expect from massed humans: the sunk cost fallacy is a trap that is very hard to get out of. I figured the very good engineers on NXG would either surge through it and make it fly or we would bankrupt the company trying. That's the pattern established by plenty of other companies. Mixed. I spent 4.5 years directly working on NXG (2011 to 2016) and countless hours in later years working with the NXG team to design a future G. I really wanted it to fly. There is so much good in that IDE, including some amazing things that I just don't see how we ever do in the LabVIEW codebase without just shattering compatibility. But at the same time, I was watching good friends toil on something that the market just wasn't adopting. The software had some problems that were going to take a long time to solve. The issues were all solvable, but the time needed to fix them... that was harder and harder to justify. NXG gave us a GREAT platform for other software: Veristand, FlexLogger, etc. That code is extremely modular and can be repurposed for all sorts of tools. We also learned a heck of a lot by building NXG -- some things that I thought we could never do in LabVIEW now seem possible. NXG gave us a sandbox to learn a whole lot about modern software engineering without putting the delivery schedule for mature software at risk, and those practices [have been|are being] brought back and applied to LabVIEW -- that will decrease cost of maintaining older code. All in all, NXG was valuable -- the expenditure was not a complete loss. I am very sorry to the few customers who did migrate to NXG. We don't have a reverse migration tool, and building one would be absurdly expensive. Leaving those folks stranded is going to hurt -- I hate letting our customers down and just saying, "We have no solution to help you." There aren't many of those folks (that's essentially the problem), but they do exist, and they are basically stuck having to rewrite their NXG apps in some other tool. I can only hope that they pick LabVIEW. I don't know if this will help us or hurt us with customers in the future... on one hand, people may say, "Well, you let us down on NXG, why should we trust you will be there for us on any new products?" On the other hand, this decision was clearly made listening to customer feedback, and it takes a lot of humility to swallow a loss that big, which may make customers trust our judgement more in the future. And, really, there's nothing to compare with the scale of NXG -- an entire computing platform -- so this does seem like something that needs to be judged in isolation. I really like programming in G. I like being able to expand G to make it more powerful. I wanted NXG to succeed because it had the potential to be a better G. It failed. Its failure means more resources for the existing LabVIEW platform, which will directly help our customers in the short run. It leaves open some big questions for the long run. So, in summary: I think it was a decision that had to be made, and I'm happy to work for a company that can learn from new data, then admit a mistake, and then figure out how to correct it.
    12 points
  3. Two questions about this: 1. Does something like this already exist? 2. Is this something that could be useful? Every once in a while I need dynamic UI components that can be generated at runtime. One nice thing to use for this is a picture control; however it doesn't lend itself as well to keeping other pieces of function such as mouse click events and such. I put together a mini library of UI functions for this that has the ability to be extended. The UI can be generated dynamically at runtime and be any picture thing that you can draw. Using Standard layout techniques that you might find in other GUI libraries. The hierarchy generation can always be simplified by using some type of templating string. Example1.vi Front Panel on Pgui2.lvproj_My Computer _ 2021-07-02 14-03-54.mp4
    6 points
  4. So I managed to find the underlying issue and at least one solution to it - sharing the information here. The Icon editor enumerates the font list in linux with command 'fc-list'. With OpenSUSE 43.2 / LV 2016 combo the fontlist looks like this: The listed fonts are essentially a list of the font files with full paths, and that does not work well with the font tool LV uses. If this list looks similar to yours and the fonts are not looking pretty, I have a solution for you - read on. To fix this a command 'fc-list : family' should be used instead, to come up with a list like this: There are two solutions (and I'm sure there are more) - you can decide which one to pursue depending on your expertise. As the Icon Editor in LV2016 (starting with LV2011 I think) is in packed library for execution optimizing purposes, the Icon Editor code can't directly be altered to fix this. One can come up with a solution where the command 'fc-list' is overridden in linux so it will always use the (proper for LabVIEW) format 'fc-list : family'. This may have some unwanted side effects if other programs use the command in similar fashion, so this may not be the best solution. It would be pretty easy to use for assessing whether this could be your problem also. There are multiple trivial ways of implementing this, so I won't be giving an exact solution - here is a list of them: https://lmgtfy.app/?q=override+command+in+linux Darren Nattinger has provided the source code for Enhanced Icon editor in https://forums.ni.com/t5/Enhanced-Icon-Editor/Icon-Editor-Source-Files-for-LabVIEW-2016/m-p/3538808 - You can replace the packed library LabVIEW uses as editor with this source. There are easy 3 step instructions on the site - even I managed to do that. Please give kudos to Darren should you go this way. When you have the shipped Icon Editor replaced with the source, you can directly edit the file in /usr/local/natinst/LabVIEW-2016/resource/plugins/NIIconEditor/Miscellaneous/Font/Linux.vi so it uses the correct form, like this:
    5 points
  5. Does it help to re-ask the question as "where should LabVIEW have a future?" It is not difficult to name a number of capabilities (some already stated here) that are extremely useful to anyone collecting or analyzing data that are either unique, or much simpler, using LabVIEW. They're often taken for granted and we forget how significant they are and how much power they unlock. For example (and others can add more): FPGA - much easier than any text-based FPGA programming, and so powerful to have deterministic computational access to the raw data stream Machine vision - especially combined with a card like the 1473R, though it's falling behind without CoaXPress Units - yes no-one uses them , but they can extend strict programming to validation of correct algorithm implementation Parallel and multi-threaded programming - is there any language as simple for constructing parallel code? Not to mention natural array computations Real-time programming Data-flow - a coherent way of treating data as the main object of interest, fundamental, yet a near-unique programming paradigm with many advantages and all integrated into a single programming environment where all the compilation and optimization is handled under the hood (with almost enough ability to tweak that) Unfortunately NI appear to be backing away from many of these strengths, and other features have been vastly overtaken (image processing has hardly been developed in the last 10 years, GUI design got sidetracked into NXG unfortunately). But the combination of low-level control in a very high-level language seems far too powerful and useful to have no future at all.
    5 points
  6. I've just implemented this and posted a beta: https://forums.ni.com/t5/JDP-Science-Tools/BETA-version-of-JSONtext-1-5-0/m-p/4136116 Handles comments like this: // Supports Comments { "a":1 // like this "b":2 /*or this style of multiline comment*/ "c": 3 /*oh, and notice I'm forgetting some commas A new line will serve as a comma, similar to HJSON*/ "d":4, // except I've foolishly left a trailing one at the end }
    5 points
  7. Since Latin for six is "sex", we could have gone for "sexidecimal".
    4 points
  8. View File BlueSerialization Serialize and deserialize LabVIEW classes using either JSONtext or TOML Official documentation: https://github.com/justiceb/BlueSerialization Submitter bjustice Submitted 08/05/2021 Category General LabVIEW Version 2018 License Type BSD (Most common)  
    4 points
  9. Are Italian LV developers more prone to producing spaghetti code? 🤨
    4 points
  10. VISA is ok but not everyone uses it so...
    4 points
  11. I have been coming round to supporting comments in JSONtext, at least for ignoring comments on reading (which is quite simple to implement, I think). And possibly features to be more forgiving of common User error, such as missing commas.
    4 points
  12. Coming from my personal experience, I still lean towards no. I had a discussion with Nancy Hanson about this and we came the the conclusion that the CLA was not a destination, but the opening of doors to learn (yes, this was alluding to the CLA Summits). Personally, I had 0 experience using OOP when I got my CLA. But after my second CLA Summit, I found an application that deserved a very basic OOP architecture. The CLA Summit opened that door for me. Now I would say ~50% of what I do is OOP. There is still A LOT you can do effectively without OOP. And keep in mind that part of a CLA is to make architectures that your less experienced developers can use and understand. If they can't use OOP, then your OOP architectures will not be effective. So should it be REQUIRED? No. Highly recommended? Absolutely.
    4 points
  13. Try: apt install xfonts-75dpi xfonts-100dpi Then reboot.
    3 points
  14. "The software you are refactoring had one and it is too costly to change the training materials". How's that? It is likely that a great deal of pressure had to be applied to even allow a rewrite and the manager probably had a hard time arguing for a budget to do it. Getting hot under the collar over a button isn't a hill I would die on. I would concentrate of making my life easier supporting it going forward than what it looked like-which was decided a long time ago.
    3 points
  15. Shameless plug but feedback would be great since there's a lot that's changed. A few more eyes on it would help immensely before going live. The Encryption Compendium for LabVIEW (ECL) has had a major update. It's completed and a couple of weeks away from release as the documentation is in progress so there may be a few errors on that front. It's a commercial package so the gibs free crowd will, I hope, be disappointed. So what's new? IPv6 support (Thanks Rolf for your help with that). I've been wanting this for sooooo long but it required moving away from the NI primitives and nasty text programming. I bit the bullet and now have it, although a lot less hair SFTP Since NI released their poultry offering I thought I'd productionise the SFTP I had in my back pocket for quite a while and offer something actually useful. It comes with a full client example and supports recursive uploads, downloads and deletions (be careful with that last one). It also uses events for progress and status feedback in a similar fashion to the Websockets below. The event topology will be come a common feature of protocols in future additions. Websockets The ECL is a better home than a separate product since it was very difficult to distribute when trying to support TLS. Reworked from the ground, up it sits nicely in ECL and is the start of more encrypted protocol support in the future. Oh yes. Almost forgot. The ability to use Windows Certificate Store because ... why not? (and I wish I had thought of it ) If anyone would like to play before its release proper in a couple of weeks time, I can give you a trial copy (valid for 30 days). PM me and I'll send you a link.
    3 points
  16. I can say without any doubt that online clothing stuff is good quality or not.
    3 points
  17. So this was posted on the NI forums: https://forums.ni.com/t5/LabVIEW/Our-Commitment-to-LabVIEW-as-we-Expand-our-Software-Portfolio/td-p/4101878?profile.language=en
    3 points
  18. I like resizable UIs, and that often means custom code to handle some of the operations in the UI. One such feature I like is a vertical array, that shows more rows, as the window size gets larger and can show more. I've used code like this a few times and figured the community might like it too. When doing this there are times when I want a user to be able to add more rows, and some times that it needs to either be static, or set by some other outside variable. So in this demo is also a way to link a vertical scrollbar that is a separate control, and have it move, and get set, so that it looks like it is linked to the array control. But this will coerce, and not allow more rows to be seen, than the number of elements in the array. At the moment this only supports 1D arrays, and only in the vertical direction. Code is saved in 2020. Array Resize and Scrollbar Link.zip
    3 points
  19. Or even better: Replace "Write To Text file"/"Read From Text File" with "Write To Binary File"/"Read From Binary File". The output of "Flatten to String" is not text. (String != Text)
    3 points
  20. Right-Click on the "Read From Text File" method and unselect "Convert EOL". The number "13" (0xD or Carriage Return) gives it away. If you look at the HEX representation of what you save and what you read from file, you'll notice the 0xD gets converted to 0xA unless you uncheck that box. Tip: If you deal exclusively with HEX/binary data, you might want to save it using byte arrays instead. The result will be the same in your file, but you won't get bitten by hidden config parameters in your node...
    3 points
  21. LabVIEW Software Engineers, here's your opportunity to work with us at SpaceX! I'm hiring for a Sr. Software Engineer on the Ground Software team. We build control and monitoring systems running the Falcon launchpads, mission systems launching astronauts to the International Space Station onboard Dragon, and a fleet of sea-faring recovery vessels bringing spacecraft and rockets back to shore. The screens you see in Mission Control are the UIs we build. Read more about the role and apply here: https://boards.greenhouse.io/spacex/jobs/5480997002?gh_jid=5480997002
    3 points
  22. There are many links on the Internet to tell you how to configure git to use custom tools for VI. Many are wrong. Yesterday, I and another developer outside NI worked through the sequence and got it working repeatably on both of our machines. Here is the process. Save both of the attached files someplace permanent on your hard drive that is outside of any particular git repo. We used C:\Users\<<username>>\AppData\Local\Programs\GIT\bin _LVCompareWrapper.sh_LVMergeWrapper.sh Modify your global git config file. It is saved at C:\Users\<<username>>\.gitconfig You need to add the following lines: [mergetool "sourcetree"] cmd = 'C:/Users/smercer/AppData/Local/Programs/GIT/bin/_LVMergeWrapper.sh' \"$BASE\" \"$LOCAL\" \"$REMOTE\" \"$MERGED\" trustExitCode = true [difftool "sourcetree"] cmd = 'C:/Users/smercer/AppData/Local/Programs/GIT/bin/_LVCompareWrapper.sh' \"$REMOTE\" \"$LOCAL\" [merge] tool = sourcetree [diff] tool = sourcetree That's it. There are lots of ways to edit the .gitconfig from the command line or by using SourceTree's UI... if you know those ways, go ahead and use them.
    3 points
  23. Yes, I don't assume you use many new features of LabVIEW from the last 10 years if you still develop in LabVIEW 2009.
    3 points
  24. NI didn't say they would be porting NXG features to 2021, but to future versions of LabVIEW. Technically such a promise would have been unfulfillable, since at the time the NXG demise was announced, LabVIEW 2021 was basically in a state where anything that was to be included in 2021 had to be more or less fully finished and tested. A release of a product like LabVIEW is not like your typical LabVIEW project where you might make last minute changes to the program while testing your application at the customer side. For a software package like LabVIEW, there is a complete code freeze except for breaking bug fixes, then there is a testing, packaging and testing again cycle for the Beta Release, which typically takes a month or two alone, then the Beta phase of about 3 to 4 months and finally the release. So about 6 months before the projected release date, anything that is not considered ready for prime time is simply not included in the product, or sometimes hidden behind an undocumented ini file setting. Considering that, the expectation to see any significant NXG features in LabVIEW 2021 was simply blue eyed and irrational. I agree with you that LabVIEW is a unique programming environment that has some features that are simply unmatched by anything else. And there are areas where its age is clearly showing such as lack of proper Unicode support, and related to that the lack of support for long path names. Personally I feel like I could tackle the lower level part of full Unicode support in LabVIEW including full Unicode path support quite easily if I was part of the development team, but have to admit that the higher level integration into front panels and various interfaces is a very daunting task that I have no idea how I would solve it. Still, reworking the lower level string and path management in LabVIEW to fully support Unicode would be a first and fundamental step to allow the other task of making this available to the UI in a later stage. This low level manager can exist in LabVIEW even if the UI and higher level parts don't yet make use of it. The opposite is not possible. That is just one of many things that need some serious investment to make the whole LabVIEW platform again viable for further development into the future. This example also shows that some of the work needed to port NXG features back to LabVIEW require first some significant effort that will not immediately be visible in a new LabVIEW version. While such a change as described above is definitely possible to do within a few months, the whole task of making whole LabVIEW fully Unicode capable without breaking fundamental backwards compatibility, is definitely something that will take more than one LabVIEW version to eventually fully materialize. There are a few lower hanging fruits that can help prepare for that and should have been done years ago already but were discarded as "being already fixed in NXG" but the full functionality just for full Unicode support in LabVIEW is going to be a herculean task to pull off, without going the path of NXG to reinvent LabVIEW from scratch (which eventually proved to be an unreachable feat). My personal feelings about the future of LabVIEW are mixed. Not so much because LabVIEW couldn't have a future but because of the path NI as a company is heading. They have been changing over the last few years considerably, from an engineering driven to a management driven company. While in the past, engineers had some real say in what NI was going to do, nowadays it's mostly managers who see Excel charts, sale numbers and the stock market exchange as the main decision making thing for NI. Anything else has to be subordinated to the bigger picture of a guaranteed minimum yearly growth percentage and stock price. The traditional Test & Measurement market NI has served for much of its existence is not able to support those growth numbers anymore. So they are making heavy inroads into different markets and seem to consider the traditional T&M market by now just as a legacy rather than a significant contributor to their bottom line.
    3 points
  25. Working on the next JSONtext functionality, which is features to improve support of JSON Config Files. See https://forums.ni.com/t5/JDP-Science-Tools/BETA-version-of-JSONtext-1-6/td-p/4146235
    3 points
  26. I've encountered a black imaq image display in exes, solved by unchecking the box to allow running in a later runtime version. Don't know if that is related to your problem.
    3 points
  27. I've been toying with the idea of implementing a new TOML library for LabVIEW. I've been using OpenG variant config for years, but I would prefer to use a more standardized format for my ini config files. TOML is the best candidate for this. Erdosmiller's library is pretty good, but as the author points out, it is no longer maintained, and it didn't gracefully handle all of the datatypes that I wanted to use. It would be great to have the flexibility of jsontext but for TOML format. I'll post back here if I manage to get the project off the ground.
    3 points
  28. Update: I used the dll call from the link @dadreamer provided, and made a Messenger-Library "actor" that I can use for debugging. Already found a couple of bugs with it.
    3 points
  29. Shaddap-a you face!
    3 points
  30. I agree with all your points. Definitely on making LabVIEW free for all purposes, if not even open source. NI may hang on to the mega-costumers for a while with its current business model. But eventually it'll get marked as a legacy software and slowly replaced by younger people with newer ideas and experience in different, more accessible languages. The idea that a company can sell a programming language these days is ridiculous when there are so many free alternatives. I am not counting the community edition. It needs to be free for any purpose.
    3 points
  31. I don't really expect many new language features or UX improvements in LabVIEW just because they stop working on NXG. From what we know there are only a few knowledgeable people at NI who are intimately familiar with the codebase and some of its intricate details which fundamentally drive LabVIEW. There are also many customers who rely on that technology for their own business. Because of that, NI can't just throw more developers at it and change LabVIEW fundamentally unless they find a way to stay compatible or take a bold step and do breaking changes (which are inevitable in my opinion). LabVIEW will probably stay what it is today and only receive (arguably exciting) new features that NI will leverage from the NXG codebase to drive their business. Unfortunately NI hasn't explained their long-term strategy (I'll assume for now that they are still debating on it). In particular what LabVIEW/G will be in the future. Will it be community-driven? Will it be a language that anyone can use to do anything? Will it be the means to drive hardware sales for NI and partners? Will it be a separate product altogether, independent of NI hardware and technology? There are also a lot of technology-related topics they need to address. Does LabVIEW Support Unicode? - National Instruments Comparing Two VIs in LabVIEW - National Instruments (ni.com) Error 1316 While Using .NET Methods in LabVIEW - National Instruments (ni.com) Using NULL Values or Pointers in LabVIEW - National Instruments (ni.com) Not to forget UX. The list is endless and entirely different for any one of us. If and when these will be addressed is unknown. Don't get me wrong, I'm very excited and enthusiastic about LabVIEW and what we can do with it. My applications are driven by technology that other programming languages simply can't compete with. Scalability is through the roof. Need to write some data to text file? Sure, no problem. Drive the next space rocket, land rover, turbine engine, etc.? Here is your VI. The clarity of code is exceptional (unless you favor spaghetti). The only problem I have with it is the fact that it is tied to a company that want's to drive hardware sales.
    3 points
  32. The first time you mentioned this I thought it was a nice gesture, now I think you are just desperate for friends...or an alcoholic. I'm down.
    3 points
  33. TDF team is proud to propose for free download the scikit-learn library adapted for LabVIEW in open source. LabVIEW developer can now use our library for free as simple and efficient tools for predictive data analysis, accessible to everybody, and reusable in various contexts. It features various classification, regression and clustering algorithms including support vector machines, random forests, gradient boosting, k-means and DBSCAN, and is designed to interoperate with the Python numerical and scientific libraries NumPy and SciPy from the famous scikit-learn Python library. Coming soon, our team is working on the « HAIBAL Project », deep learning library written in native LabVIEW, full compatible CUDA and NI FPGA. But why deprive ourselves of the power of ALL the FPGA boards ? No reason, that's why we are working on our own compilator to make HAIBAL full compatible with all Xilinx and Intel Altera FPGA boards. HAIBAL will propose more than 100 different layers, 22 initialisators, 15 activation type, 7 optimizors, 17 looses. As we like AI Facebook and Google products, we will of course make HAIBAL natively full compatible with PyTorch and Keras. Sources are available now on our GitHub for free : https://www.technologies-france.com/?page_id=487
    2 points
  34. Well I started in April 1992, went to the US for 4 months in May and heard there that there was this big news about LabVIEW not being for Macintosh only anymore, but telling anyone outside of the company would be equivalent to asking to be terminated 😀. They were VERY secretive about this and nobody outside the company was supposed to know until the big release event. In fall of 1992 LabVIEW for Windows 3.1 was announced and the first version shipped was 2.5. It was quickly followed by 2.5.1 which ironed out some nasty bugs and then there was a 2.5.2 release later on that made everything more stable, before they went to go to release the 3.0 version which was the first one to be really multiplatform. 2.2.1 was the last Mac version before that and 2.5.2 was the Windows version. They could not read each others files. This was Windows 3.1 which was 16-bit and still just a graphical shell on top of DOS. LabVIEW used the DOS/4GW DOS Extender from Tenberry Software, that was part of the Watcom C development environment used to compile LabVIEW for Windows to provide a flat 32-bit memory model to the LabVIEW process, without nasty segment:offset memory addressing. It was also the reason that interfacing to Windows DLLs was quite a chore because of the difference in memory model between the LabVIEW environment and the underlying OS and DLLs. Only when LabVIEW was available for true 32-bit environments like Windows 95 and NT, did that barrier go away. NI was at that time still predominantly a GPIB hardware company. A significant part of support was for customers trying to get the various GPIB boards installed on their computers and you had at that time more very different computers architectures than you could count on both hands and feet. There was of course the Macintosh and the IBM compatible computers, with all of them running DOS which Windows computers still were. Then you had the "real" IBM computers who had abandoned the ISA bus in favor of their own, more closed down Microchannel bus and also were starting to run OS/2 rather than Windows and about a dozen different Unix based workstations all with their totally incompatible Unix variant. And then even more exotic beasts like DEC VAX computers with their own expansion slots. Supporting those things was often a nightmare as there was literally nobody knowing how these beasts worked except the software driver developer in Austin and the customers IT administrator. NI had just entered the data acquisition marked and was battling against more established manufacturers like Keithley, Data Translation, and some other small scale speciality market providers. The turning point was likely when NI started to create their own ASICS which allowed them to produce much smaller, cheaper and more performant hardware at the fraction of the cost their competitors had to pay to build their own products and still selling them at a premium as they also provided the full software support with drivers and everything for their own popular software solutions. With other manufacturers you usually had to buy the various drivers, especially for NI software, as an extra and some of them just had taken the blueprints of the hardware and copied them and blatantly told their customers to request the software drivers from their competitor as the hardware was register for register compatible with theirs. The NI ASICS made copying of hardware by others pretty much impossible so NI was never concerned about making their drivers available for free.
    2 points
  35. I take it you don't have a dedicated connection between the two devices. Things that can break the system that come to mind is the routing and if there are duplicate IP addresses on the network.
    2 points
  36. That's fairly paranoid considering that any VI, even when running in a PPL is basically still executing inside the same process. There are a lot more things it can do that could be much more dangerous, but you have to strike a balance between security and performance. Starting to isolate each PPL completely from the rest of the system would take up a huge amount of development effort and also cause a lot of performance loss. You wouldn't like that at all! VI server has some strict limitations when it is operating across LabVIEW contexts but limiting it even inside the same context would be to restrictive and it would also mean that you have to consider the entire scripting interface in LabVIEW as very dangerous. And yes if you use PPLs they could be swapped out by an attacker. But if that is really your concern you may have a lot of other more grave trouble. Who lets such a person even have access to that computer? Why would they attempt to attack a PPL on that system when they can have the entire cake and eat it too? It's many times easier to attack DLLs, yes even with signed DLLs, and take over the entire system, than trying to hack into a PPL with its proprietary format and only get a crude control over a single LabVIEW application on that system.
    2 points
  37. @The Q started such a thing on the LabVIEW Wiki: https://labviewwiki.org/wiki/Text-Based_terminology
    2 points
  38. My intension for this template was to fill the gap between the Actor Framework and standard QMH. I never wanted to recreate the AF. When I first got a look at the AF I didn't get it. After a tutorial on youtube I got the understanding of AF but I feel somehow a distance I can't describe to the AF. It feels like AF is free floating in Hyperspace or so. Because of this I wanted to create a template with some benefits of OOP but more grounded and I started this template. And yes, AF is great way to get things done but in my opinion not very easy to understand. Thanks, Peter
    2 points
  39. I checked with LabVIEW R&D, they said there is no way to determine this information in G code.
    2 points
  40. Shameless plug: Proper Way to Communicate Over Serial
    2 points
  41. Actually swiss cheese doesn't have holes, french cheese does. Sometimes proverbs are inaccurate. I'm french.. so no 😉 Lots of white flags in my code 😆
    2 points
  42. Finally here it is. Refnum_to_Pointer-Linux.zip Refnum_to_Pointer-MacOS.zip Tested in LV 2020 on Ubuntu 20.10 and macOS Big Sur. It's assumed to work in 64-bit editions of LabVIEW only.
    2 points
  43. - LVCompare and LVMerge should be unlocked with the LabVIEW base edition. Or even better, an open source merge and compare tool could be released to the community.
    2 points
  44. In the world of C it is up to us to declare variables such as those Refnums as 'volatile' so the compiler knows they may change despite the appearance of being loop invariants. I would say it is a bug that the LV compiler does not treat Refnums as volatile. I'd say most of your workarounds would work, but are in (slight) danger of being optimized away as the compiler improves. Personally, I'd drop an 'Always Copy' node instead of the IPES.
    2 points
  45. Free is maybe pushing it. Zero technical support without a SSP would be an appropriate model associated with open source. Of course this wouldn't prevent a vibrant community support (independent from NI) from existing, but this is not what the big industrial companies using LV would go for. They would remain NI's proverbial cash cows. I think the misunderstanding on the corporate side of why open source is beneficial for code safety and reproducibility is understandable, as I can witness the same ambivalence if not resistance in academia. As for NXG and webUI (or whatever they call it now), as discussed elsewhere, it looks like NI doesn't have the resources to bring the vision (whatever it was originally) to fruition, and their recent decision to abandon it will probably lead (or has already led) to morale cratering and talent effusion, so I wouldn't hold my breath... One thing I'd add to the list is this: stop the yearly versioning breaking backward compatibility. This is frankly moronic and the clear and only reason why this exists is NI pricing scheme. Adopt the scheme suggested in the first paragraph and this can go right away.
    2 points
  46. To be honest, I would probably not put them anywhere in that view. It’s called Class View for a reason. 😀 It hadn’t really occurred to me that you would want to have the non-class VIs visible in there. Is that a flaw or just out of the box thinking?
    2 points
  47. The future is Python for many of the applications, it is easy to get in to for newcomers to programming, works great has a strong package management system and large community, and can be applied to virtually any OS you can think of, as well as it can be even used to program GPUs if you are so inclined. The huge advantage of using G and LabVIEW is that paired with NI Hardware, in the hands of someone skilled with LV you can bang out a solid prototype of a product or a Test and Measurement system so fast it will make people's heads spin. NI hardware is absolutely top notch for High End use cases or rapid prototyping, complex one offs , scientific use or complicated Test and Measurement end of line test space. However in the IoT there is strong competition for the NI SB RIO line up, for a SB RIO you will pay 1500 US. There are so may cheap programmable & capable pieces of hardware, such as Jetson Nano (ARM7+NVidia GPU for vision) or Raspberry PI (ARM7 1.4 GHz with 8GB RAM) or even Asus Tinker Board ... which will serve so many purposes and have onboard GPIO and can be purchased for 50-60 bucks ... that in that space Python paired with Linux knowledge is really making headway. And if you want to go with ZynQ from Xlinix you can get a board with FPGA ~300 Bucks, which is basically the same HW as SB RIO, all you have to do is use different software tools. If NI would consider unlocking the ability use NI FPGA with the ability to deploy to non NI Hardware ... I think this is where G absolutely would take off like wildfire and be used on millions of IoT devices everywhere in the world that are powered by and ARM7 + FPGA module... but as it stands now if you use NI FPGA you must deploy on a target you've purchased from NI. I'll really be a stickler but if we're talking about the programming language we should talk of G. LabVIEW the IDE. Never say never, I am not aware of any other graphical programming language which could be used for general purpose programing and is as complete as G. If you have come across something interesting I would like to look at it, but I feel that it would be at best an academic project, which I would not use in production code.
    2 points
  48. I think this is a great decision. Admitting they made a mistake is a bold and courageous step. Onwards and upwards from here.
    2 points
  49. I thought this was an interesting exercise so here is my attempt. OpenG has some image tools and one of them is the ability to open a GIF, but for some reason it crapped out and died with your GIF even after resaving it to something much smaller. I did find some other GIF API over on the dark side and instead used that. Attached is a zip, extract it and run Demo Saving Button. It will show the first image. Then when you click the image it cycles through the first half of the GIF and waits for the simulated save process to complete. Once it is complete it rotates through the second half of the images, and then after a few seconds returns back to the first. Parsing of the GIF takes time so I put in the GIF images as a constant, along with the code to parse the GIF. I also set the pane to be the color of the (0,0) pixel in the hopes it will blend in better. Honestly this could be turned into a QControl and be made very seemless. Demo Saving Button Gif.zip
    2 points
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.