Jump to content

hooovahh

Moderators
  • Posts

    3,450
  • Joined

  • Last visited

  • Days Won

    293

Everything posted by hooovahh

  1. That's fine, it just helpful for others to know that a duplicate conversation is taking place elsewhere, so that we know not to suggest things already tried, or to suggest things that have been tried or won't work for some reason. Oh and in the past I have used this toolkit but have had issues with Windows picking the right printer in the past.
  2. In cases like these I assume (possibly incorrectly) that NI's compiler can handle these lower level operations better than me. They have more intimate knowledge about what is happening, and can make performance improvements to handle the delay and notification of data to dequeue better than me. Then again I know more about what my application needs than NI, and maybe I could write more specific software for a queue, and NI needs a more general purpose solution. I don't really have any more information on how NI implemented or handles queues and polling. I'm just glad that over the years it seems to be pretty robust, and works well on the various platforms supported.
  3. In a setup like this I might not even have a timeout or rather have a -1 timeout. This allows the dequeue function to just sit idle waiting for a request to take an action. That action might be to quit or something else. Another common thing you might see in examples is to use the error out of the dequeue and have it stop the loop. If the queue reference is destroyed while waiting, it will return an error and that can be a way to tell the loop to shutdown. A better method is to use a quit message of some kind but it is seen in several NI examples.
  4. The design choices of LabVIEW make it impossible to encrypt the entire file. The block diagram has to be readable at some point which means access to it will be possible in a variety of ways. I'm sure there's some kind of compromise between restricting VIs to a specific target and build of LabVIEW, and the current implementation. I just wanted to highlight the fact that the solution to protect the block diagram isn't a simple one.
  5. Yes there is no real native solution for this. NXG did tease a dynamic control creation but I'm not sure it made it far into development. Depending on your needs I would suggest different solutions. I have accomplished similar functions with picture controls in the past for creating a dynamic ribbon interface. Here a running VI can generate an image that looks like a set of group buttons, and tabs and allows for setting of various background colors. Another less polished solution revolves around parent and child relationships of windows, and dynamically running VIs. I called this the Multi Panel Interface and honestly it didn't make it pas the proof of concept phase. But for simple stuff, and a finite set of controls, you're probably better off just having the controls, then showing or hiding them as needed.
  6. I hope for that future too, however I get the feeling that with 35+ years of LabVIEW development, that parts of the system are in a state that open sourcing the project might take a larger effort then NI would want to put into it. Then there is the liability issue if some kind of exploit was found because the source was released. I'm lucky enough to work in an environment where my boss asks for work to get done, and is less interested in the means of getting it done. If I were making test systems for external customers I'd be more nervous, but internally I can just continue to use a legacy platform. Cobol anyone?
  7. You don't need to hold your breath for the answer. There is a public beta, it can be downloaded right now for free, only requiring an NI.com account. It probably won't surprise you to know the answer to your question.
  8. Yes, I don't assume you use many new features of LabVIEW from the last 10 years if you still develop in LabVIEW 2009.
  9. LINX now is supported on commercial applications starting in 2020 BTW. Your opinion is valid, and you have reasons for it, but I think it might be a bit of forest from the trees situation here. LabVIEW tends to have a one or two major bullet points of new features with each release, with many smaller improvements that are less noteworthy. Some of these aren't very applicable to me and I don't see the benefit of the update, but I can still recognize that a bunch of effort was put into a making it into the release, and makes me think NI isn't sitting idle. I know I made a list like this in the past when a similar topic has come up but I'm having a hard time finding it. 2012 - Loop Tunnel improvements with concatenating, conditional, indexing, and last value / Project Templates 2013 - Improved Web Services / WebDav / SMTP / Bookmark Manager 2014 - Actor Framework (I might be off by a version or two) / 64 bit Mac and Linux support 2015 - Custom Right Click Framework 2016 - Channel Wires 2017 - VIMs / Forward Compatible Runtime 2018 - Command Line Interface / Python integration / Type Specialized Structure for Improved VIMs 2019 - Sets and Maps 2020 - Interfaces for classes / Free Community Edition with Application Builder And here are a few of my favorite features that I can't remember what version they were added. Error Ring, Improved VI calling under Application Control for starting asynchronous processes and static VI references, DVRs, Conditional Disables based on environment or Project variables, Linux Real-time operating system, allowing for 3rd party and open source tools to be installed and called with the System Exec, and then adding an embedded HMI, User Events, LINX toolkit for running LabVIEW VIs natively on a Raspberry Pi, or controlling an Arduino connected to the host, QuickDrop's plugin system allowing for all kinds of tools, filtering on search results, improved performance of tree and listbox controls, NIPM, and loads or more scripting functions with more added with each version. I sure hope LabVIEW has a future because I've hitched my career to it. But even if NI closed its doors tomorrow I feel like I'd still be using it for another 10 years or so, or until it didn't run on any supported version of Windows. But I feel your concern, and if I were a junior engineer starting out, I would feel safer in another language.
  10. Oh boy 2006, I think that predates the LAVA 1.0 migration issues, and then several other newer revamps. I certainly don't have this, and have no way of getting something so old. I can contact Michael if you'd like.
  11. Not sure if you are past this point or not, but be sure and take a VM snapshot at some point. Having a fresh VM for a semi-standard environment like LabVIEW 2017 SP1 with DAQmx, Vision, RT (or whatever is common for your industry) is super useful. Taking this another step you can have a snapshot for each project too. Sharing this with team members can be a pain, so I generally just go through this process at the end, or near the end of a project, then archive the VM somewhere on the network for safe keeping.
  12. I think combining that what PiDi gave, with a single column listbox, and an event structure that gets triggered on the Double Click, event would get you most of the way there. You can also try to capture a <CTRL> + G on the OS to go to the next. I think the Initialize Keyboard an Acquire input would work.
  13. Okay this is possible, but it may need extra work when making the file. TDMS data in channels by themselves are just 1D data. They have no way of knowing how much time is between each sample, or the start time. But you can write to the properties of a TDMS channel, and put in extra information like the start time, and time between samples. This works great for things like the Waveform data type, and when you write this data type to a channel, these properties are already written. So you can read the start time property, the dT (time between samples), and the number of samples in a channel (property automatically written), and with this information you could make a slider where the user selects what section of data you want to load. You would have to convert the sliders input, in the number of samples to read, and the offset of the read, but that can be done based on the sample rate. If your data isn't being written at a consistent rate, you can also have a separate channel that is Time. Then you can read the first sample and know the start time, and read the last and get the end time. Intelligently grabbing just the data you want would likely take some kind of binary search but would be faster than reading all samples, when the channel contains lots of data. This requires that when you write the channels samples, that you also are writing the samples time data. These two channels will need to have the exact same number of samples. There are a few options, and all of them go outside of this toolkits functionality.
  14. I like TestStand but it is the one size fits all type of software. NI can't know what you'll prioritize in features in a sequencer and so it either has too few features and only can satisfy the simplest of projects, or it has everything thrown at it, in which case there are parts almost no one will use, and it might have a steep learning curve. That's why one of the justifications I agree with when it comes to a home grown sequencer, is that you know the types of features you need, and the types you don't. Now obviously requirements change, and you need to change with them, but high level things you know you won't use, can help trim the feature set in replacing TestStand. For me the custom sequencer grew out of the need to have it run completely on RT. The Windows application is just there to for starting the test, status, final report generation, and manual control. I've used PTP Sequencer in the past, and it was pretty basic and not really a TestStand replacement. But it was helpful in getting some ideas on where I wanted our sequencer to go. It wasn't open source and I still don't think it is. As for cost savings. All the analysis I've seen has favored TestStand when it comes to cost of developing your own.
  15. I downloaded everything here just fine.
  16. Good discussions are timeless. There's lots of great content on LAVA that goes dormant, but its still there. I would almost put XControls in the best features, and worst features list personally. I still stand by my debugging comment from almost 11 years ago.
  17. A seemly highly specialized piece of equipment can usually justify the price. But not meeting your needs and costing that much seems crazy. It gets me wondering if there are products NI offers that over the life of the product never sell a single unit. Years ago I was in a hands on session with some PXI card NI was selling that allowed for taking many temperature readings, by using fiber optics. Something like the fiber optic had microscopic cuts in it that allowed for taking many channels of temperature reading, like on the order of 100s. I looked up the card and I can't remember the price but I thought it was like $20k or something. I figured no one would use this but looking online it seems people did, but it has now since been obsolete.
  18. Not that I know of. I'm not sure if this is an option for you or not, but you could completely shutdown the LinuxRT device, then have power removed, then reapply power. I believe the bios of these controllers allow for wake on power being provided, and your app can be ran on startup. So maybe have some kind of external device work like a watch dog. While your program is running keep hitting the watchdog, then when your program shuts down the timer starts counting down, removes power, waits a second, then applies power again. Just a thought and probably not a great solution, and someone with more Linux experience may have better suggestions.
  19. Here is a discussion about it on LAVA. If this is on Windows I'd suggest trying a different compression tool like 7zip or the native zip to see if it will work for you.
  20. I have seen bad DBCs in the past. But in those cases I would just load it up in XNet Database Editor, re-export it, and then it was fine. I haven't seen it not load in XNet before.
  21. Oh man and I was just touting NI support as being one of the few "fine" parts of NI.com. Never mind. I'm really confused and I think the report a bug feature is worst than you made it sound. So I got there and it says "In which product did you encounter the bug". It asks for a Serial Number. I enter "LabVIEW" or "LabVIEW 2020" and it didn't work. So I tried entering the serial number of my hardware or model number and that didn't work. I think I'm supposed to put in the serial number of my LabVIEW license, but I'm on a disconnected license from a VLM so I have no serial. Am I not allowed to report bugs? I'm thinking I'll make another post on the dark side to try to get some kind of answers. EDIT: Okay it seems I can find my license number but I need to dig into the quote to find it. I couldn't find it anywhere else.
  22. Very neat thanks for the background on a toolkit I didn't know existed. I'm sure NI was in a "damned if you do/damned if you don't" situation, but this can be seen as another example of NI's first party solution, stifling 3rd party toolkits. Also I generally don't mind bundling in stand alone binaries in cases like this if it means an easier experience for users, similar to bundling in the SQLite DLL in the build.
  23. You have to create a free NI account, and login with that. Once logged in and connected to the internet it should activate. If you have any problems with activation you can try contacting NI, or post on their forums.
  24. The LabVIEW Wiki has lots of content. I usually recommend new developers checkout the Getting Started or the Online Training sections.
  25. The problem is sometimes the compiler gets a bit too aggressive and does something that it thinks won't functionally change the code, but does. Like what if the compiler mistakenly thinks the close reference function can't be called? Well it will think that node can safely be removed and nothing will change. But if the close was actually being called in the IDE, and now it isn't in the RTE that could be a problem. The Always Copy function has been known as a band-aid because it forces the compiler in some cases to leave things alone instead of trying to optimize code. This would then have the code no longer leaking memory. It seems to be a real bug, and NI should fix it. But in the mean time you might want to sprinkle in some Always Copies and see if anything changes. IMAQ images are references and so I don't know if it will actually help or not. I don't have vision to test with.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.