-
Posts
3,392 -
Joined
-
Last visited
-
Days Won
284
Content Type
Profiles
Forums
Downloads
Gallery
Everything posted by hooovahh
-
Is there a reason the running average needs to be in the file? Just curious why you don't have a circular buffer in your program and calculate the average with that. Even if you do really want it to be in a file for some reason, is there a reason you can't have two files? It sounds like the running average is filled with dummy data anyway and could be saved in a temp location.
-
I've never really thought about it but you are totally right. I suspect it has a lot more to do with the target audience of LabVIEW, being scientists and engineers, who may have no or little software discipline background. Who needs SCC when it's just me in a lab making a single giant VI? Why would I need requirements tracking? Bug tracking? Documentation? Coding style? Certification?
-
In the NI world people usually talk about Requirements Gateway. (RG) It was a tool made by another company that NI bought and modified to support NI software tools a little better. I don't know that it can do all the things you asked, it basically links Word documents, and LabVIEW source, finding tags in the code, usually to signify a requirement is being fulfilled by that part of the code. Imagine you had a detailed requirement document. RG can pull out the text and tag information so it knows "R12-345" means "The software shall have a UI". Then in LabVIEW you can put down a comment on the BD or FP stating you cover that requirement. Then you can write a test document that proves that the requirement is in the build by having a user test that the UI exists and stating it covers that tag. Then you could have test results logged showing that the test was performed, and what steps passed, meaning those requirements in those steps were tested and were implemented, and so they were coded. When all this linking works well, it is beautiful. RG can generate matrix tables showing start to finish where the requirements go, from creation, to implementation, to test document, to test passing. This requirements traceability matrix can be stored for each build and then you could record coverage this way. It's a lot of work, and honestly most of the time it isn't worth it. But for medical devices, customers like to see the ducks in a row. Bug logging for me is separate and I haven't seen a good integration. Not that it hasn't worked, I just haven't seen it integrate with all the other tools for traceability.
-
Watch the Wednesday Keynote >> NI's Software Platform >> Around 7 minute mark http://www.ni.com/niweek/keynote-videos/ "Port the entire LabVIEW engine to Java script". Yeah I was messing with the Show Public URL function so that was probably it. I commend your efforts and hope it is in a semi-complete state when I have a need for this type of thing.
- 137 replies
-
I never had good luck with this. Maybe it was older versions of TDMS but it never seemed to work right. I can try it again and see if it works right.
-
Yeah I'm a big fan of TDMS and I still learn things every once in a while by experimenting. One thing that helps is as you already noticed, writing chunks of data. Basically calling the Write function as few times as possible. If you are getting samples one at a time, put it into a buffer, then write when you have X samples. Got Y channels of the same data type which get new data at the same rate? Try writing X samples for Y channels in a 2D array. I think writing all the data in one group at a time helps too but again that might have been a flawed test of mine. I think it made for fragmented data, alternating between writing in multiple groups. Because of all of these best practices I usually end up writing an actor that takes care of the TDMS calls which can do things like buffer, periodic defrag, and optimized writing techniques. A bit of a pain for sure when you are used to just write to text file appending data, but the benefits are worth it in my situations.
-
Pretty neat. I couldn't actually get it to display live data for some reason. I tried in the source, and the EXE and when running both only 8080 gave me a static image of the FP, and 8001 was page not available. That being said I was able to follow the code and understand it, and see where you can add methods for other control types. This isn't any thing new really, but no one has released a free version that worked well, and is fully open, as far as I know. Don't get me wrong I, think the toolkits others are making are awesome and hope they make lots of money off of them. But in many cases the bosses just believe "software is free" and are often leery about buying 3rd party toolkits. Wezarp is another one you've probably seen doing some self promotion. Even NI has been dabbling in it at some level according to their keynote this year. Until recently my development world was in a place where I would never really have a need for this type of software. But now I could see some places where this could be very handy. Especially with my somewhat limited experience with web design. EDIT: Oh and VIRemote is another.
- 137 replies
-
Okay new version 1.3.0. This adds support for having only the 2015 runtime engine installed Adds the ability to abort all VIs in private contexts (still not clones) found in the settings window because I didn't think this was a common feature needed Several bug fixes with the registry, and screwing up what program controls what extensions Removed a few external dependencies (like pv.exe) Added a listbox to the settings for what file extensions this program takes over (by default VI and CTL) Moved config file location from Program Files to ProgramData folder Simplified open file process Few bug fixes for picking the right version of LabVIEW when selecting the only default option Added some asynchronous calls to help prevent a lockup from a non-responsive LabVIEW version. I really think this is the most stable version and have been using it for a few weeks without any real issues. I still have some locking up issues when LabVIEW doesn't respond but it doesn't happen often, and a restart of the tray program fixes it for now. I updated the page on my personal site, but here are the direct links. If for some reason there is an issue with it, closing it from the system tray (or taskmanager) and then relaunching LabVIEW should return things to the way they are normally. This is because on startup LabVIEW writes to the registry to take over the file extensions. Source 1.3.0 Installer 1.3.0
-
Either you misunderstood what I said, or you misunderstand what that expression means.
-
The problem is the people customers. You must know that a requirements document, or rather a text document from a customer that very loosely describes what they want the software to do, is going to vary in format, wording, and technical level from customer to customer. I've seen plenty of documents that were supposed to describe the software a customer wanted, but was more of a stream of consciousness, describing what they wanted, including but not limited to things like "the operator won't get bored using the software". Good luck getting them to use a word like Boolean, or Enum. This is hardly pseudo code, and requires some amount of magic and hand waving when it comes to bidding on project with this type of specifications. For me the real meat of what needs to happen is a sit down conversation with the end user, asking what they want it to do. Flush out what they really need, and what they want. Understand priorities, and try to think of all the pit falls, technical limitations, and dead locks where they ask for something in one place and contradict it elsewhere. I'm not saying it can't be improved, but I'm trying to say in my world why so much effort is put in translating this document to an output like software. That being said I agree that you can fail CLD/CLA simply based on not understanding what it wants. I remember hearing someone take the CLD coffee maker exam having never drank coffee, or knowing what it was. If it were me in real life I'd sat down with the customer and discuss what they really want and how they want it to work. I do.
-
Huge fan of TDMS over here, so personally I'd probably go with that, but I've heard good things with SQLite so that is probably an option. With TDMS the write is very fast in just about all cases. The read is where it can be less efficient. As mentioned before file fragmentation is the biggest cause of long read and open times. In my logging routines I would have a dedicated actor to logging, and among doing other things, it would periodically close, defrag, and re-open the file to help with this issue. But if you write in decent sized chunks you might not have an issue. There are probably lots of ways to write a 4D array to a TDMS file. Obviously it is only supposed to be a 2D type of structure, where you have something like an Excel work sheet. But just like Excel you can have another layer which is groups. So here we have a way of logging a 3D array, where you have groups, channels, and samples. How you decide to implement that 4th dimension is up to you. You could have many groups, or many channels in a group. Then your read routine you'd want to encapsulate that so as you said you request X vs Y and it takes care of where in the file it needs to read. Another neat benefit of TDMS is the offset and length options on read. So you can read chunks of the file if it is too large, or just as a way to be efficient if the software can only show you part of it at a time anyway. Conceptualizing a 3D array of data can be difficult, let alone a 4D. Regardless of file type an method, you are going to probably have a hard time knowing if it even works right. I wanted to write a test but I can't tell if it works right because I'm using made up data, and am unsure if it even works.
-
The all users start menu on Windows 7 is in this folder: C:\ProgramData\Microsoft\Windows\Start Menu If you open the start menu, then right click All Programs, you get a menu for opening this folder. I've never had to do this but can an NI installer make a shortcut in this place?
- 1 reply
-
- 1
-
Are you sure these toolkits aren't just free, and support 2014 and 2015? I downloaded them from these links and installed in 2015 and was never prompted about a trial, and ran examples in the sparse toolkit. I didn't run any GPU ones because I was missing the CUDA DLLs. The package that installs them also is used for LabVIEW 2014. http://sine.ni.com/nips/cds/view/p/lang/en/nid/210829 http://sine.ni.com/nips/cds/view/p/lang/en/nid/210525 I then installed the sparse toolkit in 2014 and it worked the same.
-
So I haven't had any new projects use 2015. I installed it, and the only real development I've done is updating reuse, confirming functionality, and doing some EXE and installer builds for applications that are pure LabVIEW. Calling .NET and system DLLs are the most interesting things these EXEs do. So it might be too early for me to say for sure, but I've had no issues with my limited use. To be honest 2014, and 2013 SP0 has been pretty good. It probably is obvious but NI has been focusing on stability. I guess what I'm saying is if you were forced to use 2014 or 2013 SP0 when it first came out I don't think you'd had any real issues. I knew of some toolkits going free in 2013 (maybe it was 2014), I think report generation and PID, what ones were added in 2015? EDIT: okay here is a bit more information on the 2014 release notes: The LabVIEW 2014 Full and Professional Development Systems include all of the functionality of the LabVIEW PID and Fuzzy Logic Toolkit except the PID (FPGA) Express VI, which is part of the LabVIEW 2014 FPGA Module. LabVIEW 2014 Professional Development System now includes the following toolkits: – LabVIEW Database Connectivity Toolkit – LabVIEW Desktop Execution Trace Toolkit – LabVIEW Report Generation Toolkit – LabVIEW Unit Test Framework Toolkit – LabVIEW VI Analyzer Toolkit The following toolkit consolidations also provide additional functionality: • LabVIEW 2014 Digital Filter Design Toolkit includes the LabVIEW Adaptive Filter Toolkit. • LabVIEW 2014 Control Design and Simulation Module and LabVIEW 2014 Advanced Signal Processing Toolkit include LabVIEW System Identification Toolkit. • The LabVIEW 2014 FPGA Module includes the FPGA Compile Farm Toolkit, which is now known as the FPGA Compile Farm Server, and the FPGA IP Builder. • The LabVIEW 2014 Real-Time Module includes the Real-Time Trace Viewer. LabVIEW 2014 Release Notes Page 19
-
Yup that's why you use randomized data, and use that same randomized data in the other functions you are trying to compare it to. At some point we are splitting hairs, but if we are seeing orders of magnitudes of difference between the execution time of two methods then I agree, updating the UI and thread swapping probably isn't adding that much uncertainty to our time measurement.
-
UI elements are updated and polled asynchronously. Having control values being read or written during the timing measurement will change the results. Especially if you are reading or writing elements continually in a loop. The proper timing should read the control before taking the start time, and write to the indicator after taking the stop timer. For bonus points turn off automatic error handling and debugging.
-
Schur Decomposition and Matrix Square root
hooovahh replied to jaehongyoon's topic in LabVIEW General
Post your code, but I'd guess that Altenbach over on the NI forums is more versed in advanced math on LabVIEW. You might want to post over there and see if anyone else picks it up. But again post your results and code there if you do. -
LabVIEW EXE Running on a $139 quad-core 8" Asus Vivotab tablet
hooovahh replied to smarlow's topic in LabVIEW General
I can't say for certain what the limitation would be, only that when we were developing applications on the dual core atom we had at times a USB cDAQ or at times an ethernet based cDAQ which would do finite DAQ sampling, but primarily single point stuff. It was a pretty slow system, and really just need to check signals once in a while and turn on solenoids once we reached a certain flow/pressure. My point is we didn't push the limits of data transfer. -
Most secure camera connection
hooovahh replied to infinitenothing's topic in Machine Vision and Imaging
So I've had good experiences with GigE other than the fact that it is ethernet traffic (duh) and anti-virus / firewall software on the host PC can cause issues. Just be aware to set exceptions on the physical port, or disable it all together. At the time it was some of those AVT cameras. I've used Camera Link before too, there was a couple from Imperex and another that I can't remember at the moment which converted Camera Link to GigE which was handy for the large number of cameras we needed to work with running into a gigabit switch. I don't think USB3 is really all that new any more, but you'll likely find more options in other form factors. One benefit is the increase power capability of USB3 means that you can usually power the camera and get data from a single USB port. -
I was resistant to the NI Downloader at first. Why do I need another application doing who knows what? There are several downloaders (I'm looking at you CNET) that download crapware that you need to make sure you don't install and I guess I thought it would be like that. But the NI ones seem to work well, resume has never been an issue, and I've noticed it resumes after a restart too. But of course if you need to use FTP they won't work.
-
Seriously? You posted this on a Sunday and waited 12 hours. Please be patient next time. I confirmed the behavior in 2014 SP1 and 2015. In the hanging example I needed to zoom in, and then try to scroll horizontally. Zooming alone wasn't enough. I'd say this is probably some kind of corrupted VI or control. In 2015 I forced a recompile by pressing CTRL+Shift and clicking the run button and the VI that did hang no longer hangs.
-
LabVIEW EXE Running on a $139 quad-core 8" Asus Vivotab tablet
hooovahh replied to smarlow's topic in LabVIEW General
It's hard to say for sure, but based on the performance I'd say some DAQ work shouldn't be a problem. I'd guess you'd see significant performance issues if you tried taking something like N channels N samples at a very high rate, and then try to post process the data. But this might be fine for say a log to TDMS and then display a subset of data. A quad core atom isn't nothing to sneeze at. In the past I actually deployed a test system that ran on a dual core atom, running Windows XP. It had a couple ethernet based cDAQ systems performing a sequence of events for controlling solenoids, and measuring mass flow. Ultimately it was used to measure the efficiency of semi-truck air dryers. We went with this PC because it was pretty small. It had room for one PCI slot and was passively cooled, it was just a big heat sink that we could put in the cabinet with all the equipment. We just attached a monitor on an arm and didn't need the whole second cabinet we were replacing. -
LabVIEW EXE Running on a $139 quad-core 8" Asus Vivotab tablet
hooovahh replied to smarlow's topic in LabVIEW General
Yeah I think the same thing. Really it doesn't matter the language, but if a program was developed with a mouse and keyboard in mind, it probably doesn't work well on a touch screen. If I knew I'd be targeting a touch screen, I would probably need a different set of hack-ish UI tools. The first change that comes to mind is how right clicking would probably never be used. Or if it was a custom popup would be needed. There probably wouldn't be a menu bar, or if there was it would be replaced with custom large icons. And I imagine there would be more drag and drop, as well as larger controls. -
LabVIEW EXE Running on a $139 quad-core 8" Asus Vivotab tablet
hooovahh replied to smarlow's topic in LabVIEW General
Fair enough. I think people worry too much about a proper category. Generally the only time a move is requested is when the category has nothing to do with the subforum. And in those cases it takes like 5 seconds to move a post to another section. I guess speaking of performance, I'd expect any normal USB DAQ, or DMM to work with this hardware. In a pinch I've used a myDAQ for a DMM and basic scope. Having one of these tablets be the front end could be handy. I'd much rather have a Virtual Bench, if say maybe the cost were half what it currently is.