- Popular Post
-
Posts
1,256 -
Joined
-
Last visited
-
Days Won
29
Content Type
Profiles
Forums
Downloads
Gallery
Posts posted by asbo
-
-
Any ideas what makes up the performance gap between Debug and Debug + PWD?
- 1
-
Can you reproduce this in safe mode with networking? The error message basically means that something outside of your (application's) control is aborting the connection, so I would try and eliminate potential sources of that. Do you have problems with anything else TCP-based, in or out of LabVIEW?
- 1
-
Ha! If only I'd clicked through... They referred only to "LVCompare.exe", so I assumed...
-
Backsaved to LabVIEW 8.0.
There seems to be a bug in the Modbus CRC calculation vi in the NI library. Replace it with the attached VI that I got from Steve Brooks on the NI website.
Do you know what the changes were in this VI?
-
One of my ideas finally made it to 'In Development'! WooHoo!
-
Based on the narrow scope of this KB article (Comparing Two VIs in LabVIEW), other platforms don't get a compare tool? That seems crazy, though, since it's only written in LV anyway (I think).
-
-
I'm curious how people decide whether they are doing alpha or beta testing? I've always considered it alpha testing if there are large chunks of functionality that have not been implemented, UI is unfinished, etc. Beta testing is when the software is mostly (> ~80%) feature complete, UI is mostly in place, etc. I've had other tell me they don't consider the software to be in beta testing until it is feature complete and all you're looking for is bugs. Thoughts?
https://en.wikipedia...ife_cycle#Alpha
Before having read it, I agreed with the feature-complete notion of a beta, which is primarily what separates alpha from beta in my mind.
-
One solution is to close Windows Explorer when starting your test application (or never allowing it to start on the operator account in the first place).
Alternatively, use a file format with protections built in. For example, I have a project which reports to XLS format and utilizes the built-in spreadsheet protection functionality. Even knowing where the file is, the operator cannot modify the sheet without knowing the password.
-
http://zone.ni.com/reference/en-XX/help/371361J-01/lvinstio/visa_flush_i_o_buffer/ is about as illuminating as it gets.
The node only flushes the receive buffer by default, so it's possible you've left something in the transmit buffer and your device is echoing it back?
-
I almost feel like you could make 2 installers, one that installs MAX, DAQmx, VISA, and any other things you need (NI-DMM, Switch, NI-Power), and then have your second installer that is your 9MB program which imports your nce as a post install function. Then have the user install both if it is the first time new machine, or just run your smaller installer if it is an upgrade. That way when you push a new version of your code you will only have to deliver a new 9MB file, instead of a new 330MB file, with 321MB being the same as the last release.
Yeah, this is the proper solution, but it would be nice if I could tell the installer "assume this software is present" instead of it silently including extra components. I think that's probably what annoyed me the most, that there was zero indicating I was auto-including other software. It still kills me to have a 321MB "Hardware Configuration Installer". Blech.
As for the auto-import behavior - if something fails, it will prompt you to retry the auto-import or to do it manually. Must have changed since the last time you tried it?
-
Use a class?
You've been around a long time asbo... I thought you had already jumped to oop?
I should have expected that response. More often than not, I don't come in at ground level on a project and OOP was not part of the architecture in most projects I've worked on. I've been applying it as I can with newer projects, though.
-
It is perhaps way too easy to carry on using cases down into the low-level operations that really should be subVIs. I did make that mistake at first.
I find that if I use a case from more than three other cases, it is doomed to become a subVI instead. The only annoying part is that I tend to pass in my clustersaurus into it, which ruins the front panel.
-
This is an option for NI installer build specifications. Gist is, you can include a .nce file which gets merged into (or replaces) the MAX config on the target machine at install time. I thought, "Oh, cool!" and ticked it recently for a project. It wasn't until I got to distributing the installer that I realized it ballooned my installer to 330MB! Without that option ticked, it's around 9MB. The installer will automatically include the NI System Configuration Runtime (and possibly MAX itself?), but unfortunately does not indicate this on the Additional Installers page, else I might have caught it). All in all, I'm disappointed there's such a hefty tax for this feature.
Does anyone use this regularly? More importantly, has anyone used this, been appalled at the ballooned install size, and made a nice, lightweight alternative? Is it even worth it, I wonder/ We have a couple VIs in our reuse library which could facilitate this, but it would have to be done using an executable which is automatically run after the installation finishes.
-
Sure it is. LV Speak is open source and implements speech recognition through the Microsoft Speech SDK.
-
frequency is visualized as Y and time as X how to visualize the frequency as X and amplitude as Y?
Sounds like you need to use an XY Graph or XY Chart.
-
Come on, guys, at least tell him there are loads of examples that probably do this exact thing.
http://forums.ni.com...SX/td-p/1980101
James-D's answer was pretty reasonable, I thought.
To be fair, James D is being paid to write those answers.
-
If you were to swap to using Menu Rings instead of Text Rings, you could avoid this problem altogether because the menu ring doesn't have up/down controls.
-
Never heard of that, never read that in any doc...
Is there a way in Windows to monitor the GPU's activity?
I've used tools like GPU-z to monitor temperature and clock speeds of GPUs, but I've never heard of pulling utilization stats.
I know that there was a beta for CUDA toolkit of sorts (nVidia's flavor of GPU interface) with LabVIEW perhaps a year ago. It's no longer in the list (it was released with LV2012), but that was the first I'd heard of anything from NI using GPU resources.
-
Pretty interesting, which is why I told you about it. You posting it kinda feels like you are stealing my internet points. I'll just have to make some posts around here that don't add anything to the conversation to get my post count up.
Well, you'd certainly be the first one to try that technique.
The writeup is pretty interesting. I'm implemented a MIPS processor, as well as a variety of custom logic, in VHDL and really liked playing with the language. Maybe I'll be picking up a Spartan 3E soon...
-
(probably late one sunday evening with a bottle of Bud close by!!).
A bottle of Bud?? You're in Germany, you can do better than that!
-
In the simplest of solutions, you could use a matrix switch to route your four USB lines to each DUT. NI has some Ethernet-based hardware that you could use to pull this off, I think.
We have had to do USB switching before, and used the J-Works SSB118. However, it doesn't fulfill your Ethernet requirement.
As for 1-to-all versus 1-to-1, I don't think the former would work because each host (your DUT) expects that only one host will be on the bus - there would probably be a mess of collisions if you plug x DUTs into one dongle.
- 1
-
My money's on the VISA version LV2011 forced to install. You won't be able to go back to your old version without uninstalling LV2011 (and maybe not even then).
- 1
-
I agree with rolf - there *shouldn't be* any exceptions where a VI configured to inline won't inline, but only NI could tell us for sure. The fact that some VIs see a performance decrease from inlining tells me that, at minimum, LV is not disabling inlining conditionally as an optimization. The only conditions I know of that break inlining are those made verbose by the exclamation icons in the Execution settings pane of the VI.
Also, "subroutine" for inline code makes no sence, unless there are som catches somewhere.
In non-inlined code, it is guaranteed that a subroutined VI will execute as one clump (that is, atomically). I would hope that holds true for inlined code, but that's something else I can't speak to.
Symbio GDS unable to activate with community key
in Object-Oriented Programming
Posted
I'm not usually one for personal advice, but you might want to see a doctor about that.