Jump to content

asbo

Members
  • Posts

    1,256
  • Joined

  • Last visited

  • Days Won

    29

Everything posted by asbo

  1. I'm not usually one for personal advice, but you might want to see a doctor about that.
  2. Any ideas what makes up the performance gap between Debug and Debug + PWD?
  3. Can you reproduce this in safe mode with networking? The error message basically means that something outside of your (application's) control is aborting the connection, so I would try and eliminate potential sources of that. Do you have problems with anything else TCP-based, in or out of LabVIEW?
  4. Ha! If only I'd clicked through... They referred only to "LVCompare.exe", so I assumed...
  5. Backsaved to LabVIEW 8.0. MBcrc.LV80.vi Do you know what the changes were in this VI?
  6. Based on the narrow scope of this KB article (Comparing Two VIs in LabVIEW), other platforms don't get a compare tool? That seems crazy, though, since it's only written in LV anyway (I think).
  7. https://en.wikipedia...ife_cycle#Alpha Before having read it, I agreed with the feature-complete notion of a beta, which is primarily what separates alpha from beta in my mind.
  8. One solution is to close Windows Explorer when starting your test application (or never allowing it to start on the operator account in the first place). Alternatively, use a file format with protections built in. For example, I have a project which reports to XLS format and utilizes the built-in spreadsheet protection functionality. Even knowing where the file is, the operator cannot modify the sheet without knowing the password.
  9. http://zone.ni.com/reference/en-XX/help/371361J-01/lvinstio/visa_flush_i_o_buffer/ is about as illuminating as it gets. The node only flushes the receive buffer by default, so it's possible you've left something in the transmit buffer and your device is echoing it back?
  10. Yeah, this is the proper solution, but it would be nice if I could tell the installer "assume this software is present" instead of it silently including extra components. I think that's probably what annoyed me the most, that there was zero indicating I was auto-including other software. It still kills me to have a 321MB "Hardware Configuration Installer". Blech. As for the auto-import behavior - if something fails, it will prompt you to retry the auto-import or to do it manually. Must have changed since the last time you tried it?
  11. I should have expected that response. More often than not, I don't come in at ground level on a project and OOP was not part of the architecture in most projects I've worked on. I've been applying it as I can with newer projects, though.
  12. I find that if I use a case from more than three other cases, it is doomed to become a subVI instead. The only annoying part is that I tend to pass in my clustersaurus into it, which ruins the front panel.
  13. This is an option for NI installer build specifications. Gist is, you can include a .nce file which gets merged into (or replaces) the MAX config on the target machine at install time. I thought, "Oh, cool!" and ticked it recently for a project. It wasn't until I got to distributing the installer that I realized it ballooned my installer to 330MB! Without that option ticked, it's around 9MB. The installer will automatically include the NI System Configuration Runtime (and possibly MAX itself?), but unfortunately does not indicate this on the Additional Installers page, else I might have caught it). All in all, I'm disappointed there's such a hefty tax for this feature. Does anyone use this regularly? More importantly, has anyone used this, been appalled at the ballooned install size, and made a nice, lightweight alternative? Is it even worth it, I wonder/ We have a couple VIs in our reuse library which could facilitate this, but it would have to be done using an executable which is automatically run after the installation finishes.
  14. Sure it is. LV Speak is open source and implements speech recognition through the Microsoft Speech SDK.
  15. Sounds like you need to use an XY Graph or XY Chart.
  16. Come on, guys, at least tell him there are loads of examples that probably do this exact thing. To be fair, James D is being paid to write those answers.
  17. If you were to swap to using Menu Rings instead of Text Rings, you could avoid this problem altogether because the menu ring doesn't have up/down controls.
  18. I've used tools like GPU-z to monitor temperature and clock speeds of GPUs, but I've never heard of pulling utilization stats. I know that there was a beta for CUDA toolkit of sorts (nVidia's flavor of GPU interface) with LabVIEW perhaps a year ago. It's no longer in the list (it was released with LV2012), but that was the first I'd heard of anything from NI using GPU resources.
  19. Well, you'd certainly be the first one to try that technique. The writeup is pretty interesting. I'm implemented a MIPS processor, as well as a variety of custom logic, in VHDL and really liked playing with the language. Maybe I'll be picking up a Spartan 3E soon...
  20. A bottle of Bud?? You're in Germany, you can do better than that!
  21. In the simplest of solutions, you could use a matrix switch to route your four USB lines to each DUT. NI has some Ethernet-based hardware that you could use to pull this off, I think. We have had to do USB switching before, and used the J-Works SSB118. However, it doesn't fulfill your Ethernet requirement. As for 1-to-all versus 1-to-1, I don't think the former would work because each host (your DUT) expects that only one host will be on the bus - there would probably be a mess of collisions if you plug x DUTs into one dongle.
  22. My money's on the VISA version LV2011 forced to install. You won't be able to go back to your old version without uninstalling LV2011 (and maybe not even then).
  23. I agree with rolf - there *shouldn't be* any exceptions where a VI configured to inline won't inline, but only NI could tell us for sure. The fact that some VIs see a performance decrease from inlining tells me that, at minimum, LV is not disabling inlining conditionally as an optimization. The only conditions I know of that break inlining are those made verbose by the exclamation icons in the Execution settings pane of the VI. In non-inlined code, it is guaranteed that a subroutined VI will execute as one clump (that is, atomically). I would hope that holds true for inlined code, but that's something else I can't speak to.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.