Jump to content

Neville D

Members
  • Posts

    752
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by Neville D

  1. QUOTE (sahara agrasen @ Oct 2 2008, 09:54 AM) After about 30 s of looking, here are http://search.ni.com/nisearch/main/p/sn/ssnav:sol/q/pxi' target="_blank">1146 results for using a PXI system. Maybe you need to work a little harder.. Neville.
  2. I'm curious: Why does it take 10-20 mins? 1 Do you have FPGA code? 2 Are you compiling for multiple targets simultanously? (If so, is the code for each of those targets exactly the same?) 3 Do you have a slow PC or lack of memory? I have a fairly large project ~ 1000+ VI's for PXI-RT target that compiles on my slow laptop in about 2 minutes or so. On the desktop it is faster. N.
  3. check the NI website under PXI solutions. N.
  4. QUOTE (Eugen Graf @ Sep 26 2008, 06:11 AM) Not exactly. Having a .NET equivalent of "#bytes at serial port", followed by "serial read" in a tight 2ms loop for fast data communication at 115KBaud might show the weaknesses of the .NET implementation. There have been many discussions here talking about the performance issues with calling .NET assemblies in fast loops with LabVIEW, and some of Brian Tyler's recommendations in his blog on how to deal with it. Neville.
  5. You know, there is an upgrade to Vision 8.6.1 ? Also, since you were using 7.1 before, is your LV version compatible with Vision 8.6x? I am having a LOT of problems building an exe with Vision 8.6.1 and LV RT 8.6. I have narrowed it down to a IMAQ Find Circular Edge.VI which if excluded doesn't cause the build to fail. I will have to go back to LV 8.5.1 + vision 8.6.1 where everything worked fine. N.
  6. When building an exe, the front panel of subvi's is automatically removed. Go into the build and manually uncheck the "remove frontpanel" checkbox. But I this may not work to see the front panel. Re-architect your code to call the subVI and display as a subpanel in your top-level VI. This way it is always visible. This screen shot shows how to open a VI as a subpanel, given the VI name and path. The loop is just a re-try incase the open fails. You can add a tab to the top level VI and embed the subpanel in it, so that you have additional screen space. Put the data coming out of the subpanel VI on a Q and read that in the top-level VI. N.
  7. QUOTE (bazookazuz @ Sep 23 2008, 03:15 PM) The Windows Performance Monitor I mentioned earlier, does EXACTLY that. N.
  8. QUOTE (schneidley @ Sep 23 2008, 12:36 PM) What exactly have you played with and how? Using MAX? Your own code? What sort of relay are you trying to drive? Are you getting any output at all without the relay connected? Relays might need a lot more current to drive them than the DAQ card can provide. PS. See http://www.catb.org/%7Eesr/faqs/smart-questions.html' rel='nofollow' target="_blank">this. N.
  9. QUOTE (Antoine Châlons @ Sep 23 2008, 12:39 AM) A quick recap of your experience here would be helpful to all, I imagine. N.
  10. QUOTE (jmccoy04 @ Sep 22 2008, 09:29 AM) Yes, all of the above is possible with LabVIEW. QUOTE (jmccoy04 @ Sep 22 2008, 09:29 AM) 1. I am not sure what SBC to get so that LabVIEW will run on it? Does the SBC have to have an OS for LabVIEW to run? LabVIEW needs an OS to run. You can use LabVIEW RT which runs on the Pharlap OS, and is supplied by NI. But to run RT you have to match the hardware requirements for the SBC very carefully. You are better off loading Windows XP or VISTA onto a SBC and then running LabVIEW off it. It will make your life much easier in the sense that setting up the hardware platform will be simple and running in Windows is trouble-free. QUOTE (jmccoy04 @ Sep 22 2008, 09:29 AM) 2. For the SBC will I need some sort of video capture card to connect the camera to, or are there cable adapters to change the camera connections? Depends what sort of camera you are using. If you go with a Firewire or Gigabit Ethernet camera, they should be able to connect up to a PC fairly simply. You will need a PCI Firewire card and ideally an additional Gig-E port on your PC or you could share with a hub. You will need additonal NI Vision software and vision acquisition drivers from NI. If you are building an executable with NI Vision, then you will need an additional Vision Runtime licence as well. QUOTE (jmccoy04 @ Sep 22 2008, 09:29 AM) 3.Will it be possible to create some sort of code that will do the things that I want it to do, such as looping the video recording until the accelerometer has measured past the specified limit? Yes, when it comes to the code, you are only limited by your imagination as to what is possible. Neville.
  11. QUOTE (bazookazuz @ Sep 19 2008, 03:24 PM) Take a look at the http://decibel.ni.com/content/docs/DOC-2051' target="_blank">Windows Performance Monitor that uses .NET calls to access CPU (including per core values) and other system parameters. There is a similar VI for RT applications on NI's website. N.
  12. QUOTE (Daklu @ Sep 19 2008, 11:10 AM) The extra Gig will definitely help a bit. Get a faster hard drive as well. I find applications on my desktop load much faster than my laptop which has a slower HD. QUOTE (Daklu @ Sep 19 2008, 11:10 AM) Does LV run in 64-bit OS's, and if so, does it benefit much from the increased memory that is available? Is the Labview dev environment designed to take advantage of multiple processors? My laptop has two cores but still runs like a dog. Does LV itself benefit from quad core processors? (Again, I'm referring to the dev environment, not LV applications I write.) I think LV does run on 64-bit Vista but not in "native" mode, so it is not able to access the extra memory. There is a LV 64-bit Beta program that you can join if you are interested. LV applications do take advantage of multi-core, but i don't see much performance difference in the development environment. Loading an application's project is still painfully slow as compared to just opening the top-level VI. Starting up LV 8x is still quite slow 6-8 s on my quad-core DELL desktop. Opening the Search menu for the first time after boot-up still takes a good 4-5 s to open as well. N.
  13. QUOTE (liuman @ Sep 19 2008, 08:25 AM) I would check all the serial settings baud rate, stop bits etc. Next I would check the EOL character. Does the instrument require a Carriage Return or such at the end of the command string? Does it work with Hyper-terminal? N.
  14. QUOTE (BrokenArrow @ Sep 19 2008, 06:54 AM) Keep in mind, LVOOP is NOT yet supported on LV-RT. N.
  15. As far as I can remember, the USB DAQ cards should work fine with versions of DAQmx but are probably NOT supported under traditional DAQ. Get an older version of DAQmx and you should be OK. why don't you call NI Tech support and they should be able to guide you about which driver version should work with LV 7x and your card. N.
  16. QUOTE (Antoine Châlons @ Sep 17 2008, 06:59 AM) I have a large 1000 VI Vision project as well. The upgrade didn't go as smoothly. For some reason after mass compile LV 8.6 would crash. I narrowed it down to a particular VI and then to the IMAQ image display in it. Replacing this display and saving the VI caused the project to open correctly. Then I had problems with building exe's under LV and LV-RT. LV 8.6 has a new web server for RT as well, and this webserver interferes with the install process if creating an image of a PXI using http://zone.ni.com/devzone/cda/tut/p/id/3937' target="_blank">NI's System Replication tool. Specifically, an image of an RT target cannot be applied to another RT target of the same type, since somehow the webserver files or something are different. This is a very important feature for us, allowing upgrades to PXI's remotely. I don't remember what the problems with the LV exe build was. Something to do with disconnecting the type-defs? At this point I gave up, and went back to LV 8.5.1. At least Vision 8.6 and its upgrade to 8.6.1 went smoothly and that has a lot of good features I can use right away in LV 8.5.1. I don't really miss the new features of LV 8.6. If mass-compile crashes LV 8.6, don't open the whole project; open up individual folders and recompile them one at a time. N.
  17. QUOTE (hfettig @ Sep 13 2008, 05:06 AM) I think it was introduced to better support FPGA type applications. I'm not exactly sure how, since I don't use the FPGA stuff. N.
  18. QUOTE (Antoine Châlons @ Sep 9 2008, 06:24 AM) Did you see the NI System Monitor API? Looks like it can do everything you need. N.
  19. What is the value you are reading? Hopefully it isn't too small. try using NI-MAX to see if the value reads correctly, before using your own code. How have you connected the Inputs? Read up on Referenced Single-Ended, Non-Referenced Single-Ended and differential connections. Use differential connections for best noise rejection. N.
  20. QUOTE (bmoyer @ Sep 8 2008, 12:24 PM) Well, some of it is. If you look through your own link, someone does mention that the R-B planes issue can be fixed by swapping them. And since no source code has been posted for the dll's you can't fix the issue yourself either. QUOTE (bmoyer @ Sep 8 2008, 12:24 PM) Someone should accurately describe any known issues so that people are properly warned before wasting their time implementing/debugging this if there is in fact a problem. Bruce Well, I just described it for you; and if you troll through LAVA, you will find links that show the problem as well N.
  21. QUOTE (bmoyer @ Sep 8 2008, 05:20 AM) Beware, the IMAQ JPEG Encode and Decode VI's downloadable from NI's website have a memory leak that will crash the application if run over a period of time. I don't remember specifically if its the Encode or the Decode, but I would use either of those with great care. Also, the Encode VI swaps the Red and Blue Color Planes so that the image if decoded by any other application (other than the supplied Decode) will cause the image to appear blue-ish. There is an available alternative VI if you have Vision called IMAQ Write String.vi which is equivalent to the Encode (and will allow you to convert any type of image to string), but I don't know of an alternative to the Decode VI. Neville.
  22. QUOTE (shoneill @ Sep 5 2008, 05:00 AM) For every monochrome Basler, there is an equivalent color model. They are quite prevalent. N.
  23. I don't know what your doing, but it seems to work OK when I tried it. I had to disconnect the inflectiondata type def to make it work, & added a boolean to select between the two data sets; here is my edited version to compare with. Neville.
  24. QUOTE (Aaron L @ Sep 3 2008, 10:35 AM) Great! As a sidenote, if using the ExpressVI, double-click it, change to VI and then edit it to remove unnecessary functionality and data conversions to get better performance. Digging around inside might also point you to why you seem to be losing the last pt on resample. <EDIT>: Playing around with Open or Closed Interval boolean in the ExpressVI seems to generate the extra last pt, but I can't seem to get my head around the concept at the moment.. N.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.