Jump to content

Neville D

Members
  • Posts

    752
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by Neville D

  1. I would not try to update a GUI more than 5 times a second. Any faster would be overkill. Usually every 0.5 s is good enough. You could use the "defer panel updates" property to limit the number of updates, and then you wouldn't need to worry about the speed of your processor. Make sure: 1 all loops in your code have a Wait ms with an appropriate time (even 0ms is better than nothing) 2 limit transparent graphical objects on your panel 3 do not overlap objects. Note also that raw processor speed is anyway not a measure of system performance with LabVIEW.. if doing a lot of array manipulations, poor labview code will require more and more contiguous memory, and calls to the LV memory manager drastically affect performance. Read the Labview performance and Optimization App note for further info. Neville.
  2. Why don't you make a subpanel of your subVI and display it on one of the tab sheets. That way you can display the front panel of the vi you are running with a much simpler UI. This would be a quick and dirty way to do it. Another way is to develop an architecture so that data can be moved from the acquisition loop to the UI loop for display. Then it is just a matter of selecting what you want to display. Neville.
  3. Take a look at Scott Hannah's Serial library; it already has a simple terminal emulator. http://sthmac.magnet.fsu.edu/labview/vi_library.html Neville.
  4. 0 ms will work, but add a reasonable delay. If you can live with 50ms updates, then use 50ms. Its just good practice to not hog all the time, if not needed. Neville.
  5. AAAHH....!! I remember doing that once on an RT Target. If an error was generated in the RT code, I would automatically reboot the RT target, and the application was set to run at startup. I think I used the Reboot VI.. Let me see if I can dig out what I used.. Will get back to you tomorrow. Neville.
  6. Lots of free, useful tools for array/number/boolean/file/string manipulation etc etc. Yes you need to download and install the OpenG commander, and then from Tools menu start the commander and download and install all the separate packages. This is so that each package can be updated independant of all the other OG packages. Installing it will do nothing to your original LV install. It just places a bunch of useful VI's in your user.lib folder, and various other stuff in your other LV folders (all of it pretty harmless). As far as I know, it does not make any entries into your registry (it is after all platform independant). The OG Builder enhances the functionality of your application builder (I haven't used it). You can't use it if you don't already have the App builder as part of your orig. LV install. Its great stuff.. give it a try. Open up the OpenG Commander for a lesson in good LV programming style. Neville.
  7. Are you using LV 8? Then all the OpenG stuff appears in a single sub-pallete. If using prior to 8 you should have a full set of tools in all the appropriate locations. Check your user.lib folder in NI/LV x.x/user.lib If correctly installed you should see a whole bunch of OpenG tools installed there. If correctly installed, then cruise over to: Block diagram>right-click-pallete & nail it, then -options (top right corner) and select Dynamic pallete view. You should see the OpenG functions in each of the sub palletes. Check array functions to see if you have the OpenG array tools. I would really urge you to download and install these tools. They are well-written, extremely useful and in general bug-free. Neville.
  8. They are different implementations of the same thing. The feedback node was introduced later (in LV 7 or so) to clean up the diagram a bit, so you don't have to route wires from one end of the diagram to the other. It is just a matter of taste which you use. I don't think there is any speed advantage. I do remember there being a bug with the feedback node a long time ago, but that was probably fixed. There should be a note in the LV Programming manual explaining the finer points of both. Neville.
  9. Sorry, you lost me.. I still don't understand what you are trying to do (or why). Neville.
  10. Things that affect performance: 1 Overlapping graphic objects 2 Transparent objects overlapping each other or other graphics. 3 New-Style (6i) controls/indicators have a bit more overhead because of the shading etc. 4 Static graphic objects (boxes, arrows etc.) if placed, should be placed at the BACK. (Group Menu>Move to BACK). 5 If placing things like logos etc. do not overlap with other objects. If no choice then move to back. 6 Do you really need a 17Hz update on the panel meters? Use "defer panel updates" property to update indicators every 250ms or so. Anything faster is useless anyway (unless captured on a plot) 7 Get latest drivers for you Graphics card (this may not make too much of a difference) 8 Increase system memory to 1Gig. 512MB is barely enough nowadays. 9 Upgrade to a newer laptop if you have an option. I have had display related issues with older laptops. 10 Separate User-Interface updates to a separate loop from the rest of the acquisition/control code. Don't worry about graphics on a tab sheet that is not displayed. That won't affect performance since those pages don't have to be drawn. Calculations etc. are usually very fast in LabVIEW. However, avoid building arrays in a loop, that calls the memory manager which will slow down your system. You can add logic to display every alternate or every 3rd or 4th value based on the length of the GUI Q elements. If too much backlog, display less data. Neville.
  11. What "hardware" do you want to restart?? If you mean the LV-RT platform itself, you could reboot it with a VI on your HOST system, targetting the RT platform. Its reboot RT controller.vi or something like that.. I don't have RT loaded on at the moment on my PC. Neville.
  12. One other thing is: when you install LV 8 or one of the newer DAQ drivers (7.5 onwards of NI-DAQ I think), by DEFAULT the older style NI-DAQ does not get installed. You have to expressly check off to install the traditional NI-DAQ driver. To the best of my knowledge, every new version of NI-DAQ holds the traditional DAQ driver at 6.9.3, while upgrading and expanding the DAQmx part of the driver. Neville.
  13. Hook up the device to one PC, and then in NI-MAX on the other PC, configure the device to be accessible as a "remote VISA" device. Tools>NI VISA>VISA Options-General Settings-Remote. That way you have access to it through VISA, and can use the same software on both PC's, one accessing the device "locally" and the other using the remote VISA functionality. I believe you can also use this method to have access to the serial ports of a different PC. Speed MIGHT be an issue if you are getting large amounts of data from the instrument. PS I haven't tried it this way before, but it looks like it should work. Neville.
  14. We have already received our LV 8 PDS CD's last month in North America. Maybe check with your local NI rep to see when you will receive them in Poland. Of course the installation of the LV 8 Debug Licence was such a pain, I have not bothered to install 8 on my development PC. Neville.
  15. Hi Fei, Remove the shift register which is used to collect all the message strings and append the new ones to the old list. See attached image: Neville.
  16. Hi, Here is an implementation of a lookup table. The LUT can be either: - a 2d array of X-col, Y-col, OR - a 1d array of (X,Y) clusters. Given an X, it will interpolate a Y value. Note that the LUT must be monotonically increasing for the LV function to work! You can add additional logic to check for out of range X values, error generation, interpolating Y given X etc. as needed. Cheers, Neville. Download File:post-2680-1134673935.vi
  17. Your email is very unclear. How are you communicating with the device? GPIB/Serial/USB/Ethernet? I am guessing, you are using a canned example VI to acquire data? Structure your code to set up the instrument in the required mode, and then put just the data acquisition commands in a loop with the required timing, to speed up the processing. Maybe you can set up the instrument in burst mode to acquire all the values and then transmit an array of values. I am not sure what you mean by rapidly, but if you want fast deterministic control, use an analog control line on the instrument (if it has one). Connect the control line to an NI DAQ card, and then control the voltage output as rapidly as you desire. This should give you much finer control than GPIB. Neville.
  18. Read the above reply AGAIN, and click on the hyper-links. Maybe spend less time playing the guitar and more time working with labview?? java script:emoticon(':ninja:', 'smid_17') smilie Neville.
  19. Hi Eric, You are not really meant to be using the "stop" button on the toolbar to stop your code. That is an ABORT button. Its like pulling the plug on your coffee-maker to stop it. Write your code to exit/cleanup/close files etc. when the user presses a front panel boolean of your choice. For your serial code, when user hits this STOP button, use VISA CLOSE to close the serial session, and free up the serial port for access again by either LabVIEW or another program (like Hyperterminal). then you don't need your series of steps to have access to the serial port again in LV. Hope this helps. Neville.
  20. EVAL version of LabVIEW may not have hardware support. But from your comments looks like you need to install VISA for the Serial port stuff to work at the very least. Neville.
  21. I have used the USB 6015 with 5 analog Input chans with no problems. I am using an old Toshiba laptop which has only 2 USB ports, so I have the DAQ connected into a powered hub, with a USB mouse, and a USB-Serial converter on the other USB port direct to laptop. It all works like a charm; the 6015 is like an E series card but running off USB. Works great. Note that the USB DAQs use interrupts and not DMA for data transfer and this may affect speed. Neville.
  22. Hi All, anyone figured out how to get all the OpenG pallete views back in LV 8? I just loaded it on this morning, and manually copied over all the OpenG stuff to the required LV8 folders. Suffering from OpenG withdrawal!! Neville.
  23. Hi Azazel, Looking at the diagram, you have a lot of coercion dots on arrays. These are a killer for memory allocation. The output of the arctan probably is a DBL, but output array is a SGL. Use Numeric>Conversion>To SGL precision Float to cast the data to the right type. You should see a huge performance boost. Neville.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.