Jump to content

Neville D

Members
  • Posts

    752
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by Neville D

  1. I think you will help yourself greatly by going through some example programs and a few chapters of an introductory text on LabVIEW; maybe a little tutorial on LabVIEW before you wade into trying to detect motion in an image. Try looking up clusters, shift registers, and maybe a few image examples after you feel comfortable with LabVIEW. We are all here to help, and I am not trying to discourage you, but your questions seem to indicate a lack of very basic labview knowledge. Neville. PS A mathscript node is the last thing you want to use in an image application. It is extremely slow and its main benefit is being able to run matlab scripts in a LabVIEW environment.
  2. Maybe you can "add" the images? Or best apply a mask to limit the pixels that will be manipulated. Neville.
  3. You might try this library modified by Rolf K over at the NI site: Raw Socket Ping Neville.
  4. How about a quick call to NI to find out which hardware will work best in your special case? This is exactly what Application Engineers and your local NI rep are there for. Often they can recommend hardware that you never knew existed in their catalog. Maybe you need an FPGA card, I don't know.. Neville.
  5. Can you save the data and process offline to see if there is something in the data causing your processing to leak memory? Can you isolate just the TCP stuff and run that to see if its the communication thats causing the problem? Are there any other processes running on your machine that could cause issues? Can you run the processing routines in a loop (without the TCP) to see if there is something in there causing a problem? Are you using a statemachine architecture? I usually have a "Next State.vi" to flip states. I can log every state transition that occurs. This helps isolate the states where trouble is brewing. There is also a version that can be used to log states into a string, but this can grow very large so is to be used with caution for limited runs: Troubleshooting is never easy, and the basic principles are the same: isolate modules and figure out which one is causing the problem, then fix it (easier said than done) :headbang: Neville.
  6. I would like to see a benchmark for: 1) Vision functions: threshold/extract image, copy image, image to image, management of large numbers of image buffers. 2) Manipulation of complicated deeply nested structures. e.g: Array of clusters of array of Edge (x,y) coords as output from some of the image functions like the IMAQ concentric Rake. See the Search Arcs output and the ROI Descriptor input in this picture: Thanks, Neville.
  7. Can you please post jpegs and use the "insert image" icon above so that it is visible in your post. thanks, Neville.
  8. QUOTE (dallas @ Feb 25 2009, 02:33 AM) As far as I remember you can't change the GPIB address programmatically. You have to do it from the front panel of the instrument. Why would you want to change it programmatically? Once set, just send commands to that address. You can do a *IDN? query and then based on the response from all the instruments, you can figure out what instrument is at what address. Neville.
  9. QUOTE (diogo @ Feb 22 2009, 03:17 AM) http://www.catb.org/%7Eesr/faqs/smart-questions.html' rel='nofollow' target="_blank">Read this. N.
  10. QUOTE (shadowzero2 @ Feb 20 2009, 01:11 AM) The JPEG encode/decode for sure have a memory leak in it. And your right, you do need to swap the R-B colour planes. Those were unsupported VI's posted on the NI website, and I have tried many times to get them to fix the bugs in it, but no luck. There are other posts on LAVA about them. You can use IMAQ ReadFile or IMAQ Write File2 or IMAQ Write String vi's to do what you need. You might have to save the string to file and then open it to duplicated the "jpeg decode". PS Next time resize your jpeg screenshot to a reasonable size and post here using the "Insert Image" button above. It was almost unreadably small. Neville.
  11. QUOTE (Vende @ Feb 16 2009, 10:59 AM) Yes, in Windows. How can you change in MAX if you can't detect the camera there? N.
  12. QUOTE (JimCo @ Feb 16 2009, 06:46 AM) I think your doing it the right way. The reason is most use cases need the overlay to stay with the image and transform as the image is transformed. But your case is the exact opposite, so you have to transform the image and then copy that overlay to your image. Thats fine. I do that all the time. I maintain overlays in separate image buffers and only copy them over just before displaying. This allows me to maintain multiple overlays displaying the results of different processes, all of which may not be needed at display time. It doesn't seem to be very processor or memory intensive when used to display a few circles and lines and 4-5 lines of short text. What are you overlaying? text alone? In that case, shift the text and copy over the overlay at the last instant. N.
  13. QUOTE (Vende @ Feb 16 2009, 08:41 AM) Yes, select the camera, right-click and use the NI driver instead of the one supplied by the camera manufacturer. I have my cameras connected to a PXI-RT chassis, so I can't walk you through the exact steps, but thats what I have done in the past. Neville.
  14. Start by looking at the DAQmx examples. Note that you can't run old DAQ and DAQmx simultaneously. There are two VI's somewhere on the pallets that allow you to "clear" or "close" one for old DAQ and the other for DAQmx. Use those at the end if you want to flip to the other style of DAQ (without having to close out of LV). Sorry can't be of more help but thats what I remember from a few years ago when I did the conversion. This helps you to run your code in both old and new styles for debug. I would say start of with simple read-trigger-write and then move to advanced stuff. Use the DAQmx wizard to generate code>convert to a VI> and pick through it to help with your conversions. Thats how I learned about it. You should be able to do everything with DAQmx that you could with the old style DAQ (and much more) with better performance. Neville.
  15. Look at the Straight Edge Detection Example.vi and Edge Detection Example.vi Try to follow what they're doing, and repeat for your case. N.
  16. Are you using a serial mouse? Try a USB mouse instead. N.
  17. There is a VI on the NI site called Performance Monitor based on .NET. See if you can use it track your processes being created, or maybe modify it for your purpose. N.
  18. Check the NI website for an app note about using encoders. N.
  19. QUOTE (rolfk @ Feb 9 2009, 11:43 PM) I have a number of PXI 8187's running LV-RT. They are based on the Pentium-M mobile single core processor, with reasonable performance, but like I said, VI-Server performance is still shaky. N.
  20. QUOTE (Nima @ Feb 9 2009, 11:51 AM) Another reason to stay away from them in the future. There are camera manufacturers a dime a dozen these days, Allied Vision Technologies, Prosilica (they got bought over by AVT), and Basler to name a few who are very good. No need to stick with vendors who are difficult to work with. QUOTE (Nima @ Feb 9 2009, 11:51 AM) Is there any way to acquire the images in LabView with GigE standard? How did these GigE cameras wok before the GigE Vision era ?!!! Before Gig-E, NI maintained a list of supported cameras. If your camera was on the list, it was supported. If your camera is Gig-E compliant, it means the command structure of the camera is somewhat standardised, and additional commands that are specific to camera model or manufacturer are specified in an xml file somewhere that the the software (like MAX or your LV code) can read. You might call NI and start a service request on this one, to see if they can get that camera to work. You could even send the camera in to them, and they might give you a patch or something to make it work. But you should start the dialog with them. Neville.
  21. QUOTE (Chris Davis @ Feb 5 2009, 07:17 PM) Yes, thats exactly what I saw as well. It doesn't seem to make any difference. Better to let the OS/LV optimize CPU switching. Thanks for the reply! N.
  22. QUOTE (amila @ Feb 6 2009, 03:31 AM) Yes, everything is possible. Look at the LV examples for edge detection. You will learn a lot. You need to define a ROI that encompasses the feature. Use this ROI for the edge detection. N.
  23. How about you add a "Done" boolean that the user clicks when they have finished selecting the ROI? Or else set the Done to be a double-click on the front panel? Neville.
  24. Also make sure your DAQ can provide enough gain for TC measurement. The signal from a TC is in the uV, so you need a gain of minimum 1000 to get something meaningful. Make sure your Cold Junction Compensation is correctly set as well. N.
  25. OK, then just get them to build their app into an exe which your RT app can call as a plug-in. They can drag their exe into the RT target using windows explorer as an ftp interface. But it can get complicated in terms of inputs and outputs. Its really difficult writing an app for OTHERS to modify. I know customers start out saying "I want to be able to change this", but in reality they probably won't touch it, or will complain that the process is too complicated. N.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.