Jump to content

Neville D

Members
  • Posts

    752
  • Joined

  • Last visited

  • Days Won

    2

Posts posted by Neville D

  1. QUOTE(JFM @ Sep 26 2007, 10:21 PM)

    Thanks for your comments, but my guess is that this is only solved by some hidden KEY in the NI-RT.ini file.

    Currently Broadcast is not an option, it has been ruled out earlier.

    I have tried to add a delay, but that has no effect. What matters is how you set the Maximum Number of Sockets in MAX.

    If you break the limit given in MAX, you'll get error code 11, (To many files open), if you then increase the number in MAX, you will instead run into another limit, that throws error 42 when it is reached.

    I need to open my sockets on a RT target, I guess this tool is only for Windows, or can it be used on Pharlap as well?

    For the record I have tried to open 1000 UDP sockets on Windows, without any problem.

    /J

    I think it is time to get NI involved. Open up a service request and get some of NI's RT experts to take a look. Your problem is easily duplicated, so its something they can follow up with pretty quickly.

    good luck.

    N.

  2. QUOTE(ASTDan @ Sep 20 2007, 09:37 PM)

    Hi Dan,

    Just saw your email today;

    things to check:

    1 Is your VI actually running?

    2 Is the web-server on the CVS enabled?

    3 Is your front panel very complicated with multiple image displays etc? (try first with a simple 1 second loop with a boolean flash)

    4 Can you see the front panel if you access the front panel from LV? i.e not using a web browser. LV>Operate>Connect to Remote Panel

    5 The web browser is a lower priority task, is your VI running very high priority? It will block the web server from publishing the web page.

    6 The CVS has very little RAM, and as such if your code is using a lot of image buffers etc, it may have trouble with the web stuff.

    7 Did you place the frontpanel.htm file in the required spot on the CVS?

    8 Is your loop rate very high? (i.e. enough time for webserver to publish the panel?)

    9 Does your front panel.htm actually match the latest version of your code? (I seem to remember that this didn't matter in the past, but check to make sure anyway)?

    With some of the versions of LV there was a bug where image displays on a front panel didn't update when viewed remotely. It is fixed with LV 8.5 and Vision 8.5. But didn't work with the previous version of Vision; but worked with the version prior to that & not before etc etc.. quite annoying when a "fixed" bug appeared again! Even having the image display on the front panel caused all the other indicators and controls to be affected where only some updated and others didn't.

    I usually view remote panels through LV and avoid the web browser route. I have a vision application on PXI running LV-RT; everything has been running fine since atleast LV7 and Vision 7 (except for the Image display issue); currently on LV 8.5+Vision 8.5.

    Neville.

  3. QUOTE(nil @ Jul 20 2004, 07:21 AM)

    So i just want to know what are they and how to make a .mnu file myself ,by labview or other editor?

    You can't create mnu files by yourself. These are the files that define your pallet views. The ones you generate when changing the default pallet set are stored in *\LabVIEW Data\8x\Pallets.

    If you want to change your pallet views, goto Tools>Advanced>change pallet set (I don't remember if this was the location for changing pallets in 7.x, but just search LV help for "editing pallets")

    Neville.

  4. Use IMAQ convert Pixels to Real-World in the Calibration pallet under Vision Utilities. But you must pass a calibrated image to this VI before you can get valid results.

    Calibration is as simple as marking a Region of Interest on the picture and defining its vertical and horizontal size in real-world coordinates (Use IMAQ Simple Calibration). You can get fancy using a non-linear calibration by taking a picture of a grid of dots and specifying the grid spacing. Once you have the calibration data you can transfer this to a real-world image (they must obviously be of the same size) using IMAQ Set Calibration Info.

    The idea here is to leave calibration info as part of an IMAQ image. So you don't have to carry around a large (potentially) 2d array of numbers in your code. the IMAQ image reference (for the calibrated image) is a pointer to this data in memory. You can use the Calibration VI's to access parts of this data as required. This way, when performing calculations in calibrated units, the routines only access parts of this data (using the user-specified ROI information), substantially speeding up processing. It is also a good way to hide the nitty-gritty of pixels<->real-world units, and use either as required.

    You really should go through the examples some more.. there are a lot of good pointers there, though they are definitely not examples of good LabVIEW style, being coded mostly in sequence structures.

    Also read the calibration chapter of the IMAQ Vision Concepts Manual. In my opinion it is one of the best-written NI manuals out there. The latest version is July 2007 (includes explanations of the new edge finding routine introduced in Vision 8.5).

    Neville.

  5. I would go with more memory and the fastest machine you can buy.

    Will you be processing the images as well or just acquiring? What frame rate?

    Does the app need the portability of a laptop?

    If not you could probably get a PXI chassis running LV-RT and a Core2 Duo processor, and put the DAQ and Image processing on different cores. The PXI's are very rugged, and we use them in sawmills all the time. Never had one fail yet. Obviously, this is the most expensive solution in the range of $8-10K, plus the need to get an enclosure and/or cooling for the PXI.

    Note that the firewire port on a laptop is different.. If I remember correctly, its 4 pin and doesn't provide the power. You should check that your camera will work with the firewire port on the laptop.

    Neville.

  6. QUOTE(Gary Rubin @ Aug 28 2007, 07:19 AM)

    I hate ctrl-t. I especially hate that it's next to the ctrl-R, so it inadvertently gets hit sometimes and I have to resize my windows again.

    While I'm ranting about my fat-fingering, I wish ctrl-E and ctrl-W weren't next to each other...

    You can change (or remove) the shortcuts associations of the ones you don't use.. or re-assign as convenient.

    Neville.

  7. QUOTE(p27182 @ Aug 28 2007, 04:13 AM)

    I tried the non-express integration but I must have been using it wrong, despite reading the help file cause I kept getting all zeros.

    thanks again for the help

    -pat

    Save a copy of your original VI.

    * Right-click on the Express VI, and select convert to VI. This operation cannot be reversed (hence back up original)

    * Now comb through the Express and remove all the stuff that is unnecessary for your application.

    * Save this VI with a different name.

    * Make sure it is re-entrant since you have 3 copies of it running in parallel in the loop.

    Your done.

    Again, try to work with what you have instead of jumping off in all different directions. Forget dll's and better hardware for now. Analyze where your performance is poor and improve that area. A few days work should get you all the performance you need.

    If you do a search for integration on LAVA, you should pull up posts of people writing their own integration routines using Rectangular or Trapezoidal rule.

    Neville.

  8. I would open up the express VI's you have for integration, and extract out only the functionality you need for your calculations. Avoid the blue dynamic data type wires if you want speed.

    Better yet, write your own integral VI, that should be pretty trivial for someone who is fresh with all the math. Or you might use the pt-by-pt integral VI.

    Use the "defer panel updates" property to not redraw every single iteration.. maybe once or twice a second might be good enough.

    You can selectively make visible or invisible the different UI elements to see which of them is contributing to your slow behaviour. This should help you to figure out if its the calculations or the display that are causing the slowdown.

    In general going to a dll won't buy you more speed, as compared to well-written LabVIEW code. Optimizing LabVIEW code should give you a speed increase equivalent to C-code. If you are bent on using C, use a formula node and type C-code into it. It is equivalently fast, without the hassles of debugging the dll in a different environment.

    Also to speed up certain calculation sections, change the VI's priority to "high" (or even subroutine for certain fast, repetitive calculations), and disable debugging on everything; using a built executable will also buy you a slight increase in speed since this removes the diagrams, disables debugging, and removes all unnecessary parts of the application.

    You could also do a simple decimate for the display (display only every alternate pt).

    Neville.

  9. QUOTE(ASTDan @ Aug 23 2007, 06:56 AM)

    Hello,

    My comment is where is NI headed in regards to documentation? I feel NI's documentation in general is good but could be better. Sometimes when reseaching something I wish they had more detail. Are they moving away from detailed manuals? Is documenation just chunks of html posts and the era of an all encompasing manuals over?

    Definitely seems like it. Gone are the days of thorough PDF documentation for new features, and well-written pdf documents that took you through all the steps necessary for the task.

    It seems even the new LabVIEW RT and newer hardware modules only has "help file" documentation. Messy, crude, and incomplete in terms of links.

    If you have a help file it should atleast have links (that work) to examples that illustrate that particular concept.. no luck.

    Neville.

  10. QUOTE(ashwink27 @ Aug 21 2007, 06:55 AM)

    what is the difference (advantages) between vision developement module 8.0 &vision developement module 8.5.

    Look at the release notes for the two products on the NI site. But off the top of my head, new things in 8.5:

    1 New edge finding algorithm

    2 New straight edge finding algorithm using Hough technique

    3 Support for recognizing 2-D bar codes

    4 Vision Assistant Express VI

    5 Ability to write images to strings in memory (i.e. convert an image to a jpeg/bmp/png string)

    All submodules of the NI Vision 8.2.1 Development Module have the following new features:

    · Support for LabVIEW 8.2.1.

    · Support for Microsoft Windows Vista.

    · Golden Template Comparison—Functions for comparing images to a golden template reference image.

    · Data Matrix—Enhancements in speed and accuracy and functions that output the ISO 16022 (AIM) grade for a given Data Matrix barcode.

    · Optical Character Verification—Functions for verifying the accuracy of characters within an image. For each character, the algorithm compares the character from the image with the reference character, and outputs a score based on the comparison.

    · Geometric Matching—Enhancements in speed and accuracy, support for searching for multiple templates within a single target image, and searching for templates within a calibrated image.

    · Shape Detection—Functions for detecting rectangles, lines, ellipses, and circles within an image.

    · Watershed Transform—Computes the watershed transform on an image. Refer to the NI Vision Concepts Manual for more information about watershed transform.

    · Local Adaptive Threshold—Thresholds an image into a binary image based on the specified local adaptive thresholding method (e.g. Niblack, Background Correction).

    · JPEG2000 File I/O—Support for reading and writing JPEG2000 files.

    Vision Assistant 8.2.1 includes the following additions:

    · GigE Vision Camera support—Allows users to acquire images from GigE Vision cameras.

    · Annulus ROI—Added support for the annulus ROI tool in the OCR/OCV step.

    · Run LabVIEW VI Step—Allows users to call custom LabVIEW VIs from within Vision Assistant scripts.

    · Image Overlay step—Overlay figures, text, and bitmaps onto an image without destroying the image data.

    · Image Annotation—Save data with an image file.

    · 64-bit RGB images—Added support for 64-bit RGB images.

    Neville.

    In summary, it is worth upgrading.

  11. QUOTE(jccorreu @ Aug 20 2007, 03:25 PM)

    In a state machine, you could easily calculate the elapsed time on each state and decide what to do about it (select next state based on this).

    I don't think you need a "timed loop" structure per se, unless the timing granularity required was very high.

    The notifier-based approach you have will only work if you have one of these loops. Notifiers of the same type interfere with each other, and cause some of the notifiers to be missed; this is due to performance optimizations in the NI code for notifiers. This is not a bug. You would be better off using a single element Q, instead of the notifier, if you really wanted to implement this architecture.

    Neville.

  12. I have had good luck with DELL Inspirons.. I take mine to saw-mills all the time.. a VERY dirty, dusty, hot environment with continuous high vibration..

    I frequently have saw-dust on them from the static on the display.. still running fine after a couple of years of this treatment.

    They aren't marketed as "rugged/industrial" but they sure work fine. They don't come with a serial port though.. you would have to use a USB to RS-232 converter.

    I have seen people in the mills use the Panasonic Tough-Book it looks pretty rugged, but they usually lag behind in terms of the latest generation hardware.

    Neville.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.