Jump to content

Neville D

Members
  • Posts

    752
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by Neville D

  1. Any recommendations for an SBC running LabVIEW-RT in a VME chassis? Neville.
  2. QUOTE(sprezzaturon @ Nov 1 2007, 10:03 AM) What kind of Developer Suite do you have?? If you got all those disks, then you would seem to have a full Vision Suite. In which case, just activate it and start using the built-in routines. Neville.
  3. QUOTE(stamatios @ Oct 30 2007, 02:27 PM) Try clicking Shift+Click to zoom out. It is usually helpful to use Vision Assistant just for fine tuning the basic image algorithm; after that you should use LabVIEW to code the application as you need it. These may be, but are not limited to: * accessing the image data (camera? file?) * selecting the ROI programmatically (search through the image for features of interest) * dynamically adjusting thresholding parameters based on different images * User Interface issues: position of windows, selection of image to process etc. * any additional buffers you may need to store intermediate images for display etc. etc. Vision Assistant in my opinion, makes these other steps hopelessly complicated and prone to error. If I have time, I will take a look at the code you have posted. Neville.
  4. QUOTE(monzue @ Oct 26 2007, 02:08 PM) Ben, you are right, when the decode is put in a loop LV Memory usage goes way up.. there seems to be a memory leak ONLY when used with RGB images. No way to fix that without getting NI involved. Unfortunately your code is quite a mess :headbang: and I wasn't able to figure out all the things you said you are doing. Looks like TCP transmitting a JPEG and then reading the JPEG on the TCP receive side. Fair enough, should work. I tried out my suggestion of using a fresh buffer every time in my version of the code. Doesn't help. Your best bet is to try saving to file and viewing from there. Not sure how you are able to use these functions without IMAQ Vision ?? Neville.
  5. QUOTE(monzue @ Oct 26 2007, 12:21 PM) Can you post some example code that demonstrates the problem? I have been using Decode and Encode for a while now, with no problems. What are you trying to do? Maybe if its blowing up, you could just save the JPEG string from the Encode to file, and then open the file instead? How about trying to kill the buffer after decode and then using a fresh buffer on the next iteration? If there really is a bug in the Decode, only NI can fix it, (since they have the source code) and you will have to open up a service request with them. Neville.
  6. QUOTE(daro @ Oct 25 2007, 08:00 PM) What is the order you installed your software? You should install LV first, then the Vision stuff. If you are using Vision 7.1.1, then you need a number of components from different cd's installed. As far as I remember, IMAQ Vision, IMAQ for 1394 etc. etc.. its been a while since I used 7.1.1 but the newer versions of Vision have everything integrated into a couple of CD's instead of a million separate components. Also, have you activated your Vision package? Maybe it is not allowing you to generate VI's because of that. Another thing: check that LV8 is indeed compatible with Vision 7.1.1 (I think it should be, but check on the NI website or the LV 8 release notes). Neville.
  7. QUOTE(stamatios @ Oct 25 2007, 03:51 PM) Here is a http://www.catb.org/%7Eesr/faqs/smart-questions.html' target="_blank">helpful link. Neville.
  8. QUOTE(Hacti @ Oct 25 2007, 02:15 AM) I'm no expert in using shared variables, but why don't you save these properties or whatever to a file instead of to a llb and then read them from file at the start of your code? If all else fails, here is a brute-force approach: buy a NI $900 debug licence that allows you to legally install the full LabVIEW development environment on the PC (in the field) and use it like that. Neville.
  9. Vincenzo, you will HAVE to write code for some parts of your application. Try searching for the following two VI's, they may be of some help in locating the pts of interest. Transition Measurements 1 chan.vi Trigger Detection for 1 chan.vi They may not be on the pallets directly, but I have found them probably from the examples or by deconstructing an ExpressVI. No matter, just search the pallets for these two VI's. Neville.
  10. Are you sure all the VI's are 8.5? I had some wierd issues where some example code I had downloaded from NI's site and which was in my old project, was loading a dll from my labview 8.2 folder in my upgraded labview 8.5 project. Look under the Project's files tab to make sure everything is loaded from /labview 8.5 folder. Neville.
  11. Vincenzo, threshold 1d array ONLY works for non-descending data i.e., the data set must be increasing. Split the waveform into increasing and decreasing sections. For the decreasing sections, use "reverse 1d array" so that the decreasing sections of your waveform are increasing as well. Note that this will affect the solution index. Apply the threshold to each section, find the x value, adjust the x value if required (if it is from a reversed section of data). Neville.
  12. QUOTE(orko @ Oct 18 2007, 08:48 AM) The advantage of the remote panel method is that you don't need anything additional on the Server (remote system) side.. just enable the webserver on the cRIO, and allow users to connect to front panel and control it, thats it. These can be set programmatically in your cRIO code, or else access the remote target in your project and change its properties from there. (Now I have never used to cRIO platform, but I am assuming it is similar to other NI LV-RT based platforms). Then write the client-side code on a PC from where you want to access the remote panel, and you should see the remote panel exactly like the cRIO stuff was running on that PC. However, that VI must be running on the cRIO for you to be able to access it. About update rates.. 50Hz seems pretty high, and you will probably miss updates in-between, but as a human, you are not going to be able to eyeball that much data anyway. So you should be OK. The advantage of the other two methods is that you can enable buffering if you desire, plus the responsiveness of the application will be much higher. For example if your cRIO is busy running something and you press a button on a remote panel or web browser, it might ignore that command until it gets round to servicing its lower level browser task. If using shared variables or TCP, the response can be built in to your application to any degree of sensitivity you desire. Neville.
  13. Joe, I prefer just accessing the remote panel using VI Server & tabs work. The browser just adds another layer in-between, looks uglier, and sometimes with past versions of LV, special types of controls or indicators did not show up correctly on your browser (IMAQ Image display? Subpanels?). I am not sure if you had subpanels on your VI if they would show up in the browser. On the other hand, you need to write some code and install the LV runtime on each machine that you plan on accessing your application with. The code is quite simple actually: 1 enable webserver on the remote target; allow access 2 Invoke Application property RemotePanel:Open Connection To Server 3 Supply the remote VI's name and your ready to go. You might also need to consider at what rate your panel indicators are updating and what sort of response you need. In case performance isn't up to snuff with the browser approach or using remote panels, you will have to use shared variables or your own TCP based client-server application. PS I am not sure that tabs DON'T work with the web browser approach.. its been a while since I went that route. Neville.
  14. Neville D

    cFP rate

    In the past, I found that reading all channels required in a single read is faster than reading individual chans one at a time. Also, if I remember correctly, the TC module was much slower to read than the other module types... the reason escapes me at the moment.. this was about 3 yrs ago. Neville.
  15. Have you tried entering the #rows and offset? Are you using this VI as a subVI? Not clear how you are using the front panel of this VI. In any case, if all this doesn't work, just read the whole file and extract out the rows you want using array subset function from the arrays pallet. Neville.
  16. Maybe there is a label or caption that is placed hidden somewhere on the tab control and it resizes (auto-grows) to fit it. Try turning auto-grow off? Else just rebuild the tab control in 8.5 and delete the old one. Neville.
  17. QUOTE(rolfk @ Oct 12 2007, 01:11 AM) Thanks Rolf. I was wondering why changing the threads didn't seem to make a darned bit of difference to my "multi-threaded" multi-loop application!! That also answers my second question as to how to set up threads for an executable. Neville.
  18. QUOTE(crelf @ Oct 12 2007, 05:12 AM) True, but you can also use the Align & Resample VI's under Signal Processing>Waveform Conditioning. They give you a lot more flexibility than just throwing out unwanted samples (i.e decimation). You can resample at an arbitrarily lower sample rate and choose what kind of interpolation to use between the pts. as well as the alignment (starting pt) for the original and resampled waveform. These are indeed powerful VI's with lots of flexibility and functionality. A good starting pt. is the Express VI, and then drill down from it to select the bits you really need. Neville.
  19. Do you have to run Thread config.vi in your code everytime? Or just once and LV will maintain the thread configs internally? Neville.
  20. QUOTE(gottfried @ Oct 11 2007, 06:01 AM) Short answer: you cannot do this in hardware. DAQ boards have only limited resources (scan clocks, timebases etc) that have to be shared. Workaround: Sample at the highest rate, and then downsample one of the signals to your required lower sample rate. There are VI's available in LV that will allow you to change the sampling rate pretty easily. Neville.
  21. QUOTE(alukindo @ Oct 9 2007, 04:19 PM) Just use a multimeter to ohm out the ground leads going into the the 1314 and coming out of the 1314. Also check the board that connects your SCXI into your DAQ card in the PC.. hopefully this is a straightforward board like the SCB-68, which you can eye-ball the traces.. but follow the same procedure: check that the ground channels into the PCB ohm out with the ground channels exiting the PCB. Neville.
  22. QUOTE(alukindo @ Oct 9 2007, 10:33 AM) Anthony, Like I said before, check the grounding on the BREAKOUT boards (ie interconnect board between the DAQ card; and the terminal block between SCXI and the real-world signals). The SCXI boards and probably the DAQ card are all OK. This is probably why all the signals are floating.. the ground plane on the interconnect must be fried. You will probably not need a spare SCXI chassis and 1520's to trouble-shoot this; what are terminal block are you using for the 1520? Check the ground plane on that. Neville.
  23. Have you checked the Termination block (or breakout board) ground leads? In general, when you screw up, a large current flows, and usually the ground plane on the breakout board gets fried. Examine it carefully. Also, check out each SCXI channel by attaching a known good voltage (battery for example) and seeing that it is being read correctly, channel by channel.. you could probably use NI MAX to do this. Neville.
  24. There are functions in the IMAQ Vision tool-kit from NI that allow you to do that and much more. It is expensive, though. Neville.
  25. Hi Dan, sorry I didn't try out your posted VI earlier. I don't have a CVS handy and since your code used CVI specific FPGA calls, I would have had to build a different little test VI and try it out on my PXI, but I have been very busy with projects lately and never got around to try that out. Good luck with NI's trouble-shooting. Neville.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.