Jump to content

Neville D

Members
  • Posts

    752
  • Joined

  • Last visited

  • Days Won

    2

Posts posted by Neville D

  1. Don't know if this will help, but if you can upgrade to LV 8x, you can automatically add NI components to the installer for your application like MAX, DAQmx etc.

    will save you the hassle of trying to build a global installer with all these components yourself. And why do you have two different versions of MAX? Just upgrade one to match the other.

    Also, if you use DAQmx you have the ability to programmatically define channels, so you wouldn't need the virtual channels of the old-style DAQ.

    You can also convert old-style virtual channels to equivalent objects in DAQmx called "global chans" or something like that (in NI-MAX).

    Neville.

  2. QUOTE(Thang Nguyen @ Mar 30 2007, 09:59 AM)

    Hi,

    I have a problem with the queue. After creating and using the queue to pass data, I try to flush queue then destroy it. But I meet this error:"Error 1 occurred at Release Queue in ...". I don't have much experience when working with a queue. Please give me some recommends on this issue.

    THank you in advance for your time of reading and answering this question.

    Thang Nguyen

    How about some example code or a quick screen shot?

    Neville.

  3. QUOTE(Vladimir Drzik @ Mar 29 2007, 08:26 AM)

    Hi guys

    I am using a reentrant VI which has several clones dynamically created through Open VI Reference. I would like to open the FPs of all those clones for debugging purposes (most of the time it is enough for me to have a look at what the FPs display). I know I can do the clone FP opening programmatically, but can I do it from the IDE once my application is already running?

    Vladimir

    Why dont you make the clone VI's re-entrant and then make the re-entrant VI "open front panel when called". That way all instances will automatically open up at run-time. If you dont want some of them open, just close the window (it wont stop the VI from running).

    The only issue with this method is that its hard to figure out which instance is which.

    Neville.

  4. QUOTE(Jon Sweeney @ Mar 22 2007, 06:58 PM)

    Neville,

    I'm not sure what the connection between Palette loading and the Help Window is, but I'll give your suggestion a try tomorrow.

    Thank you,

    Jon

    Another thing you could try is to copy over your labview.ini file from your work PC to the home PC.. (backup before hand). There might be some setting in there that is incorrect or different.

    Neville.

  5. QUOTE(Jon Sweeney @ Mar 22 2007, 09:56 AM)

    On my work computer (NI PXI-8105, 2GHz, 504 MB, quick internet access), there is always a long delay (~ 70 - 80 secs) before the "LabVIEW Help" window gets populated the first time (and sometimes the second time) the "Search the LabVIEW Help" option is used after LabVIEW startup. I do not have nearly this long of a delay on my home PC, which in all other ways is quite a bit slower. I have LV8.2 on both.

    Any ideas of the cause or the cure?

    Thank you,

    Jon

    Try playing around with the Palette Loading options under Tools>Options>Controls/Functions Palettes.

    I have mine set to Load palettes in background.

    You have to re-start LV after making a change here.

    Neville.

  6. QUOTE(BenD @ Mar 21 2007, 12:26 PM)

    Hi everyone,

    I'm building an application with a plug-in architecture that calls VIs not built with the application dynamically. The app builds and runs just fine, but when I get to the part of the executable that starts calling the VIs, the application seems to choke on opening them and spits back an "Error 1003" from the Open VI By Reference.vi (I think that's the name, anyway)

    This architecture has worked fine in the past but I've inherited this code and it's the first time I've tried to build the application as there have been some minor changes to the VI that the executable is built from. These small changes are also not ones that would cause this error to occur.

    I do remember the previous owner of the code going through considerable trouble getting it to work as an exe but I'm not finding their notes to be enough of a help to get it up and running. Has anyone worked with this kind of architecture before?

    Thanks!

    First thing to check would be the paths to the plug-in VI's. You might need to double-strip the path for the exe and only single strip to get at the VI in the dev. environment.

    Put an indicator up on the main VI so you can see what path its using.

    Neville.

  7. QUOTE(CraigGraham @ Mar 20 2007, 10:41 AM)

    I'm playing with a firewire camera in IMAQdx, LV7.1. Yeah, not got round to updating to 8.

    I've done an ugly as hell test VI (that I'm not going to post in its present form) that's based around the code in "IMAQdx Snap.vi". The only functional changes are to put the Start Aquisition...Get Image...Stop Acquisition group of sub VIs inside a while loop to snap an image every time I hit a button, and to monitor and display the output of various digital lines as it's happening. What I've discovered is that the camera is actually taking 2 shots each time. Further investigation reveals that, even if I load the original "IMAQdx Snap.vi" and put a breakpoint after the "Get Image", the camera keeps triggering. The "one shot" doesn't seem to actually mean single shot.

    Anyone else come across this?

    What are the camera trigger settings when you examine in NI-MAX?

    I would check there. Maybe its set to "free-run" mode instead of software trigger mode.

    Try the old-style IMAQ example to see if there is any issue there.

    I don't have any camera hardware to check at present, but last I played with it, IMAQdx worked fine under LV 7.1, Windoze as well as LV 7.1 RT for Snap and Grab.

    I have about 8 installs running LV8.2 RT, IMAQdx RT on PXI, with no problems (hardware triggered grab image).

    Neville.

  8. QUOTE(rkesmodel @ Mar 17 2007, 11:11 AM)

    LabVIEW 8.01, Full Development, no added toolkits

    I need a simple, fast, adjustable method of filtering out intermittant, short peaks (noise) in incomming data. The method we have come up with is to perform a Boxcar (or moving) average on the data. To explain, say you have an array of 100 elements. Take the first 10 elements and average them. This is your new first element. Now take elements 2 through 11 and average them, this is your new second element, etc. You can do this with an incomming data stream, you just lose the first 9.

    I have what I think is a very efficient, fast method for doing this. If anyone can come up with a better way, I would appreciate knowing. Also, I would like to know how fast this will execute. Of course, this will depend on how many elements are 'boxcarred' and the speed of the processor. I have included a 'throw together' test I created that I think tells me it will execute at about 6 nanoseconds per element with a per run overhead of about 60 nanoseconds.

    How did I get this? In the test routine I input a file of a little over 12K elements (attached). Before and after each run of the boxcar I capture the Tick Count, subtract them, and output the values. Looking at the graph you see periodic peaks of 1 millisecond. I believe this is where the tick just happens to increment exactly when the boxcar is running. So, if you calculate the number of runs between each peak and divide that into 1 millisecond it should be close to the run time of the boxcar (yes, I know it takes some time to get the second Tick Count).

    Running this on a 1.69 GHz, Pentium M, Gateway laptop, I calculated (all times in seconds) 9.5e-7 for 1 element, 1.14e-6 for 10, 1.69e-6 for 100, 6.7e-6 for 1000, and 3.03e-5 for 5000.

    Dividing the 5000 element time by 5000 gave me the 6 nanosecond result, and the 60 nano overhead is an EWAG (Educated Wild A** Guess) based on the 100 and 1000 elements numbers.

    Better/faster method? All opinions appreciated.

    BTW. I give the Boxcar routine freely to the forum. Anyone may use it for any reason (though I wouldn't mind credit).

    Sorry to be so long winded.

    Roy

    Couldn't you just use the Pt. By Pt. function : Mean Pt by Pt.vi ?

    (under Signal Processing>Pt by Pt> Probability & Statistics>Mean).

    It seems to be doing the same thing.

    Neville.

  9. QUOTE(Rick @ Mar 14 2007, 08:54 AM)

    During that conversation I asked about the eminent quarterly service release (shipping very soon) and was told that there are NO LabVIEW-related fixes in this service release. The primary update being NI-DAQmx 8.5.

    Rick

    Rick,

    My NI Rep told me the new version should be out in about 2 weeks. There are already new versions of NI Vision out on the NI website (Vision 8.2.1), so the LV version should be just around the corner.

    Neville.

  10. QUOTE(brianafischer @ Mar 9 2007, 05:33 PM)

    I do have the Professional development system, so X controls are an option and I am using LabVIEW 8.2.

    Current applications I deal with (that others designed) utilize multiple for loops which is really bad (i.e. 4 for loops for 4 stations!). I haven't seen any good examples beyond a simple state machine. My thoughts thus far are to utilize GOOP to create a class for each station. However, I could use some advice in how to implement the control logic. I currently use re-entrant VI's, but I could also use some advice on how to debug a re-entrant VI while multiple stations are calling it.

    If you already have LV 8.2, debugging re-entrant VI's is a snap. Just double-click on the VI and you can open its front panel, and probe its diagram just like a regular VI.. multiple instances show up as different VI's with a number attached to the name.. Example.vi_1, Example.vi_2 etc.

    As a matter of style, it is better to use while loops (instead of FOR) with the option of stopping the test on error or user-set Stop.

    Neville.

  11. QUOTE(yen @ Mar 10 2007, 11:30 AM)

    No, my first statement was "correct". If you use the pair of provided VI's to first encode and then decode, the output is definitely correct; however if you use another environment (trying to decode) with C, then the color planes appear to be swapped.

    I'm not exactly certain, but I suspect the issue is the fact that the http://en.wikipedia.org/wiki/JPEG_File_Interchange_Format#Color_Space' target="_blank">JPEG file format does not define the color planes to be used, and the hence the original NI developer of the code was free to interpret the order of the planes as required.. just that interpreting the image in Windoze shows the planes as swapped.

    The reason for using the JPEG VI's is the built-in compression & flexibility to adjust it up or down depending on file size (transmitting the 12k image takes about 100ms via tcp-ip).

    I'm not sure how I could use the picture VI's to do the same thing?

    And, yes, I am using IMAQ and a PXI RT platform for this time-critical image processing application. It's definitely matured a lot since I first started using it around LabVIEW 7.

    Neville.

  12. Figured it out.

    I have been using the IMAQ JPEG Encode.vi and IMAQ JPEG Decode.vi from the Developer Zone. There is a "bug" or issue with the Encode, namely that the Red and Blue color planes are swapped, resulting in an image with incorrect color on the receive side, when used with a non-LabVIEW application.

    However, if you use the JPEG Decode.vi on the receive side, you don't see the problem, since it interprets the color planes correctly.

    The fix is to swap the Red and Blue planes before using JPEG Encode.

    Neville.

  13. QUOTE(mermeladeK @ Mar 2 2007, 04:36 AM)

    Here are the VI's. The main one is the one with the word "TEST" in the name. The other 2 are subVI's. The "Acquire 1 divided by f idealized spectrum parameters.vi" is the VI supposed to use the Non linear curve fit.

    Nil

    Thanks,

    I will take a look at it shortly.. been out of town and lots to catch up on.

    Neville.

  14. Hi All,

    I am using JPEG VI's downloaded from NI, to take a color image and transform to a JPEG string.

    This string is transmitted via TCP-IP to a VME based computer and my colleague uses C to capture this image.

    Problem is the image he receives seems to have the colours messed up. It looks blue-ish.

    I can't seem to be able to upload the JPEG images to LAVA..

    Anybody have any ideas?

    Thanks,

    Neville.

  15. QUOTE(mermeladeK @ Feb 9 2007, 04:32 AM)

    Hello SciWare,

    Sorry for the delay. I thought I had posted a message on wednesday but it seems I didn't.

    The data wired to the Block is in the next picture. It is a spectrum of a 1/f noise plus a white noise.

    I also attach a graph plot of the data contained in the array in the cluster of the spectrum graph.

    Nil

    Why don't you post the VI's (and data if any) instead of a picture of the BD.. that will save people time in trying to re-build your work just to re-create your error.

    Neville.

  16. QUOTE(dthomson @ Feb 26 2007, 10:35 AM)

    My understanding is that the three computers can all be at work, as long as you are only using one at a time. The installation at home is separate and existed before the change to the 3-computer installation.

    Dave T.

    Hi guys,

    I had talked to my sales rep a while back on this.. what you can do is install on all the machines you might need to use it on, then activate the one you are using, and if you are going to switch to another machine then the Licence Mgr has the option to "de-activate" a licence for the time you are not using that machine.

    Neville.

  17. QUOTE(TiT @ Feb 21 2007, 09:54 AM)

    Hi all,

    I'm about to get a new PC at work and the first thing I'll do after installing LV will be to copy my LabVIEW.ini from my old computer to the new one.

    Now I wish I could do the same with my customized tool and function palette ; is there a simple way to do that or do I have to re-edit it all ?

    Thanks in advance

    Hi Tit,

    just copy your /LabVIEW/menus folder from one install to the other. That should copy all the pallet setups. (Be sure to backup the old copy first, just in case).

    Neville.

  18. QUOTE(gustav @ Feb 21 2007, 06:50 AM)

    I'm running a counter task for pulse generation, using an external signal as a source for ticks. I'm wondering if it's possible to switch the input terminal for the ticks while the task is running?

    This means I would have two sources of pulses connected to two different pins on my NI6602 card, and while the pulse generation task is running I want to be able to switch from which pin it gets the ticks. When I try to do this I get an error, so it appears to me it is not possible. However I thought I should check with the expert before ruling out the possibility, or perhaps someone could tell me another way to accomplish what I need :) . Any answers would be apreciated.

    /Gustav

    Your right.. once the task is started, you can't switch pins mid-stride.

    Neville.

  19. QUOTE(Val Brown @ Feb 20 2007, 11:46 AM)

    It's been a real problem that has been "solved" by a two-fold process:

    1. An older deployed version of my app built using LV7x and having the option to use or not use VISA.

    2. The current deployed version built using LV8.0.1 and requiring the use of VISA.

    In my experience the serpdrv was more forgiving of error conditions and seldom generated any errors (even if error conditions existed). This gave the false impression that things were fine.

    You could try to trap the errors you are getting and either automatically reset the serial port and restart your communication or else try ignoring them to see if you can still communicate.

    another issue is the output voltages on the USB adapters seldom reach the values as defined by the "recommended standard" (RS-232). It might be a low voltage issue causing the errors, or a floating ground.

    It might be adding a few delays of 5 or 10 ms in a tight "bytes at port" loop.. it might be incorrect serial port initialization settings.. or another application having access to the particular port (Hyperterminal open?).

    It might be a missing termination character on your data string.

    You could try monitoring your serial ports using "portmon" (or was it serialmon ? Just google it) while under use with VISA and with serpdrv.

    It might be poor design on the software.. without more specific info, (post your code), or more detailed debugging on your side (simplest serial comm that exhibits your problems) its difficult to play "doctor".

    Neville.

  20. QUOTE(gsussman @ Feb 20 2007, 11:46 AM)

    LabVIEW System Replication Tools

    Take a look at the following link

    http://zone.ni.com/devzone/cda/tut/p/id/3937

    The LabVIEW System Replication tools make this much.....much easier.

    Configure one RT system the way you need it

    Download an image of the system

    Upload new image to as many targets as you need

    I am not sure if the bug has been fixed, however there was a small issue with the built EXE that is included in the ZIP file. It required that the MAC address was the same in order to upload the image.

    If you run the source code version, you can easily modify this behavior to disregard the MAC address.

    Thanks for the tip!! Sometimes you can't see the forest for the trees. I had already modified the utility to backup my RT images. I can use the RT<->RT feature to move over software.

    You are right, you can't upload the image if the MAC address doesn't match. There is no way to modify this part of the code (behind a p'word protected VI). I suspect it is an anti-piracy feature to prevent copying the LV-RT runtime from a licenced target to an unlicenced target.

    Neville.

  21. QUOTE(Val Brown @ Feb 19 2007, 05:10 PM)

    Brian:

    I'm talking about the serpdrv VIs and I do believe that you will find that the code was already "ripped out" as of LV 8.0. Do you have any specific suggestions for my restoring that functionality? That would be REALLY helpful.

    I can give you prior SRs for this issue, going back to the "serial compatibility" VIs from LV 7x. They NEVER worked and so, throughout LV 7x days I used the "workaround" to continue using the serpdrv-related VIs. This is what I would truly love to do now but am blocked from doing so, as I pointed out above.

    BTW, the issue isn't USB-serial -- we had the same problems with the "serial compatibility" versions of LV7x when using 9pin serial ports, ie without USB-serial converters.

    What details would you like about the difficulties I'm seeing under VISA?

    Hi,

    I have used serial VI's using VISA from LV 6.0.2 all the way through LV 7.1 with PC serial ports as well as USB-Serial adapters on a laptop. Some of the applications involved quite high data rates and ran for days on end (monitoring fuel-cell applications).

    I have never experienced the kind of instability you are talking about. What version of VISA are you using? There used to be a problem with version 2.6 (or was it 3.0?) but that was fixed ages ago.

    Like Brian mentioned, even starting LV 7, the "serpdrv" VI's were just wrappers on top of the VISA functions. They had already been "ripped out". So essentially you have been using VISA for a long time.

    Have you tried

    1 A Different PC

    2 A different USB adapter

    3 A different version of VISA (earlier or later)

    4 A different serial cable

    I don't remember what brand of USB adapter we used, but it was some cheap OTS unit that (thankfully) worked right out of the box. But if you do have issues with your USB adapter try looking to see if NI offers a product. It may not be cheap, but you are guaranteed it will work with VISA/LV.

    Neville.

  22. QUOTE(Tom Eilers @ Feb 20 2007, 03:12 AM)

    A: With LabView 8.2 is a special deployment application for multiple targets.

    B: If you don't have LabView 8.2 than start the Max select you remote target.

    Right click and you will see an ftp option start this.

    go to the startup directory in you target and copy the startup.exe from and/to the other targets.

    That's very simple.

    Sure, I am using LV 8.2. Problem is with each RT as a separate target, you have to build the SAME application again and deploy it again for each separate target.

    Trying to do all this on the factory floor with a laptop and a wireless connection is quite difficult. It would be nice to have a two-button solution: "Build" followed by "deploy all".

    Neville.

  23. Hi all,

    I have multiple RT targets (6). I am deploying the same executable on all 6 of them. Currently, I have the 6 targets set up in my project, and when making a change, I build the exe on one target and copy over the build folder on my desktop to the other six targets build folder. This is to avoid the loooonng wait involved in re-building the same application 5 more times...

    Then I deploy the exe by right clicking on each target.. a very slow and cumbersome procedure.

    1 Is there a way to build and deploy simultaneously onto all 6 targets at once?

    2 Is there a way to copy build settings from one target to another?

    3 Can the OpenG build tools help with this?

    Thanks,

    Neville.

  24. QUOTE(JStoddard @ Feb 16 2007, 11:46 AM)

    I'm currently doing a VI for a Leak Test machine that may run up to 24 hours. In this time period I'm graphing the collected pressure traces, and that's it.

    Is there a way to dump the history of the graph to a CSV file? Rather than collecting the data into another array, and then either trashing the array once they start a new test, or saving it to a file if it's involked. It just seems like better memory managment to use the history feature... i guess.

    Thanks!

    Jason

    You mean "chart"?

    Neville.

  25. how to create executable VIs (.exe files) in Labview. Is it possible to run a VI without labview actually been installed on the system ???

    You need to get the Application Builder, either separately purchased, or if you have one of the LV Pro packages, it is included.

    It is under Tools>Build Executable.

    Once you build the exe, you need to install the LabVIEW Runtime Engine, along with your exe on the machine that you want to run your exe on.

    The RTE needs to be only installed once, after that you can replace your exe or add others, and they should all run fine.

    Neville.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.