Jump to content

Neville D

Members
  • Posts

    752
  • Joined

  • Last visited

  • Days Won

    2

Posts posted by Neville D

  1. QUOTE(BrokenArrow @ Jul 17 2007, 05:51 AM)

    Hi all!

    If you had a system that ran 9 com ports and did a lot of file I/O, with multiple parallel loops, would you choose Windows XP or 2000 Pro? The choice IS limited to those two, no Linux or MS Bob, etc. :)

    If you know of any reasons why one is better than another and can share that with me, I'd appreciate it.

    THANKS!

    p.s. this is for a built app LV 5.1.1

    LV 5.1.1 probably isn't supposed to run on XP. Don't know about 2000 Pro; I would go with 2000 Pro for such and old application, just to make sure everything works.

    Otherwise just re-build using a later version of LV (7.1.1 is pretty rock-solid, 8.2.1 isn't bad) and use XP.

    Neville.

  2. QUOTE(jlau @ Jul 11 2007, 08:13 AM)

    Thanks for all those quick answers :) .

    So I cannot really simulate an RT target :( . I am going to run the target application on my desktop computer, with a Conditional Disable Structure for the TCP communication VIs (which cause error because the is no network).

    (The application is driving a motor and receiving position data from the robot driven by this motor.)

    Jean

    TCP communication should work regardless of platform. Have you tried running the code on the desktop? The TCP primitives themselves should handle any platform specific details invisible to you.

    Like others mentioned there are a few functions (usually from the RT pallet) that won't work on the desktop platform (and vice versa for any ActiveX calls etc. for RT platform), but the large majority, I would say 99% of the code should work fine on the desktop.

    -------------------edit-------------------------

    Looking at your earlier screen shot, looks like you have some RT FIFO VI's; these won't work on the desktop, but you could probably replace those with Q VI's (use a case structure wherever you have the FIFO VI's and replace with equivalent Q VI's) and the code should run.

    Neville.

  3. QUOTE(mateo @ Jun 26 2007, 09:52 AM)

    Dear friends,

    I am using the IMAQ JPEG DECODE and IMAQ JPEG ENCODE but this 2nd VI waste a great amount of system memory. I think the cause is the Call Library Function Node called "IMAQ_JPG_JpegBuffer2Image" but I don´t know how to dispose the memory allocation.

    Could anybody help me? Thanks in advance.

    What do you mean by "great amount"?

    Is it affecting performance?

    How large of a JPEG file are you trying to encode?

    The buffer in Encode may be re-usable for future calls. I wouldn't worry about it unless it is drastically affecting performance.

    Neville.

  4. QUOTE(orko @ May 24 2007, 10:34 AM)

    I'm getting very similar stats (~30 second first launch on 8.2.1; ~4-5 seconds after that) on my dev box, which is a dell precision 380; 2.8Ghz 1GB

    A question: About half of the startup time for the first launch (~14-16 seconds) is spent on the "Converting toolkits" phase. What exactly is LabVIEW doing during this time? This may point toward the cause of most of the inititial delay.

    I found that mass-compiling the \National Instruments\Advanced Signal Processing Toolkit sped up my launch times to near light speed!

    Strange, since this folder has VI's in 7.0 without diagrams, and mass compiling should not change them.

    Neville.

  5. QUOTE(EJW @ Apr 25 2007, 12:56 PM)

    Since this is being done in the initialization state of my program , does it really matter between using the local or the prop node. I would think during this state of the program, execution speed is not an issue. If I am correct in assuming: Property NOdes don't create copies of the data, local variables do. However, property nodes do switch to the UI thread to obtain a value. Are these assumptions correct as well?

    I think you are correct. Property nodes won't create copies, BUT they are even slower than Locals, probably due to the thread switch to the UI thread among other things.

    In terms of speed:

    Wire is fastest, followed by local, followed by property node.

    In terms of initialization, it may NOT matter for a few (even a few hundred) variables what you use to initialize, and property nodes will even help you keep the block diagram neat, burying the nodes in a sub-VI and reusing it for resetting properties of similar variables. I guess that was the idea in having the value property.

    Neville.

  6. QUOTE(ooth @ May 10 2007, 05:59 PM)

    Hmm... I haven't seen that. I have an RT utility application using the internet toolkit to download and upload files, reboot targets backup their drives etc.

    It was written in 7.0 and migrated (and modified) all the way to 8.2.1.

    It has an event structure that detects either the stop button pressed, or the close window event and then gracefully exits (fast).

    Do you have some open references and such lying around at the end that are being closed by LV before exit?

    I would look at the shutdown sequence in the code.

    Maybe look at the toplevel VI settings, and play with those.

    Neville.

    QUOTE(jhoskins @ May 10 2007, 03:11 PM)

    This is really annoying.

    I got a complaint from one of my fellow engineers on the floor. His complaint was that when he closed my program it took the program a long time to not be visible in the taskbar (not really a big deal to me). I mean the FP goes away but the taskbar indicator stays up for as long as 30 sec or more. I tried it on my developement machine and sure enough it does the same thing. So I decided to put in the quit
    LV
    function, same result. Then I decided to downgrade the program to
    LV
    8.0. I did that and built it into an EXE and it worked perfectly. As soon as I closed the front panel it was removed from the task bar within a second or two. So I then built (using 8.2.1) a really really small app (while loop, random #, Indicator) and built that into an EXE. I ran it and it had the same results. Has anyone seen this type of behavior using
    LV
    8.2.1.

    I have attached that original problem VI.

    I took a look at your app. On panel close, the DAQ task isn't wired, which means the Clear task is going to error out. Try with that fixed.

    Neville.

  7. QUOTE(young grasshopper @ May 7 2007, 09:07 AM)

    Nope I wouldn't need simultaneous AI & AO, the signal would be based on the one sample and should never change in form. The frequency is low because this is a waveform associated with a human heartbeat - the application is a patient simulator for testing pulse oximeters.

    Thanks

    I think hardware or softwared timed AO should work fine. Take a look at the DAQmx examples; you should see one that matches your needs.

    N.

  8. QUOTE(young grasshopper @ May 7 2007, 08:03 AM)

    I have no experience using analog outputs in LabVIEW, and I'm just trying to determine whether this would be possible, so I don't kill myself trying to do something which can't be done...

    If I acquired a sample of an analog waveform (somewhat resembling a 1V 80 Hz pulse), would there be any way to turn around and use that sample as the basis for a new signal being generated on the analog output line of a DAQ board? What I would like to be able to do is reproduce the original waveform, but change the frequency and amplitude as desired.

    I haven't looked into this that much, so I'm sorry if this is a dumb question, but from what I've read it seems like signals are mostly generated in LabVIEW through either an equation, or using generic signal types like a sine wave.

    Thanks for any advice!

    Your question isn't very clear.. do you want simultanous analog input and analog output? Or is the AO calculated off-line?

    The frequencies seem fairly low (80Hz?) so you should be able to get away with software-timed analog output. This will allow you to change the frequency. You should be able to use hardware time Analog Input.

    I'm not sure if you can do hardware timed AI and AO simultanously while changing the frequency of the AO; most probably not. DAQ cards usual have only one on-board FIFO, which limits you to only one of the above.

    Using a more expensive DAQ card (for example M-Series as opposed to the cheaper E-Series) will allow you more DMA channels for simultanous AI/AO (but still won't allow changing the AO frequency on the fly).

    Speak to your local NI rep or call NI for more in-depth info.

    Neville.

  9. QUOTE(Pablo Bleyer @ May 4 2007, 02:57 PM)

    Don't think so. Upgraded two different systems from 8.2 to 8.2.1 and a completely new install of 8.2.1 on another. Same behavior on the three machines. It is related to the size of the files in respect to the frequency of the problem. It *has* crashed a couple of times during development, like changing the properties of some controls, but I guess that is the regular crash behavior of any application these days :-/. LV recovered correctly the VI backup on that occasions, though.

    I even thought it was the version of DAQmx installed, but tried downgrading to 8.3 with the same troubles. I am also trying to avoid any express VI. They seem to use more resources than regular VIs and the crashes are more frequent when I use them.

    Cheers.

    I meant the VI's that you have written might be corrupted, not the vi.lib VI's.

    Your comment about "LV recovered the VI backup" brings to mind a similar problem I had a while back. The so-called "recovered" VI from backup was the one that was corrupted in my case. It caused all sorts of strange crashes, till I replaced it with a same VI written from scratch.

    If you remember which of these "recovered" VI's you used, I would advise just re-writing them or else obtain a copy from some old backup and modify. I'm pretty sure thats your problem.

    Neville.

  10. QUOTE(Pablo Bleyer @ May 4 2007, 01:05 PM)

    Ok. I am starting to regret having upgraded to 8.2.1. It is crashing on my machines almost on a daily basis with my current applications. Before I submit a report to NI, is anybody else having problems processing *big* data files into LV and displaying it in graphs? (See attachment). After working with the resource hungry 8.0 and 8.2 series, I have a deep longing for the stability and reliability of the old LV versions... In fact, I am seriously considering downgrading to 7.1 at this point.

    Regards.

    Maybe some VI in your project got corrupted after the upgrade process?

    Is it really related to the size of your data file (doesn't crash with smaller files)?

    Go back to a previous backup of the VI or project and try to mass-compile/re-compile the project. See if it re-occurs.

    Nevile.

  11. QUOTE(Thang Nguyen @ May 4 2007, 07:43 AM)

    I see that they put everything in the buffer then plot it out. We can use an array and append a new value into this array and plot it out at each interval. If I do a test for three days. This will become a big problem with memory for a multiple plot. Is there any better way?

    I also need to update the plot at every value with dynamic interval.

    Thank you for your answer anyway,

    Thang Nguyen

    If you do a test "for 3 days" do you have to see every single pt of data you have got? You need to think about the resolution of your monitor wrt to the number of points you are trying to view.

    Or maybe resample your data points so that the dt is the same for all plots and then just use a chart. There is an express VI to resample data. Extract out the necessary VI's from it for your use.

    Or else, some form of intelligent data decimation is required. There is an article on the NI site on data decimation for display.

    Neville.

  12. QUOTE(Louis Manfredi @ Apr 26 2007, 06:07 AM)

    Yes, same thing here in USA, always delivered by Fed Ex or some other really expensive means. Since my business is in a suburban residential area, the Fed Ex truck often makes a special trip.

    Maybe they are justifying the ridiculous shipping charges that they now charge which is a percentage (10% ?) of the purchase price. i.e. if you bought SOFTWARE worth a few thousand dollars, you get charged hundreds of dollars to SHIP a few CD's!!

    Maybe this justifies the expensive looking cardboard carton as well.

    Neville.

  13. QUOTE(eaolson @ Apr 24 2007, 03:01 PM)

    Hmm.. I seem to be able to write the data and read it, while the image buffer is maintained.

    But saving the image doesn't seem to save the meta data as well.

    Silly me!! There is a SEPARATE VI "IMAQ Read image and vision info".. the regular "IMAQ Read Image" won't work..

    Neville.

  14. Hi all,

    Is there a way to embed some additional text information to a png image file? I want to be able to save some image crop information to the image to avoid having to maintain an additional file.

    There seem to be some VI's "IMAQ Write Custom Data" and "IMAQ Read Custom Data", but they don't seem to be able to save the information to the png file.

    Anybody know what those specific VI's are for?

    Thanks,

    Neville

  15. QUOTE(kate @ Apr 24 2007, 03:21 AM)

    hi there,

    I need some help to convert image to 2d array and 2d array into 1d array?

    if the image has 24 bits, I need to convert it to grayscale 8-bits

    Do you have the IMAQ Vision toolkit? It is an add-on NI product. I would recommend you get it if you don't already have it; (it is expensive though and needs separate paid, run-time licence as well for executables).

    There are functions to convert Image buffer to 2d array as well as functions to cast an image to another type. These are optimized for speed when dealing with large image buffers.

    Neville.

  16. QUOTE(ControlEngineer @ Apr 17 2007, 10:11 PM)

    Hello

    Thank you Neville , i dont want to use analog outputs because i dont have enough channels

    to control more than two motors at the same moment

    AAAhhh..!!! You should have stated this earlier. I suspect you will be able to control AT MOST 2 motors with a single DAQ card. The reason is that each card has only 1 FIFO and if you are dynamically going to be changing the frequency and or the Output value using hardware timed IO, you will be able to do it for at most one output channel.

    I think you will need to use counters for the pwm and not the digital outputs. You are limited to two counters and may even need both for one pwm. Sorry I haven't used DAQmx in a while, but its going to be difficult to control a whole bunch of pwms with a single DAQ card.

    Just build a little "Analog Voltage in->PWM out" circuit and control via the analog output.

    Neville.

  17. QUOTE(Sally @ Apr 13 2007, 03:23 AM)

    sorry my topic must not be titled with menu

    so sorry

    but according to chart it is also versus timee yes??I want to change x axis

    there is no option 2 wire x axis and y ??

    how ??

    I think what your after is a chart with the ability to have variable time input (x) values as well. Currently, there is no such plot available in LabVIEW. You will have to build your own based on a graph with history data stored in shift registers.

    I think there is an example of that in LabVIEW.. just search.

    Note that performance is not that great (for intensive applications) with a graph since all the points are re-drawn for every update.

    Neville

  18. QUOTE(ecarrig @ Apr 13 2007, 09:37 AM)

    hello lavaland ~

    I just got an error message in labview 7.1 while trying to use the scxi 1125 with the 1313 terminal block. Apparently the EEPROM went bad on the card itself. This is the second unit that has failed in this manner. Has anyone else had problems with this card? Can anyone suggest any work-arounds? I need to measure up to four signals randging from 0 to 300 volts very accurately, and would like to avoid building my own static attenuating circuit.

    thanks

    Eamon

    If its under warranty, you could probably get it repaired by NI.. Check that you are using an ESD strap when installing the unit into the chassis.

    I have used them in the past and never had problems measuring quite high voltages (around 60V) as well as much smaller values simultanously.

    Neville.

  19. QUOTE(Tomi Maila @ Apr 4 2007, 06:59 AM)

    I'd love to see a combination of enum and a ring that would have benefits of both of them. It would be like enum in all other respects but it would have separate "display string" and "value strings" for each elements. This way one could localize the display string without affecting the value string.

    Tomi

    Try the combo box control. It allows separate display and value strings. I think it came out with LV 7.0

    Neville.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.