Jump to content

BrokenArrow

Members
  • Posts

    257
  • Joined

  • Last visited

Posts posted by BrokenArrow

  1. QUOTE (Gary Rubin @ May 15 2008, 10:48 AM)

    My experience is that these are usually so poorly written (i.e. overuse of sequence structures) that we end up rewriting them amost from scratch anyway.

    Yep, or so old that they use the old serial driver interface! I just re-wrote a handful of turd balls from [popular signal generator company] into one VI. (note I said one "VI" not one "driver") ;)

  2. QUOTE (rolfk @ May 14 2008, 04:25 AM)

    As to being compiled, as far as LabVIEW is concerned there should be little difference between development system and runtime system performance.

    Rolf Kalbermatter

    Agreed! I wonder if it's the benchmarking rouine that could be to blame? Maybe "Tick Count" works differently in EXE than Dev, and what I'm seeing is the burden of the Tick Count as opposed to any real time differences in the code between the ticks. (?)

  3. Thanks for all the responses. Great stuff. I've come to three conclusions:

    1. I've concluded that this topic is "entirely arbitrary and subjective" to borrow a phrase from Rolf. :worship:
    2. When the Quality Department asks me to divide my Software Release Document into the VI's that are drivers and the VI's that are not, I will just make a best guess and that'll be that. It may be up to non-LabVIEW programmers to make my mind up for me. :laugh:
    3. I know that when I add "EasyDAQ.dll" alongside my VI's, there's going to be heck to pay. :wacko:

    Richard

  4. I agree with the points that Gary and Paul have brought up. To take it a step further, a driver (to me) is something that a piece of hardware *can't* work without (think printer driver), whereas if you've written a LabVIEW VI that controls a device, the device may still stand on its own with something written in (god forbid) C or VB or Cobol or Fortran or Pascal or JAVA or HPV.... ... ...

    Thanks for the input, I've been trying to get people to stop calling LabVIEW VI's "drivers" because I think that terminology can confuse a customer/user.

    OK, back to my hand full of turd balls. These arrays aren't going to index themselves.

    Richard

  5. When is a VI a "driver" ?

    1. It never is
    2. It always is, if it talks to hardware
    3. When the VI works only on a specific brand/model/part number of device
    4. When its main purpose is to make API calls
    5. When your manager, vendor, or girlfriend referes to it as such

    Thanks! :rolleyes:

  6. QUOTE (rolfk @ May 13 2008, 03:20 AM)

    Flush Buffer will delate data already in either the receive or transmit buffer. Discard Event will discard any events that might have already been queued for the current refnum. That are two very different things.

    Rolf Kalbermatter

    Rolf,

    Indeed they are different, I didn't mean to imply that they were similar, but does it make sense to create an event and immediately discarding it as this example is doing? The reason I suggested the Flush was, if the hardware were filling the buffer with "stuff" by the time he got around to doing the next Read/Write there would be trash in the buffer, so a tabula rasa would make the reads clean. I thought maybe whatever the Discard is asserting on the line might be analogous to clearing the line, but it isn't - thanks for the clarification ;)

  7. All of these serial devices have idiosyncrasies that often just have to be shotgunned. Still looks odd to me, the Discard. Maybe Flush Buffer would work? But hey, if it works it works! :thumbup:

    If you can't string together your commands with ";" or CR/LF (the typical method), you could still combine your Serial Writes with a For Loop.

    Congrats on the success, thanks for checking back in.

  8. QUOTE (fuzzyspot @ May 4 2008, 11:59 PM)

    ......I onlly typed in things like "BD1", "H+", etc. Should I convert them to HEX or DEC format? .....

    1. You need to leave the String input "Normal", not HEX and definitely not DEC. Those are plain ASCII characters you're sending.
    2. You need to send a CR with the string to satisify the <enter> that you'd normally do in HyperTerm. Concatenate a CR or a CR/LF combo with your string when you send it. See attached JPG.

    Richard

  9. QUOTE (Götz Becker @ May 5 2008, 04:13 AM)

    Hi again :) ,

    after the switch to 8.5.1 together with a new PXI controller the memory consumption remained stable......

    That's great news. :yes: NI said the next release (after 8.5) would fix some of the leaks. I'm glad to see that ugrading may be worth the trouble. Thanks for checking back in.

  10. Guten tag Markus,

    I don't have that Keithley driver (vi), so I can't steer you to an exact solution, but I do have a few comments:

    • Why do you Enable the Service request then immediately Discard it?
    • If Keithley supplied a set of VI's to you, then I doubt you even need to enable a VISA service. Check with Keithley for an example.

    Richard

  11. QUOTE (rolfk @ Apr 29 2008, 06:39 PM)

    ... getting rid of globals might have done this trick in two ways. First saving lots of data copies just for the sake of the removed global itself and the necessary changes in architecture might have given LabVIEW a chance to actually use optimization in other places too.

    My thoughts exactly. Many of those globals were being used as locals, and those "variables" were used to avoid shift registers. I think the customer got a lot more than he thinks from the cleanup labor, which he was initially skeptical of.

  12. Thanks for the responses! I need some fodder to explain why a 5.1 to 8.2 conversion runs faster. Getting rid of a lot of globals, and using VISA rather than Serpderv, and the un-called panels (that used to run in the background) are the only things I can think of that programmatically sped things up. It likely has more to do with better folding / compiling since 5.1

    Richard

  13. QUOTE (crelf @ Mar 20 2008, 04:38 PM)

    I'd like to nominate http://forums.lavag.org/Jim-Kring-m17.html' target="_blank">Jim Kring for a (IMHO long overdue) TALM award. Jim's personal dedication to the LabVIEW community is truly wonderful:

    Agreed! Jim Kring is the Neil Peart of LabVIEW!

    On a side: One thing I really like about LAVA is the diversity. All are welcome, but experts hold the fort. When I open a thread, and I see it peppered with names like crelf, Jim, Aristos, et.al., it makes me want to read further. These experts add a sense of validity to a thread. I mean, who wants to see answers from squids like me? (although I nominate myself as the Champion of the One Hit Wonder).

    But.. what about the guru lurkers out there? I know of several mighty LabVIEW Champion members that just don't post. Attach the ChipIn widget and get to postin' !

    Richard

  14. Reem,

    That's not a DAQ, it's a breakout board. I honestly think you're missing some high level knowledge. Is there anyone where you work that has programmed in LV that can help you? "Creating a Task" is plain LabVIEW talk. Read up on MAX (Measurement & Automation Explorer), DAQmx, and find your DAQ!

  15. QUOTE (SPM @ Apr 28 2008, 10:49 AM)

    Can the configuration be made persistent so that it doesn't disappear between TestStand test steps?

    Certainly. Load the system parameters from a text file into the the VI's that need the info. If VI's share information dynamically, a file is a easy sell-off to agencies because the data is "always there". As TestStand calls the VI, the VI will load what it needs from the file. It only takes a few mS to read several hundered lines of configuration data. When each VI exits, it reliquishes its memory (or at least it should - close your references), but that's OK because we were reading the parameters on each call. Would this scenario work? How are you doing it now?

    edit: .... also ask yourself.. do you really need TestStand? What is it doing in this case that LabVIEW couldn't do by itself?

  16. SPM,

    You are inferring that there is something intrinsically wrong with LabVIEW insofar as how it handles memory. If this is what you truly believe, then you've answered your own question.

    You might need to explain your complaint more specifically in order to get help. For example, can you give an example where a VI had a hard time retaining a value? Also, how is TestStand going to tell LabVIEW to dump memory?

    Richard

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.