Jump to content

BrokenArrow

Members
  • Posts

    257
  • Joined

  • Last visited

Posts posted by BrokenArrow

  1. I've never seen a use of Notifiers where you couldn't have done it with a Queue. That being said, I like to use Notifiers to send status messages to a tiny parallel loop that sits there un-spinning and patient - waiting for the Notifier to be updated. Then again, you can do that with a Queue too, but why? A "real" SEQ is bulky to set up and use compared to the Notifier.

  2. QUOTE (bmoyer @ Jun 5 2009, 07:52 AM)

    I think the release of the software included with LabVIEW 8.6.1 NI Developer Suite was the straw that broke the camel's back.

    Agreed. I've never regretted upgrading until I went to 861. I can't beleive all the software it loaded that we don't own, for trial periods, that I could have sworn I said not to load. I think uninstalling some of those trial softwares causes issues. I also have a system that can't make sense of a Filedpoint card, lost the installed com port card, and wants me to install some generic driver for PCI. Nothing was wrong before upgrading, so....

    As much as I like Linked Tunnels and smaller file sizes, it wasn't worth it. 851 was fine.

  3. QUOTE (James Beleau @ Jun 4 2009, 11:36 AM)

    ...you need to wipe that machine...

    Yep, after two weeks of support calls and a VI method of exporting as a work-around, that's NI's take on it too. I was told to uninstall MAX, but that can only be done by uninstalling everything that touches MAX, which is everything.

    MAX is a horrendously complex and ever-morphing, capable of changing its own code and causing ripples in the space time thingy.

  4. Well, for those that are interested, this is a known issue at NI. If you have a computer with a lot of, um, "history", such as having lots of LV installs (7.1>8.2 >8.5>8.6.1) and perhaps one occaision of a corrupt MAX database that you've fixed (or it fixed itself), MAX may forget how to export channels and scales. It may or may not happen.

    I personally don't like to use custom scales (do it programmatically) nor do i setup channels in MAX (do it programmatically), so it's not a huge deal to me, but due to maintaining three legacy machines running dozens of custom scales that have to be synchronized when someone adds or modifies a scale, this has been epic. :headbang:

    'sall i got

  5. After upgrading to 8.6.1, I can no longer export my channels and scales to a *.nce file with MAX. It just does not give me the option.

    On the dark side, I found a thread about DAQmx 8.8 possibly having problems, but after upgrading to 8.9, the problem still persists.

    I was wondering if someone with 8.6.1 can try to export scales - to see if you have the problem.

    thanks. :)

  6. QUOTE (JFM @ Apr 7 2009, 09:39 AM)

    Since more than one application might share a resource like the COM port, I think it is good practice to setup the communications within your application, instead of relying on the "global" settings.

    My 2c

    /J

    me too.

  7. QUOTE (d_nikolaos @ Apr 6 2009, 08:09 AM)

    Both of them are Windows Xp. The one that make the program has 2.4ghz processor and the other that i make the test is i think 3.2ghz.

    2.4ghz or 3.2ghz makes about as much difference for your issue as the brand of monitor.

    1) Again, like crossrulz said, make sure you have VISA installed by looking in MAX.

    2) Are you getting an error? If so, what's the error number/source?

    3) If you have VISA and are returning no errors.. Check your cable, make sure one is not a null modem (cross pins 2 and 3). Make sure the computer is setup properly. Check Device Manager. Perhaps the COM port is setup wrong for your configuration.

    (SnagIt edge effects rule! :laugh: )

  8. Regarding the original topic, I am having an issue with the system numeric spinner on 8.5.1 where the min and max values I have entered will not be honored by the control. For example, in order to get a 4 to 20mA range, I had to enter 3 to 21 as min/max :angry: .

    Is this a known big on 8.5.1?

    Edit: Nevermind. Looked into it further. Not a bug. The increment resolution changes as it approaches the max. Not a bug maybe, but not intuative IMO.

  9. QUOTE (PSUstudent @ Mar 30 2009, 04:23 PM)

    Hello I am looking at reading a voltage from an external source, and using that voltage to trigger an event if it is above some value or not. Is this possible to do with the serial port.

    I know some of the other posts on serial port things involve a stream coming into the serial port, but we don't have a stream. If the serial port is not possible to use, then might you have a suggestion of what I could use. Our options are USB, serial, PS/2, ethernet port, and firewire port. We are using LabVIEW 8.0

    Thanks

    You need an A/D converter between your computer's port and the voltage source. Do you have that?

    You said you want to monitor the voltage and trigger upon a certain value. If you have a "voltage", then you have a "stream", so I don't understand when you say "we don't have a stream". You don't have a stream because you haven't built a pipe (interface cable) and selected a pump (hardware) to get the water from the stream to your computer.

  10. QUOTE (levin.hua @ Mar 29 2009, 02:17 AM)

    ...

    i'd like to know what skills are essential for being a LabVIEW engineer.

    You should have a working knowledge of the popular hardware interfaces: GPIB, Serial, USB, PXI, SCXI, etc.

    Be prepared to talk about more than one or two interfaces. When someone is being interviewed, and they wax poetic about a specific piece of equipment (such as Agilent), but don't bring up anything else, it's a red flag.

    Besides the listed hardware interfaces, you should be familiar with (at least) MAX (Measurement & Automation Explorer), Setting up and using DAQmx Tasks, and File I/O (logging to file). If I were interviewing someone and they balked at any of those basic items, it would be another red flag.

    Also, know your audience. Unless the person is a LabVIEW programmer, they will not care that you know a Queue from a Notifier, but they might be delighted that you can click a solenoid.

  11. QUOTE (neBulus @ Mar 18 2009, 09:33 AM)

    See http://forums.ni.com/ni/board/message?board.id=170&view=by_date_ascending&message.id=191864#M191864' target="_blank">reply #12 in this thread on the dark-side where Greg McKaskle explained why the VI Analyzer want controls and indicators that are on the icon connector to be on the "root" of the diagram.

    Ben

    I read that post a while back and was wondering where it was - it was on my mind this week, thanks for finding it.

    "The caller has to protect the data on its wire from the subVI"... :blink: that kind of stuff blows my biscuit.

  12. QUOTE (ned @ Feb 26 2009, 12:46 PM)

    Not directly related, no.

    QUOTE (ned @ Feb 26 2009, 12:46 PM)

    With a local variable, there's no value pre-compiled into the code so need to make a copy at run-time.

    Perfect. Thanks for the explanation.

    What was that thing I saw a few weeks ago about the difference between wiring a constant to the top or bottom of a comparison or simple math function, and the effect on memory? I can't find it.

  13. QUOTE (crelf @ Feb 26 2009, 11:37 AM)

    I'm really hoping that's a typo tha should be 1Gb...

    1Gb, yes yes, 1Gb. I still can't get used to typing that.

    Back on topic, this VI has a few arrays that have values set as defaults (as aopposed to being empty by default). The profiler would probably catch those.

  14. Hello everyone! I have some questions / comments about memory management.

    1)

    I noticed today that after I disabled some stuff, my Data Space memory went UP a bit (by 0.2kb). Curious, I then deleted a two local variables and replaced them with constants. My memory went UP again (by 0.2kb). Putting the locals back, the memory went back down. Does this make any sense? It seems counter to what should happen.

    2)

    If you could take a look at the attached JPG, do these ratios here look OK? If my "code" is taking 161kb, and "data" is taking up 99kb, seems like maybe I haven't wired my code as effieciently as I could to reduce data space allocation. Or maybe, that conclusion can't be gleaned from these figures? This code has no globals, but quite a few locals. It also has lots of dead-end tunnels which I'm slowly working out, but I don't think those make copies of data. (?).

    3)

    Lately, I have been adding the Request Deallocation function to VI's that run just once per application, and never get a chance to run again. Also lately, I have noticed a bit more disk swapping on one of my computers which has only 1MB of ram. I haven't tried a before-and-after, because the disk swapping is hard to pin down. Has anyone had issues with the Request Deallocation?

    Thanks!

  15. QUOTE (torekp @ Jan 28 2009, 12:53 PM)

    Maybe I misunderstood your question, but you can use NI-MAX to create simulated DAQ devices, complete with all the usual channels. I've done that and gotten some good mileage out of it.

    I'm not at my NI-MAX computer right now, so the following is from memory.

    Just right-click in the place in the tree where your devices are listed, and choose Add DAQ device. You can choose a virtual/simulated device.

    You understood perfectly. That's a good idea. As long as the task type is the same as the other channels in the task, that'll work. I'm not too fond of having to carry around or modify a MAX setup wherever the ap goes though.

    The idea I came up with was to use some of the internal channels that these cards have (CJ temperature, excitation voltage, etc). Set the unused channels to one of those and it should be off and running. Haven't tried it yet though.

    Thanks

  16. QUOTE (Paul_at_Lowell @ Jan 28 2009, 11:14 AM)

    I don't think you will encounter any issues using a subVI. Have you experienced any issues after deploying the application and monitoring the PC's memory usage?

    Well, no, not on this application, but in the past, I had an RT application that had a huge memory leak. Some of that issue ended up being a bug that NI admitted (fixed on 8.5.1). But since then, I get very nervous around applications that run for days on end.

    QUOTE (beBulus @ Jan 28 2009, 11:28 AM)

    The only thing I see that could be different is the cluster. In the lower no-sub-VI loop,
    LV
    can reapeatedly through the last value into the buffer that is the output tunnel of the while loop. In the sub-VI flavor the sub-VI does not know the cluster is not being used whil looping so it has to return a copy of that data each time.

    On the non-SubVI flavor, which was a paste from the SubVI, I forgot the shift register for the incomming cluster. It should be there. Actually, since the cluster will change every iteration, I could just wire a constant there. Thanks Ben, I like you new name.

    ------

    What do we think about setting a SubVI like this one to 'Load and retain' ?

  17. In a loop performing reads on a card 24 hours a day, (analog inputs in this case), are there any issues with regard to using a Sub-VI or not? Note the JPG, and note that in this case the loop is slow, so we don't care so much about the 2.5uS (?) that it takes to open tand close the VI every loop iteration - I'm just wondering about the effect on memory in the long term. I would definately turn off debugging, but what about any gains with the Prioroty settings? Or Reload for each call vs. load and retain vs. load with callers ... etc.

  18. Is there a way to substitute a false channel in a DAQmx task? I think this would be handy. For example, if you have a 12-channel task, and once in a while you didn't want channels 8 and 10. Since the code downstream may be expecting 12 channels, 12 elements, a cluster of 12 etc etc, it would be nice to just seed those channels with false data - put something in there, even though there's no real data, so the task still has 12 channels.

    Any ideas?

    Thanks!

  19. QUOTE (normandinf @ Jan 12 2009, 04:11 PM)

    Hi all,

    I'm building a list of available DO physical channels and when a user selects multiple channels, the result I get is a string like this:

    "Dev2/port1/line0, Dev2/port1/line1, Dev2/port1/line2, Dev2/port1/line3, Dev2/port1/line5, Dev2/port1/line6"

    but when I select multiple channels from the Browse button of the physical channels control, then I get a much simpler "Dev2/port1/line0:3, Dev2/port1/line5:6".

    Is there an easy way to group a list of single channels?

    Sorry that I don't have an answer for you, but I will supply a "bump" for your thread ;) . Maybe someone has created a VI that compiles a nice ranged (colons) grouping of channels based on user input that does not use strings?

    There's this thing: (sorry for two pictures, should be one, either my computer or the IPS is misbehaving.

  20. QUOTE (Yair @ Dec 20 2008, 11:49 AM)

    But that's exactly the point of an indexable and searchable database. Why is opening a book or a virtual document any easier than searching using the search function (or Quick Drop, both available using a couple of clicks)? In both cases you would have to KNOW to do what you want.

    Very true. But the point was, an alphabetical list will show you things regardless of the location in the palette. Take my example - when the user happens upon "Mean.vi" while looking around for a function to take an average, and no other VI's in the adjacent palettes start with "Mean", he/she is likely to give up, and may not think to use Search to find something else that starts with Mean.

    Who would think, intuatively, that:

    Mean.vi is is Mathematics > Probability and Statistics

    MeanPtByPt.vi is in Signal Processing > Point by Point > Probability and Statistics Pt By Pt

    An alphabetical list view would have revealed the other VI to the user.

  21. QUOTE (NI Guy @ Dec 19 2008, 11:01 AM)

    ...... http://sine.ni.com/nips/cds/view/p/lang/en/nid/2454

    I'd be curious to hear any feedback on these documents. Did you know about them already? Are they answering any of the questions you have in mind? Is there a better way to organize these kind of docs?

    That is a very decent database, thanks for the update. No, I didn't know it existed - I don't typically look at those pages because I assumed they are targeted at promoting LabVIEW to novices.

    So, while that is close, is there an alphabetical Index of every VI and function? A good example of a need for this: say you are browsing the palette and find MEAN.vi -- you might assume you've found all VI's having to do with taking the Mean, but there's also the Point by Point version - in a completely different major palette. I dare to say if you didn't use Search for "Mean", you may never have known of Mean PtByPt's existence. However, if you were looking at an alphabetical list, the two choices would be side by side, regardless of where NI has decided they should go on the palette.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.