Jump to content

jdunham

Members
  • Posts

    625
  • Joined

  • Last visited

  • Days Won

    6

Everything posted by jdunham

  1. QUOTE (Ic3Knight @ Feb 13 2009, 02:04 PM) It might be possible, but it's certainly foolish. A better approach is to make one process to write the DAQ data to disk, and make a separate process to crunch your data. The DAQmx API allows you to read the same data more than once, so each thread can keep track of its own read mark and operate independently. I have done this lots of times.
  2. QUOTE (stilts @ Feb 13 2009, 02:08 PM) No. The Visa Resource is just a reference to a session. The value on the wire refers to the open session. The output is just there for your convenience. Only running Visa Close or terminating your VIs can close the session. (the latter only works if you have some optional setting turned on). QUOTE (stilts) 2) Assuming I am required to connect "VISA resource name out," would I be able to use the close VISA session function in one of my case structure cases, or would I need to place this function outside the while loop? You can close the session anywhere you like, though I think the session may automatically open itself if you run one of your other cases after calling close. Close is still useful so that other programs may take control of the port.
  3. QUOTE (torekp @ Feb 13 2009, 05:09 AM) I don't think there's anything else going on. Some of NI's error handling is totally crappy, and some is just lame. Where's the call chain, or ability to insert arbitrary run-time data (like the filename), and have it auto-labeled? The functions for generating errors and clearing errors are worthless. For example the Clear Errors function clears any error, not just one matching the error code(s) you might be expecting. Part of the problem was that the original design (the error cluster itself) was not flexible enough, and another part is that they never use typedefs for internal functions, so there's no way for them to change a data structure like that once it's let loose. (I guess that could have consequences for reading old versions of the typdefs back from files, but I think they could have worked around that). If you poke around the NI stuff, you can see they have a sort of syntax for adding items like the file name, so you can write code that detects a new error, and adds the info you care about to the error.source field in the format that their dialog boxes understand.
  4. QUOTE (Daklu @ Feb 13 2009, 01:16 PM) Geez, I hate to deconstruct a joke too far, but the point is that the software developer was the one who projected all the extra requirements by over-generalizing, while the king just wanted a toaster. It's a cautionary tale, to be sure.
  5. QUOTE (menghuihantang @ Feb 11 2009, 02:02 PM) Make sure you double-check all the parameter data types, and then check C calling convention versus Pascal/WinAPI calling convention. You can try them both, and usually one will crash and one won't.
  6. QUOTE (OptimisticDan @ Feb 11 2009, 01:04 PM) The "...\documents\" folders are probably not going to contain settings. I found an XP computer and checked, and my file is at C:\Documents and Settings\All Users\Application Data\National Instruments\NIvisa\visaconf.ini. (I probably missed "Application Data", and it's likely that XP tries to do you a favor by skipping that in your searching) If you don't have that folder, then create it and run MAX, check your ports (maybe rename one), and then run LabVIEW. If that doesn't help, and you already reinstalled VISA, then you'll have to keep working with NI Tech support. Once you figure it out, everything should go smoothly after that. Anything you can do in HyperTerminal, you should be able to do in LabVIEW. Good Luck!
  7. QUOTE (OptimisticDan @ Feb 11 2009, 11:55 AM) I once had a problem where the "NIvisa" folder was not there and MAX was too dumb to either create it automatically or throw an error. Can you find any National Instruments folder anywhere in your Documents and Settings folder? (I may have made an error converting from my Vista path to the normal XP one). If you do have the NI folder with lots of other stuff but no NIvisa folder, then create the folder, run MAX, and try again with LabVIEW.
  8. Well I would check that C:\Documents and Settings\All Users\National Instruments\NIvisa\Nivisa.conf exists (slightly different path on Vista) and has contents that look reasonable. Mostly that just helps you map ASRL1::INSTR to the more normal COM1. If it's totally missing I think bad things will happen and you won't necessarily get an error (this burned me a couple years ago, but maybe they've fixed it). Have you tried NI tech support? It's good to ask on LAVA, but in general this is the kind of question they will handle better and faster. Good luck. Jason
  9. QUOTE (rom76 @ Feb 10 2009, 06:30 AM) Well there are lots of ways. There is a timestamp data type which can be bound in a cluster to whatever data type you are using for your messages. If that's not help enough, maybe you can post your code and tell us what's not working.
  10. QUOTE (Mads @ May 27 2008, 03:20 AM) What usually takes the time is the recursive parser. For the same data type, parsing the call on each read and write is a big waste of time. I have some ideas about caching the decomposed type structure, but that wouldn't speed up the first call. It would be interesting to combine this approach with faster INI-file i/o routines.
  11. QUOTE (AutoMeasure @ Feb 10 2009, 08:28 AM) We like Trac.
  12. First I would try to do as much as possible with the "Timestamp" data type. Many LabVIEW analysis and plotting functions can deal with it. Second, I would only have your program operate on timestamps or values in seconds. If you simply must convert to days/hours/minutes/seconds, only do so just before displaying the value to the user. You don't want to do arithmetic on a complicated time data structure. Don't worry about optimization. Just keep you code as simple an clean as possible and only worry about performance if you are having a problem. Good luck.
  13. Also, if your commands don't change, you should use an enum rather than a ring. Then you don't need a property node, because you can use the OpenG "Get Strings From Enum" VI. Also, don't worry so much about 'efficiency'. Just make your code readable and maintainable. An RS232 instrument is going to be orders of magnitude slower than your LabVIEW program, so your program's speed will be throttled by the comms. Who cares if the rest of the processor cycles are taken up with some LabVIEW you could have improved?
  14. QUOTE (Michael Aivaliotis @ Feb 3 2009, 08:23 PM) By the way, I would fix the appFont, the systemFont and the dialogFont, just in case your existing work uses any of those. We don't usually call out the FPFont and the BDFont, and they will usually point to appFont.
  15. QUOTE (Val Brown @ Feb 3 2009, 09:52 PM) No, he was implying that test & measurement is like a toy or game compared to the big-boy world of consumer products and business applications. Think about HP and where their test & measurement business went. It was basically too small to bother with, so they spun it off. If 20% of Americans knew how to use LabVIEW and if X10 home automation were replaced by NI products and were in 30% of all dwellings and 90% of new construction, NI would probably be selling its T&M business to Jim Kring. Since HP went through exactly the same thing, you can be sure that NI has at some point tried to figure out whether they could pull this off.
  16. QUOTE (MJE @ Feb 3 2009, 08:43 PM) Yeah, I agree. When we build our application, we make sure those fonts are in the application's "labview.ini" file, because everything looks wretched otherwise. Forget about any kind of cross-platform GUIs. It sure would have been nice for NI to have dealt with this a bit better, though I know fonts have always been a pain for them.
  17. QUOTE (Aristos Queue @ Feb 4 2009, 08:12 AM) We have been vexed by this too, though it has been difficult to isolate into something we can report to NI. Thanks for the update!
  18. QUOTE (cheekychops @ Feb 3 2009, 05:40 PM) The Index Array function has a row input and a column input. You have wired the row input, so you are getting the rows in your 1-D arrays. If you wire the bottom input instead, you will get the columns you crave. Have fun wiring!
  19. QUOTE (Michael Aivaliotis @ Feb 3 2009, 05:03 PM) Just go into LabVIEW.ini and change all of your fonts to "Tahoma 13". It's just not worth the trouble to put up with the new font.
  20. QUOTE (Val Brown @ Feb 3 2009, 04:15 PM) Well the advantage would be that the user base would probably increase by a factor of 20 to 50 in short order (wildly guessing of course). The world would be divided into people who use LabVIEW or who program "the old-fashioned way". What if 80% of the kids in the USA would have built a LabVIEW-controlled robot by the time they reached 10th grade? I have no doubt NI evaluated this future and decided it was too risky and that they didn't know how to do this in a way that would benefit their shareholders. I don't know how to do this either, but I don't agree that this it's naive to contemplate it.
  21. QUOTE (bsvingen @ Feb 2 2009, 04:39 PM) I think we can agree that a dataflow language is more useful when you have data flowing. I'm not sure we agree on much else though. I think by-reference objects in LabVIEW are less intuitive, even if they can be very useful. QUOTE (bsvingen) Another thing that is counterintuitive and IMO counterproductive for everything except streaming data, is the left-to-right paradigm. The natural thing to do when you can program in two dimensions, is to fully use those dimensions to build abstract programming topologies or topologies representing real world topologies where data and messenges flows in all directions. The natural thing to do when you have a bunch of cans of spray paint is to make a big muddy mess on the wall. I find the left-to-right paradigm helps me make sense of the dataflow; I can model the dataflow in my brain. Without the left-to-right, it would be much harder for me to create that model. Without that model I would find the code to be unmaintainable, unless LabVIEW had a totally different set of visual cues illustrating that omnidirectional flow. QUOTE (bsvingen) LabVIEW is similar to FORTRAN. They both can be used to make any kind of program, but they are both really only good at doing one single thing; datalogging and number crunching respectively. Like FORTRAN, the reason for this is not inherent shortcomings in the language structure, but because they both are tuned towards those specific tasks. All I can say is that I have written many VIs which do a lot more than datalogging and number crunching (though plenty of that too), and I think LV has been a help more than a hinderance. Communications and feedback control come to mind right away. I have also had lots of success with data visualization.
  22. QUOTE (Daklu @ Feb 2 2009, 09:08 AM) OK, I'll ask the obvious... Why do you write test VIs for the private subvis? It seems to me that the nature of classes is that they expose a specific interface (all the public parts) and if you test the interface, then by definition your class works correctly. There shouldn't be any need to test the private methods inside the class. If they are not exercised by calling the public methods, then why do they exist at all? Having said that, I have to admit my unit testing is sometimes lacking, so I'm willing to learn and be corrected.
  23. QUOTE (Aristos Queue @ Feb 2 2009, 12:03 AM) Well the OP said he was new to LabVIEW, and a lot of new users don't understand that the sort routine will take an array of clusters and that you can drop the index in there and that that can be extremely useful. I hadn't thought about your idea of searching the array rather than sorting a cluster, though in most cases, you would have to be very careful about duplicated search values. You might have to find all of the search values in the original array and come up with a way to handle them. That's another reason why the OP request is totally bogus. If his FFT were symmetric, he would get duplicate median values on either side of the peak. I just can't imagine how that would be useful.
  24. QUOTE (Aristos Queue @ Jan 30 2009, 01:34 PM) The problem is that your loyal users are all annoyed that the software development community won't take LabVIEW seriously and any real projects in LabVIEW are under constant threat of conversion to other languages by order of management. So yeah, most of us agree that the current marketing focus on the noobs keeps the bills ( AQ's salary) paid, and that's generally good, but it does create problems. I wasn't trying to say it was a fatal problem, or even the biggest problem. In fact it could be the biggest problem, because it has taken a successful and brilliant product and doomed it to a minor niche in the world of computing (sorry for being overly dramatic, you can start flaming me now!) QUOTE (bsvingen @ Jan 31 2009, 12:35 AM) I have allways thought that the real strength of NI is the NI-DAQ drivers, enabling the use of NI hardware for just about every thing possible. LabVIEW is just an add-on enabling easy or fast use of NI hardware. Dude, the real strength of LabVIEW is the intuitive nature of dataflow programming and the awesome task scheduling engine that allows multithreaded program execution with hardly any management code required. It's an excellent signal processing and control environment. For me it's so much more than an add-on for manipulating NI hardware.
  25. QUOTE (Aristos Queue @ Feb 1 2009, 03:17 PM) ... if the array is sorted! Most interesting physical data is not sorted (until you run the Median function on it, but the function does not return the sort order). QUOTE (Anders Björk @ Jan 31 2009, 04:19 PM) You could sort your data first and take out index at 50% (of max index) of your sorted data. If you go this route, you should run your array into a FOR loop, and cluster each point with its index value. Then sort the array of clusters. The median will be at the halfway point of the sorted index, and the second element in that cluster will be the original index. QUOTE (Laur @ Jan 31 2009, 03:54 PM) I am finding the frequency spectrum of a simulated sine wave, and I can get the correct median value; however, I am unsuccessful in finding the index value of where that median value is ... Any suggestions will be extremely appreciated! Laur, what the heck are you trying to measure? The FFT of a sine wave should be a sharp peak at the fundamental. The majority of the FFT values will represent the tiny amounts of noise at the rest of the frequencies. Whichever one happens to be the median will be at a nearly random frequency. A much more interesting measurement is the median power frequency, which divides the spectrum into equal areas. However, that is also trivial for a sine wave, since it should be at the fundamental.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.