Jump to content

jdunham

Members
  • Posts

    625
  • Joined

  • Last visited

  • Days Won

    6

Posts posted by jdunham

  1. QUOTE (jorwex @ Aug 7 2008, 07:34 AM)

    I'm sure there's a good reason for it ...

    That's the true sign of a newbie. :rolleyes: No, there's no good reason. It's just a complicated feature set that mostly works and doesn't generate too many complaints so it has remained the same. I think you figured out why I didn't want to type up a whole set of instructions.

    I'm glad you got it working. One final tip: you can use the Control Editor to specify the exact size of the plot area. LabVIEW scales the axes automatically, but if you match the plot area to the size of your X and Y dimensions, then the rendering of the intensity graph will be faster and more accurate. If you're off by a few pixels, then LV has to do a 2D interpolation of the whole data set.

  2. Well a simple example I just made does not reproduce the problem. Basically I was passing App.MenuLaunchApp and App.MenuLaunchVI into the "Open VI Reference" function and getting error 1004, VI not in memory. However it's working fine right now. I'll be sure and post back if I can recreate the problem.

    OK. I figured out my problem. I tried to re-open the VI reference by name, but the second time, I did not wire the proper application reference. Thanks for listening!

  3. QUOTE (Darren @ Aug 6 2008, 12:58 PM)

    MenuLaunchApp should always give the application instance that owns the VI from which you launched the Tools menu option. If you're seeing different behavior, it might be a bug. Can you give me specific steps to reproduce the problem? I'm not seeing it here...

    -D

    Thanks for the quick reply.

    I have two projects open, one of which contains the VI I launched from, and one which doesn't. MenuLaunchApp is returning whichever project is frontmost. (still using LV 8.5, sorry).

    I'll try to get a test VI together.

    Jason

  4. I wrote some VIs which launch from the Tools menu (by the well-known trick of storing the VIs in the "C:\Program Files\National Instruments\LabVIEW 8.5\project\" folder. When I select the menu item, there is a property node for App.MenuLaunchVI, which gives the VI name, and there is App.MenuLaunchApp. I was hoping that gave the application instance which owns the launch VI, but it just gives the topmost lvproj which is open. Does anyone know how to get the AppInstance which owns the launch VI?

    Jason

  5. My pet peeve is the file dialog box. It's great that it starts in the last folder you accessed. However, only one last position is remembered, and it's shared between opening up VIs and other code from the File menu, versus opening data files from various runtime methods of opening files. Only a crazy person keeps programs and data anywhere near each other on the hard drive, and it's very irritating to have to navigate back and forth.

    This gets even worse if you have parallel branches of similar code (managed with a professional source code control system, of course). Right after you switch branches, it is very easy to browse into the wrong working copy on your hard drive, since the folder structure is usually identical, and LV's file dialog box will almost always start you in the wrong one.

    With the miracle of lvproj "application instances" you can have both trees open at the same time, and the file dialog box will work as hard as possible to encourage you cross-link your code. Often, loading the VIs with throw warnings which you ignore at your peril, but the right place to fix this is in the file dialog box starting place.

    My wishes:

    1. For loading VIs, it should always start in the lvproj folder or one of its subfolders

    2. For loading data files with File VIs, path control browse buttons, executing the file dialog function, etc, it should maintain a separate "last used folder" amd start there.

    I have already submitted something like this to the LV product suggestion center, but I encourage others to do so too.

  6. QUOTE (Max1971 @ Jun 25 2008, 06:06 AM)

    I've found many examples on the net regarding synconism between AI, AO, AI and Counter, etc. but nothing about two counters.

    You are on the right track.

    I don't see why those examples shouldn't help, though I could be more specific if you had links to those examples. The main reason DAQmx is better than the old driver is that the same functions work whether your tasks are AI, AO, or Counters. Int the DAQmx example ...\LabVIEW 8.5\examples\DAQmx\Synchronization\Multi-Function.llb\Multi-Function-Ctr Retrigg Pulse Train Generation for AI Sample Clock.vi,

    you can see that the counter channel starts counting based on a trigger from another task. You should be able to use that same subvi (...\vi.lib\DAQmx\configure\trigger.llb\DAQmx Start Trigger (Digital Edge).vi) to start your counter task from another counter task's start or input signal.

    Good luck.

  7. One probable cause is a LabVIEW annoyance. If you are editing your code in the BranchA working copy, and then you close it and open your BranchB working copy by double-clicking the lvproj file or grabbing it from the recent files list, then when you use the Open... or Save As... functions, you usually end up browsing in the folders of the wrong branch! :blink: This is because LabVIEW only remembers the most recent folder you used with the file dialog box whether you are opening a VI, or accessing a data file when your program is running (I've complained many times about this). :angry:

    Of course in many cases, the wrong branch looks indistinguishable from the right branch, so if you don't inspect the file dialog box to see the complete path, it's very easy to save or load VIs from the wrong working copy. You have to be very vigilant. There is a lot NI can do to help this, but it doesn't seem to be much of a priority. I think they don't want to second-guess the user's on-disk folder layout, but this makes it very easy to make a mistake.

    OK, so if you are already cross-linked, the best thing to do is to move the working copy to a completely differerent place on the hard drive to make sure LabVIEW's relative path links can't find the cross-linked VIs from the other working copies. Make sure the working copy is moved to a different level (number of folders from the root) to confuse the links. Then when you load, will still have to re-link a lot of the VIs yourself, but when you are done hopefully everything is fixed. Of course you want to immediately Save All and then do an SVN commit.

    Godspeed

  8. QUOTE (James N @ Jun 18 2008, 10:00 AM)

    We had a similar problem working with braches early on.

    Our repo is structured like so:

    ../svn/scc/ProjectFiles/Project_xyz/Trunk/

    ../svn/scc/ProjectFiles/Project_xyz/Branches/

    The working copy on our hard drives looks like

    ..\ProjectFiles\Project_xyz - Pointing to code from the /trunk or /brach directory on the repo.

    Do not include \trunk or \branch in this path! This got us the first few times too. LabVIEW VIs know the absolute path to subVIs. If you change the absolute path to any subVIs, things can't be found. As you know.

    It's also important that any one else co-developing software keep the exact same directory structure on their hard drive.

    We do not have the \trunk revision and the \branch revision checked out on the same computer. You work with one -or- the other at any given time.

    -James

    First off, you should not be having these problems. You don't need to switch to Perforce. Something is going wrong.

    I don't agree with James, LabView should only be keeping relative, not absolute, paths to each VI.

    I keep checkouts on my hard drive from a half-dozen active branches of the same code base, and they don't get cross-linked. Of course I am somewhat careful to make sure there are no links outside a give working copy except to the LabVIEW installation folder (vi.lib).

    I can open up multiple branches at the same time, as long as I rename one of the lvproj files (I usually revert afterwards rather than checking the renamed lvproj into SVN).

    I can move the working copies of any branch anywhere on my computer and it will open just fine, though all developers on our team do keep the same hard drive layout to make builds easier.

    Make sure you are double-checking the repository location of files you are opening by checking in the subversion tab of the windows file properties display.

    Look at your VI hierachy with full paths turned on, and make sure that all vis are inside your current working copy or else in vi.lib.

    Good luck, and if you have some more information or some samples, we can probably help some more.

  9. QUOTE (menghuihantang @ Jun 17 2008, 12:55 PM)

    I am trying to finish a VI which can listen to a TCP/IP connection and also be able to respond to user action at the same time. I put them in one while loop and keep checking each instance, like anything from TCP/IP and does user do anything on the front panel.

    But the result is the VI is not working at all. When it goes to listen TCP/IP, it stops and waits for data. I wonder if there is an event-driven listening for TCP/IP so that the VI doesn't need to stuck there for incoming data. Or other structure to implement what I want: both monitoring the TCP/IP and responding to user action?

    Thanks.

    Use parallel while loops. That's what LabVIEW is great at. Keep each loop as simple as possible and minimize the interaction with the other loop. If you need to share data or events, you are going to need to use some kind of communication system outside of the dataflow, like local or global variables, queues, notifiers, functional globals, datasockets.

    Your best bet is probably to have one loop for TCP/IP listening, one for user event listening, and then a third loop where all the action takes place, fed by your own events generated in the other loops and then passed into the main execution loop. Queues are the favorite way to do this. Do a search on "producer-consumer architecture" on this site and at zone.ni.com.

  10. QUOTE (Eugen Graf @ Jun 13 2008, 12:44 PM)

    But I saw that LV-Programmers use commonly a cluster (not a Variant!) with ENUM and Variant inside.

    If you are frequently using a typdef which contains an enum describing the type of data along with a variant containing the data itself, then you have a prime candidate for using LabVIEW lvclass objects (LabVOOP) instead. With LabVOOP, you can put different data types (different child classes) on the same wire without having to have an enum and lots of case structures whenever the variant data is interpreted. Instead, you drop your code into dynamically dispatched VIs, and at run time, the correct DD VI is called based on the type of data in the wire. This should work very well in combination with the use of queues to pass the data around.

  11. QUOTE (marp84 @ Jun 11 2008, 07:59 PM)

    i use labview 8.0 professional versions, i've NI DAQmx, MCC DAQ for USB and VI logger too.but this all i get (looks attachment).

    can some one tell how i can find AI an AO complete library.

    tanx

    I would just try to reload your Device Driver CD from your LabVIEW 8.0 install disks. At the beginning of the install, there are checkboxes for all the components to be installed. Make sure that "LabVIEW 8.0 Support" (or something similar) is activated under the DAQmx item.

    Good luck.

  12. QUOTE (Justin Goeres @ Jun 10 2008, 05:52 AM)

    I did feel bad about jumping all over the optimization point without offering anything constructive. Now that there is more information available, I can see you're doing it exactly right, Eugen. Code it up the easiest way first, and see whether it's fast enough, and only try to mess with it if it has a serious problem.

    QUOTE (Eugen Graf @ Jun 10 2008, 06:44 AM)

    I have to programm an application, which postprocess some data, so it shoulb be like MS Excel. My programm should read raw data from flash over 7 Slots x 15000 Pages x 60 Datasets (each dataset contains ca. 10 doubles as binary) and show this data in a table.

    I don't show all the data on one table, but show one of 7 Slots and it's enough to make my PC slower and eats some RAM. The problem is, that this data is not only duplicated, much more the problem is the table with data. Because I convert binary data to ASCII !

    It was the first part of the program. The second part should read the saved table convert it back to doubles (for plots) and make postprocessing. After poistprocessing I have to show the raw data and posprocessed data in one table. One row contains about 25 values! And this data is twice in RAM: doubles for plots and ASCII for the table.

    I would consider keeping the storage of the huge datasets separate from the user interface. Instead of keeping them in ASCII, you could keep all the data in native binary, stored in a functional global (uninit shift register).

    You can't look at all that data at once anyway. You could just pull small segments of the data out of the storage and write them to the table. If someone modifies the table, you catch the event and update the storage in the right place. If the user scrolls the table or changes pages, then you go back to the storage, and get the new section of data and toss it into the table. Even if they scroll around a lot and your code has to do a lot of gymnastics it may still be quite a bit faster than keeping all the data in the table, and using the table as a storage mechanism.

    Jason

  13. QUOTE (Xrockyman @ Jun 9 2008, 03:30 PM)

    Im using Labview 8.2.1 with Ni DAQ6015(for usb), and I have a rain sensor that I've to use. The device is only a switch that "switch" every time that 'x' mm^3 of water is incremented.

    I wanted to use a hardware counter that increments every time that occurs a change in the sensor, and after every time I want, i could verify the value of the counter in my Labview code.

    Another way to view the problem is to use an external "trigger" that works like an interrupt in a micro-controller, every time the "trigger" is activated it force to execute a function and after continues the rest of the program. Is this possible to do?

    Sure, the NI DAQ products work great as hardware counters. LabVIEW doesn't really have a true interrupt handling system which can be connected to a DAQ event, you will probably need to poll the counter. There are lots of examples for reading from the counters in the LabVIEW Examples.

    If you have tried writing some code, and it's just not working, then try to post some code and you might get some more specific help to point you in the right direction.

    Salut,

    Jason

  14. QUOTE (Eugen Graf @ Jun 9 2008, 03:55 PM)

    This absolutely applies to LabVIEW.

    The Java page was just the first one that came up on Google. People way smarter than me swear by these rules, in all languages. As my co-worker often says "the compiler is smarter than you". It's just not a good use of your time to try to outwit the development system, especially if it makes your code harder to read or harder to maintain. You should do what you are good at, which is coming up with nifty things to ask the computer to do for you, and you should let the computer do what it's good at, which is getting it done really freaking fast.

    I think you will find that programs with clean designs and clean diagrams actually run pretty fast.

    QUOTE (crelf @ Jun 9 2008, 03:27 PM)

    :blink:

    Wow - that's one hell of a generalisation. Even the page that it links to is over simplified. Whilst I agree, at least in part, with the sentiment, I've gotta give my standard response to generalisations like this: "it depends"
    :P
    Saying "Don't optimize as you go" is like saying don't write using punctuation. Also "...making sure that the code is clean... and understandable" can be considered forms of optimisation.

    I don't think that's a useful definition of optimiZation (c'mon, crelf, you live in North America now :P ). Using punctuation is like having clean, easy-to-read LabVIEW diagrams that humans can understand, which should be the primary design consideration.

    Human labor time is much more precious than computer time, so it doesn't make sense to optimize unless there is a problem (i.e. the user interface is sluggish for other humans), and if there's a problem, you can usually fix it by profiling the code and fixing it in just a very few places. If you write code that other humans (maybe yourself two years from now) are going to waste time understanding or debugging because it's so confusing, then where is the optimization in that?

    I think there's a lot of value in having clean code and doing sensible things, but Eugen's original question was about whether he should make his code messy in order to fix a 'problem' that was just speculative.

    BTW I learned all of this the hard way!

  15. QUOTE (Eugen Graf @ Jun 9 2008, 01:37 PM)

    1. That's what wires are for, so use them! Putting data out on the connector pane does not necessarily make a copy.

    2. The http://www.cs.cmu.edu/%7Ejch/java/rules.html' rel='nofollow' target="_blank">first rule of optimization is: Don't do it!

    Your computer uses the same amount of electricity (in general) whether or not your code is efficient.

    From the above link:

    QUOTE

    Write your program without regard to possible optimizations, concentrating instead on making sure that the code is clean, correct, and understandable. If it's too big or too slow when you've finished,
    then
    you can consider optimizing it.

  16. QUOTE (rolfk @ Jun 4 2008, 10:05 PM)

    You inspired me, so I submitted this. Other people should go submit related ideas at the http://digital.ni.com/applications/psc.nsf/default?OpenForm' target="_blank">Product Suggestion Center Vote early and vote often! :thumbup:

    QUOTE (LabVIEW Product Suggestion)

    Numeric controls and indicators are only one line, so they resize when the font is resized. Strings controls get larger when the fonts get larger, but are not reduced in size when the font is reduced. This makes it difficult when the VI is moved to a new OS, or loaded on a different computer with different font display settings. I realize this is made more complicated since there is some limited support for multiple font sizes in the same control (not used in 99.99% of cases).

    It's a real pain to get string inputs to line up (especially arrays of strings) when the display changes. Our current workaround is to set the fonts in stone and put Tahoma 13 in the INI files for all our built applications, but this is very limiting.

    What I would suggest:

    Add a string property which is "Scale to font size" and if this property is active, have a property "NumLines" (this would be analogous to "NumRows" in an array display). Those properties could also be combined by considering NumRows=-1 to be a fixed user-defined size (the current behavior).

    If a string input is changed to "limit to single line" = TRUE, then default NumLines to 1.

  17. QUOTE (rolfk @ Jun 4 2008, 01:02 PM)

    No! the problem is that numeric controls adapt their height to the font applied to the number inside wheras strings do not do that. This is in fact a copying of Windows control behaviour which NI better would have left out IMHO. You can also see that you can not resize numerics in height but only in length whereas strings can be sized to any height independant of the font they display in.

    Rolf Kalbermatter

    I think we were trying to say the same thing.

    It would be great if you could set a string control (especially if input is limited to a single line) to scale with the font the way the numerics od.

  18. QUOTE (BrokenArrow @ Jun 3 2008, 06:11 AM)

    That's good info, I didn't know that, thanks.

    I know about that INI font deal, in fact it's all over the NI forums and I was on one of those threads years ago. However, my issue back then had to do with the fact that the EXE did not differentiate between a size 13 and size 14 Application Font, and would make them all 13, and we're talking about being on the same computer, same OPS. On a side note, I recently built an application in 8.5 which had 13 and 14 size fonts, and it did not exhibit this problem, and I didn't change the INI, so maybe it's gotten smarter.

    I'll try the INI deal on our one Vista computer and see it fixes my array in Dev mode. What is odd is how it (Vista) chose a different font for the Integer and String arrays, even though they were both Application 14.

    Richard

    I looked at your image. I think the fonts are exactly the same, but they are rendered to a different pixel size in Vista than they were in XP. Remember that integers can only be one line, so labview will always resize the numeric for the specific font size. For strings, the control itself does not resize automatically.

    As a test, select all of those objects and change the font size to 8 or 9 or something. The numeric array will get a lot smaller, but the string arrays won't change, even though their fonts do.

  19. QUOTE (BrokenArrow @ Jun 2 2008, 02:57 PM)

    I think it is the same issue. The first thing I had to do when I opened LabVIEW in Vista was change the default fonts from Vista fonts (SegoeUI, which doesn't exist on XP) back to Tahoma, and then my existing GUIs were not all ruined. II think the folks in that thread could have solved their problems by changing the LabVIEW fonts to non-default(XP compatibility) settings rather than changing the theme. Changing the theme would be equivalent, but changing only the LV fonts would not affect the other applications.

    It's also possible that there are other NI display bugs, but I haven't seen similar issues, and I have been running Vista 6 months, with old-style XP fonts, of course.

  20. QUOTE (James N @ May 8 2008, 09:06 AM)

    It's very simple if all VIs are loaded into memory.

    Open the top-level VI and hit Ctrl-F to bring up the "Find" dialog. Or via the menu, Edit>Find and Replace...

    Click the "Text" radio button.

    Type in the text to search.

    -James

    Note that the Find dialog lets you limit your text search to the block diagram only and labels versus data, which can be really handy when looking for more common words. The search box will (thankfully) remember your choice, which can trip you up next time, but generally I only search on block diagrams.

    This is really handy if you are searching for specific bundling and unbundling of items in a cluster. The item name will be on every front panel of every VI using the cluster, but often it will be in just a handful of places on diagrams.

  21. QUOTE (Jim Kring @ Apr 23 2008, 12:44 PM)

    Well, it's your club, so you can define it however you wish. I just figured that 12-bit club meant you needed to have 2^12 posts to be a member -- just as a million post club would probably mean you'd need a million points to be a member.

    I think it's more like how a 'six-figure income' is anything over $1E5.

  22. QUOTE (Dirk J. @ Apr 17 2008, 02:16 PM)

    I have the nagging feeling I'm overlooking something here...

    I want to create a an instance of class "child" inheriting from a specific instance of class "parent", keeping "parent" values.

    To clarify, lets say I have a class parent.lvclass with some attribute A which is initialized at some value.

    After that, I want to create (a couple of) instances of child.lvclass /keeping/ the previously initialized value of A.

    What I've seen sofar, is that if I try to directly upcast parent --> child; I get an error 1448 saying Bad type cast. LabVIEW cannot treat the run-time value of this LabVIEW class as an instance of the given LabVIEW class. Other casting (child --> parent --> child) lose the parent settings.

    Any thoughts?

    I don't understand how you can "Create an instance" with previously initialized data, since labview classes are by-value. Are you just branching the parent wire and then casting it as child? I think if you post some example code, you will get more responses.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.