Jump to content

Jon Kokott

Members
  • Posts

    186
  • Joined

  • Last visited

  • Days Won

    4

Posts posted by Jon Kokott

  1. In general, references are automatically destroyed when their hierarchy goes idle or out of memory. The hierarchy is determined by the top-level VI, which in your case would be the daemon. One way of working around this for queues and notifiers is to obtain the references by name. Then, the reference which was created in the daemon will be destroyed, but if the queue still has other references, it will not be destroyed.

    Correct me if I am wrong, but memory allocations for queues do not appear in a given VIs memory when using VI profiler. I assumed from this that queues would be "owned" (in terms of memory) by the active application. It seems counter intuitive that a VI hierarchy owns memory allocations and not the owning applications if this is the way labview displays memory allocations.

    I might be venting a little be here as I would be given the ability to define the application/"VI Hierarchy" owning a given chunk of memory than have labview decide for me. I suppose I can always go back to C (joking...)

  2. I noticed something interesting today.

    If the daemon completes its run (the calling VI finishes execution), and has created any references (like queues, notifiers, etc.) it destroys them upon completion.

    I've tried to see if this was because it wasn't in opened using identical application references, but it does not seem to change the behavior.

    This means if you need any shared application memory you must open all references outside of the daemon source VI, or be ready to release them before the daemon completes.

    I'm not really sure why a queue reference is owned by the daemon alone, and cannot exist after the VI is closed.

    I'll think about adding an auto close VI daemon reference option to make the daemon wait indefinitely until the reference is manually closed.

  3. I've been dabbling in Xnodes for a while and it seems that the act of opening a VI containing an Xnode generally causes code within the Xnode to run. Is their any way to prevent an Xnode from running its code at the time the VI is opened/compiled? Compiling is automatic in labview on open as far as I know.

    ~Jon

  4. I still can't find any info about LV2010SP2. Or did you mean SP1, with this I already work?

    I screwed up, this issue was fixed in SP1, it should be working.

    I know that even after upgrading to SP1, you are forced to "rewire" all the property nodes that you created before SP1. The property nodes did all kinds of strange things pre-SP1 on my machine, but I've used them pretty extensively in mostly dynamic dispatch methods (and I use the wizard often, but not always.) Did you upgrade to 2010 sp1 after creating the property node methods? I basically had to scrap every single one of them when I upgraded to SP1 just to prevent the type of issue you are having now (I would hang occasionally, alot of times it would hard crash LV (no save just total abort.) and sometimes it would just garble the data and look like total junk everywhere)

    sorry about that, it really is supposed to be fixed in SP1.

    ~Jon

  5. Oh, I didn't knew about SP2... I'll try to find it.

    About a problem with class properties in LV2010 (Known Issue ID 246263) I did know... but as always I try to avoid x.x.0 releases. SP1 should have this fixed, but maybe I see some other bug!?!

    the problem is with the way the properties are unbundled/bundled in the scripting of the property node. It has nothing to do with the static/dynamic nature of the assessor. The fact that you changed the implantation probably forced a wire propagation during compiling, and the act of rewiring is known to "correct" the error. at any rate, upgrading to SP2 will fix your problem.

  6. Hi,

    does anyone knows about problems with dynamic dispatch accessor VIs for property nodes in general?

    I spent about a week hunting down the cause of application freezes. In the end some LVOOP property nodes just never did return and the OS flagged the app as "Not Responding" (LV2010SP1). I generated all my accessor with the dynamic dispatch wizard/template by default, a bad choice at last. After switching all to static dispatch the app finally works again.

    You must upgrade to sp2.

    Property nodes have problems in sp1.

    http://lavag.org/topic/13940-class-property-node-failure-when-in-lvlib/

    http://lavag.org/topic/13300-lvoop-property-nodes-and-dvrs/

  7. I'm trying to register for a Mouse Down? event on an element of an array which is on the FP of an xcontrol. I don't need this event to be dynamic, so I put it in a First Call case before the event structure in the facade. The event reg refnum is in an uninitialized shift register, so I expected this to work.

    I'm not using the Init and Uninit abilities for this, but side from not unregistering the event, this seems like a simple way to do it. It works in development, but in a built exe the registered-for events aren't firing.

    anyone know why this wouldnt work in an exe?

    Is the event you are registering for on the front panel of the Xcontrol's Facade.vi?

    why do you need to register for events in the first place?

    ~Jon

  8. I'm scratching my head a bit trying to figure out why one of these two ways of doing something is ~5 times slower than the other.

    I'm generating a command curve: an array of angles for a motor. I'm translating these to 2 of the 3 motor phases for output to a motor driver. I then take each of these phases and pass it through a zero-order hold and a low-pass filter.

    If, instead, I zero-order hold and low pass filter the command and then translate that into the angles, it's about 5 times faster. Now I'd be happy to just leave it at that, but there are some artifacts if I do it this way near the trigonometric asymptotes (causes short duration spikes where it shouldn't). I know that I'm calling ZOH and the LPFilter twice as often the first way, but I can't understand why it's 4-5 times slower instead of twice as slow. Even more puzzling is if I turn the filter off (give it an input <= 0) and on in prefilter mode, it only adds about 50% overhead vs 400% when post-filtering.

    Anyone got any bright ideas or have I just made some stupid wiring mistake?

    The attached VI's are LV2010 and require the mathscript module to be installed to run (actually, you could get past that requirement by opening the Command to DAQ output tester, removing the Generate command curve VI, and replace it with an initialized array of, say 20000 doubles, the behavior is the same)

    The two processes which you have executing in parallel (both inside case structures) are causing alot of thread swamping (thats my guess anyway.)

    Force those operations to occur serially (only one at a time.) and you get your expected processing increase in time.

    I'd be interested if building the VI into an executable helped the situation at all.

    ~Jon

  9. Thanks a lot for your suggestion. I am able to disconnect the VI's from library. However I am not able to "Set the "VI Control Type" property to control" which is mentioned in your Step 4. I would really appreciate if you could modify the attached VI to incorporate the step 4 mentioned.

    Thanks,

    Neo

    You probably don't have VI scripting enabled in the labview enviornment. do a quick google search and it should tell you how to do it depending on the version of labview you are using (it requires a download pre 2010 i believe)

    a couple of pointers for your script, load all the VI references first before modifying. sometimes save dialogs will come up if you do not do it this way as they enter/leave memory. I would modifiy the code to do the following:

    one for loop to load the VIs (get the VI reference)

    one for loop to modify (disconnect from library, then test if it is a control, if it is, set it to standard control)

    one for loop to save them all to a new path.

    one more for loop to simple "save" them (do not put in a path)

    you have to double save because you are moving subVI dependencies and that is important to the loaded VIs. So even though you are saving the VI you modified, you have to resave to account for repathing.

    Do not save copies. The references will get all out of whack with cross links if you try to load your new copy VIs. just backup your work first in case something gets hosed.

    one more thing, when you get all dependencies in the library, feed it the constant string "VI"

    you don't have to for this particular .lvlib since there are only VIs in it, but if there was a virtual folder or something else it would break your script.

    ~Jon

    I'd modify it for you but I can't upload code right now. Its probably better in the long run that you learn to do it anyway.

    ~Jon

    • Like 1
  10. You can disconnect a VI from its owning library with the "disconnectFromLibrary" method.

    As far as disconnecting all the typedefs from a library of VIs, I'm not aware of any way to do this without scripting.

    The best way to do this would be to

    1. open the library reference (I prefer to open a VI in the library, and use a "library" property node read.)

    2. Invoke node on library "Get All Decedents" and get the Control VIs from this

    3. Read the "VI Reference" in a for loop of the decedent results.

    4. Set the "VI Control Type" property to control.

    5. Save each VI with an invoke node save.

    6. While you are at it get all the standard VIs and "disconnectFromLibrary" method.

    7. Save standard VIs

    8. Save the library.

    That will disconnect all typedefs by saving them as a standard control instead of manually disconnecting them in each VI (that is a huge pain, I wouldn't even try.)

    the disconnectFromLibrary method will take them all outside of the .lvlib as well.

    Before you try any of this, make sure you back everything up, including the projects that use this library.

    Additionally I would run your script from the same project instance as the one that you use these VIs in. Sometimes Labview gets confused otherwise when you reopen the project.

    ~Jon

    • Like 1
  11. If you want to save some money, in the past NI has offered the associate developer's test free of charge at annual regional developer days. (if you are in a hurry that might not be an option.)

    As far as the CLD is concerned, you can usually wrangle a $200 gift certificate for showing up to various NI sponsored events (which being able to attend might be an issue considering your employers stance on the certification in the first place.)

    I would talk to your local NI rep about your situation, I've found them to be very reasonable as far as certification billing is concerned (they might even waive the fee.)

    ~Jon

  12. I agree and don't use them in my code. It was just the easiest way to illustrate the key-value concept of identifying the correct queue in the array.

    Your solution might work, but I'm missing a few things. Couple questions:

    1. How are you storing the queues in the queue collection class (the first class?) Storing each one as a separate control isn't very scalable, especially if you have 1000's of queues. The best solution will allow an arbitrary number of queues.

    2. How does the message class (the dynamic dispatch vi) know which queue to get from the queue collection class (the first class) at runtime? It needs to have some way to tell the queue collection which queue to put the message on.

    3. How does the loop dequeue the message?

    There is, but it might not help you in this case. The 'Preserve Run-Time Class' prim downcasts based on object type, whereas the 'To More Specific Class' prim downcasts based on wire type. I don't know what will happen at run time if you feed the PRT output into the Obtain Queue prim. I suspect you'll get a run time queue of the child type. In any event, you're resorting to some (as you suggested) obscure code for (imo) very little gain.

    Yes, if the parent has accessor methods. I often create protected static accessor methods specifically for allowing children access to parent data.

    1. The confines of your example as well as mine are both constructed such that one queue applies per class. There are much more elegant solutions available but given the starting point I kept the construction equally simple to your presented solution.

    2. Again they are explicit located in the dynamic dispatch. The collection class and the data classes are NOT decoupled, they don't have to be. (hes not writing the end all be all messaging protocol.)

    3. With a dequeue? I don't understand the question.

  13. Here is my suggestion:

    2 Class hierarchies

    Hierarchy 1.

    The first hierarchy is a class containing the 7 queues you are interested in using (This class does not need to be a class, its only reason is extensibility, generally if you don't use the dynamic dispatching feature, use a lvlib)

    Hierarchy 2

    class hierarchy with a placeholding parent (must override) children are the datatypes you want to move to dequeue destinations.)

    dyanmic dispatch VI for enqueue (one input for the class in hierarchy one)

    once you are in the dynamic dispatch VI, select the Queue from the first class containing all the references (property node read, read member VI, unbundle, whatever)

    Why this is better:

    Its alot faster than testing for named queues (even if there are only 7, especially if there are 1000s.)

    Naming queues is really really bad practice if you ask me.

    ~Jon

  14. I have a subpanel in a VI which is running (call it "VI CONTAINER"), but the front panel is not visible. It inserts another VI (call it "VI INSERT")

    When I execute the code (from a top level) I invoke "frontpanel.open (activate=true,state=standard)" on "VI CONTAINER"; THEN invoke Supanel.insert("VI INSERT") . The front panel of "VI INSERT" appears in the subpanel contained in "VI CONTAINER" but the controls/indicators never update.

    This only happens the first time I run the main VI. If I stop execution, then hit the run arrow again, after the subpanel.insert is called "VI INSERT" updates its front panel normally.

    what gives?

    This is especially bad because in a built application it never works, since it is always the first time run. (it will work in built applications if you stop execution, and hit the run after running it once it will work normally as well)

    I am calling "frontpanel.defer frontpanelupdates = FALSE" on "VI INSERT"'s front panel in numerous places and times, it doesn't help.

  15. I posted a little premature and wanted to explore some options before I posted a question. I was trying to remove the post but it appeared there was no way to do that.

    that said...

    I would like to remove the alpha history from my LVCLASS. the mutation history is a great feature, but the build times are getting ridiculous for a a project with LVCLASS history spreading from 8.5 to 2010 sp1 (with service pack updates on many of the versions in between.)

    I wanted to know if anyone had a tool to do this, but it appears its should be a reasonably easy to write from scratch.

    I'm going to get all the classes in the project, rename them to some temporary name, save, then rename to a different name. According to the documentation that should clear the alpha (and hopefully improve my build time from an hour to like 10 minutes.)

    ~Jon

  16. class property nodes are broken pre sp1 for LV2010, and I haven't tested if they fixed the issue for sp1 yet, though it is supposed to be.

    Here is a thread where I pretty much describe the same issue (LV crashing) there was a CAR assigned at some point.

    http://lavag.org/top...nodes-and-dvrs/

    255982

    ReturnChild Class Property Node does not output until deleting and rewiring any part of the VI

    This issue pertains to Object-Oriented Programming in LabVIEW. Essentially upon creating a Parent class and Child class, we are unable to read the child class properties in the main VI through property nodes unless we have changed the VI recently (i.e. unwire & rewire).

    Workaround: The workaround for this issue is to use subVIs created from 'VI for Dynamic Dispatch Template.' Within the subVIs you are able to unbundle and bundle the object.

    Reported Version: 2010 32-bit Resolved Version: 2010 SP1 32-bit Added:12/31/2010

  17. Have you read the actual document that describes the flatten format of LabVIEW data? For the fundamental datatypes like skalars and structs it can't get much more standard than the default C data format. The only LabVIEW specifics are the prepended string and array sizes, and the structure alignment of 1 bytes, as well as the default big endian byte order.

    It only gets LabVIEW specific when you talk about the aforementioned array sizes that get prepended, complex numbers and extended precision datatype, and other LabVIEW specific datatypes such as timestamps, refnums, etc. As to Open Source there exists an implementation although not in C but in LabVIEW. Checkout the OpenG lvdata Toolkit. Feel free to translate that to a C library or any other language of your choice :D.

    I would be interested in a document describing the flattening format (I didn't think that this information was released). This is the closest thing I could find on a google search:

    http://mp.ustu.ru/Users/Kalinin/Files/LabView/National%20Instruments6.0/LabVIEW%206/manuals/datastrg.pdf

    Something on ni.com or in the help files would be preferred.

  18. I've been tangling a little with some classes designed for data serialization and also with just trying to serialize some classes.

    One of the things I'm not wild about is the way that default values in a natively serialized object (using the binary or native XML tools) are omitted from the data. I feel like this breaks one of the goals of serializing the data, which is to separate the serialized data from the implementation of the object.

    I'm considering making it a convention to try to use default values in class private data that are null or otherwise illegal (in context).

    For new objects, this would require writing a constructor that sets the intended defaults.

    For deserialized objects, this would require writing a method to validate an object against a class invariant. I'll claim this is a good habit.

    Does anyone do it like this? Any drawbacks I'm missing? Any comments or ideas are appreciated.

    Thanks,

    -B

    When you say deserialized objects, are you talking about references to objects? (SEQ/DVR/LCOD or whatever implentation?)

    in most implementations getting a reference to an object is a private operation anyway, what do you need to validate for?

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.