Jump to content

bjustice

Members
  • Posts

    152
  • Joined

  • Last visited

  • Days Won

    10

Posts posted by bjustice

  1. drjdpowell touched upon the primary (and initial) purpose for the code.  I had an issue where I wanted to use user events, but the rule of "who starts first" came into play.  This eliminated that worry.  I could spin up a sub-process, register it for the user event, and force the user event case (only in that subprocess) to initialize with the most recent data.  And, it's clean!  Notice how I'm able to share the "mycluster" user event case structure with both the initialization UE and the normal UE.  No tacky need for another case structure event that only handles first call initialization, or some other similar thing.

  2. I made a fun piece of code, and thought that I'd share.  Maybe it'll spark some good discussion.  Here it is:

    Push-Notifier.zip

     

    I wanted to accomplish 2 things with this code:

    1) I wanted to be able to create a notifier that I can register as a user event.  Basically, I want a user event to be generated whenever a new notification is sent.

    2) When I register for a user event, I want to be able to also force the event structure to generate a user event using the most recent notification.  (Helpful for initialization of data in the event structure)

     

    Demo VI looks like this:

    10.png.75b2866c23f6b94d32c423a43501ec0e.png

     

    Anyways, thought this might be fun to share

     

    • Thanks 1
  3. Benoit,  thanks for looking at this with me.

    It looks like the error that you generated there is a result of the "Vertical Arrangement" property not being allowed to be applied to boolean text.  I get the following possible reasons for that error:

    image.png.ee5627c1fa7c495c9dc00e13ddb248c2.png

     

    So, this is good proof that not everything in the "text property palette" is compatible with boolean text.  Which is fine, but the text select end/start should return a similar error in order to indicate the lack of support.

    Also, as I said earlier, I can copy/paste text into the boolean text and have it retain properties.  Which means that the property node interface simply doesn't appear to support programmatic interaction.

    image.png.a88bba3bcaa10396d50a31839162d998.png

     

     

  4. X-POST to NI forums.  (No luck there so far.)
    https://forums.ni.com/t5/LabVIEW/button-with-different-size-text/m-p/3864798/highlight/false#M1095147

    I am hoping to determine if there is a programmatic method to have a button with boolean text with varying font size/color.  This is achievable with a strong control using text selection end/start.  Booleans expose these same properties in the property node... but it doesn't appear to do anything.

    Even more interesting is that I can copy/paste text with varying color/size into the boolean text... and it will work.

    Thoughts?  Thanks! 

     

    bool_text.vi

  5. Hey guys!

    This might be a dumb question... so ignore me if I'm out of touch.

    But, I'm a big fan of the OpenG toolkit.  Is there intent to ever rewrite any of this codebase using malleable VIs?  I know that many of the tools (especially the array VIs) use a polymorphic VIs in order to handle different data types.  When malleable VIs were announced, I always envisioned that this toolkit would immediately benefit from this new capability.

    I'm mostly just curious

     

    2018-10-16_16-30-17.png

  6. I haven't head of TestScript before.  Sounds interesting.  But I have tried both the Enthought toolkit as well as the LabVIEW Python Node.

    I've been majorly impressed with the Enthought toolkit.  I've passed fairly large data to/from Python and the performance has been pretty great.  And it certainly does solve some deployment headaches.  It has a building mechanism that is able to generate a self-contained python environment with only required libraries for running your code.  You can copy/paste this environment directly onto a deployed machine and it will run!  No need to install and configure a python environment from scratch for every deployed machine. 10/10 recommend

  7. Co-worker found the solution!

    2018-08-16_15-28-14.png

     

    This option in the packed library build spec will exclude the lvanalys.dll from being placed into the packed library build directory.

    Furthermore, when the application EXE build spec runs, it seems to be smart enough to include lvanalys.dll in the resource folder when the packed lib is a dependency.  (Cool!)

  8. Good people of LAVA,

    I am running into a peculiar issue involving packed libraries.  I have attached a zipped folder that illustrates this issue in a simplified format.  When I run the build spec in the attached folder, I get the following error:

    2018-08-16_13-46-45.png

    My understanding is that the application builder is running into this issue as a result of trying to include the lvanalys.dll twice:
    -lvanalys.dll is a dependency of G-code in the main.vi
    -lvanalys.dll is a dependency of the packed library build: "math.lvlibp"

     

    One workaround that I've discovered is to rename the dll in the packed library build spec:

    2018-08-16_14-02-20.png

    As you might expect, this results in the Main application builder including both lvanalys.dll and lvanalys_mathlib.dll in the resource folder.  However, since the names are different - this doesn't generate a namespace collision warning.  The downside is that this feels a bit kludgy.  And it would be nicer to find a solution where the application builder is smart enough to understand that it only needs 1 copy of this dll in the resource folder.

    Thoughts?  Am I missing something?  I have only recently started to explore packed libraries.  Thanks!

     

    math.zip

  9. You guys all rock!

    Ok, so, for future readers, here are my lessons learned here:

    1. As you would expect, converting a U64 to a DBL yields loss of precision on the fractional seconds (near millisecond/microsecond regime).  As such, feeding the "Seconds to Date/Time Function" a DBL input will result in loss of precision on the fractional seconds output
    2. The LabVIEW "timestamp" data structure consume 16 bytes of memory, and can maintain precision at the nanosecond regime so long as I typecast the structure correctly.  (Cool!  I always assumed that the timestamp structure was a DBL under the hood.)
    3. The Seconds to Date/Time function can accept either a UINT64 or a TIMESTAMP as an input.  This yields a lossless (less lossy?) calculation.  It might even be possible that the UINT64 input option is a completely lossless calculation since there is no floating point math involved (maybe...)

     

    After a bunch of unit testing and poking, I've decided to implement @infinitenothing's solution with one small tweak.  screenshot:

    2018-06-10_18-50-18.png

    The small tweak here is that I convert the remainder directly to microseconds as opposed to converting to seconds and then adding to the output of the Date/Time function.  This yields slightly better data.

     

    Thanks again everyone.  I was banging my head against a wall for a few hours last week on this.

     

  10. Hey guys!

    I have a problem that is giving me a headache.  Kudos to anyone with any suggestions.  I have a small subvi that needs to do the following:

    input = (UINT64) nanoseconds since the start of the LabVIEW time epoch

    output = (cluster) timestamp expressed as: (INT16) Julian day,  (UINT32) milliseconds since start of the day,  (UINT16) microseconds since start of the day.

     

    Now, I could indeed just use the LabVIEW seconds to Date/Time VI.  However, this VI gets very lossy near the fractional second regime.  (It is using a DBL after all.)  I've tried to correct for this by doing the sub-second math myself.  See my current attempt:
     

    2018-06-05_20-22-27.png

     

    However, this solution still has a bad edge case.  Try the following inputs:

    1528210282999999871

    1528210282999999872

    This shows that the "seconds" output of the LabVIEW seconds to Date/time function rolls over unexpectedly (I assume due to the lossy-ness of the function.)

     

    Does anyone have any suggestions?  I could write this seconds to Date/time code from scratch myself, but I'm unfamiliar with the annoying calculation for determining number of days since the start of the LabVIEW epoch.

     

    Thoughts?  Suggestions?  Thanks!

    LV timestamp to SUN timestamp.vi

  11. Hey guys!

    I recently upgraded an application used across my company from LV2012 SP1 to LV2017.  (woot woot!)

    The #1 complaint (and this is a bit silly really) is that LV2017 changed a characteristic of user input to numeric controls displaying relative time.  More specifically, take a look at the attached VI/screenshot.  In LV2012, a user is able to double-click the numeric input field (highlighting all test) and then entering a number.  If the display format was set to :<:M:%S>t, then this input is recognized as a desired number of minutes.  Thus, entering "2" yields a value of 120 to the control.  Fantastic!

    However, in LV2017, if a user highlights all text and then enters a number.... then no change is applied to the numeric value.  This input is rejected.

    Is there any mechanism for enabling the LV2012 mechanic for highlighting all text and entering a value?

    Thanks!

    2018-01-22_15-44-45.png

    numeric.vi

  12. Hello everyone!

    I've noticed a behavior in LabVIEW that has me a bit puzzled, and I wanted to know what you guys think.

    (See attachments)

    I'm currently working in LabVIEW 2017.  As the image points out, I am able to use my mouse wheel to scroll control arrays.  However, whenever I mouse-wheel over an array indicator.... the indicator goes to index=0.  Is this intended behavior?

    Since I like to be able to mouse-wheel on arrays in GUIs, I am finding myself creating control arrays... setting to a disabled state... and then updating the controls via property node so that I can effectively have an indicator that has mouse-wheel scrolling capability.

    Thoughts?

    Thanks!

    mousescroll.vi

    2017-12-11_9-33-17.png

  13. On 6/2/2017 at 10:29 PM, smithd said:

    As for the structure, you may wish to watch the JeffK+Mercer presentation at NI week you can get here

    My understanding: basically it is a disable structure where instead of manually enabling/disabling, the compiler will run through all cases and enable the first case which doesn't cause a compiler error. When used in conjunction with a vim you can do fancy things. For example if you wanted to make 1 node that did "concatenate stuff" you could have two cases, 1 assuming the inputs are strings and 2 assuming they are scalars or arrays. If the type passed in is not a string, that case1 will cause a compiler error and it will go on to case2 with the more flexible build array, which will compile. In the NI week presentation it sounded like it was mostly solid but too early to be comfortable throwing it out to the masses yet. 

    Smith, thank you for your answer and for linking these pages.  I found them very helpful.

  14. Hey guys!

    So, I've installed LabVIEW 2017 and I'm starting to play around with it.  Malleable VIs are cool!  (No more giant OpenG toolkits where there are 10 instances of the same VI for multiple datatypes.)

    Another cool thing that I've observed is that the example "sort 2D Array" function can support a scalar and an array input for index.  Upon further digging, there appears to be an interesting disable structure that intelligently selects the upstream input that yields non-broken code. 

    Does anyone have more information about this structure?  Is this related to the experimental structure that Hooovah talked about in his Xnodes presentation at NI week?  I'm just curious if this is stable... and if I can start using it... and it there is any documentation on it that would tell me how to use it.
     

     

    2017-06-02_16-53-54.png

    2017-06-02_16-55-24.png

  15. Hi Hooovah!

    I suppose that I've lurked long enough; maybe I should post something.  So, hello!  I just got back from NI week --> this was my 2nd time attending.  I bought you a beer at the LAVA BBQ.  (That means I get Kudos, right?  How does this internet thing work?)  The advanced users track brought me back to the conference again this year.  What a great group of people :)

    I have a degree in aerospace engineering with no formal software engineering education.  I sorta taught myself LabVIEW in college for various projects. I've been working pretty heavily in LabVIEW during the last 4 years.  I was at SpaceX for a bit, and now I'm at Blue Origin.

    I don't post very much simply because I don't have the time.  Work keeps me busy, even when at a full sprint.  When I jump on the forums, it usually because I've run into a limitation of LabVIEW, and am looking for a workaround.  If I don't find an acceptable workaround in 20 minutes, I will usually move on with a different solution.  I don't have the time to post a questions and collect further feedback or to contribute on a discussion.

    I wish that I could share more code with the community, but most of what I do is work related... and cannot be shared.  Nonetheless, I do have one project from college that I've posted online.  It's pretty old now, and I've come a long way since then, but maybe it's a project worth sharing in this thread.  Enjoy!: Link

    Anyways, I'll try to post more and participate in discussions.  You guys rock, cya at NI Week next year!

    -Brent

     

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.