Jump to content

bjustice

Members
  • Posts

    152
  • Joined

  • Last visited

  • Days Won

    10

Everything posted by bjustice

  1. Thanks Hooovahh, I've used your TDMS concatenate VIs in a few places. Really convenient to see this wrapped in a VIPM with a few other tools. Will install this right alongside Hooovahh arrays
  2. I threw this together, and maybe someone will find it useful. I needed to be able to interact with cmd.exe a bit more than the native system exec.vi primitive offers. I used .NET to get the job done. Some notable capabilities: - User can see standard output and standard error in real-time - User can write a command to standard input - User can query if the process has completed - User can abort the process by sending a ctrl-C command Aborting the process was the trickiest part. I found a solution at the following article: http://stanislavs.org/stopping-command-line-applications-programatically-with-ctrl-c-events-from-net/#comment-2880 The ping demo illustrates this capability. In order to abort ping.exe from the command-line, the user needs to send a ctrl-c command. We achieve this by invoking KERNEL32 to attach a console to the process ID and then sending a ctrl-C command to the process. This is a clean solution that safely aborts ping.exe. The best part about this solution is that it doesn't require for any console prompts to be visible. An alternate solution was to start the cmd.exe process with a visible window, and then to issue a MainWindowClose command, but this required for a window to be visible. I put this code together to allow for me to better interact with HandbrakeCLI and FFMPEG. Enjoi NET_Proc.zip
  3. drjdpowell touched upon the primary (and initial) purpose for the code. I had an issue where I wanted to use user events, but the rule of "who starts first" came into play. This eliminated that worry. I could spin up a sub-process, register it for the user event, and force the user event case (only in that subprocess) to initialize with the most recent data. And, it's clean! Notice how I'm able to share the "mycluster" user event case structure with both the initialization UE and the normal UE. No tacky need for another case structure event that only handles first call initialization, or some other similar thing.
  4. Oh, and it's saved in 2018 and makes heavy use of VIMs. So, apologies to those using older versions of LabVIEW
  5. I made a fun piece of code, and thought that I'd share. Maybe it'll spark some good discussion. Here it is: Push-Notifier.zip I wanted to accomplish 2 things with this code: 1) I wanted to be able to create a notifier that I can register as a user event. Basically, I want a user event to be generated whenever a new notification is sent. 2) When I register for a user event, I want to be able to also force the event structure to generate a user event using the most recent notification. (Helpful for initialization of data in the event structure) Demo VI looks like this: Anyways, thought this might be fun to share
  6. Interesting history. Thanks for the information. I've reported this to NI. I will post here if they generate a CAR thanks everyone
  7. Benoit, thanks for looking at this with me. It looks like the error that you generated there is a result of the "Vertical Arrangement" property not being allowed to be applied to boolean text. I get the following possible reasons for that error: So, this is good proof that not everything in the "text property palette" is compatible with boolean text. Which is fine, but the text select end/start should return a similar error in order to indicate the lack of support. Also, as I said earlier, I can copy/paste text into the boolean text and have it retain properties. Which means that the property node interface simply doesn't appear to support programmatic interaction.
  8. X-POST to NI forums. (No luck there so far.) https://forums.ni.com/t5/LabVIEW/button-with-different-size-text/m-p/3864798/highlight/false#M1095147 I am hoping to determine if there is a programmatic method to have a button with boolean text with varying font size/color. This is achievable with a strong control using text selection end/start. Booleans expose these same properties in the property node... but it doesn't appear to do anything. Even more interesting is that I can copy/paste text with varying color/size into the boolean text... and it will work. Thoughts? Thanks! bool_text.vi
  9. I still actively use the OpenG toolkit. I would say that 90% of my usage lies in the Array and File palettes. I also recently learned that Hooovahh rewrote the Array pallette using VIMs though, so I've been using that in newer projects. Hooovahh Arrays It might be nifty if OpenG added Hooovahh Arrays to the OpenG Arrays palette...
  10. Fantastic! This is exactly what I was envisioning. Thank you.
  11. Hey guys! This might be a dumb question... so ignore me if I'm out of touch. But, I'm a big fan of the OpenG toolkit. Is there intent to ever rewrite any of this codebase using malleable VIs? I know that many of the tools (especially the array VIs) use a polymorphic VIs in order to handle different data types. When malleable VIs were announced, I always envisioned that this toolkit would immediately benefit from this new capability. I'm mostly just curious
  12. 7 years later, this information helped me! Thanks for coming back to post
  13. I haven't head of TestScript before. Sounds interesting. But I have tried both the Enthought toolkit as well as the LabVIEW Python Node. I've been majorly impressed with the Enthought toolkit. I've passed fairly large data to/from Python and the performance has been pretty great. And it certainly does solve some deployment headaches. It has a building mechanism that is able to generate a self-contained python environment with only required libraries for running your code. You can copy/paste this environment directly onto a deployed machine and it will run! No need to install and configure a python environment from scratch for every deployed machine. 10/10 recommend
  14. Co-worker found the solution! This option in the packed library build spec will exclude the lvanalys.dll from being placed into the packed library build directory. Furthermore, when the application EXE build spec runs, it seems to be smart enough to include lvanalys.dll in the resource folder when the packed lib is a dependency. (Cool!)
  15. Good people of LAVA, I am running into a peculiar issue involving packed libraries. I have attached a zipped folder that illustrates this issue in a simplified format. When I run the build spec in the attached folder, I get the following error: My understanding is that the application builder is running into this issue as a result of trying to include the lvanalys.dll twice: -lvanalys.dll is a dependency of G-code in the main.vi -lvanalys.dll is a dependency of the packed library build: "math.lvlibp" One workaround that I've discovered is to rename the dll in the packed library build spec: As you might expect, this results in the Main application builder including both lvanalys.dll and lvanalys_mathlib.dll in the resource folder. However, since the names are different - this doesn't generate a namespace collision warning. The downside is that this feels a bit kludgy. And it would be nicer to find a solution where the application builder is smart enough to understand that it only needs 1 copy of this dll in the resource folder. Thoughts? Am I missing something? I have only recently started to explore packed libraries. Thanks! math.zip
  16. You guys all rock! Ok, so, for future readers, here are my lessons learned here: As you would expect, converting a U64 to a DBL yields loss of precision on the fractional seconds (near millisecond/microsecond regime). As such, feeding the "Seconds to Date/Time Function" a DBL input will result in loss of precision on the fractional seconds output The LabVIEW "timestamp" data structure consume 16 bytes of memory, and can maintain precision at the nanosecond regime so long as I typecast the structure correctly. (Cool! I always assumed that the timestamp structure was a DBL under the hood.) The Seconds to Date/Time function can accept either a UINT64 or a TIMESTAMP as an input. This yields a lossless (less lossy?) calculation. It might even be possible that the UINT64 input option is a completely lossless calculation since there is no floating point math involved (maybe...) After a bunch of unit testing and poking, I've decided to implement @infinitenothing's solution with one small tweak. screenshot: The small tweak here is that I convert the remainder directly to microseconds as opposed to converting to seconds and then adding to the output of the Date/Time function. This yields slightly better data. Thanks again everyone. I was banging my head against a wall for a few hours last week on this.
  17. Hey guys! I have a problem that is giving me a headache. Kudos to anyone with any suggestions. I have a small subvi that needs to do the following: input = (UINT64) nanoseconds since the start of the LabVIEW time epoch output = (cluster) timestamp expressed as: (INT16) Julian day, (UINT32) milliseconds since start of the day, (UINT16) microseconds since start of the day. Now, I could indeed just use the LabVIEW seconds to Date/Time VI. However, this VI gets very lossy near the fractional second regime. (It is using a DBL after all.) I've tried to correct for this by doing the sub-second math myself. See my current attempt: However, this solution still has a bad edge case. Try the following inputs: 1528210282999999871 1528210282999999872 This shows that the "seconds" output of the LabVIEW seconds to Date/time function rolls over unexpectedly (I assume due to the lossy-ness of the function.) Does anyone have any suggestions? I could write this seconds to Date/time code from scratch myself, but I'm unfamiliar with the annoying calculation for determining number of days since the start of the LabVIEW epoch. Thoughts? Suggestions? Thanks! LV timestamp to SUN timestamp.vi
  18. Hey guys! I recently upgraded an application used across my company from LV2012 SP1 to LV2017. (woot woot!) The #1 complaint (and this is a bit silly really) is that LV2017 changed a characteristic of user input to numeric controls displaying relative time. More specifically, take a look at the attached VI/screenshot. In LV2012, a user is able to double-click the numeric input field (highlighting all test) and then entering a number. If the display format was set to :<:M:%S>t, then this input is recognized as a desired number of minutes. Thus, entering "2" yields a value of 120 to the control. Fantastic! However, in LV2017, if a user highlights all text and then enters a number.... then no change is applied to the numeric value. This input is rejected. Is there any mechanism for enabling the LV2012 mechanic for highlighting all text and entering a value? Thanks! numeric.vi
  19. Ok, I've cross-posted to the NI forums: https://forums.ni.com/t5/LabVIEW/mouse-scroll-array-indicators-bug-LAVA-x-post/td-p/3730630 I linked the LAVA forum there as well. (Not sure of etiquette here) thanks!
  20. Hello everyone! I've noticed a behavior in LabVIEW that has me a bit puzzled, and I wanted to know what you guys think. (See attachments) I'm currently working in LabVIEW 2017. As the image points out, I am able to use my mouse wheel to scroll control arrays. However, whenever I mouse-wheel over an array indicator.... the indicator goes to index=0. Is this intended behavior? Since I like to be able to mouse-wheel on arrays in GUIs, I am finding myself creating control arrays... setting to a disabled state... and then updating the controls via property node so that I can effectively have an indicator that has mouse-wheel scrolling capability. Thoughts? Thanks! mousescroll.vi
  21. Smith, thank you for your answer and for linking these pages. I found them very helpful.
  22. Tortoise SVN with command line tools installed. This lets me call SVN from the LabVIEW system exec, which I use for various applications
  23. Hey guys! So, I've installed LabVIEW 2017 and I'm starting to play around with it. Malleable VIs are cool! (No more giant OpenG toolkits where there are 10 instances of the same VI for multiple datatypes.) Another cool thing that I've observed is that the example "sort 2D Array" function can support a scalar and an array input for index. Upon further digging, there appears to be an interesting disable structure that intelligently selects the upstream input that yields non-broken code. Does anyone have more information about this structure? Is this related to the experimental structure that Hooovah talked about in his Xnodes presentation at NI week? I'm just curious if this is stable... and if I can start using it... and it there is any documentation on it that would tell me how to use it.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.