Jump to content

A Scottish moose

Members
  • Content Count

    42
  • Joined

  • Last visited

  • Days Won

    1

A Scottish moose last won the day on June 9 2017

A Scottish moose had the most liked content!

Community Reputation

5

About A Scottish moose

  • Rank
    More Active

LabVIEW Information

  • Version
    LabVIEW 2016
  • Since
    2010

Recent Profile Visitors

1,523 profile views
  1. Hey everyone! I've looked through google and the lavag forums for this problem and haven't found much so I thought I'd ask and see if anyone had some ideas. Problem: I have an executable that is launching applets from several different PPLs. I am defining what PPL should be loaded by passing an INI file as command line arguments and also providing any other important start up information in that file. I've been trying to also force the icon on the application to set so that, depending on what the user is doing, the icon matches the process that I'm presenting them. Originall
  2. Hello everyone, TL;DR - Any thoughts on if it's a good or bad idea to spin off an actor from a QMH framework? I've got some library code that would be valuable but the project itself doesn't justify AF. I'm brainstorming ideas for a tester that I'll be building over the next few months. The project that I'm currently winding down is an Actor Framework test system that has come out really nice. Part of the project was an actor built to handle all of the DAQ and digital control. The main actor spins it off an tells it when and what tasks to launch. It send back data and confirms com
  3. Anyone know of a way you can do a ctr+drag on the block diagram (adding or removing space) that only impacts the current structure? I find myself wondering if this functionality exists often enough I thought I'd ask. Basically if you have a for loop inside a case structure is there a way to do a drag inside the for loop and grow that structure without impacting the case structure size or position? Thanks, Tim
  4. Yes. They are... however this was out of the classic palette. Perhaps you've taught me something new. I was under the impression that only system dependent controls are in the systems palette but I bet you are correct. The system button in the classic palette is OS dependent. Good catch Don't listen to my previous posts then because my conclusion was incorrect. Also thank you for your observation. Tim
  5. Update: I was able to replicate the original look with a 'Flat Square Button' and some color matching in about 5 or so minutes. Just be aware that if you move to sp1 of 2016 some of those graphics will change on you. Replicating them isn't a big effort however. Cheers, Tim
  6. Hello Everyone, I noticed an interesting change today that I thought was worth mentioning. I've been using the classic system button for some of my UI's. It's flat and looks clean. Less busy. I noticed today that the graphics for this button are different from one of my PC's to another. My laptop, which is running 2016.0, looks like the original. My desktop which is running 2016.0f2 looks like the 'different' one. Both 2016 but one has no depth graphic and one does. I've dug through the menus folder and it looks like the classic and modern controls are not housed there, whi
  7. Hello Everyone, A couple CLA summits ago (and I think even at an NI week or two) one of the NI R&D guys demo'd a software library that used variant attributes to create a lookup table. It was a class based interface that could be swapped into an actual database later if needed. I'm in the process of developing an application that would really be able to use this concept. I remember sitting in on this discussion but I don't have any notes on the name of the project or where to find it. I did some digging on NI.com and LVPM but came up with nothing so far. Does anyone on here rememb
  8. Hey everyone! I'm working on a test system right now that requires the operators to sign the test reports. In the previous generation this was done by the print/sign/scan method. During one of the meetings it was mentioned that getting around this requirement would be nice. I recommended we look into a digital signature pad and see what would be required to integrate one. I've been thinking about ordering one and just giving it a go but I thought first I'd ask and see who has done this with LabVIEW before. I know someone has, I just haven't found the documentation online yet.
  9. What about stripping the VI of it's borders during runtime? This is what I've done and it's worked well. If the VI is just a box during runtime you'll end up with that nice splash screen look... The attached shows how I set up my splash screens and that as worked for me in 2015 and 2016
  10. Using 2016. Checked the strict Type Def idea, nope, using regular type def. Good idea though! I've tried to reproduce it a bit today but haven't had much luck. Yay for non-repeatable failures!
  11. Exactly! This was my thought too, and the exact reason I add at the end of an enum if at all possible. Going to work on it some more today and see if I can figure out why.
  12. Hey everyone, I'm working on a midsize (Actor framework) project that has a main command message that gets used by all of my nested actors. This command messages payload is a typedef enum that defines what job gets done. As I've been developing the project I've been adding values to the enum and handling the resulting job in the 'send command.vi' message.... This works well because all of my nested actor messages that don't have a payload just send a constant enum to the controller to request a job... so I'm adding a new function to the said enum and after editing the typedef LabVIEW r
  13. How does purging the mutation history work? Any white papers you could link to on this? Thanks,
  14. I forget sometimes how easy it is to test a function or behavior in labVIEW. Ctrl+N and throw a few VIs on the block diagram and see what it does. So very powerful for sure!
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.