Jump to content

A Scottish moose

  • Posts

  • Joined

  • Last visited

  • Days Won


Everything posted by A Scottish moose

  1. This is a good summation of the problem. Thanks for sharing!
  2. Ehh why not... <gets chair and looks intensely at camera> I think that NI will sell in the next 2-3 years. I agree with X on the churn rate. There's zero chance NI comes out on top in the long term with this plan. NXG is dead; LabVIEW as a competitive language is no more from a professional standpoint. It's firmly an enthusiast language now. That means like other enthusiast languages it's user base will continue to shrink from here on out. Now you've got two options to deal with this problem; embrace it or hasten it's demise. NI is obviously going with the later. 2-3 (maybe 5?) years of increased revenue while people work their way off the LabVIEW bandwagon (which they were going to do anyways when NXG was nuked) and then they are moving on. It's possible NI just understands the 'make hay while the sun is shining' concept and are going to get every value out of the product in the next half decade because, either way, LabVIEW is dead weight on the company in 5-10 years. The other possibility is that subscription revenue has a higher impact on company value (on paper) than on-off sales. I think subs are a 2-3x multiplier on estimated value. If NI is looking to sell, moving everything to subs and holding for a couple years until they hit the peak of the revenue curve in 2025 and then shopping for a buyer makes the company look 50-100% more valuable than it was in 2021. All that's conjecture and theory. I'm more than happy to be proven incorrect, but I believe I am saying the quite part out loud here and I think that's a good thing. (I hope) Best Tim
  3. My approach has been Class based inheritance. Have a parent class that accepts the messages in a VI and then overwrite that VI with your children classes. Using inheritance allows you to make a decision on who gets to handle the command first. The youngest class in the hierarchy gets a chance to handle the class first before you pass it up (or not). I find that it makes it easy to handle the "What do I do if both controllers handle the same command differently?" As an example. MsgHndlrParent.lvclass:core.vi ControllerChild1.lvclass:core.vi ControllerChild2.lvclass:core.vi Child 2 would attempt to handle the command in it's overwrite of core.vi, if it can't send it to Child 1:Core.vi, and if child 1 can't handle it pass it to MsgHndlr for error handling. Hope that helps! Tim
  4. I have always found that the core LabVIEW community to be kind and welcoming and more than willing to support and bring up new developers. This is one of my favorite things about LabVIEW; that people actually have a passion for the language and the culture that it creates. You write Python because everyone else writes Python. You write LabVIEW because it's a unique experience and sharing that experience is part of the package. The Architect's Summit every year was one of my favorite weeks of the year for this reason (aside from the free food ). The people on this forum are some of those most responsible for keeping that community expressly *not* like other programming communities out there. It's a big selling point for the language and why I think it was so successful (IMO) in the mid '10s.
  5. As a mid-career guy I think I can speak for some of the younger developers in the world. If you are working on a programming problem I think Googling "How do I do <this> in <programming language>" is basically the de facto solution to that problem. Weither you get the answer from YT, Reddit, Twtr, IG, FB, etc is irrelevant as it the first answer on Google always wins. This has been true for me for the last 10 years a least for whatever that's worth. If I could be so bold... I actually think one of LabVIEW's challenges is that it doesn't take well to posting on text forums and requires PNG'd meta files at the least to pass information around. A 'dump VI to text' option similar to what was used in old video game saves (or Factorio blueprints if you are familiar with that game) would vastly improve the share-ability of the language. I might be showing my novelty here if this already exists. it would do a ton of good for the language.
  6. I have frequently wondered on the 'Make it possible, but really inconvenient' approach to software legacy support that has become the norm amongst many large software corporations in the last few years. I'm sure there is a reason for taking this approach from the seller side but as an enthusiast/developer/consumer all I see is bad customer service...
  7. I hope this is true. LabVIEW is the only language I enjoy programming in. I had thought community edition was the solution to this problem but I've basically seen LV interest continually wane in the 10 years I've been using it.
  8. Corp gave the rubber stamp so I'm planning on being there.
  9. Harsh but true. To his credit it takes a brave presenter to take live comments, especially when they are posted directly behind you.
  10. Hey everyone! I've looked through google and the lavag forums for this problem and haven't found much so I thought I'd ask and see if anyone had some ideas. Problem: I have an executable that is launching applets from several different PPLs. I am defining what PPL should be loaded by passing an INI file as command line arguments and also providing any other important start up information in that file. I've been trying to also force the icon on the application to set so that, depending on what the user is doing, the icon matches the process that I'm presenting them. Originally I thought this to be rather trivial as LabVIEW provides an invoke node to set the icon but after attempting it (and then checking the documentation) I realized that it is not available at run-time, (also it doesn't support .ico files according to the docs) http://zone.ni.com/reference/en-XX/help/371361R-01/lvprop/vi_set_vi_icon/ My hope is that I can force the application icon on the start bar and the title bar to match the .ico file that I'm providing to LabVIEW from the ini file. Any thoughts on this? I assumed it would be rather simple, just a function call to the right property/invoke/activex node but I've not come up with much at this point Thanks for your help, Tim
  11. Hello everyone, TL;DR - Any thoughts on if it's a good or bad idea to spin off an actor from a QMH framework? I've got some library code that would be valuable but the project itself doesn't justify AF. I'm brainstorming ideas for a tester that I'll be building over the next few months. The project that I'm currently winding down is an Actor Framework test system that has come out really nice. Part of the project was an actor built to handle all of the DAQ and digital control. The main actor spins it off an tells it when and what tasks to launch. It send back data and confirms commands, uses Dynamic events to keep up with the generated signals, and uses actor tasks to ship the data back to the controller. Works amazing. Nothing revolutionary for sure but very handy. This next project doesn't really need actor framework, it's much smaller and has a lot smaller list of requirements. That being said I'm curious about integrating my DAQ actor (Dactor, because who can resist an amalgamation?!) into the project. Any thoughts on if it's a good or bad idea to spin off an actor from a QMH framework? Is this even possible based on how the AF tree is designed to work? Thanks for reading! Tim
  12. Anyone know of a way you can do a ctr+drag on the block diagram (adding or removing space) that only impacts the current structure? I find myself wondering if this functionality exists often enough I thought I'd ask. Basically if you have a for loop inside a case structure is there a way to do a drag inside the for loop and grow that structure without impacting the case structure size or position? Thanks, Tim
  13. Yes. They are... however this was out of the classic palette. Perhaps you've taught me something new. I was under the impression that only system dependent controls are in the systems palette but I bet you are correct. The system button in the classic palette is OS dependent. Good catch Don't listen to my previous posts then because my conclusion was incorrect. Also thank you for your observation. Tim
  14. Update: I was able to replicate the original look with a 'Flat Square Button' and some color matching in about 5 or so minutes. Just be aware that if you move to sp1 of 2016 some of those graphics will change on you. Replicating them isn't a big effort however. Cheers, Tim
  15. Hello Everyone, I noticed an interesting change today that I thought was worth mentioning. I've been using the classic system button for some of my UI's. It's flat and looks clean. Less busy. I noticed today that the graphics for this button are different from one of my PC's to another. My laptop, which is running 2016.0, looks like the original. My desktop which is running 2016.0f2 looks like the 'different' one. Both 2016 but one has no depth graphic and one does. I've dug through the menus folder and it looks like the classic and modern controls are not housed there, which makes sense really since they are probably proprietary. I'd prefer the original look as it matches with the flat Windows 10 style that's in vogue right now. Just something I noticed and thought was worth mentioning. Cheers, Tim
  16. Hello Everyone, A couple CLA summits ago (and I think even at an NI week or two) one of the NI R&D guys demo'd a software library that used variant attributes to create a lookup table. It was a class based interface that could be swapped into an actual database later if needed. I'm in the process of developing an application that would really be able to use this concept. I remember sitting in on this discussion but I don't have any notes on the name of the project or where to find it. I did some digging on NI.com and LVPM but came up with nothing so far. Does anyone on here remember that discussion and perhaps know what I'm talking about? TL/DR - A couple years ago an NI R&D engineer demo'd a lookup table that was class based and ran on the variant attribute engine in the background. Trying to remember what it was called, looking for tips. Thanks, Tim
  17. Hey everyone! I'm working on a test system right now that requires the operators to sign the test reports. In the previous generation this was done by the print/sign/scan method. During one of the meetings it was mentioned that getting around this requirement would be nice. I recommended we look into a digital signature pad and see what would be required to integrate one. I've been thinking about ordering one and just giving it a go but I thought first I'd ask and see who has done this with LabVIEW before. I know someone has, I just haven't found the documentation online yet. Here's how I expect it would go: 1. The software prompts the user to sign at the end of the test. 2. The signature pad saves the image location to the hard drive or provides it to the client through an API (any experience on how this usually works is appreciated) 3. My software would aquire the image and save it to a named range in Excel using the report gen toolkit. Currently my report writing tool of choice. 4... Profit! Does this theory match with reality? What are your experiences? Do you have any models you prefer to work with? I dug for a few minutes on this and didn't come up with much so perhaps a discussion on the subject is valuable. Thanks for the help! Tim
  18. What about stripping the VI of it's borders during runtime? This is what I've done and it's worked well. If the VI is just a box during runtime you'll end up with that nice splash screen look... The attached shows how I set up my splash screens and that as worked for me in 2015 and 2016
  19. Using 2016. Checked the strict Type Def idea, nope, using regular type def. Good idea though! I've tried to reproduce it a bit today but haven't had much luck. Yay for non-repeatable failures!
  20. Exactly! This was my thought too, and the exact reason I add at the end of an enum if at all possible. Going to work on it some more today and see if I can figure out why.
  21. Hey everyone, I'm working on a midsize (Actor framework) project that has a main command message that gets used by all of my nested actors. This command messages payload is a typedef enum that defines what job gets done. As I've been developing the project I've been adding values to the enum and handling the resulting job in the 'send command.vi' message.... This works well because all of my nested actor messages that don't have a payload just send a constant enum to the controller to request a job... so I'm adding a new function to the said enum and after editing the typedef LabVIEW resets ALL constant instances of this typedef to default across my entire project. Every instance in memory of this enum gets set back to the zero value. I have everything backed up in SVN so this isn't a big deal from a time standpoint for me but I don't think this is intended behavior. Has anyone else seen this? TL/DR - I have a typedef enumeration and after adding a value to the end of the enum it resets ALL constant instances of this enum to value 0... ideas? Thanks, Tim
  22. How does purging the mutation history work? Any white papers you could link to on this? Thanks,
  23. I forget sometimes how easy it is to test a function or behavior in labVIEW. Ctrl+N and throw a few VIs on the block diagram and see what it does. So very powerful for sure!
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.