Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation on 07/24/2017 in Posts

  1. How dare them making a choice when they are asked to? You cannot approve the dialog automatically as it is shown only after starting the application (or restarting the computer) during which you don't have elevated access. However you could add firewall rules as part of the installation procedure (see below). None that I know of. It is also very annoying to get prompted over and over again after making a choice. Why even bother asking if there is no choice in the first place? Software that does that is malware in my opinion. Also I can see users taking three courses of action: Allow access (stupid, since it could be malware, also teaches your users bad habits) Decline and inform IT (IT will knock on your door, so go for it if you need to talk to them urgently ) Uninstall (yeah, might not be IT who is knocking next) You can actually run command line instructions during installation in order to add firewall rules. Have a look at the instructions over at TechNet: https://technet.microsoft.com/en-us/library/dd734783(v=ws.10).aspx Once you figured out the necessary firewall rules (e.g.: by checking a computer that accepted the rule), you can build the commands and execute them during installation. It should be possible to run the instructions using the post-install action (run application after installation), though I'm not sure if it will actually be run in elevated mode. Another option is to use a custom installer (we made our own using Inno Setup) and pack the LabVIEW installer inside the custom installer.
    1 point
  2. Classes on the terminals of XNodes are somewhat unstable and crashy. The key way to avoid a crash is to put a "Request Deallocation" primitive wired with a constant TRUE on the block diagram of every VI that has a variant that could contain a LabVIEW class. But you then also have to be careful not to use your XNode in two separate user contexts at the same time unless the class is defined identically in both user contexts. The two features were never designed to work with each other (they came into existence at the same time and weren't really aware of what the other's needs would be). One or the other would have to be completely redesigned to accommodate the other. There are some safe paths, but I can't give much guidance there... I have to derive them special for every XNode I work on... which is why I rarely use XNodes. YMMV. But, seriously... add the Request Deallocation nodes. You'll save yourself a lot of pain.
    1 point
  3. I see this issue regularly. I do a lot of work where the "real" UI is just a big picture indicator. One stray coordinate on that blue picture wire will send the Picture to Pixmap VI into a tizzy as that's the first time a flattened 2D map is made of the image data. That VI also has horrible error handling-- which is to say it has none-- it'll just crash LabVIEW with the out of memory error if asked to do something absurd. Some general guidelines for interfaces I write rely on the Picture to Pixmap VI: Constrain dimensions to some reasonable maximum density. I usually do 25 MP or something. Your mileage may change depending on expected system power and color depth. Test VIs which create picture data. Those pictures have 32-bit coordinate space (16-bit for each axis) which allows for some very unreasonable map sizes if you're not careful. Any VIs you use to generate picture data should have their edge cases well defined and tested to make sure you're not trying to draw data which may demand terabytes worth of RAM when converted to a map. Be aware of vector vs (bit)map data. The native picture controls and indicators can work fine with pure vector data which will be more tolerant of large map spaces but generally perform really poorly if there's any real data density. Rendering maps is far quicker, but demands the use of the offending Picture to Pixmap VI. Cringe with fear that you've been reduced to having to work with the LabVIEW picture API. There's really no good way out if you have to go down this route.
    1 point
  4. I definitely get what you mean by burning out. Even on project that is long, it is easy for me to lose interest. With LabVIEW it always feels like getting 90% complete is pretty easy, quick, and there are plenty of results to see. That last 10% seems to take so long to wrap up, and so much work that I find myself just putting it off and doing other work instead. Of course when timelines are involved, and you need to actually work on it, it is hard to stay motivated. Maybe its hard because I enjoy doing LabVIEW and it doesn't feel like work when its fun, but when it isn't fun it is a drag. For large multi member projects it gets a little better since you can all share in the accomplishments of everyone, so staying motivated is easier. From a program management stand point I've seen plenty of projects eat up 8 or more developers full time for months (years?) more than originally planned due to poor management, and unclear requirements. Those are pretty bad sometimes too because the moral of the team is poor. Constantly getting beat up, while working hard can make good people want to quit.
    1 point
  5. One of illustrious admins (hooovahh) is shaving for charity - please donate here!
    1 point
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.