Jump to content

hooovahh

Moderators
  • Posts

    3,388
  • Joined

  • Last visited

  • Days Won

    283

Everything posted by hooovahh

  1. I too am interested in hearing how people tackle this issue. Personally I think it starts with keeping the number of custom error codes to a minimum. I have a tool (that I can't seem to find online but thought I posted) which allows to search for errors that might match what you need. Then I use this as the error code and put in the custom text in an Error Ring with the optional inputs with %s. Of course one step can be eliminated if you could search in the error ring. I understand the benefit of custom errors, but the dependency issues, and code ranges usually means I just stick with the ones on the system.
  2. I worked at a company that had all USB devices disabled. Not just USB memory devices, but also USB CAN, USB DAQ, USB Serial, USB GPIB, everything. For the DAQ stuff we would just simulate the hardware, then deploy it to test it. It was a pain but not too bad. The USB memory devices were easy to get around. It made the USB stick read-only, so to write to it you would map some empty folder like C:\USB to the E:\ drive or where ever your USB was. Then you could read and write to the C:\USB folder all you wanted which actually wrote to the USB device. I guess they just setup software to disable writing to drives that were marked as removable. That DgAgent sounds nasty.
  3. I'm loving this thread, it is stirring up so many memories and stories I mostly forgot. I was at a customer's site that had to production lines going which a previous vendor had made using LabVIEW and TestStand. The company I was working for had been tasked with updating the systems by adding a few new features, and updating to the latest LabVIEW/TestStand versions as well as updating the OS. Luckily they sent the 3rd system to our facility to update and upgrade while they were still using the other two. We updated the system and made the changes they asked for, but there was still some onsite work required because we couldn't interface with their network databases until we were there. Once onsite I started setting up the 3rd stand right next to the other two that weren't upgraded yet. My first day there I started asking around about the credentials used to access the network and database that I needed to have my software log to. No one seemed to know and even the IT guys were confused about what I was talking about. So with the permission of the program manager I looked at the source on the 1st stand to see how it was logging to the databases. Sure enough there was the user name and password, and it was something like User:TestUser and Password:TestUserPassword. So I updated our code got it working and the customer was happy. I saw the IT guy walking around so I let him know. "Oh yeah I found the credentials used for the production systems it is the TestUser account." His eyes got really large and said "No one should be using that." And I'm like "Oh okay well we're just using the same credentials the last vendor was using and it can be changed later." He said okay and left. About 10 minutes later I noticed the other two production systems were having all kinds of problems, failing every part it tested. Just then the IT guy came up and said "By the way I deleted that user account so no one can use it." Luckily the program manager was in ear shot and started blaming him for causing the production lines to go down, and demanded he fixed it. So IT made a new account, with a new password and we used that. I'm not sure why he thought it was okay to delete that user when I mentioned how the production lines were using it, but he did and I was glad we didn't spend too much time trying to fix it. Oh and we also have a No Sleep program on our test systems which is an AutoIt EXE. As for updates we have the systems download updates but not install and then we just need to remember to reboot the machines when testing is finished which might not be for months. Every once in a while IT comes down and says there is some major new virus and we need to update all machines now, which is a pain but we haven't had any issues yet.
  4. Use off the shelf software? Heavens no. Why don't we just create our own approval software that doesn't integrate with anything, requires extra training, is supported by no one that speaks English, is buggy, and charges a licensing cost calculated by number of clicks of the mouse used during the approval... Another story before I forget it. I was deploying a test system to a production facility. The system involved two GigE gigabit cameras, and was looking at the DUTs display to determine if it was working correctly. Everything worked great before we got to the facility, but then all the sudden the images from the ethernet camera would sometimes get black bars, obscuring the image to where the vision system would fail to work. We looked into all kinds of jumbo packet settings, Windows network settings, and just before going to buy a new network card I asked if anyone had used the machine and a worker mentioned someone from IT had used it. I talked to them and they installed network spying software to make sure it was locked down. Turns out it was also disrupting all network traffic, including the cameras. We asked if it could not sniff the two ethernet ports going to the cameras and they said the software didn't allow for that level of control. We tried to uninstall their software but it required a password during the uninstall, and IT wouldn't give it to us and wouldn't allow us to uninstall it. At this point the customer is getting increasingly frustrated with their own IT department, especially since it was costing time and money for me and others to be there when the system was crippled. We were still local administrators on the PC, so we just disabled the Windows services, and had their software not run on startup anymore and production was running again. Customer was happy the system was running, and IT didn't really care as long as it was installed on the PC
  5. I made a new thread and started with a story that you reminded me of.
  6. So based on conversations in another thread we've come to find several instances in our career where IT and corporate policies that mean well, get in the way or getting work done. Policies intended to keep IP from leaking, or viruses from taking over our PCs, and ransom-ware, tend to make software developers find work arounds so they can get their job done. So I thought we could make a thread where we share some of our IT horror stories, focusing on previous work for obvious reasons... One company I worked with had locked down the work laptops to where you couldn't really do anything other then answer emails and write Word documents. So when we needed to install LabVIEW which required administrator privileges, there was a whole approval process to get the IDE and some of the DAQ drivers installed. This included several levels of approvals, and justification for why you wanted to be an admin. You would apply to have administrator privileges given to you but you only had a 2 hour time slot, and after that the credentials you were given wouldn't work, and you had to apply again. Of course the first thing we would do in that 2 hours, is use the administrator account we were given to promote our user as a local administrator, so we never had to apply for it again. This company also had an app store for PC software. To me the concept of having a PC app store seemed odd. I sorta consider the whole internet the app store, and the concept that an offline app store can be up to date with all of the internet, with the newest software seemed silly.
  7. Yes but I'm guessing the security concern is not stealing IP, but instead plugging in a random USB drive, that might have some malware, or rootkits on it. Of course this should be no different than putting in a DVD from said company since the USB mounts the same as an optical media, but I guess you wouldn't know that unless you plug it in. NI has been doing this for various bundles for at least a couple of years now. I personally like it just so I have one thing I need to copy to the network for network based installs, instead of having to copy 4 or 5 DVDs and then do something to merge them into a single folder so it doesn't prompt for the next disk. If you want to really discuss corporate policies getting in the way of getting your job done, or the efforts in reinventing the wheel for a large company, then I have lots to say. I'm not certain that we have the most restrictive policies in the industry, but I know we would be top contenders.
  8. Great example and one I've forgotten about. So you have a couple of options (all of which sound like git to me anyway). You can start editing the code anyway even though you don't have the lock, and send that person an email saying you are editing some code they have locked. When they get in they will either say "Oh yeah sorry I didn't make any changes" to which you commit yours over theirs (breaking the lock), or they say they have made changes to which you'd need to merge the changes just like a git commit when you both made changes at the same time (am I wrong?). Or you can steal the lock while they are on vacation and force them to have the conversation with you when they get in and try to commit, at which point they again will either say they had pending changes, and a merge needs to happen, or they don't. Isn't it still just like git in that if only changes happen in one place at a time, that is what gets committed, and if there are multiple changes at once time, a merge needs to happen? Again I want to make it clear I don't use git, so my understanding might be incorrect but from an outsider I didn't see that as a compelling reason to change.
  9. I've not used git, but I hear some of these pros/cons. The one I've heard is that git is sometimes better because it forces you to communicate with your developers to know what they are working on. But if this were a servers side SVN thing with locks you'd know what other people are working on because, it's locked. If you need to work on something that is locked, ask the person that locked it if they are done and can unlock it. I don't need to track down every developer and ask what they are working on or if I can work on a code module. If something is locked I want to work on I talk to that developer about it, if it isn't locked I'm free to do whatever. The server-less side of things do allow for you to have the whole code base at your disposal, but you could do something similar with SVN when you aren't connected to a server. You'll just have lots of things to resolve when you do get back online since there is a chance multiple people changed the same VI, just as you would with git. Since I'm in the office 99% of my development time having it be server dependent is a non-issue, and that other 1% of the time I either VPN in, or I'm going onsite to deploy a system and will typically lock the whole project before going out, or not lock it, and the changes I make while offsite will need to be merged when I get back. I was also confused by the overhead comments. Do you mean work flow overhead?
  10. I know this is off topic but I do see this issue once in a while but on Chrome when this happens for me, it doesn't do anything and my text is still there, so I just copy it, refresh the page, and make a new post pasting it in. Are you saying you hit submit, the page refreshes, but your post isn't there? Not that I'm equipped to troubleshoot these types of issues.
  11. Oh I'm sorry I'm was not familiar with a Signature Line. One thing that I find useful is to use the Excel feature of recording a Macro, then looking at the generated code, and then translate that into LabVIEW. The problem is I can't even create a signature line in a footer, in normal Excel. If I click into the footer, then the Insert >> Text is all greyed out. Still I enabled a macro recording, then recorded my operations, and added a signature line at a normal place in the worksheet and looked at the generated code and it was the following: ActiveWorkbook.Signatures.AddSignatureLine _ "{00000000-0000-0000-0000-000000000000}" Not sure what it means but I'm guessing that you can't automate the creation of that signature line using the VBA this way and that it was designed to have a dialog prompt you for the settings for some reason.
  12. That makes me think that maybe your role is more of an architect than a developer. I didn't find the CLD very difficult personally. It was just a single loop QMH using arrays of strings, and I finished early and spent the extra time double checking my work. The CLA I barely passed, and worked up to the last minute.
  13. I use MAX as much as I can. I try to not reinvent the wheel unless there are limitations that I really need to work around, and MAX just does so much that I try to avoid custom things. I'll still incorporate MAX in my application, by doing things like calling test panels to hardware to open, or creating and editing channels, tasks, and scales using a VI I posted here. But as for the tasks and how they work in applications. I'll usually make the tasks in MAX and test them there, then perform an export into an NCE file which can go along with the source code which explains how the channels are all setup. Then in application builder, you can specify this NCE file to be imported when the installer runs. I do however remember some issues with editing tasks that have already been opened. Like I remember having my software opening it, and then trying to edit the scale of a channel and having issues where the new scale wouldn't be applied until I closed and reopened my software. I must not have been stopping or closing all of the references to the scale, or channel using the scale and that might not be a real limitation of DAQmx but something to look out for. I've had no experience with INIs or XML as a result.
  14. I made a suggestion for this on the Idea Exchange to give more font control over listboxes and tables. If you want this you should go vote there.
  15. Personally I think this is a good idea, and NI already is doing something similar by tagging NXG posts so they can be identified. I do think there are some issues with making a new subforum. One is if we do make an NXG subforum, then what happens when it isn't called NXG anymore? I've heard that some day NI will just call it "LabVIEW" and the LabVIEW we all know and love might be called something else, like "Classic LabVIEW". But then again that might not be for many years until NXG has functional parity with LabVIEW. There are also going to be times when a subject is about NXG and Object Oriented Programming, which subforum should it go into? Now we'd say NXG, but in a few years would it make more sense to put it in OO? Anyway I'll talk to some admins and see if we can agree on something.
  16. Attached is an example that does this using report generation toolkit, but does it through ActiveX calls. I think this needs to be ran on a workbook after all other writing has taken place, because it does it on a per worksheet view. For me I had N work sheets already made, some with data, and some with charts and graphs, then calling this would set the text and logo to the footer. Saved in 2015. Footer in Excel.vi
  17. Tortoise SVN. I've used Mercurial, and Perforce too, but SVN is the one I'm most familiar with, and the setup was stupid simple. I only use LabVIEW tools for performing a rename in a project and SCC at once.
  18. The NI Package Builder is less of a replacement for VIPM in my opinion, and more of a distribution method for installing software in general. NI may mature their package manager to one day replace VIPM, but as it is now the NIPM is not so much focused on installing LabVIEW packages, as it is focused on installing software. Now LabVIEW packages can be software (of course) but you won't find things like palette editing tools, running pre/post install VIs, or anything LabVIEW specific at the moment. It is more like an app store, where you can one day search for tools network packages, or larger packages like NI DAQmx. This is how NXG is currently distributed by installing the NIPM, and then choosing to install NXG and its dependent packages. There is also an offline installer but I suspect it is still just NIPM with the NXG packages as an offline repository. Of course this also opens up the possibility of making your own packages and you can have the "Hooovahh's Awesome Program" package which may depend on the NXG run-time, DAQmx, NI-DMM, and NI-VISA. Then these packages can be included in an offline repository that gets installed, or (potentially) an online repository so users of my software can update to newer versions of my awesome program, or newer version of NI's packages if dependencies allow it. Because of this I highly doubt it will ever replace VIPM for current generation LabVIEW needs. At the moment this just doesn't seem to be what the NIPM is made for.
  19. Yeah I remember you, I don't remember what I was drinking but I remember you. Glad you had a good time at NI Week and the BBQ. That project looks very cool by the way.
  20. VIMs are amazing and will cause revamping of polymorphic reuse libraries. I did a talk on this in August at NI Week which covers XNodes and the second half talks about VIMs and this structure which was named "Type Enabled Structure" unofficially in 2016 when it was introduced. http://forums.ni.com/t5/2016-Advanced-User-Track/TS9451-XNodes-Treasures-of-Reuse-in-LabVIEW-s-Attic/gpm-p/3538650 As others have said it works like a disabled diagram, where it goes through trying to enable each case one at a time and will enable the one that creates no broken wires. This can mean even more reuse if you can make your code more generic to accept scalars or arrays as it does in this case. The structure itself is not official in 2017 and has some editor issues. The impression I get is that it is stable (I mean it has to be NI is using it) but only after the code has been written and compiled. It still has odd cases where when you are first developing with it, the VIM won't adapt the way you want, but I've found it works just fine with code that is written and tested
  21. If you have an active SSP (not sure how you'd get NXG without it) then you should have access to the Self-Paced Online training. No idea how much it covers or how well it is put together. http://sine.ni.com/myni/self-paced-training/app/main.xhtml It looks like they have a Core 1 with Acquire Analyze Present, and another set of sessions on transitioning to NXG 1.0.
  22. Not sure how a waveform can be patronizing. Especially when it is an opinion of my own, based on my observations of...myself. I'm not saying you are wrong (because you're not) but I am going to say you are seeming glossing over some of the positive things NI has been trying to do, to try to get into the cheaper hobbyist space. Buying LabVIEW home for $50, which comes with application builder, and can deploy to Raspberry Pis as a target is a pretty positive thing that NI didn't need to do, that doesn't add direct positive revenue in terms of hardware sales. And giving away the student for free is also nice. I can't speak to the Linux LabVIEW thing, other than I've heard its poor. My only experience with Linux in the last 13 years has been running Linux on NI hardware. It's not something I've seen used in the testing career path I've been in, and I haven't needed it for the hobbyist projects I've had. If Linux is important to you, NI is failing you because they seem to prioritize Windows. Anyway I don't work for NI, I don't need to defend NI, and I don't know what NI has planned for the future. I'm just trying to stay neutral and show some positives if some negatives are being highlighted (maybe I'm just on the sine wave coming down). But I am not denying the valid negatives mentioned here. I also want others to join in on the conversation. No one wants to hear me spew the same opinion in different sets of words over and over.
  23. They're at NI marketing. I don't think anyone can make very well informed decision because so much of the platform is "Next version will have this feature, and fix this issue" and we can't really assess it until we have enough features to actually deploy a full real system, and find the things we like and dislike. I also have concerns about much of what you mentioned. I'll be presenting on NXG and 2017 in our next user group so I thought this might be a good time to illustrate what I call the NXG Cycle. The thing to keep in mind here is some at NI, and some LabVIEW insiders have been seeing NXG since 2013 or earlier. I've known about it but not for that long. 4 years or more of this cycle, all the while questioning all the current work you are doing and its relevance, can make for some very jaded feelings. You tend to stop having high highs, and low lows, and instead are just ready for it to be finished so you can deal with the change.
  24. There certainly is a fine line between being on the cutting edge of a new version, and also being the first to find bugs and issues with a new version of software. I know a lot of companies that just hold off upgrading as a result. Most I know wait for the SP1 release of LabVIEW before upgrading at least. I like to participate in betas so the added excitement of new features, and thinking how I can improve my code with them is enough to help push me into the latest version. The message NI is trying to send with NXG is that it will replace the LabVIEW we know today...eventually. But that the information we all know about the language doesn't go away. Someone even mentioned how there is not going to be two different CLDs, one for LabVIEW, and one for NXG. You will be allowed to use either IDE because in the end they both are creating the same code. Things like data flow, graphical programming, palette navigation, and good coding practices like state machines, aren't IDE specific. That being said I know what you mean, there is some intrinsic knowledge about the current LabVIEW that will become obsolete, and we will need to learn about NXG's quirks. But honestly this is several years out, NXG 1.0 in my opinion is closer to Signal Express, than it is LabVIEW. But with each new version you will see more current gen LabVIEW features make its way into next gen.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.