Jump to content


  • Posts

  • Joined

  • Last visited

  • Days Won


Everything posted by Bryan

  1. I tried to log into the Wiki using my LAVA username/password and it wouldn't let me. So then, I tried to reset my password by specifying my username and email address. It said that there was no email address recorded for my username. Next, I tried creating an account with my username, and of course it won't let me since "Bryan" already exists. So, I gave up and went with "SouthPaw". But, I would still like to access the Wiki under my LAVA username. Would it be possible for a Wiki admin to make that happen? If it's too much of a pain, I can just stick with "SouthPaw".
  2. Loving this central location of organized and searchable LabVIEW knowledge! (Just like LAVA.😁) I've made a couple of small contributions to this so far. Mostly clicking on "Random Article" and reviewing it for typos and broken links. I set somewhat of a personal goal of trying to do this at least once per workday. Every little bit helps, right? Granted, they're not contributions on a grand scale yet, (i.e I'm not creating articles, etc.) but maybe I'll get there. This is my first time contributing to a Wiki of any kind, so I'm a n00b at it.
  3. Judging by the poll results thus far, it looks like there is a LOT of interest in having a LAVA code repository for those that are not using VIPM Pro. That's to be expected though. If a public repository were to be set up somewhere, I'm assuming that JKI would have the option of adding it to the list of canned repositories for the Pro and Free versions if they so choose.
  4. From my experiences, TS seemed to be overkill for a lot of applications with which I have been involved, like "trying to swat a fly with a Buick" (to quote a friend). Add to that the additional cost of deployment/debug licenses for multiple test stations, the bill goes up quickly. Learning curve also comes into play as TS has seemed to become its own programming language and is not as easy to learn as LabVIEW for those without the budget or time to take the training classes. With a "roll your own", after the initial time investment of creating a test sequencer in LabVIEW, you can deploy to as many stations as you want without the cost of additional deployment licenses by just building EXEs. Where I'm currently working, TestStand is just starting to gain a foothold, but ease of deployment and cost of deployment licensing has been a source of contention. Now, my opinions are based solely on my experience with using TS on and off for several years without formal TS training. Ironically, a couple of my coworkers and myself are going to be taking a TestStand classes within the coming weeks, and I'm interested in whether my personal opinion (and those of my coworkers) will change. At my previous employer, we still had a system running the precursor to TS: Test Executive. We had kept updating it until we ran into an incompatible version of LabVIEW, then just "froze it in time". Nobody there was very familiar with TestStand and how much it would take to do a migration as the station tested a large variety of boards. Because there was only one operating station, the risk of downtime for migration was considered too great, so it was never done. Test Executive seemed so much simpler and a better match for the needs at the time. A second station was created to do the same as the one running Test Executive. Test Executive wasn't compatible with the newer version of Windows, and they still didn't want to go the TS route, so we ended up going with a LabVIEW-based test sequencer from CalBay (now Averna) called "iVVivi". I became very well versed in iVVivi and it was definitely simpler than TS. It also had an integrated LabVIEW OOP-based HAL, which was very attractive. However, in some respects it was TOO simple and required some creative LabVIEW routines to mimic some functionality available with Test Executive. I don't believe iVVivi is even supported anymore, so they may eventually be forced to go the route of TS. Or, beg NI for the password to access/update the protected VIs to recompile in later LV versions. EDIT (3/2020): After having had the training and applying what I've learned over the past year, I can say that I understand TestStand much more, including its quirks. I still think that it could be suited for some applications, but still way too overly-complicated for the majority of applications that I deal with on the day-to-day and the additional costs per seat/deployment are still a deterrent. It still fits the "swatting a fly with a Buick" analogy I used before.
  5. Here's another quick and dirty example just using local variables and two separate while loops. I don't like using sequences and local vars in practice myself, but this is just a way to show you a simple way to control parallel loops. It's quirky, but I hope it gives you an idea. There are much better ways to implement parallel loop control and communication, but I just wanted to provide a quick example for you. The "Stop" button is set up to use latch functionality, which isn't compatible with having it used as a local var, so I had to create a separate indicator for "Stop All" to "hold" the value to be used by the second loop. Please don't use this as the best example as it doesn't really show good LabVIEW programming practice, but will at least show you what's needed for parallel loops. Again, I hope I've helped you out! Untitled 2.vi
  6. You're probably going to have to implement a producer/consumer design pattern if you want truly parallel processes. There are many examples of them in the LabVIEW examples. I wasn't sure whether going into producer/consumers was going to be overkill for what you wanted, but you may want to look into going that route based on your description. For simplicity's sake (and my lack of time), and to illustrate what I was talking about in my first paragraph using a timeout case, I've attached a VI (in LabVIEW 2016) that uses the timeout case. The default timeout is -1, which means that the event will wait forever or until an event occurs. In my attached, it waits for 100ms. You can have the timeout be as long or short as you want as long as it's greater than -1. To keep the value in your shift register, you'll have to wire your shift register value straight through your timeout case. If you don't, it will be overwritten by a default value (normally "0"). This method is a quick and dirty way of allowing the event structure to complete alongside your case structure and not be dependent on it while being in the same loop and not have the while loop hanging. I hope this helps! Untitled 1.vi
  7. A while loop will not iterate until all nodes within have completed execution. First thing: do you have a "Timeout" event case defined? If not, your loop is going to "hang" and only loop when your "Numeric" value changes. The wire you have currently running to the "timeout" terminal of your event structure is not going to do anything as this defines the amount of time that the event structure "waits' for an event before proceeding via the "Timeout" event case. You'll need to wire a separate constant for your iteration delay (i.e. Timeout value), say: 1000ms. If you wanted these two cases to be independent, you'd have to set up a second while loop and pass data between them. I don't know what the end product is for what you're working on, so just for simplicity's sake, I'd say to set up your timeout event case. Note: when this is running and your timeout case is set up, there will be up to a 1000ms (1s) delay from the time you press "Stop" until the VI stops execution. This is unless you have a "Value Change" event set up for your "Stop" button.
  8. I agree with you, but I still have to run it by them before plugging in anything not provided by the company. You know how large companies can be. Knee-jerk reactions become long term policy and then convincing them otherwise is like talking to a tree unless you have serious pull within the organization, (which I do not). I've sent a request asking if there is an accepted process or if I can get a waiver. If there's no way, then I have to wait for our company-wide VLA to come up to speed... which could be a while. Currently, they are only providing up to LabVIEW 2015 through our engineering software "store". As I said in another post, we're normally two to three versions behind (which is why I let my CLD expire). The only way I'm able to get anything newer has been via the DVDs/CDs and my "pseudo" administrative rights on my computer. I would definitely be able to contribute as well!
  9. I just received the "disks" for our SSP today with LabVIEW 2017 and NXT. I was eager to install and play with it. But, instead of disks, I received a 32GB USB stick. While It's a neat concept by NI, it's a BIG no-no at my company to use USB media sticks that are not explicitly provided by my company for information security reasons. (Anyone remember Stuxnet?) I'm friends with someone in information security, and I can ask if there are acceptable processes/procedures to use the disk, but I'm pretty sure I already know the answer. These are the times when working for the type of company I do is a pain.
  10. We used to use Visual SourceSafe and built toolkits for our ATE systems to ensure we were running the latest versions of software (and to perform self-checks). My company tried to force IBM Rational ClearCase on our group but I was able to fight them off and am currently using SVN/TortoiseSVN. Funny though, my company has started setting up SVN databases since so many people balked at ClearCase.
  11. We've simulated a generated 0-150VAC 3-phase source before with a NI-6733 and amplifiers, then shifting the phase of each channel to 0°, 120° and 240° respectively. Due to the number of 3-phase signals we were generating to a high-impedance load for that particular setup, it was more cost effective to do it that way than to purchase many AC voltage sources. I was kind of hoping that there was a similar solution for AC current. Perhaps a similar setup using lower voltage and a step-up transformer, assuming the amplifier has enough wattage for each channel. This time however, we're looking at a low-impedance load where a 0-5A AC Current Transformer would normally be connected in the field. I'd looked at those programmable sources that you cited, but with space and budget limitations, I'm not sure if we could do it. If it's the only way then I'll have to convince the decision makers make some room and cough up some additional dough. The type of simulation we're doing (for a test bed) requires that each line (phase) be able to be individually controlled to test behaviors from desired and undesired phases, frequencies and levels. The Chroma source would do everything we need, but is really overkill for our needs. I've also looked into an equivalent Elgar 3 phase sources as well. Thanks for the ideas so far though, gives me things to look into!
  12. I've been tasked with coming up with a way to simulate the output of a Current Transformer (CT). Basically, a way to generate a 0-5A AC signal that can be controlled via automation to simulate a total of 15 CTs (simulation of five 3-phase lines). The option of using 15 programmable AC Current sources is currently being kicked around, but space and cost may raise some eyebrows here. Not being an EE myself, I'm having difficulty coming up with a solution. Does anyone have any ideas that they could toss my way? Thanks!
  13. There are three postings on Indeed.com for a Test Engineer at Northrop Grumman in Charlottesville, Virginia. I believe the first one may be for a "new grad". Test Engineer I: https://ngc.taleo.net/careersection/ngc_coll/jobdetail.ftl?job=1195044&src=JB-10200 Test Engineer I (experienced): https://ngc.taleo.net/careersection/ngc_pro/jobdetail.ftl?job=1190027&src=JB-10200 Test Engineer II: https://ngc.taleo.net/careersection/ngc_pro/jobdetail.ftl?job=1190176&src=JB-10200
  14. It really depends on your current situation and your future aspirations with LabVIEW in my opinion. In my case, I let my CLD certification expire a couple of years ago. I've been working for the same company for 13 years and having the certification didn't benefit me in any way aside from being able to prove a level of LabVIEW programming competency and provide a sense of achievement for myself. To date, my reputation as the "resident LabVIEW guru" has been more beneficial to me in my job than having my CLD was. Since my company is always 2-3 LabVIEW versions behind the latest releases, I was struggling during re-certification since the exams weigh heavily on having the knowledge of the latest features and functions. It ended up being more of a pain to me than it was worth, so I just let it expire.
  15. I've been on here for quite a long time. I don't normally post much anymore. I used to be pretty active. Normally I find the answers I'm looking for via searching. I don't get to play with newer versions of LabVIEW very often. The company I work for has a tendency to freeze our programs in time and it's rare that we have the opportunity to move to latest and greatest versions of LabVIEW. So, I don't know much about newer features and functions. This has lead me to allow my CLD to expire a few years ago. I wasn't really "using it" and re-certifying normally lead to me having to do a lot of homework on my own time in order to pass the CLD recert. My previous employers were more supportive of me being the resident "LabVIEW Guru". I'm known as such at my current job as well, but as I said, don't get to use and explore the latest features of LabVIEW. I'm able to use LabVIEW sporatically and not as much as I would like to. It's normally feast or famine. Most of the LabVIEW I end up doing are for things that I don't enjoy, like keeping an old LabVIEW 6.1 ATE running NI Test Executive up and running and our current LabVIEW 2010 3rd party Test Executive ATE up and running. My technical background is mostly test engineering, so "Jack of all trades" type of stuff due to our small department. My favorite project is one that I'm actually just winding down on for a sister site of ours. Upgrading a version of test software that was converted up from LabVIEW 5.1 to 2014. It was done before I came on board with the project. The previous developer was a good programmer and I was able to figure things out. I can't go into a lot of detail about it, but I feel it was my most professional undertaking with LabVIEW yet... allowing me to create a professional EXE and installer, graphics, etc. Actual software-engineering stuff that I rarely get to do in pieces let alone in entirety. I've been using LabVIEW since 1999 and have always wanted to go to NI Week, and the LAVA BBQs once they started. The companies I've worked for either never had the budget to send me or something else came up that was priority. The same happened this past year. (There's always next year, right?) I just heard about LabVIEW NXG via THIS form today. I'll have to spend some time looking into it before I form my impressions, but am worried about what I'm reading as it possibly being a replacement for LabVIEW in the future, making my limited knowledge and experience old-hat. But, that may be a knee-jerk reaction at this time.
  16. I checked out the Exaprom toolkit a bit last week. I agree, it has a lot of really neat features. If I can convince the customer for more time and budget for a better PDF solution than just printing the FPs to a virtual printer to maintain the separation of text and images, I may use the Exaprom kit to do so. Thanks!
  17. Thanks for the clarification! It's nice to know more of what's going on under the hood when LabVIEW sends a FP to a printer. I just assumed that it was automatically doing some sort of OCR. Thinking back, when I generated a PDF with a FP as an image, the OCR wasn't very reliable. That should have clued me that something else was going on instead of OCR. This is really the first time I've had to deal with PDFs and LabVIEW together. I've always wanted to mess with the PDF toolkits for LabVIEW, but in the applications I'm dealing with at work, I haven't really had the opportunity to date.
  18. Hey guys/gals. Long time member/infrequent poster here with a PDF challenge to share. I'm working on modifying some old LabVIEW code for an internal customer where reports generated by the compiled EXE are essentially LabVIEW FPs that are printed to a PDF printer. I've been asked to streamline the process by preventing the prompt for filename/location for each report that is "printed", (and there are many) as well auto-generate the PDF filenames. The problem I've run into is that in using the LabVIEW PDF toolkits that I've found on VIPM and NI's website - the FPs are added to the PDF files as images whereas by using a PDF "printer" to print the FP, it appears that an OCR engine is used to break up the FP image into images and searchable text. Having searchable text is what the customer wants to keep if possible. However, with the level of control they're requesting for PDF generation, I'm having a difficult time finding a solution that can be done in LabVIEW and meet all of the requests without changing the report generation scheme from FP "printing" to an actual proper report scheme... which they don't want me to do. I could dig into finding out more details about programmatically controlling their "PDF Printer" from LabVIEW, but I would prefer the application not to be coupled too tightly to a 3rd party application that could change or disappear at any time. (Gov't Contractor). Anyone have any ideas? It would be nice if the available PDF toolkits out there had some sort of OCR function for images if they contained text. If this is too much of a pain to do, I may just have to tell them it will have to be one way or the other. I love LabVIEW programming, so if I can do it I will.
  19. I still use them quite frequently as I've become the code maintainer for an old LV6.1 / Test Executive ATE system that should have been put out to pasture long ago.
  20. The fact that it made me click on this thread link makes me think of Facebook "click-bait" postings. "She sticks a butter knife in an electrical outlet. The result? I'm SO doing this!"
  21. Look to see if the Pickering card is being detected at a different address/location by the processor when the Keithley is ON vs when it's OFF. You should be able to do it either via NI MAX or a soft front panel of some kind.
  22. @stefanusandika: If I remember correctly, you have to design the VI that you want to be a "service" in such a way that it can be safely aborted when Windows stops it. This would prevent proper shutdown of VI execution and cleanup of references, etc. I haven't created a service using the method, so I don't know if there are techniques that can be used to allow safe shutdown of a VI. I was able to find an NI Article on it, but it does involve tinkering with the Windows Registry. @JamesMc86's nssm method does appear to be something worth a look.
  23. I remember seeing an article many moons ago about creating a Windows Service-like LabVIEW application. It involved using some files like "instsrv.exe" and "srvany.exe" (were these the files you're thinking of ShaunR?) or something of that nature. The only service-like interaction that an app using the method could do would be for Windows to start and stop it.
  24. I've been a LAVA member for quite some time. I actually found my current job through this website back in 2004 (and I'm still employed at the same location). I don't post much, but haven't done as much LabVIEW development in my career as I would like to. So I've fallen behind the curve as far as latest tips, tricks, architectures and methods. I haven't even used the Actor Framework yet, (we're still primarily using LV2010). So I haven't been able to lend much on LAVA as far as advice and help for those using all of these newfangled toys and methods or haven't had the need to seek help. This is possibly just a personal preference, but I prefer LAVA to the NI forums, and I rarely use the NI forums. I like how the smaller community feel is so concentrated with LabVIEW expertise. I wish I would be able to spend more time on here and be one of the regulars like I had been on previous forums in which I was a longtime member, but I just haven't had the need or knowledge to do so. That all being said, I may be on here a little more in the coming months as I've been somewhat asked by one of our company locations to modify a neat LabVIEW application. I've already seen the code and the previous developer was definitely someone I would consider to be a LabVIEW Architect. For all I know, they may be a member on here.
  25. Aside from moving your serial configuration VI outside the while loop, you may want to also add a delay to the loop as well. You may be pounding it with data faster than it can handle them (maybe, I've never had the opportunitiy to play with Arduino). Additionally, how is your Arduino set up to handle termination characters? I didn't see anything in the Arduino code specifying anything other than the baud rate. By default, the serial config VI enables a termination character using a line-feed. I don't know what VB serial configuration defaults are.
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.