Jump to content

Tim_S

Members
  • Posts

    873
  • Joined

  • Last visited

  • Days Won

    17

Everything posted by Tim_S

  1. Google produces: Queued State Machine with User Input Basic LabVIEW Design Patterns - National Instruments Queued State Machine QSM - LabVIEW Design Patterns
  2. Property and invoke nodes can also uncheck 'Remove front panel' when 'Use default save settings' is checked. Ones that reference the front panel certainly do (e.g., front panel->open), but I'm not sure about ones that reference controls.
  3. What it sounds like you're asking for is where NI has gotten out of (<100 Hz) because there are tons of remote I/O options Rockwell, Siemens, GE, etc., already on the market. All of these use some form of bus (RS232, RS485, PROFIBUS/NET, Ethernet/IP, CAN...). Depending on what you pick you can spend just as much as a cDAQ in time and materials. NI acquired Measurement Computing some years ago. I've never used the hardware, but they do list having LabVIEW drivers. Some devices look to be able to handle industrial logic levels (24V).
  4. I'm not terribly proficient in Python, but wondering if Python is reacting to a termination character.
  5. Which is a big point to why we didn't use TestStand in that there is a cost in learning the system which has to be able to fit in budget and schedule of a project.
  6. I've run in to TestStand twice... The first was a project I got pulled in to at the 11th-and-then-some hour to make some changes. I got a quick intro to implementing LabVIEW code in TestStand and made the changes. I didn't get to do much with it, however the impression I got from people who had been with the project was very negative in how difficult the system was to use to do what was wanted and they intended not to use it again. This was with TestStand version 1 (and I believe LabVIEW 5) from people who (today) would be certified LabVIEW CLD, so take that for what you will. The second time I ran into it was as part of a major revision and re-write of an existing system including sequencer written in LabVIEW. We (two from my company and a local alliance partner) seriously considered using TestStand, but concluded that the amount of work learning TestStand, porting existing code to work with TestStand, build a system around this that did all the other things that were needed, and make it look like a homogeneous system would be more costly that just writing it in LabVIEW (version 8.6 at the time).
  7. You can easily check the work by calculating what the value by hand and then comparing the output of the VIs. If you had done this then you would have answered your question. You do have errors in your code. Things you need to watch for is the data types matter and how the logical shift primitive works. As for style, making code neat and organized is important with any language. LabVIEW is a 'visual' language rather than a text language so what constitutes neat and organized means something a little different; there is a code cleanup feature up in the button bar area that can help with this.
  8. It appears you need help with the math expression. The LabVIEW help topics of "Formula Node" and "Precedence of Operators in Formula Nodes and Expression Nodes" should help. Have you read these?
  9. If you are using a formula node then it's just a matter of setting up the inputs and outputs of the node and typing in the formula. LabVIEW has very good help that installs with it, but there is a little tutorial here. The first formula has one gotcha but the error message tells you what the issue is. The second formula is a little wonky in that it looks like x-squared is being squared. Make sure you read the help on the formula node.
  10. Had some good luck with Acromag modules the couple of times used them. Curious how these particular units work out (or don't) for you.
  11. Haven't used it with a cRIO, but have with E and X series cards set up as a voltage divider.
  12. Potentially speed. Possibly stability. Certain individuals have very strong opinions on .NET and how it's mainly suitable involves the firing range. Otherwise, if you're programming for Windows only, it's a valid use.
  13. If you google "LabVIEW modbus" you'll find some solutions. Top one for me was a 2015 article on NI's website.
  14. There's plenty of examples on serial communication that ship with LabVIEW, so that's just looking at the examples and the ADV manuals/documentation on your part. The forums is a great place to ask questions and share information however you seem to be asking for someone to spend a great deal of time working with you on this code. If that is the case, people here make a living writing LabVIEW code and I'm sure someone would be glad to provide a quote.
  15. Unfortunately you're asking a very broad question. There is a very long list of ways LabVIEW can collect data from devices (voltage, current loop, ICP, RS232 serial, GPIB, TCP, UDP, CAN, manufacturer provided DLL...) , so you need to know how the device talks as a start.
  16. Hexadecimal as in an array of numeric values? The String to Byte Array primitive does that.
  17. INI files were the way to go years ago, but they have the challenges when handling complex data types and arrays. I used XML (DCOM) in the last revision of my core application (designed for medium to large system). It worked well in small scale, but was a significant impact (30+ seconds) when my configuration editor and application tried reading in the file of a full system. I switched to JSON using this package which has greatly improved performance. Each plugin can have its own section for any configuration, so additions are easy. The files aren't intended to be edited directly, but I do install Notepad++ on the PCs to make it easy to go in and take a quick look (useful for repairs should anything bad happen).
  18. I used the NI CVT as a reference. There was nothing wrong with the CVT itself, but I needed something that worked with objects in packed libraries (plugins) which don't seem to share the same memory space.I didn't use the CCC with my inter-application communication. Not used DCAF before... just getting back to looking at it now because of your question. Looking at a demo video of DCAF, it's not an equivalent to CVT. CVT only stores values for read/write where DCAF can interface with hardware, run PID loops, etc., where storing values is a subset of what it does. This video goes through how the DCAF engine works. CVT is meant for asynchronous accesses from a central repository; DCAF appears to iterate through each, object that is configured/loaded in the system using strict by-value data transfer.
  19. Drat, you're right... Today I've had an IT-pushed Windows update and reboot that has changed the behavior to closer to your screenshot. I was able to eliminate the memory leak by completely eliminating all of the Read Variables in my application, so I know something's related.
  20. Sorry, should have mentioned I set the capture settings to have a memory threshold of 500 bytes. There are a lot of typically uninteresting allocations/deallocations occurring that spam the capture otherwise. The memory allocated for a shared variable handle I would expect to be the same for a particular data type independent of the contents, though I've not dived this deep into the bowels of shared variables before. The use of the occurrence is (an old) way to throttle loop rate and control parallel loop termination; used to see it quite a bit before events were added to LabVIEW. The bottom loop keeps running until the top loop ends at which point the occurrence is set. The timeout of the occurrence acts the same as a Wait (ms) of the same time.
  21. If I have to go back and refactor then I'll have something closer to that. The code is a communication library that gets used by anything trying to talk to an application. The Initialize launches a VI that opens and maintains the connection and then individual VIs get used to read the shared variables (thus the code using the library only reads what is needs). There's certainly other ways to do this, but it's worked well except for the memory leak.
  22. Thought I'd pass this along and see if anyone can reproduce with different versions of LabVIEW. Appreciate it if anyone has seen this and has a fix. I'm using shared variables to communicate between applications (1:N). I'd been seeing some memory creep that was inconsistent and somewhat bizarre. Eventually managed to track it down to that I'm programmatically opening a connection to a shared variable in one loop, then reading the value in a different loop (the different loops have to do with reconnecting on connection loss and startup). There is a functional global used to pass the variable to the second loop. The Read Variable primitive deallocates all but 4 bytes of memory for the previous loop handle and then allocates memory for a new handle on each iteration of the while loop, hence creating a leak. This behavior does not occur if there is only one loop where there is an open, while loop with a read, and a close. Main.vi demonstrates the issue. Main 2.vi is more like the NI example. I've got service request #7728859 with NI going, but I think I got the guy's first day. LabVIEW 2015 SP1 32-bit on Win7 64-bit. Shared Variables memory leak.zip
  23. Tried loading the code in LV2012 and 2015; in both there was an error attempting to load the .NET control from PDFBox-0.7.3.dll. I expect some other DLL is needed, which is a rabbit hole to start going down. With no documentation nor knowledge of the contents of the .NET control it is very difficult to provide suggestions. With LV2012, the .NET control attempts to use an object which is NULL, so it throws an unhandled exception that is reported back up to LabVIEW. There is no information as to what object is the issue. Without being able to see the properties or methods it's impossible to attempt relate what is missing. The error 1386, which I had to look up as "The specified .NET class is not available in LabVIEW.", implies that something is missing or broken in the .NET control. I'm expecting it's missing a file.
  24. You changed the settings in LabVIEW, which get saved in the LabVIEW INI file. There is an INI file named the same as your executable that you have to copy the settings into. You can create a custom INI file and include it in your build so the build generates everything as you want.
  25. The musical alarms bring back (traumatic) memories of midi files playing Mary Had A Little Lamb and the opening to The A-Team at auto plants in Mexico.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.