Jump to content

smarlow

Members
  • Posts

    122
  • Joined

  • Last visited

  • Days Won

    12

Everything posted by smarlow

  1. I believe it is related to having the Embedded UI enabled. I turned off the embedded UI and ran the system for a few days without the problem. I did not see this issue with LabVIEW 2020 with the app I am running, so I believe to be a new issue introduced since then (I updated the system to LV 2023 Q3)
  2. The device originally came with an FTDI VID, and a "custom" PID from the range FTDI assigns to its volume customers. This is to prevent the device from interfering with the setup of other devices that use the same chip. This unrecognized PID caused the device to show up in RT Linux as a VISA USB RAW device, which is the default for unrecognized VID/PID USB devices on NI RTOS. I have some experience working with FTDI chips in the past on the Phar Lap RTOS. I once wrote a set of drivers for the FT232R using the FTDI packet specs and the USB control pipe and VISA read/write functions. However, those drivers don't work for this particular chip, which is different model. The mfg. agreed to change the PID to the standard (the VCP load option will still be off), and I am trying to prepare for dealing with the chip without the VCP or the USB RAW drivers.
  3. After reading the AN you posted, it seems that for the Linux OS, the VCP/D2XX are not connected. Does this mean that if one of my FTDI chips needs the D2XX driver, that none of my VCP devices can be used?
  4. It is my understanding that the VCP interface sits on top of the D2XX driver. The EEPROM for any FTDI chip can be programmed to optionally load the VCP driver or not. The device I am dealing with is programmed to not load the VCP.
  5. Since FTDI chips are recognized by NI RT Linux out of the box these days, I assume the OS comes packaged with the FTDI chip drivers for Linux. Are there wrapper VI's available for the RTOS for calling the driver library? If not, does anyone know off hand where the external driver library is located so I can set up the Call Library function for it? Thanks in advance.
  6. Rolf: Thanks for the info. I did suspect it might be related to the SD card. However, I did see the issue before adding the logging module . I will still try a run without to see if it helps, since the non-logging version that froze was before the updates.
  7. Thanks for the links and info. I will keep plugging away. BTW, I am logging at 1 Hz to an SD card. I do not leave the file open.
  8. *SIGH* Lock-up issue has returned even after firmware upgrade.
  9. Thanks for the info. I believe I got it fixed. I ran the latest update for 2023Q3 (f2). I also did not realize there is a new cRIO firmware image for the cRIO-9047 (23.5). I was running 8.8. I am surprised that MAX did not flag the 8.8 firmware as being too old when I installed the 2023 base image. Anyway, after updating everything, the system has been running for 16 hour with no problem. I'll let it run over the weekend. Hopefully, the problem is resolved. EDIT: I did completely reformat the cRIO drive after updating the firmware.
  10. Having trouble with a cRIO locking up: cRIO-9047 with the Embedded UI enabled. LabVIEW 2023Q3 on host and cRIO Using 30 sec. restart Watchdog and whacking it ~1 Hz Not a really complicated diagram. FPGA programming mode (not scan interface mode) I am monitoring the memory usage and it does not appear to be increasing. Usually locks up 2-20 hours after start. Program and display completely frozen. Cannot connect from LV Project or MAX. Watchdog controller does not reboot the controller. Hard reset required to get the controller running again. NI tech support says no internal known issues reported Has anybody else seen this? Thanks for any help.
  11. This implementation would mean the data for the "callback" has the same type as the dynamic event. But if you use a cluster or class, it would be easy enough to separate the calling data from the return using sub-clusters. Copies of the calling data would be in the return, but that would be relatively harmless.
  12. I have a design that uses a Notifier to return data from a dynamic event back to the thread that generated it. I used the data type associated with the event as the type for the notifier. It's kind of clunky, but it works. By typecasting the event refnum to a string, and using it as the name of the notifier, I believe it guarantees a unique notifier name. If they were to build this "notification" into the event, it would act as a return. The node could be on the right as you describe, and would have the same type as the dynamic event data. A "Wait on Return" or "Callback" function would be added to the Events palette that functions like a "Wait on Notification" function. I think they could wrap up this implementation into a tidy solution for returning data from a dynamic event:
  13. It's killing LabVIEW. They need to stop trying to use LabVIEW to only sell the top-end $$$ hardware. It's going to end up like HP-VEE.
  14. Do you mean like a return from a dynamic event?
  15. C++ marching into oblivion? No, not yet. But there is a lot of C# and JAVA nowadays. The point is that LabVIEW is not an all-purpose language, and very specialized. If they don't maintain and preserve LabVIEW as the preferred language within that specialized field, it is as good as dead. It might chug along as a zombie for a while to maintain expensive systems that are running it.
  16. All I know is that if they don't do something to make it a more powerful language, it will be difficult to keep it going in the long run. It was, in the past always a powerful choice for cross-platform compatibility. With the macOS deprecating (and eventually completely removing) support for OpenGL/OpenCL, we see the demise of the original LabVIEW platform. I for one would like to see a much heavier support for Linux and Linux RT. Maybe provide an option to order PXI hardware with an Ubuntu OS, and make the installers easier to use (NI Package Manager for Linux, etc.). They could make the Linux version of the Package Manager available from the Ubuntu app store. I know they say the market for Linux isn't that big, but I believe it would be much bigger if they made it easier to use. I know my IT department and test system hardware managers would love to get rid of Windows entirely. Our mission control software all runs in Linux, but LabVIEW still has good value in rapid application development and instrument bus controls, etc. So we end up running hybrid systems that run Linux in a VM to operate the test executive software, and LabVIEW in Windows to control all our instruments and data buses. Allowing users the option to port the RT Linux OS to lower-cost hardware, they way did for the Phar Lap OS would certainly help out, also. BTW, is it too much to ask to make all the low-cost FPGA hardware from Digilent LabVIEW compatible? I can see IOT boards like the Arduino Portenta, with its 16-bit analog I/O seriously eating their lunch in the near future. ChatGPT is pretty good at churning out Arduino and RaspberryPi code that's not too bad. All of our younger staff uses Digilent boards for embedded stuff, programming it in C and VHDL using Vivado. The LabVIEW old-timers are losing work because the FPGA hardware is too expensive. We used to get by in the old days buying myRIOs for simpler apps on the bench. But that device has not been updated for a decade, and it's twice the price of the ZYBO. Who has 10K to spend on an FPGA card anymore, not to mention the $20K PXI computer to run it. Don't get me wrong, the PXI and CompactRIO (can we get a faster DIO module for the cRIO, please?), are still great choices for high performance and rugged environments. But not every job needs all that. Sometimes you need something inexpensive to fill the gaps. It seems as if NI has been willing to let all that go, and keep LabVIEW the role of selling their very expensive high-end hardware. But as low-cost hardware gets more and more powerful (see the Digilent ECLYPSE Z7), and high-end LV-compatible hardware gets more and more expensive, LabVIEW fades more and more I used to teach LabVIEW in a classroom setting many years ago. NI always had a few "propaganda" slides at the beginning of Basics I extolling the virtues of LabVIEW to the beginners. One of these slides touted "LabVIEW Everywhere" as the roadmap for the language, complete with pictures of everything from iOT hardware to appliances. The reality of that effort became the very expensive "LabVIEW Embedded" product that was vastly over-priced, bug-filled (never really worked), and only compatible with certain (Blackfin?) eval boards that were just plain terrible. It came and went in a flash, and the whole idea of "LabVIEW Everywhere" went with it. We had the sbRIOs, but their pricing and marketing (vastly over-priced, and targeted at the high-volume market) ensured they would not be widely adopted for one-off bench applications. Lower-cost FPGA evaluation hardware and the free Vivado WebPack has nearly killed LabVIEW FPGA. LabVIEW should be dominating. Instead you get this:
  17. SVG is the way to go. There is a mountain of open source SVG rendering code that was built to make modern browsers SVG compatible. The thing to do would be to create a custom SVG-based format for all LabVIEW controls/indicators, and publish the specification. That way anyone who wants to would be able to create their own front panel object design libraries. Users could create their own designs and look for any application. I mentioned this over a decade ago in my Websocket post. They could also integrate the connector pane into the diagram as a boundary with multiple frames and tools for creating connector terminals. Unlimited sprawl on the diagram would be replaced with "main structure" whose boundary is the connector pane. Users would create paginated code within the boundary, with the separate pages executing in parallel. This would be an option rather than a rule. There would still be a traditional connector pane, but it would go with the diagram rather than the front panel. The relationship between front panel objects and diagram terminals would be user-configurable (Diagram files would be configurable to point to one or more front panel files, with the front panel objects therein becoming available to be included in the diagram, at the user's discretion).
  18. Maybe separate the front panel and diagram into different files/entities so that the GUI can be used with other programming languages in addition to G.
  19. The DAQmx Read VI is controlling the timing, although it's not clear how from your example. You are not calling the DAQmx Timing function in the example, so the sample rate and number of samples per channel are not defined in the task. See LogMAN's example in the post above for controlling the loop timing with the DAQ hardware operation. He has defined the DAQ task to read 1000 samples at 1 kHz (the sample rate input is unwired, but has a default value of 1 kS/s), using the DAQmx timing function. Inside the loop he is reading 500 samples at 1 kS/s, which will take 500ms. If you want to change the period of a loop that is timed by a DAQmx read operation, you need to call the DAQmx Timing function outside the loop as LogMAN has done. Changing the sample rate and number of samples to acquire will change the loop timing.
  20. If there is no DAQ task in the loop, how are you collecting samples? It's going to be impossible to tell you what is happening if you don't post the diagram.
  21. That kit uses a DS9400 USB/I2C adapter as an interface. According to DigiKey, that device is no longer made, so you are probably going to have a hard time finding any drivers for it. The DS2484evkit has some software you can download from Analog. The last listed compatible OS is Windows 8. That package probably has a DLL Driver for the DS900, but you'll probably have to write your own LabVIEW wrapper, unless you can find one here. If you just want to connect a 1-wire network to USB, you can use an Arduino. There is an Arduino library for operating a 1-wire network, and you can buy a 1-wire shield from DigiKey (part# MAXREFDES132#-ND). That Arduino shield has the same DS2484 chip (and RJ connector) as the DS2484evkit, so you can use the Arduino I2C bus to read and write data to the 1-wire bus, and use the Arduino serial interface for the PC. You can also select a jumper to run it in "bit-bang" mode using the Arduino GPIO instead of the DS2484 chip. I have used this Arduino setup in bit-bang mode and it works extremely well. You'll have do a little bit of programming on the Arduino to implement a serial command to scan the 1-wire bus and return the data (there is a bunch of example code). But you can make up your own data format (such as CSV ASCII). The programming on the LabVIEW side to read the Arduino is a simple VISA serial connection. I think I would go this direction, since the DS900 device is obsolete.
  22. If you are using a DAQ device, it's probably a hardware-timed loop. The loop iteration interval is determined by the amount of time it takes to acquire the data. For example, if you define a DAQmx task to acquire data at 1 kHz sample rate, and acquire 500 samples, the DAQmx read operation will take 500ms.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.