Jump to content


  • Posts

  • Joined

  • Last visited

Everything posted by codcoder

  1. A heads up though: all PXI modules aren't fully supported. We replaced a PXI controller running Pharlap to one running Linux and there were some issues. The LabVIEW application transferred smoothly if I remember correctly (this was last year, August ish) but we never got one of the modules to fully work. The 6683H timing module I think. And NI was aware of this. We ended up keeping the Pharlap controller. In this application.
  2. Here is a picture of the module in all its glory.
  3. So on a lighter side of everything: was browsing the NI website and discovered that atleast one PXI module has been updated to match the (heavily discussed) new graphical profile: https://www.ni.com/en-us/support/model.pxie-8301.html Cool grey instead of boring beige. Still waiting for a green chassis though. Will NI be offering compatability stickers to allow older modules and cover plates to match the new graphic profile? Will they update older modules? Or will we have to live with mismatched colour combinations for the coming decades? 😜 (posting this in the lounge section instead of hardware for obvious reasons)
  4. Do you have any examples about this you can share? Becuase I tried using NI-Trace -- and perhaps it's about my lack of knowledge about the tool -- but to my understadning what was logged was nothing but the "high level" writes to property nodes and sub-vi's which I did in my top-level vi's. There were no information about the underlying USB communication.
  5. For posterity: I've been in contact with NI Technical Support who confirms that the USB-8451 isn't designed to work with PXI Real Time controllers.
  6. Thanks for quick answers. I was afraid changing hardware would be the answer. Doing that isn't that easy either as the hardware configuration is quite fixed at the moment. My current "temporary" solution is to use a Windows PC connected directly to the USB device running a looped LabVIEW *.vi which constantly writes i2c to the device every second or so. This works. I will look into other neater solutions as soon as I get the time. I promise to post an update here if I figure out anything useful.
  7. Hi, I have a problem here. Hope I can get some help for a solution. The task at-hand is to use an NI USB-8451 to communicate with a device over I2C. This would have been trivial if it wasn’t for the fact that the USB-8451 is connected to a PXI controller running LabVIEW Real-Time OS (Phar Lap ETS) and not a Windows PC. I haven’t chosen the hardware combination personally and the person who did just assumed they would work together. And I did too… I mean, it's NI's eco system! Just download the NI-845x Driver Software from NI’s webpage, install it on my PC and then install whatever software needed on the controller through MAX. Easy right? But no. It wasn't that easy. So I wonder: (1) Has anyone had any success controlling the NI USB-8451 from anything but a Windows PC? And if so… how?? The PXI system finds the USB-8451 and lists is as a USB VISA raw device but the supplied vi’s/drivers cannot be executed on the controller. (2) Since what I want to do is actually very little (just writing a couple of bytes once to the I2C device) I have started to toying with the idea of recording the USB traffic from the Windows laptop somehow via Wireshark and then replaying it on the PXI system… has anyone had any success doing something like that? (3) And finally. If it is impossible to get the USB-8451 to work with my PXI system, is there some other I2C hardware that is known to do? I’m getting the impression that NI only has this USB device apart from implementing the I2C protocol on a dedicated FPGA and that seems a little too much. Sorry for the long post. Getting a little desperate here. Deadline approaching. I'm running LabVIEW 2019 SP1 by the way and the controller is a PXIe-8821 if that makes any difference. BR, E
  8. codcoder

    Dear NI

    Couldn't agree more. I also believe NI really need to push LabVIEW more aggressively to the maker community. This is a group of people who adores things that are "free", "open", "lite" and today, LabVIEW simply doesn't make the cut. Of course the coolness factor will still be an issue (a night/dark mode could probably help?) but surely there must be a way to position LabVIEW as a great tool to help creative geniuses to focus less on grit and more on, well, actually creating stuff. I work in the aerospace-defense-industry. With hardware. But we are fewer and fewer who does. The majority of the engineers now are software engineers who either work purely with software or come from that background. To explain to them why NI's offering makes sense is extremely difficult. Of course, for now, what we do is relatively complicated and closely tied to the performance of the hardware. So using NI’s locked-in eco system saves us time and money. But NI’s pyramid is crumbling from the base. At other companies I’ve seen well-made LabVIEW application been ripped out and replaced with “anything” written in Python. Of course it didn’t work any better (quite the opposite), but it was Python and not LabVIEW. And that single argument was strong enough.
  9. Extremely bold. I work in a conservative organisation who uses NI's offerings a lot and I know many collegues who will not take this lightly. And it will probably bug me a little that any future NI hardware we purchase will not visually match current gen. But subjectively I like it. Feels contemporyary. But will it stand the test of time?
  10. I like the side discussion here. About the future of LabVIEW. I've been in the field for close to seven years now and I constantly ponder if relying my carrer on LabVIEW is a wise decision. Because I mean, I'm no junior anymore, and just as much as by chance as by choice LabVIEW has become my speciality now, and changing track is harder and harder for each year. Last year I attended my first NI Week and to judging from NI's marketing it was clear, according to me, that they have moved away from the use case of an engineer/scientist with a benchtop instrument controlled by a Windows PC with LabVIEW. Measuring a voltage or communicate with a IC-circuit or something like that. And I can understand why. That is a solved problem and you can just as easily do it with Python/Arduino/Raspi (everybody knows atleast one scripting language these days). A lot more focus was on (physically) big system wide solutions, mostly for testing within the semiconductor industry and radio/5G. And I guess that makes sense aswell. These are areas where hardware matter and the price point is still quite high. Perhaps their vision is that you will buy a full solution from NI in the future and only use the GUI (LabVIEW) for some small tweaks? So where does that leave full-fledged LabVIEW developers? I don't know. As a career advice I wouldn't recommend anyone who wants to be a pure software developer to go for LabVIEW. But I honestly believe using a high-level tool like LabVIEW has its benefits like allowing you to be a domain expert in the field you are operating in while also allowing you to be a develop the stuff without having to focus too much on grit. And I hope having that combination will still make me employable.
  11. This is simply amazing! Speedy answer, working solution. Thanks enserge! To be clear for anyone with the same problem it is the two steps YScale.ScaleFit=0 and YScale.ScaleFit=2 that does the trick (I guess you force LabVIEW to internally update the chart or something). Will you report this to NI enserge?
  12. Hi guys, I'm trying to programmatically update the names of five plots in a digital waveform graph. The plots are in a particular order which corresponds neatly withe the order in the legend as in the order under the "Plots" tab in the properties window. However when I try to update the Plot.Name property for each plot by stepping through the ActPlot property the order seems to be 1-2-3-4-0, i.e. when I try to change the name of ActPlot=0 it affects the last plot instead of the first. What gives? Is there any other indexing property instead of ActPlot I can use which LabVIEW uses to keep track of the plot order? I'm just updating an old vi so it would be nice if it could be solved without altering the waveform used as an input to the graph.
  13. You've probably already decided which system to go with but I would recommend the cRIO over the PXI any day of the week. The cRIO is rugged, very cheap (compared to PXI), is more modern, has no irritating fan and is far more flexible with the built in FPGA capabilities. The PXI has some modules with mechanical relays (the cRIO only has one) and a lot of modules for radio applications (but I find them somewhat archaic and with the current progress in SDR and USRP we'll hopefully see some sort of useful radio module soon). So if you don't have a specific request only the PXI can fulfil I honestly can't see any pros with it.
  14. Good news everyone! Couldn’t find the SSD but I hooked up an old hard drive to the sbRIO with a SATA-to-USB adapter. Ran some benchmarking on forehand which reviled it to be around 10x faster than my USB flash drive. And sure enough the sbRIO had no problem storing data to the hard drive while simultaneously streaming the data over Ethernet to my PC. The next step is to find a suitable storage medium but I know now that the sbRIO itself is capable to handle my application. Thanks for the help guys! Cheers, CC
  15. Thanks for the input so far! For obvious reasons (as stated) I will never reach the theoretical maximum of 480 Mbit/s, and nor do I need to, but still 0.2% of that is "amazingly" poor. I use the FPGA on the sbRIO to store the bit stream in a DMA FIFO consisting of uint8 elements. On the RT processor I have two timed loops: one that pops a fixed number of elements from the DMA FIFO and pushes them as a byte array element on a queue and another loop that simply pops an element (a byte array) at a time and stores it on the USB flash drive using the standard write-to-binary-file block (with byte order set to native). I’ve set the number of elements I read from the DMA FIFO in each iteration as a multiple (currently x8) of the sector size of my storage medium. I’ve tried to replace the USB storing with both storing on the internal memory of the sbRIO (thanks for the tip smithd!) and sending the data to my PC as a network stream. Both these variations have worked successfully with no loss in data, i.e. the sbRIO can send data fast enough both over Ethernet and to its internal memory, but the system design requires the data to be stored locally (and the internal memory is too small for my application). I’ve also tried a couple of different USB sticks I have lying around with no better result but I’m gonna get my hands on an external SSD later this afternoos. I’ll let you know if that works. Cheers, CC
  16. Hi guys, Long time reader, first time writer. I'm currently trying to develop a data gathering system using a NI sbRIO-9606. The premise is quite straight-forward; I have a digital signal (data and clock) with a bitrate of 10 Mbit/s which I want to store to a USB flash drive in real time. The single-board RIO has no problem reading and processing the bitstream but storing the data to the USB flash drive is apparently not as easy as I thought. According to the manual the USB port supports a transfer speed up to 480 Mbit/s (USB 2.0 standard?) but I've been unable to achieve any faster transfer speed than around 1 Mbit/s (1.3 Mbit/s worst-case). This is unaccaptable for my application! I'v tried to disabeling buffering for the Open/create file function and only storing files in multiples of the sector size but it does not seem to help. Are there any preferrable USB flash drives one must use? Or am I unaware of any "high speed file streaming" options in LabVIEW? Sincerley, CC
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.