Jump to content

mcduff

Members
  • Posts

    64
  • Joined

  • Last visited

  • Days Won

    5

Posts posted by mcduff

  1. I don't have extensive experience with PXIe systems, but have made some systems in the past and present. A current system collects data continuously at 20-30MSa/s for 32 channels. At that rate the 15TB RAID array is filled up in a few hours.

    Advantages:

    1. Throughput. Unless your modular instruments are attached via thunderbolt, hard to beat PXIe throughput. However, it might not be needed in your case.
    2. Triggering. Simple to implement triggering or advanced triggering through the backplane. Can be done with modular instruments also, but more wires and more hassles. If you need a synchronized start, the PXI backplane is your friend.
    3. Synchronization. Can share reference/sample clocks through the backplane. Can be done with modular instruments also, but more wires and more hassles.
    4. Compact. Somewhat more compact than modular instruments.

    Disadvantages.

    1. Expensive. Noted in previous message.
    2. Support. If you have instruments from different vendors and there is a problem, each vendor may blame the other. I had a chassis, a controller, and digitizers from three different companies. When there was an issue with the cards and the slots they could occupy, everyone at first blamed the other, Eventually, it was found that the chassis had an issue with the interrupts. PXI is supposed to be standardized but ...
    3. Future proof. The embedded PXIe controllers seem to always be a generation or two behind current CPU offerings. In addition, their components are difficult to upgrade or have limited upgrade capabilities. You may want to also purchase an external TB controller card. This allows to you to attach the chassis to a computer via the TB port and control from that computer instead of using an embedded controller.
  2. I don't work in Industry, I work in a R&D facility but I am slightly pessimistic. My gut feeling is that Python will take over sometime in the near future. I have seen it before. When I first joined my group about 15 years ago all of the analysis was done using Matlab, now everyone uses Python. NI seems seems to be pushing solutions such as Flex Logger, Instrument Studio, etc, instead of LabVIEW. (Interestingly, those solutions look like they were built with NXG. :) ) On the plus side, we recently had a presentation by a NI rep who detailed plans for new DAQ equipment that was/is going to be in development. They were looking for feedback. The future is interesting.

  3. On 7/6/2024 at 7:29 AM, drjdpowell said:

    I think your second point is wrong; VIs without the front panel loaded don't use any resources.  EXEs don't even include the code for those front panels.  

     

    9 hours ago, ShaunR said:

    This is simply not true and is a fundamental misunderstanding of how exe's are compiled.

    Excuse my ignorance and stupidity, but I never really understood the following settings in the Compile Settings. I always leave them as the default value. Do those settings remove both Front and Diagram from VIs in the EXE? I know the setting may affect debugging an EXE file and possibly some of the tools to reconstruct VIs from an EXE, but should I be checking or not checking those options in an EXE? Thanks

    image.png.93ba0d4a99d2bb505924b9236b917dd6.png

  4. On 3/5/2024 at 5:40 AM, dadreamer said:

    There's obscure "Run At Any Loop" option, being activated with this ini token:

    showRunAtAnyLoopMenuItem=True

    Does this token need to be added to the INI file of a compiled exe or does it automatically get included? 

  5. Quote

    FYI, if you zoom in the Excel export will only show that data, and it's very strange that it using 1k and 10k instead of 1000 or 10000

    This is why "Export to Excel" is a useless feature. The data is exported using the display format of the plot's axes; since the plot uses SI formatting, it is expressed as 10k, 100k, etc. This does not help if someone wants to plot the data in Excel.

  6. Before you convert to a string why not take the String from the Read Function, convert to Byte Array, split it into the fixed lengths you want then convert into string. This can be done in a for loop, have an array of lengths and use that to split the string.

  7. To make things really efficient, echoing what was said earlier:

    1. Read about 100 ms of data (1/10 of the sample rate). Except set the number of samples to the closest even multiple of the disk sector size. Typically the sector size is 512 B, so for that case set the number of samples to 10240.
    2. Set the buffer size to an even multiple of (1) that is at least 4 - 8 s worth of data.
    3. Use the built in logging function in DAQmx to save the data; don't roll your own with a queue as in your example.
    4. Get rid of the unbounded array in your example; it keeps growing.

    I have not taken data for years non-stop, but using 1 & 2 above, taken 2 weeks continuously 8 channels at 2MSa/s each, using a laptop and USB-6366.

  8. 2 hours ago, ShaunR said:

    Interesting. I assumed you used the TEK one here.

    It looks very much like the whole API is a "work in progress" as many functions are not supported and...

    Do they distribute the DLL source code as part of an SDK?

    Their "LabVIEW distribution" is nothing more than using the DLL import function for the RSA.DLL.

    I have not looked at the other sources on the GitHub page; not sure if the source code for the DLL is included on that site or not. I have only downloaded the DLLs.

  9. TEK did not make a LabVIEW API. I used the DLL import wizard along with the RSA DLL. Not sure what API Matlab is using but below is a screenshot from the programming manual; no device ID for a disconnect. :( These are the functions exposed in their DLL.

    This Spectrum Analyzer has its own DLL and it can continuously stream data up to 56MB/s; as ShaunR said, this type of streaming is not well suited for a COM port.

    This device is not using SCPI  commands with their provided DLL. However, it can use SCPI commands if you install their software (Signal Vu) and make a virtual VISA Port. But if you do that, then no high speed streaming which is needed for this application.

    Snap4.png.4bd04604055787dd1d4854f56aa13c37.png

  10. Yes, it has both of those functions and a device disconnect also. 

    The Device Connect specifies a particular instrument. All other calls have no specifiers, even Device Disconnect. The API kind of stinks. My guess is, you specify an instrument in the beginning and that is it; everything else defaults to that instrument.

    EDIT: These functions are using the RSA.DLL

    mcduff_0-1670609104251.png.085d969b291f13fe1e596affe7b8db06.png

  11. This is a somewhat out of the box idea and want to bounce it off people to see if there are any issues. (It would take some time to test, so I want to check if anyone has done anything similar.)

    I am trying to write a program that controls a TEK RSA507A. It has a DLL API that can interface with LabVIEW.

    The main problem is that the DLL does not support multiple instruments running at the same time; that it, there is no handle/address analogous to a VISA Address. 

    On this site it says "In order to communicate with multiple RSA’s simultaneously you will need to call multiple instances of the RSA API. The API will need to be called for each RSA. "

    I assume that means if I have name the DLL differently for each instance, then I can call multiple instruments.

    Idea:

    1. When EXE opens, detect the number of instruments, then copy and rename the RSA.DLL in a temporary folder, e.g., RSA_1.dll, etc.
    2. Use the temporary DLLs to call functions for each instrument.

    Possible Problems:

    1. These DLLs call other DLLs, do not know if there will be any data corruption when that happens. I cannot rename the other DLLs.
    2. Will Windows let me copy and use a new DLL? Is that some kind of security problem?

    Am I missing anything else?

    Cross posted here

  12. On 10/18/2022 at 4:10 PM, MikaelH said:

    FYI, I've found the WebVIEW2 WebBrowser integration into LabVIEW + a fancy DataGrid JavaScript library is a much flexible approach.
    We've posted the LV-64 bit supported WebView2 support here: https://github.com/ANSCenter/LcWebView2
    I use http://tabulator.info/  to make nice DataGrids in LabVIEW.

     

    Do you have any advice for installing WebVIEW2? I have gotten it to work on one computer but not another for the same exact installation, I think. Thanks

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.