-
Posts
61 -
Joined
-
Last visited
-
Days Won
4
Content Type
Profiles
Forums
Downloads
Gallery
Everything posted by mcduff
-
I don't have extensive experience with PXIe systems, but have made some systems in the past and present. A current system collects data continuously at 20-30MSa/s for 32 channels. At that rate the 15TB RAID array is filled up in a few hours. Advantages: Throughput. Unless your modular instruments are attached via thunderbolt, hard to beat PXIe throughput. However, it might not be needed in your case. Triggering. Simple to implement triggering or advanced triggering through the backplane. Can be done with modular instruments also, but more wires and more hassles. If you need a synchronized start, the PXI backplane is your friend. Synchronization. Can share reference/sample clocks through the backplane. Can be done with modular instruments also, but more wires and more hassles. Compact. Somewhat more compact than modular instruments. Disadvantages. Expensive. Noted in previous message. Support. If you have instruments from different vendors and there is a problem, each vendor may blame the other. I had a chassis, a controller, and digitizers from three different companies. When there was an issue with the cards and the slots they could occupy, everyone at first blamed the other, Eventually, it was found that the chassis had an issue with the interrupts. PXI is supposed to be standardized but ... Future proof. The embedded PXIe controllers seem to always be a generation or two behind current CPU offerings. In addition, their components are difficult to upgrade or have limited upgrade capabilities. You may want to also purchase an external TB controller card. This allows to you to attach the chassis to a computer via the TB port and control from that computer instead of using an embedded controller.
-
I don't work in Industry, I work in a R&D facility but I am slightly pessimistic. My gut feeling is that Python will take over sometime in the near future. I have seen it before. When I first joined my group about 15 years ago all of the analysis was done using Matlab, now everyone uses Python. NI seems seems to be pushing solutions such as Flex Logger, Instrument Studio, etc, instead of LabVIEW. (Interestingly, those solutions look like they were built with NXG. ) On the plus side, we recently had a presentation by a NI rep who detailed plans for new DAQ equipment that was/is going to be in development. They were looking for feedback. The future is interesting.
-
Excuse my ignorance and stupidity, but I never really understood the following settings in the Compile Settings. I always leave them as the default value. Do those settings remove both Front and Diagram from VIs in the EXE? I know the setting may affect debugging an EXE file and possibly some of the tools to reconstruct VIs from an EXE, but should I be checking or not checking those options in an EXE? Thanks
-
You may not be able to specify the channels in any order you want. If I recall correctly for some DAQs you can only specify them in ascending order. Not sure if this holds true for simulated instruments.
-
Std Deviation and Variance.vi outputs erroneous value in corner cases
mcduff replied to X___'s topic in LabVIEW Bugs
I do not think it is a bug, just floating point math. You don't have infinite precision. See below for another example. -
Does this token need to be added to the INI file of a compiled exe or does it automatically get included?
-
The curve shown in the XY graph is inconsistent with the actual data
mcduff replied to ChinaOliver's topic in LabVIEW General
This is why "Export to Excel" is a useless feature. The data is exported using the display format of the plot's axes; since the plot uses SI formatting, it is expressed as 10k, 100k, etc. This does not help if someone wants to plot the data in Excel. -
Including solicitation of interest from potential acquirers
mcduff replied to gleichman's topic in LAVA Lounge
Good Read here, a bit depressing https://nihistory.com/nis-commitment-to-labview/ -
The LabVIEW Multicore Analysis and Sparse Matrix Toolkit is definitely faster but it seems to have issues with the large number of points. For example, on my laptop I cannot do a FFT of 20M points, but the built in FFT does 100M points no problem.
-
Read and copy the file in chunks. No need to open the whole file at once. To increase speed write is multiples of the disk sector size.
-
Including solicitation of interest from potential acquirers
mcduff replied to gleichman's topic in LAVA Lounge
Is this number greater than the number of COBOL programmers? It seems like it's getting close. -
Before you convert to a string why not take the String from the Read Function, convert to Byte Array, split it into the fixed lengths you want then convert into string. This can be done in a for loop, have an array of lengths and use that to split the string.
-
You should link this to the original discussion on the darkside, https://forums.ni.com/t5/LabVIEW/Help-issues-with-arrays/m-p/4299181 There you marked the VI by Altenbach you posted here as the solution to your problem. What array do you want to extend? Your question isn't clear to me. Maybe some of the learning resources at NI would help?
-
To make things really efficient, echoing what was said earlier: Read about 100 ms of data (1/10 of the sample rate). Except set the number of samples to the closest even multiple of the disk sector size. Typically the sector size is 512 B, so for that case set the number of samples to 10240. Set the buffer size to an even multiple of (1) that is at least 4 - 8 s worth of data. Use the built in logging function in DAQmx to save the data; don't roll your own with a queue as in your example. Get rid of the unbounded array in your example; it keeps growing. I have not taken data for years non-stop, but using 1 & 2 above, taken 2 weeks continuously 8 channels at 2MSa/s each, using a laptop and USB-6366.
-
Their "LabVIEW distribution" is nothing more than using the DLL import function for the RSA.DLL. I have not looked at the other sources on the GitHub page; not sure if the source code for the DLL is included on that site or not. I have only downloaded the DLLs.
-
TEK did not make a LabVIEW API. I used the DLL import wizard along with the RSA DLL. Not sure what API Matlab is using but below is a screenshot from the programming manual; no device ID for a disconnect. These are the functions exposed in their DLL. This Spectrum Analyzer has its own DLL and it can continuously stream data up to 56MB/s; as ShaunR said, this type of streaming is not well suited for a COM port. This device is not using SCPI commands with their provided DLL. However, it can use SCPI commands if you install their software (Signal Vu) and make a virtual VISA Port. But if you do that, then no high speed streaming which is needed for this application.
-
Yes, it has both of those functions and a device disconnect also. The Device Connect specifies a particular instrument. All other calls have no specifiers, even Device Disconnect. The API kind of stinks. My guess is, you specify an instrument in the beginning and that is it; everything else defaults to that instrument. EDIT: These functions are using the RSA.DLL
-
This is a somewhat out of the box idea and want to bounce it off people to see if there are any issues. (It would take some time to test, so I want to check if anyone has done anything similar.) I am trying to write a program that controls a TEK RSA507A. It has a DLL API that can interface with LabVIEW. The main problem is that the DLL does not support multiple instruments running at the same time; that it, there is no handle/address analogous to a VISA Address. On this site it says "In order to communicate with multiple RSA’s simultaneously you will need to call multiple instances of the RSA API. The API will need to be called for each RSA. " I assume that means if I have name the DLL differently for each instance, then I can call multiple instruments. Idea: When EXE opens, detect the number of instruments, then copy and rename the RSA.DLL in a temporary folder, e.g., RSA_1.dll, etc. Use the temporary DLLs to call functions for each instrument. Possible Problems: These DLLs call other DLLs, do not know if there will be any data corruption when that happens. I cannot rename the other DLLs. Will Windows let me copy and use a new DLL? Is that some kind of security problem? Am I missing anything else? Cross posted here
-
Full DataGridView for LabVIEW - OPEN SOURCE project underway
mcduff replied to Mike King's topic in User Interface
Do you have any advice for installing WebVIEW2? I have gotten it to work on one computer but not another for the same exact installation, I think. Thanks -
Event on Colour Change while widget is open, is this possible?
mcduff replied to Neil Pate's topic in User Interface
Here's a start to a speedier solution starting and after selecting a new color. ColorPicker_GUI_DrawColorRectangle_mcduff_MODS.vi -
@dadreamer Thanks for all of your help, will have to try it out later today when I have the instrument.
-
Sorry for haste when making the diagram, I made some typos. So please excuse my stupidity and ignorance, but here it goes. The IQSTRMIQINFO Structure contains the following content IQSTRMIQINFO(Structure): [('timestamp', c_uint64), ('triggerCount', c_int), ('triggerIndices', POINTER(c_int)), ('scaleFactor', c_double), ('acqStatus', c_uint32)] So I should set up my cluster as shown below and use DSNewPtr and MoveBlock if I want to read the data. If I have the function in a tight loop, can I make the pointer outside the loop and use it over and over again until I exit the loop and dispose of it? Thanks
-
@OP Sorry, don't mean to hijack this thread but... @dadreamer A few more quick questions. I am trying to interface with a TEK RSA using their RSA API. There is a function that allows one to stream the data from the instrument to the client, called IQSTREAM_GetIQData. The function prototype looks like this ReturnStatus IQSTREAM_GetIQData(void* iqdata, int* iqlen, IQSTRMIQINFO* iqinfo) Iqdata is an array of data and I think I can use the Array Data Pointer function in the library function, iqlen is just the length of the array. I believe the problem I am having is the IQSTRMIQINFO structure. The structure looks like this IQSTRMIQINFO(Structure): [('timestamp', c_uint64), ('triggerCount', c_int), ('triggerIndices', POINTER(c_int)), ('scaleFactor', c_double), ('acqStatus', c_uint32)] There is a pointer to an array of 100 elements in the structure. If I am understanding you correctly, I should use the pointer from DSNewPtr in the cluster, instead of an array. (See example below.) In the Call Library function what should the data format be: Handles by Value, Pointers to Handles, Array Data Pointer, or Interface to data? Thanks again for all of your help.