Jump to content

Techniques for saving DAQ data at a specified file rate


Recommended Posts

I have an application that reads 100 samples of analog data from a task running at a hardware rate of 1kHz into a 1D array of waveform.

The specification requires data logging at a rate of 50 Hz to 0.1 Hz. The while loop that contains the file save VI runs at a rate of approximately 20Hz (but can change due to software timing).

Regarding program architecture, the program should write to file at the data logging rate (i.e. 50 writes/second for 20 Hz and 1 write per ten seconds for 0.1Hz)

Requirements:

The number of samples read has to be greater than the logging rate

The data written should be decimated and averaged over the last 100 samples read from the DAQ

My initial thought was to create a VI that does the following:

Create a while loop that checks the time from last save.

If time > logging rate time, then write to file

Number of rows to write = # of logging rate iterations since last saved. (i.e. if 50Hz logging rate, and time since last save = 1 sec, then write 50 rows of data)

The problem is the decimate function only accepts integer data which could result in roundoff error over time. So my question is, what kind of techniques can be utilized to make this VI operate properly?

Thanks!

Link to comment

I'm not entirely sure I understand your requirements, but here's what I would suggest -

  • Read the DAQ as fast as you can into a circular buffer LV2 global (you can see a quick example here, but be sure to use Replace Array Subset and Rotate Array instead of building and deleting. You will also need to replace more than a single element). Make this buffer large (e.g. 1K or 2K samples).
  • In another loop, read this global at your logging rate, take the last 100 samples from it, average them and throw them into a queue.
  • You can now dequeue the same queue in a third, lower priority VI, or wait until the measurement is over and then save everything at once.

Link to comment

QUOTE(Yen @ Oct 21 2007, 05:14 PM)

I'm not entirely sure I understand your requirements, but here's what I would suggest -

Couldn't it be possible to skip the circular buffer and use a VIG w/some sort of algorithm to determine how much data to write every time it is called? This is along the lines of what I was thinking, but am having some issues with implementing.

Link to comment

QUOTE(brianafischer @ Oct 21 2007, 11:10 PM)

Couldn't it be possible to skip the circular buffer and use a VIG w/some sort of algorithm to determine how much data to write every time it is called? This is along the lines of what I was thinking, but am having some issues with implementing.

Pardon me if you already know this. A circular buffer is the best way to aquire data in the majority of cases. It allows the hardware to go about its business in the manner is was designed. There is nothing to it either, it is how all the buffered aquisitions work, it is nothing special. I am assuming your aversion to it is coming from a lack of experience. Yen could as easily have said "do a buffered acquisition."

Usually acquiring buffered data and post processing it offline from the acquisition is the most useful way to operate if there is no feedback step. It is OK, and often much simpler, to take great gobs data and parse it later.

Your description is kind of hard to follow and seems to be making a simple problem hard. Could you describe at a higher level what you are needing to do without assuming what the best way to do it may be? The lack of replies probably reflects the complicated presentation of your questions.

Mike

Link to comment

I'm one of the people Mike was referring to. The requirements seem to be a bit overconstrained -- i.e., some of them imply a certain freedom while others take it back away. Here are some specific questions:

A. data logging at a rate of 50 Hz to 0.1 Hz

Is this a requirement for the rate at which file writes actually happen? Why such a large allowable range? It seems strange for a spec to suggest a rate as high as 50 Hz if it also allows a rate as low as 0.1 Hz.

Once logged, must the stored data represent samples that are equally spaced in time?

B. data written should be decimated and averaged over the last 100 samples read from the DAQ

So, each data point that is logged represents an average of the most recent 100 samples? Is overlap allowed between those sets of 100 samples? Are you allowed to miss any of the DAQ data? Or must you produce exactly 1 logged data point for every distinct set of 100 DAQ samples?

C. number of samples read has to be greater than the logging rate

Another strange spec, due to units mismatch (# vs rate). I get what it means -- "don't store all the raw data, reduce it first." But a spec phrased that way suggests a certain lack of clarity about needs of the overall app.

-Kevin P.

Link to comment

QUOTE

Your description is kind of hard to follow and seems to be making a simple problem hard. Could you describe at a higher level what you are needing to do without assuming what the best way to do it may be? The lack of replies probably reflects the complicated presentation of your questions

I apologize for the confusion, let me try to clear this task up.

The user wishes to select a data save rate in msec (see the attached image).

post-4274-1193097374.png?width=400

A data row should be written at each time interval with averaging or "smoothing". For slower data save rates, the largest number of samples to average is 0.1 seconds (or 100 samples).

I was planning on using the same averaging technique in the Decimate Single-Shot.vi (If averaging is TRUE, each output point in decimated arrayis the mean of the decimating factor input points.).

When the user stops logging, the correct number of rows should be written to file. (duration in seconds x data save rate)

QUOTE

A.
data logging at a rate of 50 Hz to 0.1 Hz

Is this a requirement for the rate at which file writes actually happen? Why such a large allowable range? It seems strange for a spec to suggest a rate as high as 50 Hz if it also allows a rate as low as 0.1 Hz.

Once logged, must the stored data represent samples that are equally spaced in time?

The file writes do not have to occur at the data logging rate, but the samples must be spaced at the data save rate (i.e. for 10 Hz 0.1, 0.2, 0.3, ...). If the user turns on and off logging quickly, the correct number of rows should exist in the file (duration in seconds x data save rate).

QUOTE

B.
data written should be decimated and averaged over the last 100 samples read from the DAQ

So, each data point that is logged represents an average of the most recent 100 samples? Is overlap allowed between those sets of 100 samples? Are you allowed to miss any of the DAQ data? Or must you produce exactly 1 logged data point for every distinct set of 100 DAQ samples?

Let me clear this up with an example. If the data rate is 0.1 Hz, the 100 samples read from 0.0 - 0.1 should be averaged into a single number. If the data rate is 20Hz, the 100 samples just read should be decimated (with averaging) to produce 20 rows of data/second with a timestamp column incrementing according to the data rate (0.00, 0.05, 0.10, 0.15, ...)

Link to comment

QUOTE(brianafischer @ Oct 22 2007, 08:11 PM)

I apologize for the confusion, let me try to clear this task up.

SNIP

to the data rate (0.00, 0.05, 0.10, 0.15, ...)

I guess I was too cryptic, by "higher level" I meant, What is the top level goal? Removed from any real world context it is like a class assignement or something. What is the point of this? What is the data about? Is the analysis strictly eye to brain (why else smooth the data?)? What decisions will be made from the result?

On some level an acquisition tells a story. Mostly the stories mine tell are where and how does this hall effect sensor respond to changing flux. Or when I vary the pulse width of this signal how does this actuator change its angular location, and what are the secondary conditions that may impact this? A favorite I remember on another list was, What is the amount of urine these mice produce, and when? I never did hear how that turned out.

Truth is I still don't get it, so I can't say what you should do. I can't do better than Yen's answer, which is a general recipe that could work.

Do you still have a problem with the idea of a circular buffer and how that fits into the scheme of things with LabVIEW?

Mike

Link to comment

QUOTE(mross @ Oct 22 2007, 09:04 PM)

I guess I was too cryptic, by "higher level" I meant, What is the top level goal? Removed from any real world context it is like a class assignement or something. What is the point of this? What is the data about? Is the analysis strictly eye to brain (why else smooth the data?)? What decisions will be made from the result?

On some level an acquisition tells a story. Mostly the stories mine tell are where and how does this hall effect sensor respond to changing flux. Or when I vary the pulse width of this signal how does this actuator change its angular location, and what are the secondary conditions that may impact this? A favorite I remember on another list was, What is the amount of urine these mice produce, and when? I never did hear how that turned out.

Truth is I still don't get it, so I can't say what you should do. I can't do better than Yen's answer, which is a general recipe that could work.

Do you still have a problem with the idea of a circular buffer and how that fits into the scheme of things with LabVIEW?

Mike

Mike,

The top level goal is to allow for data to be acquired from a mix of pressure, temperature, torque, motor power, motor speed, volume flow and other senors for exciting a component in a circuit simulating an vechicle cooling system. Some performance reports are looking at a step response, which would require a data rate of 20Hz. Other reports are temperature loss over time (0.1 Hz for 1 hour straight). Hence the need for multiple sample rates.

I am currently using a circular buffer for the analog data. I was trying to avoid creating another while loop, and had the idea of a "save data" case in my state machine that gets called roughly 20 times a second. The purpose of the post was to inquire on techniques to achieve accurate extraction from the analog signal and it sounds like Yens approach is the best suggestion so far.

Thanks for the responses

Link to comment

QUOTE(brianafischer @ Oct 22 2007, 10:14 PM)

Mike,

The top level goal is to allow for data to be acquired from a mix of pressure, temperature, torque, motor power, motor speed, volume flow and other senors for exciting a component in a circuit simulating an vechicle cooling system. Some performance reports are looking at a step response, which would require a data rate of 20Hz. Other reports are temperature loss over time (0.1 Hz for 1 hour straight). Hence the need for multiple sample rates.

I am currently using a circular buffer for the analog data. I was trying to avoid creating another while loop, and had the idea of a "save data" case in my state machine that gets called roughly 20 times a second. The purpose of the post was to inquire on techniques to achieve accurate extraction from the analog signal and it sounds like Yens approach is the best suggestion so far.

Thanks for the responses

One simplification is to take all the data at the fastest rate needed and simply display the results in whatever decimated, smoothed or averaged form provides the the best utility for the human operator. This only makes trouble if the readings are not all analog voltage. For instance a mix of serial, analog and digital complicates things considerably.

You say that the output and interface is reports. This implies there is no real time or even quasi-real time response needed from the operator. This also simplifies things and all the data may be post processed.

You have two regimes with very different time scales. The Producer Consumer architecture is designed for this sort of situation. If you aren't familiar with it you may find it useful. Numerous parallel loops can come in handy, don't shy away from the idea of introducing another loop. It is what LabVIEW is good at.

Link to comment

Thanks for both the big picture and the details. I think I understand that you need the the hardware data acq rate to be 100x the data save rate, right? That is, a data save rate of 20 Hz would require a data acq rate of 2000 Hz so that you decimate 100 non-overlapping samples per "row" saved?

Here are some things I'd do:

A. Easy but helpful UI stuff -- You have a single "Click to Log" button. When a user clicks it to start logging, you should make some kind of clear indication that logging is active. Also, you'll want that click to disable the "data save rate" control to prevent a user from changing the rate on the fly.

B. As Mike recommended, an extra While loop for processing / saving will be helpful and worth the effort, particularly for the higher data acq rates. The key idea is that file writes can demonstrate highly variable timing so you'd want that loop decoupled from the loop that services the data acq tasks, where extra delays can lead to unrecoverable data acq errors.

C. I probably wouldn't read only 100 samples at a time from my data acq tasks, at least not for data acq rates above 100 or 200 Hz. I'd instead read some multiple of 100 samples that corresponds to maybe 0.5 to 1.0 sec. The key is to stick with a multiple of 100 so the decimation will work cleanly.

D. I'd size my data acq buffers for 5 - 10 sec worth of sample space. (In general, I aim for a minimum margin of 2x between buffer size and read size, but I go with 5x or 10x whenever the system isn't particularly stressed.)

E. Since you have no live display of the data, the overall solution can be simpler. You probably don't need the software circular buffer after all. You could just read your data acq data and send it into a queue (or queues). A separate loop will read from the queue(s) and write to file. The queue itself will buffer the data in case your file writing loop occasionally lags behind.

-Kevin P.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.