Jump to content

Question about the use of ProducerConsumerData (Multiple Queues)


Recommended Posts

Here is what I'm trying to do:

Data Acquisition vi:

-Read data at high speed (500-800 Hz), calculate an average value.

-Write to a file the average values at 1 Hz frequency.

-Display calculated values on the user interface.

First I had everything in the same while loop which was slowing down the vi. So after asking on the forum I was told to use the Producer/Consumer template.

Now my vi is working perfectly fine for short measurements at high speed (writing to file at 500-800Hz).

Depending on the test we want to run on our machine, I will only need one value per second. So I decided to use the ProducerConsumerData (Multiple Queue Array).vi.

In my Producer queue, I'm reading the signal at high speed, calculating the average and then sending the values to 2 consumer loops.

One consumer loop is displaying calculated values to the user interface. Everything is fine.

The second consumer loop is writing to the file and that's where my 2 questions are.

-1st question:

When I'm reaching a certain number of values (which I set before starting the vi), I want to stop writing to the file and then stop the producer loop (which will result in stopping the VI). What is the best method to send the stop signal from a consumer loop to the producer loop?

-2nd question:

What is the best way to only write one value per second to my file?

I've tried to place a wait function in my consumer loop, but that way it was only reading the first value in the queue which kept filling at 500 Hz.

As the average is already calculated in the producer loop I only need to get the first value and then empty the queue from the other values stored during the previous second.

The attached vi is describing what I'm trying to do.

Bigl00z3

Download File:post-6798-1164197967.zip

Link to comment

"-Read data at high speed (500-800 Hz), calculate an average value."

How do you propse to calculate this average? Is it a running average of the data taken before and after a particular reading? Is it the average of all the accumulated data?

"-Write to a file the average values at 1 Hz frequency."

Why do you want to do this every second? Why not acquire al the data and post process the data?

What do you gain by writing the data every second?

You can ceratinly stream data to disk, but what is the point? LabVIEW was setup to make this much easier by letting you gather lots of data into and array at very high speeds (1.25MHz for an analog channel on an E Series DAQ board), then after it is done analyze the data, store the results, and present the analysis.

The issue with writing to files is OS related. You are fighting a headwind if you write to files all the time; so make sure that is really what you wwant to do before you do it..

I haven't peeked at your VI yet, but I will. I am not sure if the PC architecture is the right one for what you are doing. Maybe you don't need any special architecture at all. ProducerConsumer is used to mediate between inputs or requests (often GUI) that may come in batches that the consumer cannot manage immediately. That PC allows tasks to stack up and be dealt with as resources are available. Merely reading some channels, doing a little analysis and saving the data to a file doesn't require any special effort.

Link to comment
-1st question:

When I'm reaching a certain number of values (which I set before starting the vi), I want to stop writing to the file and then stop the producer loop (which will result in stopping the VI). What is the best method to send the stop signal from a consumer loop to the producer loop?

After the producer loop, drop a Release Queue primitive. When that prim executes, it will invalidate the refnum. Any waiting Dequeue in the consumer loop will stop waiting and return an error. Have your consumer loop stop if an error is returned.

Your second question is something I'm going to leave to others with more experience with determinism, but I suspect the Timed Loop is what you're looking for. Though, with a timing window of a full second, just putting a Wait Milliseconds wired with 1000 into your loop might do the trick.

Link to comment
"-Read data at high speed (500-800 Hz), calculate an average value."

How do you propse to calculate this average? Is it a running average of the data taken before and after a particular reading? Is it the average of all the accumulated data?

I'm using a floating average VI based on this VI. It's calculating the average for 3 of my 8 signals on the last "x" (generally 50) points.
"-Write to a file the average values at 1 Hz frequency."

Why do you want to do this every second? Why not acquire al the data and post process the data?

What do you gain by writing the data every second?

On my system I have a pump generating high oscillations of pressure.

-For some measurements, I want all the data (without the average) on short period of time (max 30 seconds) so I can analyse all the oscillations. <-- this I can already do and it's working perfectly fine.

-For some longer mesurements (45 minutes or more), I only need my average pressure and to save post processing time & server space I only want 1 value per second.

You can ceratinly stream data to disk, but what is the point? LabVIEW was setup to make this much easier by letting you gather lots of data into and array at very high speeds (1.25MHz for an analog channel on an E Series DAQ board), then after it is done analyze the data, store the results, and present the analysis.

The issue with writing to files is OS related. You are fighting a headwind if you write to files all the time; so make sure that is really what you wwant to do before you do it..

I haven't peeked at your VI yet, but I will. I am not sure if the PC architecture is the right one for what you are doing. Maybe you don't need any special architecture at all. ProducerConsumer is used to mediate between inputs or requests (often GUI) that may come in batches that the consumer cannot manage immediately. That PC allows tasks to stack up and be dealt with as resources are available. Merely reading some channels, doing a little analysis and saving the data to a file doesn't require any special effort.

I'm using the Producer/Consumer because of my previous problem described in this post. The refresh rate of the display used to slow down my entire Vi.

bigl00z3

Link to comment
-For some longer mesurements (45 minutes or more), I only need my average pressure and to save post processing time & server space I only want 1 value per second.

hmm ... if you only need the average of your values @ 1 Hz, maybe you want to use my "IBB Logger Base" ?

It accuires data internally @ 1 kHz and you can decide, how you want to evaluate your data (Average, RMS, etc ...). You can select a "Sampe Rate" from 10 Hz to 0.1 Hz (?) and all Data are displayed on the front panel and logged into a database, where you can export them into an *.CSV-file

You can download it here: IBB Logger Base 2.0.1

Please let me know, if your DAQ-Hardware is not supportet. As far as I can be used with DAQmx, I can give you an update, which supports your hardware ...

Link to comment

I looked at your code and you have gone too deep.

All you are trying to do is aquire data at a very SLOW rate. Orders of magnitude slower than what LabVIEW is designed to do.

You want to find the average (still undefined exactly how this is to be done).

You want to write the average to a file.

There are basic functions to acquire data, calculate an average, and write the results to a file. None of these requires the complication of a Producer Consumer format.

Save the Producer Consumer for later.

I made you a vi that would acquire the data all at once, calculate the average overall and as a running average, and wirte to a file.

You really don't want to over do this. If LabVIEW does nothing it makes what you defined extremely easy.

Rather than pursue the Producer Consumer take a look at the acquisition example programs. If you can get a DAQ card and the Signal Accessory you can really make some headway learning the basics. When you start doing actual tasks in the lab or the field or analysis with real data, then perhaps the other architectures will be handy. Remember there are other architechtures, not all tasks require (or are appropriate for) the PC form, there are state machines, queued state machines, and user event driven PC, to name a few popular ones. There are good examples of all these at ni.com.

Mike

Here is what I'm trying to do:

Data Acquisition vi:

-Read data at high speed (500-800 Hz), calculate an average value.

-Write to a file the average values at 1 Hz frequency.

-Display calculated values on the user interface.

Download File:post-48-1164212953.vi

Link to comment
-1st question:

When I'm reaching a certain number of values (which I set before starting the vi), I want to stop writing to the file and then stop the producer loop (which will result in stopping the VI). What is the best method to send the stop signal from a consumer loop to the producer loop?

I think you should use a Notifier. Create a notifier with a data type of boolean. Wire the Notifier reference to the producer and consumer loops. In the Producer loop, place a Wait for Notification with a timeout of zero (default is -1, or FOREVER! not what you want). You can OR the output of the Wait on Notification with your Stop control to exit the Producer loop.

In the Consumer loop, you place a Send Notification primitive, wire the reference, and use your stop condition logic to send a True to the producer. The Producer will read the notification, the producer loop will exit, the Release Queue that Aristos mentioned will invalidate the refnums and the consumer loops will "error out".

I noticed you are converting the data to a variant and also using named queues. In the example you provided, this is a lot of overhead. Change the datatype of the queue to an array of doubles, then get rid of all the variant conversions.

Instead of naming and retrieving the queues, just pass queue refnums into the loops; In your example you are actually obtaining a reference to the queues every time you send the data :nono: ... 45 minutes of data while creating TWO references every 125 ms is something like 21,000 refs! I tried to edit and post, but I've run out of time (have to buy my turkey!).

Here is a picture of your code and my suggestions (not tested just the idea...)

Your original

post-949-1164214249.png?width=400

My suggestions

post-949-1164214314.png?width=400

Link to comment
I'm using a floating average VI based on this VI. It's calculating the average for 3 of my 8 signals on the last "x" (generally 50) points.

On my system I have a pump generating high oscillations of pressure.

-For some measurements, I want all the data (without the average) on short period of time (max 30 seconds) so I can analyse all the oscillations. <-- this I can already do and it's working perfectly fine.

-For some longer mesurements (45 minutes or more), I only need my average pressure and to save post processing time & server space I only want 1 value per second.

I'm using the Producer/Consumer because of my previous problem described in this post. The refresh rate of the display used to slow down my entire Vi.

bigl00z3

Don't show the data on screen and if it is really the refresh rate then the problem will go away.

It could simply be that opening and shutting the files repeatedly (a big no no) is slowing you down. You haven't shown us the write operation so there is no way to tell, you just show an indicator for a single datum. You have to look at the file writing vis and put the file open and close functions before and after the loops not in them. Opening and closing depend on the OS which has its own adgenda. Once the connection is established you just keep appending data to the open file. Writing once a second this way should not be a problem until the OS starts to get grumpy about the file size.

Nothing you are describing requires writing the data to a file continuously - maybe you shouldn't do it. As a last resort you can gather it into larger chunks and write it periodically. You do not want to be putting millions of data points into a spreadsheet. You need to decimate in some sensible way. At this point it would be helpful to know what you think you will see in the data, and what decisions you will make. Maybe you only need to look at the data taken 2 or 3 minutes before and after a peak. Maybe you only want to get a spectrum analysis of the data. Huge amounts of data are lots of trouble so it all has to be useful or your efforts are wasted.

It sounds like you might not need even 1 hz for long measurements. If you know the nature of your pressure fluctuations then you can use Nyquist to set an appropriate sampling rate. If your fastest pressure cycle is 30 seconds then sample onece every 3 seconds. If your fastest cycle of interest is really short then you are going to miss something at 1 Hz. You should sample a minimum of 3 times your cycle period, and 10 is much better.

Assuming you really do need massive amounts of data written to disk you could keep a big circular buffer and process the data as you go, triggering captures of the data only when it is really necessary. Again do not open and close the file for each write operation.

There is nothing to stop you from post processing the amount of data you would get in 45 minutes at 1 Hz. When I say post processing, I mean after the acquisition, but before writing to the file. LabVIEW was built to do this. It won't even take that long if done properly. When you have over 2 million data read most of it is garbage you will never look at. You need a way to decimate it and find the information of interest. Decimating data is wholely another topic.

Here is an analogous example. If you ever have heart palpitations they will want to put a monitor on you and record your EKG for 24 hours. Ever wonder how the heck they find anything in all that info? It only works if the patient marks the record when they think they feel weird. No one is going to sit and watch 24 hours of EKG looking for rare and intermittent events.

It is similar for your work. You know a visual record of 45 minutes of data is not useful. For shorter periods visual observationis a great way to spot interesting events, but there are real limits to this. (You should really not show any of the data on screen while it is being read.) You need to decide how you are going to find the information of interest in all that data. Throwing some away is a good idea if you can figure out how not to lose something important.

Anyway don't prejudge the problem and try and shoehorn it into the PC form. I don't see that have the need for any parallel operations at all. If you need to acquire this much data and you really want it all written to a file, you should be unloading a buffer now and then to reduce the number of write operations. Also, I wouldn't write it to a file at all, I would write to a database. It is much faster and the db handle large amounts of data much better. That too is a whole other ball of wax and has its own learning curve.

You need to look at the acquisition vis - start taking data even if it isn't the actual signal (hook up to a function generator or something). Lose the simulate signal and the Producer Consumer. They are a dead end.

Mike

I'm using a floating average VI based on this VI. It's calculating the average for 3 of my 8 signals on the last "x" (generally 50) points.

On my system I have a pump generating high oscillations of pressure.

-For some measurements, I want all the data (without the average) on short period of time (max 30 seconds) so I can analyse all the oscillations. <-- this I can already do and it's working perfectly fine.

-For some longer mesurements (45 minutes or more), I only need my average pressure and to save post processing time & server space I only want 1 value per second.

I'm using the Producer/Consumer because of my previous problem described in this post. The refresh rate of the display used to slow down my entire Vi.

bigl00z3

Here is your floating average vi rearranged to operate on a pre-existing array. You can use this to post process your acquired data.

Download File:post-48-1164219804.vi

Link to comment

I've started using the TDMS in version 8.2 for saving data during the run. Currently, I easily log 16 DAQ channels every millisecond for test runs of 45 minutes or more. I tried out the code with a run for over 6 hours.

In the past, my application would trend the data during the run then allow the test operator to analyze the data. Occasionally, something would go wrong and the test operator would loose the data at the end of the run. The UUT needed to cool for 3+ hours before running a test again.

Although I haven't experimented yet, I'm hoping the TDMS file is still usable even if the computer abruptly powered off or locked up during a test.

Link to comment
I've started using the TDMS in version 8.2 for saving data during the run. Currently, I easily log 16 DAQ channels every millisecond for test runs of 45 minutes or more. I tried out the code with a run for over 6 hours.

In the past, my application would trend the data during the run then allow the test operator to analyze the data. Occasionally, something would go wrong and the test operator would loose the data at the end of the run. The UUT needed to cool for 3+ hours before running a test again.

Although I haven't experimented yet, I'm hoping the TDMS file is still usable even if the computer abruptly powered off or locked up during a test.

Well, that is a new one on me in my little circumscribed corner of the LabVIEW world. What is "the TDMS?"

Mike

Link to comment
Well, that is a new one on me in my little circumscribed corner of the LabVIEW world. What is "the TDMS?"

"Technical Data Management Streaming".

Introduced in LV8.2. It's a file format created by NI and useable with CVI, LV and Diadem specificially for handling data -- streaming, archiving, reviewing, manipulating -- and is designed for huge data sets.

That's everything I know about it. Check the online help for LV8.2 for more information.

Link to comment
"Technical Data Management Streaming".

Introduced in LV8.2. It's a file format created by NI and useable with CVI, LV and Diadem specificially for handling data -- streaming, archiving, reviewing, manipulating -- and is designed for huge data sets.

That's everything I know about it. Check the online help for LV8.2 for more information.

Sounds interesting. I wonder if there is ever a reason NOT to use it?

Link to comment

Thank you for all your inputs.

LV Punk

You were right, the Notifier was exactly what I was looking for. It's now stopping at the right time.

I definitely have to look closely to your comment about passing queue refnums into the loops. That may be useful.

Karl Rony

TDMS sounds interesting I should look at it when I have a bit more time

Link to comment
>>SNIP

I will try to re explain my needs.

On my system, I have a pump which is running at 50Hz (the pressure oscillates at the same speed).

For some tests, I need to analyze the height/shape/rate of the oscillations so I want to read at 500Hz to have enough points in my file for post processing.

To answer your question, we already trigger the capture by pressing "start writing to file" when we need the massive amount of data. We know when we want to start our 10-30 seconds capture. Analyzing those data cannot be done in LV yet as we don't always really know what we are looking for before running the test.

That is an odd statement. Do you not know how to do the analysis in LabVIEW? There are agreat many tools for analyzing data. You could use LabVIEW for many years and never exhaust the methods of analysis.

Furthermore, LabVIEW is much, much better than Excel or an office spreadsheet, particualrly when the data set is quite large.

Analyzing those data is then done in an office and can take a lot of time (but that's an other story…)

For some other tests, I only need data at a slow rate, but if I just get the value at 1Hz, I get points randomly positioned in my oscillations. That's why I'm acquiring at high speed and then averaging the values on the 3 signals related to the pressure (the other 5 signals don't need averaging as they are much more stable).

For this kind of tests, we are running for longer period of time.

You say:

People using the system need to see the values on the screen in real time.

I ask you: Why? What decisions will be made based on the visual display?

Frankly, if I need to visually investigate a signal, I find a good fast oscillosope to to be much more useful. Then I use that information to guide me in choosing the correct speed and resolution for an LabVIEW acquisition. I then use LabVIEW to acquire a limited data set (not continuous) process the data, store it, analyze it, and present it graphically.

The display doesn't need a high refresh rate, but we still need to see the value to know exactly what we are doing and when to start writing to the file.

Refresh rate is an issue only if you acquire data continuously and display it onscreen in a graph or numeric indicator or array indicator. If you are not reacting to the visual display in some manner, then you have no real need to display it. You could just show an hourglass if that makes the operator feel better. If you are reacting to the visual display in real time, you need to qualify and quantify what you are going to do with it. It is also possible to apply PID to control the process if that is desired using LabVIEW functionalty.

{quote]

>>SNIP

-The second one for people who wants to acquire at 1 Hz. This version still needs to acquire the values at high rate to make sure that the 3 pressure sensors have the right average.

I am sorry that is a self contradictory statement: ...wants to acquire at 1Hz...needs to acquire at high rate....

And how does acquiring make sure the three sensors have the right average?

Are you implying some sort of decision making process and manual contorl of the system?

>>SNIP

The writing to file is done as shown below.

attachment here

So my question is still the same.

My queue keeps filling. Can I empty it once a second just after writing one set of my 8 signals? What would be the best way to get the first or last values for that particular second for each of my 8 signal (which are in a 1D Array).

I hope I explained it a bit more clearly this time. I'm still learning and I don't know the LV "vocabulary". ;)

Bigl00z3

I said this earlier. You need to look deeper into the function for writing to file and get the functions that open and close the writre operation out of any loops.

post-48-1164725817.jpg?width=400

I haven't used this example made in v8.2 so I apologise if it isn't actually functional, but it illustrates the idea. Opening and closing the write operation depends on the OS which can take a lot of time. So to make it run faster you need to leave the operation open, and call ONLY the Set Position and Write To Text File functions inside the loop. Only close the file after you are completely done writing to it.

The example I show is a couple layers of sub vis deep inside the basic Write to Spreadsheet.vi file function. You keep double clikcing on sub vis until you find the one that is opening the file and closing it, that is where you make the modifications.

The function you showed in your image is unlabled and I don't know where to find it to look at it. Code is always better than screen dumps.

Link to comment

Sorry for not replying earlier I ran out off time yesterday.

Do you not know how to do the analysis in LabVIEW? There are agreat many tools for analyzing data. You could use LabVIEW for many years and never exhaust the methods of analysis.

Furthermore, LabVIEW is much, much better than Excel or an office spreadsheet, particualrly when the data set is quite large.

No I don't know yet how to do the analysis in LabVIEW. I know and I can see that it's possible to do most of the data analysis in it. But I think that in my situation it's not the best solution.

As my system is running on a machine that is constantly modified by people wanting to study different parameters, it's easier for everybody to have the raw data on their personal computer.

I know that Excel is far from being the best solution to treat large data, but at least it's on everybody's computer and it's working for the length of data we need.

I ask you: Why? What decisions will be made based on the visual display?

Frankly, if I need to visually investigate a signal, I find a good fast oscillosope to to be much more useful. Then I use that information to guide me in choosing the correct speed and resolution for an LabVIEW acquisition. I then use LabVIEW to acquire a limited data set (not continuous) process the data, store it, analyze it, and present it graphically.

Again you are right. But for this project I can't really get more hardware ($) so I'm trying to do it on the computer in LabVIEW.

Until now, we were using PicoLog which was showing the values on the screen.

They are time when we have to check the display to know the values. We start some parts of the machine when some values are reached. That's why I want to show it on the display.

Until now, when we were running test which needed fast data acquisition, we were using an Agilent 34970A which was recording the data without displaying anything. So we had to run PicoLog at the same time to have the values displayed on a screen.

I am sorry that is a self contradictory statement: ...wants to acquire at 1Hz...needs to acquire at high rate....

And how does acquiring make sure the three sensors have the right average?

Are you implying some sort of decision making process and manual contorl of the system?

Sorry I don't think I explained it well.

If for the 1Hz measurements I want the average pressure (the pressure is oscillating at 50Hz) I read the data at high rate and then I calculate a floating average on the last 50 values. That way the value my floating average is outputting is correct (I already checked the values with our other data acquisition software and they are correct. So my idea shouldn't be completely wrong).

If I don't do this the value is just not correct at all (values are read "randomly" in different parts of the oscillation).

Attached is what I have done so far:

bigl00z3

Download File:post-6798-1164875769.zip

Link to comment
Mross:

my running average is like this.

post-6798-1164320168.jpg?width=400

post-6798-1164320148.jpg?width=400

People using the system need to see the values on the screen in real time. The display doesn't need a high refresh rate, but we still need to see the value to know exactly what we are doing and when to start writing to the file.

Bigl00z3

Just a few comments on displaying large volumes of data fast:

1 If you have a labview version other than the Base version, you could use the "point-by-point" libraries to calculate mean/median/mode and MANY other functions on a point by point basis (instead of waiting for 50 points to collect at a time).

This should help in smoothing out the data.

2 To speed up data display you could acquire say 500 points, and use the Decimate.vi to decrease the plot data by whatever factor you like before plotting it on screen.

3 Another trick is to use the "defer panel updates" property to decrease the number of screen redraws. You could probably draw the panel once a second or so to decrease the screen refresh burden.

Hope this helps.

Neville.

Link to comment
1 If you have a labview version other than the Base version, you could use the "point-by-point" libraries to calculate mean/median/mode and MANY other functions on a point by point basis (instead of waiting for 50 points to collect at a time).

This should help in smoothing out the data.

Well I have the LabVIEW base version. The person who was in charge of the project at the time they ordered LabVIEW thought it would be enough.

But if I really need the Pro version for the tests we need to do, they may decide to invest

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.