Jump to content

Decimating with out losing Min's and Max's


Recommended Posts

Download File:post-6833-1163530445.viI've found that the Decimate (continuous, DBL)

works great. Very fast on large arrays of data.

But by it's nature it only captures a spike in the data if it the peak of the spike happens to be one of the values passed thru.

I've tried creating my own version of code to decimate and capture peaks and valleys. It work well but is slow.

Is there a way or is there some code out there or shipped that I've over looked that will decimate and pass out the min and max for the given data that is decimated?

The code snipet I created uses the a shift register to keep track of the data array.

Delete From Array.vi to pull out chunks of the original data. Those array chunks are sent thru the array Max Min.vi. The indicies of Array Max Min.vi go thru the Max&Min.vi to get the correct order. Then I get the 2 points desired from the original chunk in the correct order, build an array of the 2 points and concatinate that array to an array that is being built on a 2nd shift register.

It's not really as involved as it sound but difficult to put into words. This process on a large 1 D Array takes about 100mS. Too long.

Any suggestions for making this run in a couple mS would be greatly appreciated.

Thanks

veghead

Link to comment

That was very clever.

I'm not savy enough to know how to add a picture of my code but with some additions your idea works great.

To rebuild a new decimated array after the MinMax function you need to take the Min Index and Max Index from the MinMaxArray.vi.

wire those to the Comparison Palet Min&Max.vi to find which came first. The Min or the Max, the Chicken or the egg.

Using the Indexarray.vi wire the min index, then the max index. Wire the outputs to the edge of the for loop. After the for lop use the interleve function to put the min/max values back together into an array and graph it.

It very closely resembles the original data and includes all abnormal data points.

Takes 2 to 4 mS to decimate a 10,000 point array by 5.

Fantastic idea.

Thanks,

Veghead

I don't know exactly what you want but what about something like this...

post-4014-1163531536.png?width=400

Link to comment
It very closely resembles the original data and includes all abnormal data points.

I hope you undestand that this kind of data modification changes the statistical distribution of the data i.e. extremas become more probable than in the original data. Therefore you shouldn't run statistical tests to the data decimated this way unless you modify the tests to take the modification into account (which may be very hard to do properly).

Link to comment

That is a good point. Thank you.

Fortunately I'm doing this for graphing during acquisition for the GUI only.

Post process analysis will include all data points.

The tech's watch the data on the screen during test and look for Odd spikes that might indicate an issue with the fixture or set up. So I wanted to make sure to capture all of the spikes and make them obvious.

I hope you undestand that this kind of data modification changes the statistical distribution of the data i.e. extremas become more probable than in the original data. Therefore you shouldn't run statistical tests to the data decimated this way unless you modify the tests to take the modification into account (which may be very hard to do properly).
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.