A colleague of mine was tasked to create a program which takes as input 1000 data points each second along with two "masks", one for max and one for min, and the program will analyze and report TRUE if there is any section of the 1000 data points that fall perfectly in between the two masks. He wanted to do this in C++ but I told him this type of data analysis stuff sounds like LabVIEW's bread & butter. I browsed some of the Waveform VI's, especially the "Window" pallet but those don't seem to be what I want. I have an algorithm in mind to do this manually in LabVIEW, but id like some expert opinions if there is an easier way to do this with existing VI's before I attempt to remake the wheel; perhaps Open-G has something I can leverage as well?
Just to drive home what I'm looking for, below is a visual picture of when the program would report back TRUE.
I'm not sure how much this matters but I should mention that the two masks are NOT given as 1000 data points like the pulse to analyze is. Rather, since they are simple step functions they are given with level (V) + duration (ms) pairs. For example one of the masks could be described as:
0.1 250
10.0 500
0.1 250
Which means the pulse is 0.1 volts for 250ms then steps up to 10 volts for 500ms and then finally steps down to 0.1 volts again for the remaining 250ms.
Thanks for your help!