Jump to content

Gary Rubin

Members
  • Posts

    633
  • Joined

  • Last visited

  • Days Won

    4

Posts posted by Gary Rubin

  1. I agree. My main issue with them is typically they don't do quite what I want or don't give me quite the control I'm looking for.

    I find that they often do too much. The few times I've used them, I've immediately converted them into a subvi, then stripped out all the cases and conditions that I didn't want/need. I guess I use them as templates/examples.

  2. You used to be able to get this close to runways here in the states if you were fortunate enough to live near a major airport. I spent a few years in Denver and (the now defunct) Stapleton IA had a hotspot to watch planes. It was a place to kill a Friday or Saturday night and drink beer_mug.gif . On Friday nights a 747 would be doing touch-and-goes for hours.

    Sadly the modern age of terrorism won't allow much of that any more. sad.gif

    You can still do that at Reagan National Airport. There's a park just north of the airport that the planes overfly as the take off and land. You can hear the vortices in the air after the incoming planes fly over.

  3. Your comments got me to look closer at my simulated data. I knew my real data would be noisier, but it will also look more like a sine wave than the triangle wave I had. I bumped up the sample frequency and now it looks better. Sorry to say though that the contributed ideas won't work on the new data. I'm back to my original plan of finding peaks and differentiating. By finding the time between peaks in the differentiation I can get the gaps. OK anybody up for round 2?

    George

    Fing gaps 2.vi

    The effectiveness of these all depend on your SNR.

    How about a simple moving average (SMA) on the absolute value of the data and look for the result to be below some threshold? The size of the SMA window would depend on the noisiness of the data. That uses a for loop, unfortunately.

    A slight variation would be to traverse the data array and check for the last N elements all below a threshold. Again, this uses a for loop. :(

    Or a combination of my original idea and ShaunR's... Do the diff (or sum) with the array shift, compare to a thresh, then convert from boolean to 0,1.

  4. When I've got a big bug late in the afternoon, I don't worry too much about it because I just know I'll dream of the solution... Most times, I solve these bugs the next day before my colleagues even arrive at work... I don't remember feeling tired, but I do feel I work too much! ;)

    My solutions usually come to me in the shower.

  5. I'm for sure in one of those chaotic/'agile' environments where nobody pays you for doing any internal documentation of your code AND the requirements are changed any and every time you talk with customers.

    I'm in that boat too.

    I get the impression that those of us who use LabVIEW for data acquisition and processing operate in very different world than those doing ATE. In ATE, I would imagine that the system requirements are pretty clear from the outset. You know your process, and you know the manner in which you want it automated.

    For data processing applications, the development process may be much more iterative, as you may not know what has to happen in step 3 of the processing until you can see the results of step 2. Likewise, the user may not know what his UI requirements are until he sees some initial results. This is especially true when using LabVIEW to develop new processing algorithms.

  6. That definitely should be the part of RCFs competitor - Wrong-Klik Framework. With such plugins as "finish this project", "Get the idea", "Solve this problem", "Earn money for me" and less spectacular but still effective "Wire diagonally", "Zoom out", "Measure wires cross-talking" or "Ask on LAVA why this is not working". Some of us will also welcome "Generate post for alpha thread".

    Don't forget "make coffee" and "get donuts"

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.