Jump to content

Sub-millisecond timestamp


Recommended Posts

I have written a Labview program in 6.1 to control a force-displacement setup in order to measure material properties of muscle. I have refined the algorithm to sample at a rate >100Hz and I want each sample to have a timestamp so I can accurately plot the data.

At the moment I'm using the "tick count" feature but that only incriments in milliseconds - too slow for me. I have looked at the "Get attributes" in the "Counter" vi's but cant seem to get them to work. I'm using a Lab-PC+ analogue card to capture the force data and another NI digital IO card to control the mike-drive.

If there are any suggestions as to how I can stamp each sample I'd love you forever!

Cheers.

Link to comment
I have written a Labview program in 6.1 to control a force-displacement setup in order to measure material properties of muscle. I have refined the algorithm to sample at a rate >100Hz and I want each sample to have a timestamp so I can accurately plot the data.

At the moment I'm using the "tick count" feature but that only incriments in milliseconds - too slow for me. I have looked at the "Get attributes" in the "Counter" vi's but cant seem to get them to work. I'm using a Lab-PC+ analogue card to capture the force data and another NI digital IO card to control the mike-drive.

If there are any suggestions as to how I can stamp each sample I'd love you forever!

Cheers.

3503[/snapback]

It sounds like you're polling the AI and this is where the uncertainty comes from? If you would convert your polling AI into a continuous, buffered AI, the delta time for the samples would become known and constant. Not only that, but, the oversampling possibilities would also be beneficial to the data quality.

Does this help?

Randy

Link to comment

James,

Since you are programming in labview real time. The best solution to this problem is to make your time critical loop a hardware timed single scan. In Labview Real time this is where your power comes from. Basically you setup a non buffered single scan which controls the timing of your loop, you then read the iteration counter to determine how much time has passed. Your resolution of your count then lies solely on how fast you run that loop minus the slop from the hardware crystal oscillator on the daq board (which is minimal).

Derek Lewis

Link to comment

Thanks guys. But I am now a bit more confused... I'm not sure what the difference between LV and RT is? - I'm using standard Labview 6.1??

I have used the LV "AI config", "AI Start" and "AI Read" for acquiring the analogue force measurements and written some custom VI's to interact with the digital mike drive.

I have attached the main VI (not including the digital ones for size reasons).

Thank you so much forr your help!!!

James

Download File:post-1372-1105912480.vi

Link to comment

Hi James,

If you don't know what RT is, then, as you figured out, you don't have it :) It is very cool however, if you need it. Just look in the NI catalog or website to get a description.

You're already running in the fashion I suggested, that is, buffered input. When you set the AI Start to continuous / 200sam/sec, you made the AI hardware timed by the board itself. This means that when you read the samples, you know that they have a delta time of 1/200 (5msec). There's your timestamp right there.

When you pluck out the first sample of 50 (the amount you read per loop) and go off to the motor contol section, you know the delta time for these samples is 250ms (50 of 200). BTW, you'd get better data if you averaged the 50 of channel 3 instead of just using the first element.

So, to answer your first question, to plot this data just run it to a waveform chart. Since you have evenly spaced data, there's no need for a XY graph. Set the chart X axis delta time to match the data if you want accurate X scale time values. Even if you don't do this, the data will plot correctly... Except that the x axis would be in units of samples instead of time.

Hope this helps,

Randy

Link to comment

Randy,

That makes sense... BUT if I have the AI, control system and motor control in the same loop then there is a significant delay before the motor responds to the force change. I am aiming for (and have achieved) a response time of less than 1s. But because of the timing inconsistance, the timestamp stamps more than one entry with the same millisecond count, sometimes three, sometimes for or five. Thus it is hard to know exactly what the timestamp is.

Also the plotting on a waveform chart takes significant time and by removing it I can at least double my sampling rate... I can plot the results in excel after the data gathering :)

Thanks so much,

James

Link to comment
Randy,

That makes sense... BUT if I have the AI, control system and motor control in the same loop then there is a significant delay before the motor responds to the force change. I am aiming for (and have achieved) a response time of less than 1s. But because of the timing inconsistance, the timestamp stamps more than one entry with the same millisecond count, sometimes three, sometimes for or five. Thus it is hard to know exactly what the timestamp is.

Also the plotting on a waveform chart takes significant time and by removing it I can at least double my sampling rate... I can plot the results in excel after the data gathering :)

Thanks so much,

James

3548[/snapback]

Now I'm a bit confused... Time stamping the AI is implied... There's no reason to check your clock, you can tell by the sample number (kept track of by you) what time the sample happened at (relative time, that is). This makes me wonder if you're trying to timestamp the drive commands sent by you? That's an entirely different animal. Is this the case?

Randy

Link to comment
Now I'm a bit confused... Time stamping the AI is implied... There's no reason to check your clock, you can tell by the sample number (kept track of by you) what time the sample happened at (relative time, that is). This makes me wonder if you're trying to timestamp the drive commands sent by you? That's an entirely different animal. Is this the case?

Randy

3549[/snapback]

Do you mena that because I've set the sample rate to be 200Hz, every sample will be at 1/200s? I would have thought so too, but when I had the rate at 100Hz and timestamped my output file using the tick-counter, I found uneven steps at the millisecond resolution of the counter.

I rationalised this by the fact that the acquisition, feedback system, mike drive commands and file writing takes time. That is why I put the writing of data to the file all in one place (so it would get written at the same time). But because I'm using the tick-counter and a sampling rate where I write samples faster than a millisecond apart, it is difficult to know the exact (relative) time each sample was taken / written... So I want to replace the millisecond (tick counter) timer with a sub-millisecond one...

I have tried to increase the sampling rate of the AI but the buffer fills up causing the system to slow down and stop! :(

Hope we don't confuse eachother too much :)

Thank you muchly!!!

James

Link to comment
Do you mena that because I've set the sample rate to be 200Hz, every sample will be at 1/200s? I would have thought so too, but when I had the rate at 100Hz and timestamped my output file using the tick-counter, I found uneven steps at the millisecond resolution of the counter.

I rationalised this by the fact that the acquisition, feedback system, mike drive commands and file writing takes time. That is why I put the writing of data to the file all in one place (so it would get written at the same time). But because I'm using the tick-counter and a sampling rate where I write samples faster than a millisecond apart, it is difficult to know the exact (relative) time each sample was taken / written... So I want to replace the millisecond (tick counter) timer with a sub-millisecond one...

I have tried to increase the sampling rate of the AI but the buffer fills up causing the system to slow down and stop! :(

Hope we don't confuse eachother too much :)

Thank you muchly!!!

James

3572[/snapback]

Your initial impression that the 200 sam/sec means that the sample delta time is 1/200 is correct. You let yourself get talked out of it due the time stamps created during the *polling* of the AI buffer. Just because your polling of this buffer is non-deterministic does not mean that the determinism during *filling* the buffer is suspect. The continuous analog input task you have setup is hardware timed, meaning the the board has it's own clock that is running the AD conversions. You can depend on the delta times of the samples to be very accurate.

Now, how you synchronize your drive signals with the AI data is another issue altogether and completely depends on how you write your program.

Let me know if this does anything for you :)

Randy

Link to comment

It does it for me all right!!

I have changed the code and run the program with the little lightbulb on to see how things are running... I can see that each cycle of the loop does everything once and therefore the time for sampling, updating motor and writing to file is all contained within the loop. However, the output shows timestamps being the same and then jumping up to 16ms (I originally thought the jump was 1 - hence the need for greater resolution). I have attached the VI and output for clarification.

Thanks

James

Download File:post-1372-1106106723.vi

Download File:post-1372-1106106725.txt

Link to comment

The reason the time stamps are jumping is because you are coercing a real big U32 to a single float. Just type one of your timestamps into a float control and try to add one milllisecond to it, you'll see what I mean. Depending on how long your program runs, you may be able to make this work by making your timestamp a delta time... Get the tick before the loop and then subtract the current tick from the the original value. Also, converting milliseconds to seconds will help since you can then use some of the 'right of the decimal part of the float'.

Keep it coming, we'll get this going :)

Randy

Link to comment

BTW, I should mention, as I understand it, the tick count represents milliseconds since the computer was booted. I tell you this so you understand why it sometimes has small values and seems to works and other times it has large values. This timer will roll over in 49.7 days, however, if you always subtract tick now from tick then and get a UINT as an answer, the roll over will not effect the delta time. It's cool how this works out. Like this: 2 - 255 still equals 3 in the case of U8s... So you can see, the delta time always gets the right answer unless you roll the timer twice -- Unlikely, most times, with a U32.

Randy

Link to comment

Randy,

AWESOME!!!

You are amazing!

:worship:

I thought I was getting fantastic resolution when in fact I was skipping time and getting rubbish. Now I realise that 200-250Hz is about the limit because of the mike-drive control.

Thank you for your time, patience and wisedom.

James.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.