Jump to content
mwebster

Drift between Date/Time and Tick Count (ms)

Recommended Posts

I can't seem to find a definitive source delineating the difference between Get Date/Time in Seconds and the ms Timer.  I was thinking that maybe the get date/time checked the RTC at startup and kept track maybe with occasional corrections.  But there is definitely some difference between Windows 7 and XP. 

 

post-22117-0-46716500-1380295677.jpg

 

I've attached a test VI and below are a couple of pictures showing the results between my two systems.

post-22117-0-87809600-1380295626.jpg

Windows 7 (normal result)

post-22117-0-82071100-1380295575.jpg

Windows 7 (only happened once out of a few dozen trials)

 

post-22117-0-72727000-1380295354.jpg

Windows XP

 

Now, Windows XP does drift eventually.  Left overnight, it got up to ~800ms.

 

For Windows 7, the stairstep repeats over time and the drift rate is quite a bit higher even though the "noise" amplitude is not present like in XP.  A second Windows 7 system results look almost identical to mine and a secondary Windows XP VM looked very much like the XP desktop I tested on.

 

Anyway, if anyone could enlighten me I'd appreciate it.

 

Thanks,

Mike

Timer Drift Tester.vi

Share this post


Link to post
Share on other sites

Hi mwebster,

 

I've downloaded your VI and confirmed the behaviour on my Win7 system on LV2011. The stepping effect as in your diagrams however never occured to me. I think the term 'drift' is misleading you here.

The 'date/time in seconds' value will always drift slightly to the 'tick count', as the tick count is less accurate (significant digits missing & tick counter is rounded!). You can see that in your graph which has a range from -0,0006ms to +0,0008ms on my attempt. Thats a range of 14µs!

 

For example:

 

date/time:  15,3388772010803223

tick count: 15,3390000000000004

difference: -0,0001227989196781

 

That's what causes the effect of 'drifting' in your graph, as the timer needs many iterations to line up again in terms of microseconds. Eventually both values will round up again and the cycle starts from the beginning. There is always a chance, that one of the two functions is executed slightly after the other one, since your system is not a real time system and the functions are of different compexity inside. Also non-realtime systems have a common habbit to suspend execution of some threads to allow execution of other ones. Maybe thats causing the stepping effect in your graph eventually. Depending on the number of CPU cores and the way Microsoft handles threads (this also changed between XP / Vista / Win7 / Win8 / ... ), your experience might change too.

 

I'm not sure though, why your XP system had a difference of 800ms... I remember something about one of them may be handled by the bios instead of the CPU... A quick search turned this out: http://digital.ni.com/public.nsf/allkb/4E12F6841016929D86257126007A9D94

I assume that the system made an synchronization to it's time over the internet at some point over the weekend. If the computer is not connected to the internet, I've no clue.

 

As for the differences you are asking for: The date/time is more accurate and absolute, where the tick count is less accurate and relative to the start of your system (ms since start). If you need an accurate way to display execution times or delays, the tick count is everything you need. The system can only be handled down to ms anyways. Use date/time whenever you need an exact timestamp.

Share this post


Link to post
Share on other sites

Those steps are 1ms tall on my box and they'll keep diverging ever further apart without resyncing.  However, you did remind me that the XP get date/time function only has 16.67ms resolution which accounts for the "noise" I see on the XP box.  In fact, all the noise on the XP system probably hides this divergence until a enough time has passed to make the "DC" component apparent since the AC part is going to be around +- 8ms.

 

On my Windows 7 laptop, it takes ~15 seconds for one of those 1ms steps to occur, so you might need to let it run a little while to see the behavior.

 

 

Weird ... The last two or three times I've run this, I'm getting results that are consistently like this:


post-22117-0-29893900-1380316000_thumb.j

 

This is an 8-core CPU, so maybe it has to do with how the CPU cores are assigned?  But if you look at my first post it's apparent that it can shift from one type to the other.  In fact, this most recent change happened while running.  It was initially the flat stair steps then, after a couple of minutes went over to the sawtooth pattern.

 

Edit again:

And now it's back to stairsteps again...

Edited by mwebster

Share this post


Link to post
Share on other sites

Windows 7 long run:

post-22117-0-88452400-1380328895_thumb.j

 

When zoomed in, the increasing areas are the sawtooth pattern, decreasing areas are stairstep.  The x-axis is in seconds with the x-axis markers at approximately the inflection points.

 

Windows XP on the other hand, holds steady for much longer periods and then has relatively sudden deviations:

post-22117-0-78684400-1380329527_thumb.j

 

Data prior to 7000 seconds was identical to first section of graph.  Data was still descending slightly approaching 9000 seconds and then became rock solid again for another 3500 seconds where I ended the run.  Eh, what the heck, zoomed out view:

post-22117-0-57295100-1380329816_thumb.j

Share this post


Link to post
Share on other sites

You forget to take into account that almost all systems nowadays are set to synchronize their time to a time server (if they have any internet connection). This means that the RTC is periodically adjusted to whatever the configured timeserver reports (and smarter algorithms will attempt to do this adjustment incrementally rather than in one big step).

 

The timer tick however is simply an internal interrupt timer derived from the crystal oscillator that drives the CPU. This oscillator is not very accurate as it really doesn't matter that much if your CPU operates at 2.210000000000 GHz or 2.21100000000000 GHz. The timer tick is used to drive the RTC in between time server synchronizations but not more.

  • Like 1

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.