Jump to content

Recommended Posts

Hi

 

I need the UTC time and convert it to tics.

I use this code(LV2012/DotNet 2.0):

post-7489-0-07764600-1396525022.png

 

 

 

Why is the Now time(local) the same as UTCtime. There should be a difference of 2 hours!

 

I made a program in C# and it is showing the right time (2 hours between Local and UTC)

 

regards Bjarne

Edited by Bjarne Joergensen

Share this post


Link to post
Share on other sites
Hi

 

I need the UTC time and convert it to tics.

I use this code(LV2012/DotNet 2.0):

attachicon.gifUTCNow DotNet.png

 

 

 

Why is the Now time(local) the same as UTCtime. There should be a difference of 2 hours!

 

I made a program in C# and it is showing the right time (2 hours between Local and UTC)

 

regards Bjarne

 

Why not? The .Net DateTime structure has a property Kind which can have a value DateTimeKind.Utc or DateTimeKind.Local or DateTimeKind.Unspecified. LabVIEW timestamps are internally ALWAYS the number of seconds since midnight Jan 1st, 1904 UTC. So LabVIEW properly translates the .Net DateTime structure into its own Timestamp format and takes care about doing the proper translation depending of the DateTimeKind value in that structure. If you want to display UTC in the LabVIEW control you have to change the DisplayFormat of that control accordingly, not change the internal timevalue of the Timestamp. :D

 

I find it cleaner to change the property of the display element (or toString() method) than maintaining all kinds of extra flags with the timestamp itself although that does have some implications when you move timestamps between timezones. On the other hand maintaining also the relative timezone properties with each timestamp, while being more flexible, also requires a lot more data storage and all kinds of special case logic.

Share this post


Link to post
Share on other sites

Hi rolfk


Thank you.

I don't think I understand it all, but what I need is the UTC time in ticks and I can get that from .net.

I don't need the timestamp format, although it would be nice and easy to read:-).

I have made a .net dll and call it in labview and getting what I want.

 

regards
Bjarne

Share this post


Link to post
Share on other sites
Hi rolfk

Thank you.

I don't think I understand it all, but what I need is the UTC time in ticks and I can get that from .net.

I don't need the timestamp format, although it would be nice and easy to read:-).

I have made a .net dll and call it in labview and getting what I want.

 

regards

Bjarne

 

UTC time in ticks is very unspecific. Ticks is simply an arbitrary unit with an arbitrary reference time. Traditionally a tick was the 55ms interval timer tick used under DOS and Windows 16 bit and even early versions of Windows NT used that timer tick. Newer Windows versions use a timer tick of 1ms internally but still have functions that scale that to 55ms. The reference time is usually the start of the machine.

 

Obviously when you talk about UTC you most likely mean a more absolute value than the start time of the computer. Still, the reference time is very arbitrary. While the .Net DateTime version uses midnight, January 1, 0001 (supposedly UTC but the .Net DataTime.Tick() description is entirely unclear about this) with 100ns resolution, LabVIEW uses midnight Jan 1, 1904, GMT as reference with a 1s resolution. Windows itself has several different formats such as the C runtime time_t which is typically referenced to midnight Jan 1, 1970 UTC with a 1s resolution (same as what most Unixes or more specifically the C runtime library on Unix uses). But Windows also has a FILETIME format which is referenced to midnight Jan 1, 1601, UTC with a resolution of 100ns.

 

Now LabVIEW's timestamp format supports in fact fractional seconds with a resolution of 1/2^32 s and internally retrieves its values from a FILETIME value so if you convert the timestamp into a floating point value you still get about 1/2^20 s accuracy there for the foreseeable future which is about 1us. So if your reference time doesn't have to be specifically the .Net DataTime value all you would need to do is likely to simply convert the LabVIEW timestamp into a floating point value and you can forget about any external DLLs.

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


  • Similar Content

    • By drjdpowell
      In an attempt to standardize my handling of formatting timestamps as text, I have added functions to "JDP Science Common Utilities" (a VI support package, on the Tools Network).  This is used by SQLite Library (version just released) and JSONtext (next release), but they can also be used by themselves (LabVIEW 2013+).  Follows RFC3339, and supports local-time offsets.


    • By Manudelavega
      A part of my code converts a time stamp into a string. Another part will have to read this string and convert it back to a time stamp. However, when I run this quick example I've created, I get an error. Does anybody know what I'm missing?
       
      Thanks

    • By Phillip Brooks
      My new job has me starting from scratch DB wise. I worked with a homegrown schema for many years, but it doesn't match my needs here and I want to make this supportable log term (next poor soul), so I was thinking of using the out-of-box schema with MySQL.
       
      I've got MyQL 5.1 and TS 2010 installed; the TS 2010 MySQL implementation uses INSERT statements which lack my one pet peeve, a timestamp data type. I think this may be because MySQL at the time 2010 was developed only had a date-time data type.
       
      Does anyone know if a newer version of TestStand includes support for a timestamp data type with the MySQL template?
       

    • By mwebster
      Greetings,
      I've been playing around with the timestamp recently and I'd like to throw out some of my findings to see if they're expected behavior or not.
      One issue is that if you call Get Date/Time in Seconds, the minimum interval is ~1ms. Now this is much better than ~2009 where the minimum interval was ~14ms, if I recall correctly. Meaning, even if you call Get Date/Time in Seconds a million times in a second, you only get ~1000 updates.
      The other oddities involve converting to double or extended and from double/extended back to timestamp. It seems that when converting to double, at least with a current day timestamp, your resolution is cut off at ~5e-7 (0.5us). This is expected given that you only have so many signficant digits in a double and a lot of them are eaten up by the ~3.4 billion seconds that have elapsed since 7:00PM 12-31-1903 (any story behind that start date, btw?) However, when you convert to extended, you get that same double-level truncation, no extra resolution is gained.
      Now, when you convert from a double to a timestamp, you have that same 0.5us step size. When you convert an extended to a timestamp, you get an improved resolution of 2e-10 (0.2ns). However, the internal representation of the timestamp (which, as far as I know, is a 128-bit floating point) allows down to ~5E-20 (this is all modern era with 3.4 billion to the left of the decimal).
      One other oddity, if you convert a timestamp to a date-time record instead of an extended number (which gets treated like a double as noted above), you once again get the 2e-10 step size in the fractional second field.
      I've tried a few different things to "crack the nut" on a timestamp like flattening to string or changing to variant and back to extended, but I've been unable to figure out how to access the full resolution except via the control face itself.
      Attached is one of my experimentation VI's. You can vary the increment size and see where the arrays actually start recording changes.
      Ah, one final thing I decided to try, if I take the difference of two time stamps, the resulting double will resolve a difference all the way down to 5e-20, which is about the step size I figured the internal representation would hold.
      I would like to be able to uncover the full resolution of the timestamp for various timing functions and circular buffer lookups, but I would like to be able to resolve the math down to, say 100MHz or 1e-8. I guess I can work around this using the fractional part of the date-time record, but it would be nice to be able to just use the timestamps directly without a bunch of intermediary functions.
      Time Tester.vi
    • By Runjhun
      Hi,
      I'm plotting a bar graph which takes only double data input.
      My X-axis should be the timestamp and my Y-axis is the double data.
      Bt here I can't figure out any way to convert the timestamp to double so that I can use it as an input.
      Any suggestions.
      Regards,
      Runjhun A.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.