Jump to content

Date/Time To Seconds Function and Daylight Savings


Recommended Posts

I just ran into a problem with my coding... And I think I found the issue, I just don't understand the WHY? After the time change this last weekend, my elapsed time has been off by an hour, it added an hour to the actual elapsed time. 

My software grabs the time in seconds when something starts and uses the Elapsed Time to calculate how long it's been running. This has been working just fine until Daylight Savings Time hit me.

image.png.d35b72b0b7a02d0bcf8e81e222fd014e.png

I'm guessing with my DST set to 1 in the cluster constant that for the last several months my elapsed time calculations have been correct because the actual Daylight Savings was 1, but when Daylight Savings Time changed this last weekend this DST should be a 0. The help file for the Date/Time to Seconds Function states the following:

DST indicates whether the time is standard (0) or daylight savings time (1). You also can set DST to -1 to have the VI determine the correct time automatically each time you run the VI. If is UTC is TRUE, the function ignores the DST setting and uses Universal Time.

This statement tells me I should have set DST to -1 in the cluster constant. My work computer won't allow me to change the date of my computer so I can't think of an easy way to test this. 

I'm hoping you guys years of experience can tell me if I'm right. I know this works for now, but I don't want the problem to return when we spring forward next spring... Are there pitfalls to setting DST to -1 and just letting the OS take care of Daylight Savings Time?

Thanks in advance for your help... You guys always have the right answers...

Bob

 

 

Link to comment

This is not strictly the answer to your question, but if you only use this for elapsed time, I suggest you switch to UTC time instead... There is no daylight savings time in UTC frame of reference, so you are certain it won't hit you.

Also, here's a library (shameless plug) you might find useful: https://www.vipm.io/package/labview_open_source_lib_epoch_datetime/
It serves Unix, GPS and ISO time, including support for leap years (and even leap seconds... all 37 of them)

Link to comment

I also strongly recommend using UTC usually (with conversion to Local Time for display).  And using an existing, hopefully well tested, library to convert to/from ISO-8601 (my own functions are in https://www.vipm.io/package/jdp_science_lib_common_utilities/, though they are intended just for the RFC3339 internet standard and may not be as complete as Francois').

Link to comment

But to answer you question, I would go with DST=-1.  There will be one hour per year (Fall Back) where LabVIEW won't be able to know what the UTC time actually is (as there are two possibilities), but otherwise it would work.  Note, though, that i never use that function that you are using.

Link to comment

Not sure if this is related, but I have had an issue where if the LabVIEW program had been running and the timezone was changed, the time in LabVIEW would not be updated until the program was restarted. It appears LabVIEW caches the the timezone information and the beginning of the program and never receives an event that the timezone has ben updated. 

I wonder if it does the same for DST.

And yes, I use UTC times for saving metadata in the file, but the users of the program like a local time stamp on the file name such that it matches their notebook log.

Link to comment

LabVIEW also caches DST.

That's not funny if you measure some data at 1500 local time in USA (with or without DST), then go to europe (with or without DST and with or without restart of the program). Then load your data and want to see the correkt measurement time in all possible cases.

 

Link to comment

The issues that I have found are frustrating are the following:

  1. Staff go on field test with their laptops in another timezone, and forget to change the timezone. Although the metadata is in UTC, when they look at the time in their file stamps(time created, etc,), they get all confused.
  2. Timezone caching, bit me on a field test, luckily I noticed. Started the program, realized I had not changed the TZ, and then wondered why my timed start did not update correctly.
Link to comment

Wow, I think I'm happy this Daylight Savings hasn't just bit me... Sorry guys but I'm always happier in a full boat, even if its taking on water!!!

I think the overall consensus is to use UTC within the software and convert to Time Zone corrected display for the user. I'm guessing UTC just keeps marching on and never changes for Daylight Savings Time, but my conversion for the user would need to change. I'm guessing to do the conversion I would use the DST value being 0 for standard and 1 for Daylight Savings time to determine how I'm converting. I'm hoping this DST value changes right at Daylight Savings Time. I think all this makes sense.

I'm finding it difficult to wrap my hard head around Dr. Powell's comment "I would go with DST=-1.  There will be one hour per year (Fall Back) where LabVIEW won't be able to know what the UTC time actually is (as there are two possibilities), but otherwise it would work.  Note, though, that i never use that function that you are using." How are there two possibilities about actual UTC Time? I can't have one hour a year when time isn't correct, my luck would have that one hour out of 8,760 hours be the actual time I truly need.

Dr. Powell says he never uses Date/Time to Seconds Function... What do you use? What makes your method stronger? I'm sorry, I'm just crazy busy right now and I'm trying to get the answers I need to correct my issue so when I get time, I won't be asking you guys questions... That said I haven't taken the time to look at your Libraries yet. I will soon, I promise.

Any extra thoughts, concerns, knowledge would be greatly appreciated.

Thanks,

Bob

Link to comment
8 hours ago, rharmon@sandia.gov said:

I'm finding it difficult to wrap my hard head around Dr. Powell's comment "I would go with DST=-1.  There will be one hour per year (Fall Back) where LabVIEW won't be able to know what the UTC time actually is (as there are two possibilities), but otherwise it would work.  Note, though, that i never use that function that you are using." How are there two possibilities about actual UTC Time? I can't have one hour a year when time isn't correct, my luck would have that one hour out of 8,760 hours be the actual time I truly need.

There I was thinking about a simple fix to your immediate problem, rather than the much bigger change of switching to UTC.  Using DST=-1 is to use true Local Time, which is what your Users will find most intuitive.  But one time a year your times will jump back one hour, which is a rather problematic behaviour.  You could use Local Time never corrected for Summer Time; easier for you but confusing to Users.

Aside: I suffer the programmer disadvantage of living in the Western European Timezone, which is the same as UTC for half the year, making it hard to notice errors in my UTC-Local conversion.  I often have my computer set to a far away Timezone when I'm debugging time-related things.

Link to comment
9 hours ago, rharmon@sandia.gov said:

Dr. Powell says he never uses Date/Time to Seconds Function... What do you use? What makes your method stronger?

I used to use  Scan From String function, with %T codes:

143804446_2021-11-1108_53_53-Untitled109BlockDiagramonKeitSpectrometer.lvproj_MyComputer_.png.8bff9b0e442f960ad0888a077fbf47d0.png

But I wouldn't say this was any stronger than your method.

Now I use a reusable library for RFC3339 (subset of ISO-8601).  This uses Scan From String mostly, but is robust against a variety of edge cases.  Also handles things like the Local Time plus Offset format, which is nice in that it stores both Local Time and UTC time:

735772709_2021-11-1109_01_25-Untitled109BlockDiagramonKeitSpectrometer.lvproj_MyComputer_.png.8f84ba11bc30561a7c5591ba75b37673.png

Edited by drjdpowell
Link to comment

Thanks for your great explanation. As I look into modifying/correcting my issue changing to UTC time would be very involved and probably take more time then I have available to accomplish. I'm leaning toward just changing the DST to -1 and moving on.

Just so I think I understand, by changing DST to -1 the operating system will make the corrections for Daylight Savings... When we leap forward... My data will be appear to be missing one hour of data. The data will still be taken, but the timestamp will increase making it appear I lost one hour of data.

And when we fall back my data will appear to take one hours data again. The data will still be correct, it will just appear I have taken two sets of data during that one hour... Does that make any sense at all???? My head is spinning.

Thank you all again... 

Link to comment

ISO8601 just as it's semi-sibling RFC3339 also supports an optional timezone offset identifier. Mr. Powels library deals with that and you should probably use his library.

Basically if it is not present you need to treat the timestamp as being in local timezone (which can be tricky if interpreting data that was acquired somewhere else). Local Timezone offset means using the -1  indicator for the DST value in the cluster, and yes LabVIEW still needs to learn to listen to the OS message WM_SETTINGCHANGE that indicates among many other things a date/time attribute change as requesting that information every time from the OS when it deals with time conversions would be to costly.

Edited by Rolf Kalbermatter
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.