Let's say my timezone is America/Chicago. When Daylights Savings Time ends on Nov 1st, the clock will tick from 1:59:59 CDT (UTC -05:00) to 1:00:00 CST (UTC -06:00), if I understand DST correctly.
This means that on Nov 1st, there will be be two instances of the time span 1am to 1:59:59am. When I create an event in Google Calendar on Nov 1st from 1:30am to 1:45am, how does it know which span of 15min I'm referring to?
Naturally, it seems like the way to distinguish between the first instance of 1:30am to 1:45am America/Chicago and the second instance of 1:30am to 1:45am America/Chicago is by associating their respective offset (UTC -05:00 and UTC -06:00, respectively).
The issue is that Google Calendar doesn't seem to deal with offsets, only timezones, but in this case there are two different instances of 1:30am to 1:45am America/Chicago since the America/Chicago has two different offsets (UTC -05:00 DST, UTC -06:00 ST).
Am I missing something, or is this an edge case / UI issue?
Thanks!
This will depend on the user's perspective. As you can see in the documentation:
You can change your time zone and create events in certain time zones. No matter where you create an event, everyone will see it in their own time zone. This can help with travel plans or make it easy to create events for people around the world.
Calendar uses UTC Time Zone, so:
When events are created, they're converted into UTC, but you'll always see them in your local time.
This means your event at -5:00 UTC from 1:30am to 1:45 am will show up at that time, and after the Time Change it will not be there anymore, as the time changes at 1:59:59 after the event has ended.
In the case your event went from 1:50am to "2:15am", with the Time Change it would actually start at -5 UTC 1:50 and end at -6 UTC 1:15.
I have a few queries regarding the Time zones:
Can the time be captured in UTC alone?
Is UTC -6 and GMT -6 the same, and does that mean it is US local time?
Say, I have UTC time as "02-01-2018 00:03" does that mean my US local time is "01-01-2018 18:00"?
I have searched on Wikipedia and many related websites but haven't found a relevant explanation.
Astronomy versus Atomic clock
By the original definitions the difference is that GMT (also officially known as Universal Time (UT), which may be confusing) is based on astronomical observations while UTC is based on atomic clocks. Later GMT has become to be used at least unofficially to refer to UTC, which blurs the distinction somewhat.
GMT stands for Greenwich Mean Time, the mean solar time at the Royal Observatory in Greenwich on the south bank in Eastern London, UK. When the sun is at its highest point exactly above Greenwich, it is 12 noon GMT. Except: The Earth spins slightly unevenly, so 12 noon is defined as the annual average, mean of when the sun is at its highest, its culmination. In GMT there can never be any leap seconds because Earth’s rotation doesn’t leap.
UTC, which stands for Coordinated Universal Time in English, is defined by atomic clocks, but is otherwise the same. In UTC a second always has the same length. Leap seconds are inserted in UTC to keep UTC and GMT from drifting apart. By contrast, in GMT the seconds are stretched as necessary, so in principle they don’t always have the same length.
For roughly 100 years GMT was used as the basis for defining time around the world. Since the world these days mostly bases precise definition of time on atomic clocks, it has become customary to base the definition of time on UTC instead.
Edit: The original meaning of GMT is somewhat useless these days, but the three letter combination doesn’t seem to go away. I take it that it is often used without regard to whether UTC is really intended, so don’t put too much trust into the strict definition given above.
For your questions:
Yes, time can be captured in UTC alone. Storing time in UTC and using UTC for transmitting date-time information is generally considered good practice.
I suppose it’s up to each state of the US to define its time. And I don’t know, but I suppose that today they (officially or in practice) define time as an offset from UTC rather than GMT. The difference between the two will always be less than a second, so for many purposes you will not need to care. Central Standard Time (for example America/Chicago) is at offset -6, as is Mountain Daylight Time (for example America/Denver). On the other hand, offset -6 doesn’t necessarily imply a time in the US. Parts of Canada and Mexico use it too, of course, and also Galapagos and Easter Island.
I don’t think you got your example time exactly right, but yes, 2 January 2018 at 00:00 UTC is the same point in time as 1 January 2018 at 18:00 in Chicago and other places that are at UTC-6 in winter (winter on the Northern hemisphere, that is).
Further reading:
Systems of Time.
Current Millis with a simple and a complex take on UTC vs. GMT.
❌ The accepted Answer is neither correct nor useful.
✅ In contrast, the Answer by Ole V.V. correctly summarizes the technical differences — for details follow the links to detailed pages in Wikipedia.
For programmers building business-oriented apps, the upshot is that UTC is the new GMT. You can use the terms interchangeably, with the difference being literally less than a second. So for all practical purposes in most apps, no difference at all.
Here is some more practical advice, with code examples.
Strings
Say, I have UTC time as "02-01-2018 00:03" does that mean my US local time is "01-01-2018 18:00"?
That first part is a bad example, with the date-time string lacking an indicator of its offset or zone.
If a string indicates a specific moment, it must indicate either a time zone (Continent/Region formatted name) and/or an offset-from-UTC as a number of hours-minutes-seconds. If the string is meant to represent a moment at UTC itself, that means an offset-from-UTC of zero.
To write that string with an offset, various conventions may be applied. The best in practice is with both hours and minutes along with a colon, such as +00:00, +05:30, or -08:00. The leading zero and the colon are both optional but I have seen libraries break when encountering a value such as -0800 or -8.
Zulu
As a shortcut for an offset of zero, the letter Z is commonly used to mean UTC itself. Pronounced Zulu.
ISO 8601
Furthermore, best practice in formatting date-time textually for computing is to us the ISO 8601 standard formats. For a date-time the format YYYY-MM-DDTHH:MM:SS±HH:MM:SS is used. The T separates the date portion from the time-of-day portion. This format has advantages such as being largely unambiguous, easy to parse by machine, easy to read by humans across cultures. Another advantage is sorting alphabetically is also chronological. The standard accepts the Z abbreviation as well.
So your example UTC time as "02-01-2018 00:03" is better stated as 2018-01-02T00:03Z.
java.time
Be very aware that most programming languages, libraries, and databases have very poor support for date-time handling, usually based on a poor understanding of date-time issues. Handling date-time is surprisingly complicated and tricky to master.
The only decent library I have encountered is the java.time classes (see Tutorial) bundled with Java 8 and later, and its predecessor the Joda-Time project (also loosely ported from Java to .Net in the Noda Time project).
In java.time, a moment is represent in three ways. All have a resolution of nanoseconds.
InstantAlways in UTC. Technically, a count of nanoseconds since the epoch reference of first moment of 1970 (1970-01-01T00:00:00Z).
OffsetDateTimeA date with time-of-day in the context of a certain number of hours-minutes-seconds ahead of, or behind, UTC.
ZonedDateTimeA date with time-of-day in the context of a certain time zone.
So what is the difference between a time zone and an offset-from-UTC? Why do we need separate classes? An offset-from-UTC is simply a number of hours-minutes-seconds, three numbers, no more, no less. A time zone in much more. A time zone is a history of the past, present, and future changes to the offset used by the people of a particular region.
What changes? Changes dictated by the whims or wisdom of their politicians. Politicians around the world have shown a predilection for changing the offset used by the time zone(s) in their jurisdiction. Daylight Saving Time (DST) is one common pattern of changes, with its schedule often changed and the decision to enact or revert from DST sometimes changed. Other changes happen too, such as just in the last few years North Korea changing their clock by half-an-hour to sync with South Korea, Venezuela turning back their clock half-an-hour only to jump back forward less than a decade later, Turkey this year canceled the scheduled change from DST to standard time with little forewarning, and contemporary Russia having made multiple such changes in recent years.
Back to your example in your point # 3, let's look at some code.
Say, I have UTC time as "02-01-2018 00:03" does that mean my US local time is "01-01-2018 18:00"?
Your example strings have another problem. That 03 minute in the first part is ignored your second part, an apparent typo. I know because there is no time zone adjustment in effect in the Americas on that date involving a fractional hour of 57 minutes.
Not a moment
First, we parse your input string. Lacking any indicator of zone or offset, we must parse using the LocalDateTime. The name LocalDateTime may be misleading, as it does mean a specific locality. It means any or all localities. For more explanation, see What's the difference between Instant and LocalDateTime?.
String input = "2018-01-02T00:03" ; // Text of a date with time-of-day but without any context of time zore or offset-from-UTC. *Not* a moment, *not* a point on the timeline.
LocalDateTime ldt = LocalDateTime.parse( input ) ; // Parsing the input as a `LocalDateTime`, a class representing a date with time but no zone/offset. Again, this does *not* represent a moment, is *not* a point on the timeline.
UTC
By the facts given in the Question, we know this date and time was intended to represent a moment in UTC. So we can assign the context of an offset-from-UTC of zero hours-minutes-seconds for UTC itself. We apply a ZoneOffset constant UTC to get a OffsetDateTime object.
OffsetDateTime odt = ldt.atOffset( ZoneOffset.UTC ); // We are certain this text was intended to represent a moment in UTC. So correct the faulty text input by assigning the context of an offset of zero, for UTC itself.
Time zone
The Question asks to see this moment through a wall-clock time of six hours behind UTC used in the United States. One time zone with such an offset is America/Chicago.
Specify a proper time zone name in the format of continent/region, such as America/Montreal, Africa/Casablanca, or Pacific/Auckland. Never use the 2-4 letter abbreviation such as CST, EST, or IST as they are not true time zones, not standardized, and not even unique(!).
ZoneId z = ZoneId.of( "America/Chicago" ) ; // Adjust from UTC to a time zone where the wall-clock time is six hours behind UTC.
ZonedDateTime zdt = odt.atZoneSameInstant( z ) ;
See this code run live at IdeOne.com.
odt.toString(): 2018-01-02T00:03Z
zdt.toString(): 2018-01-01T18:03-06:00[America/Chicago]
Same moment, different wall-clock time
This odt and zdt both represent the same simultaneous moment, the same point on the timeline. The only difference is the wall-clock time.
Let's work an example, using Iceland where their time zone uses an offset-from-UTC of zero hours-minutes-seconds. So the zone Atlantic/Reykjavik has a wall-clock time identical to UTC. At least currently today their wall-clock time matches UTC; in the past or future it may be different, which is why it is incorrect to say “UTC is the time zone of Iceland”. Anyway, our example… say someone in Reykjavík, Iceland with 3 minutes after midnight on the clock hanging on their wall makes a phone call to someone in the US. That US person lives in a place using the Chicago region time zone. As the person called picks up their phone, they glance up at the clock hanging on their wall to see the time is just after 6 PM (18:03). Same moment, different wall-clock time.
Also, the calendars hanging on their walls are different, as it is “tomorrow” in Iceland but “yesterday” in mainland US. Same moment, different dates!
About java.time
The java.time framework is built into Java 8 and later. These classes supplant the troublesome old legacy date-time classes such as java.util.Date, Calendar, & SimpleDateFormat.
The Joda-Time project, now in maintenance mode, advises migration to the java.time classes.
To learn more, see the Oracle Tutorial. And search Stack Overflow for many examples and explanations. Specification is JSR 310.
You may exchange java.time objects directly with your database. Use a JDBC driver compliant with JDBC 4.2 or later. No need for strings, no need for java.sql.* classes.
Where to obtain the java.time classes?
Java SE 8, Java SE 9, Java SE 10, Java SE 11, and later - Part of the standard Java API with a bundled implementation.
Java 9 adds some minor features and fixes.
Java SE 6 and Java SE 7
Most of the java.time functionality is back-ported to Java 6 & 7 in ThreeTen-Backport.
Android
Later versions of Android bundle implementations of the java.time classes.
For earlier Android (<26), the ThreeTenABP project adapts ThreeTen-Backport (mentioned above). See How to use ThreeTenABP….
The ThreeTen-Extra project extends java.time with additional classes. This project is a proving ground for possible future additions to java.time. You may find some useful classes here such as Interval, YearWeek, YearQuarter, and more.
There is no time difference between Coordinated Universal Time and Greenwich Mean Time.
7:17 AM Friday, Coordinated Universal Time (UTC) is
7:17 AM Friday, Greenwich Mean Time (GMT)
Key difference: Both UTC and GMT are time standards that differ in terms of their derivation and their use.
To quote timeanddate.com:
The Difference Between GMT and UTC:
Greenwich Mean Time (GMT) is often interchanged or confused with
Coordinated Universal Time (UTC). But GMT is a time zone and UTC is a
time standard.
Although GMT and UTC share the same current time in practice, there is
a basic difference between the two:
GMT is a time zone officially used in some European and African countries. The time can be displayed using both the 24-hour format (0 - 24) or the 12-hour format (1 - 12 am/pm).
UTC is not a time zone, but a time standard that is the basis for civil time and time zones worldwide. This means that no country or
territory officially uses UTC as a local time.
GMT is a mean solar time calculated at the Greenwich meridian. https://www.rmg.co.uk/discover/explore/greenwich-mean-time-gmt
UTC is based on the extremely regular "ticking" of caesium atomic clocks. https://en.wikipedia.org/wiki/Coordinated_Universal_Time
They are neither based on the same time nor calculated the same way. IMHO, the wording on https://currentmillis.com is misleading, at best, if not just flat out incorrect.
How can I know if the server where DB2 is running is configured with DST?
CURRENT TIMEZONE special register gives me the difference between UTC and the timestamp, but that difference can change (Summer/winter). Finally, I am not sure in which timezone the server is configured.
Let's suppose Paris is at +1 GMT. In summer, the CURRENT TIMEZONE is 20000, because DST is active, that means 2 hours. In winter, the CURRENT TIMEZONE INDICATES 10000, that corresponds exactly to the +1 GMT.
I would like to retrieve the name of the current timezone, and I could do that by matching the names and values of the timezone (stored in a table or inside a function), with the CURRENT TIMEZONE register and DST.
But how can I get the DST?
https://en.wikipedia.org/wiki/Daylight_saving_time_by_country
https://en.wikipedia.org/wiki/Time_zone
You said:
I would like to retrieve the name of the current timezone, and I could do that by matching the names and values of the timezone (stored in a table or inside a function), with the CURRENT TIMEZONE register and DST.
That is incorrect. Any list that has a time zone offset mapped to a time zone name or identifier is bound to be full of errors, even if it has a flag for if DST is supported or not.
The problem is that many different time zones use the same offset and use DST, but enter and exit DST at different dates and/or different times. Time zones also undergo changes to their rules for what offsets they follow (such as Samoa did in 2011), or for when DST starts and stops (such as the USA did in 2007). And in some cases, DST can be permanently set on or off (such as Russia did in 2011 and is undoing this year).
If you are just wanting to know whether the time zone where your database server is running is in DST or not - you can check the UTC offsets for Jan 1st and Jul 1st. If they are the same, then the time zone doesn't use DST. If they are different, then check the offset for the current time and see which it aligns with. If it's the larger offset, then DST is in effect. If it's the smaller offset then DST is not in effect.
Disclaimer: I am assuming DB2 has a function to get the offset for a particular date. I have not verified that and have little knowledge of DB2 itself.
I have WCF service, one service method return array of some objects, single object contain some date values, for example {14-05-2013 08:00:00} Kind: Unspecified.
I can see in debug mode this value before return point in method.
On cleint side I getting JSON object that contain wrong date value for my property:
Date(1368511200000+0200)
it is equal to Tue May 14 2013 09:00:00 GMT+0300 (FLE Daylight Time)
it happens just in case when client (browser) and IIS server in different time zones.
Why I see shifted date values and how fix it ?
Thanks.
The date values stay the same, but the presentation shifts because your timezone changes.
08:00 in Berlin is 07:00 in London.
If you want to transfer the same presentation regardless of the fact that it's no longer the same instant in time once this presentation crosses time zones, you could send it as string instead of date.
You could also change the kind of your DateTime to UTC, but that would have implications on your server side as well.
More information about time zone conversion is available here.
I've got times saved in a sql database in utc format. I'm displaying those times on a gridview, however they are still UTC format. I'd like to convert them to the client's browsers local time. The problem is that although I can get the timezone offset that is only for the current date/time. That offset could change if some of those dates in the future end up occuring during daylight savings time. I'm relatively new to web programming but it seems like what I need to do is run some Javascript as each entry binds to the gridview that somehow takes the C# datetimeoffset object and converts it to a local time. Or maybe that's not possible?
This can be done on the server side if you have a TimeZoneInfo object. You can use the static ConvertTimeFromUtc() method.
In C#:
DateTime localTime = TimeZoneInfo.ConvertTimeFromUtc(myDbDateTime, myTimeZoneInfo);
If you do not have the timezone on the server side things get tricky since javascript does not provide the client's timezone (unless they are in the US, and even then only on some browsers). In this case, it may be best to force the user to select their current timezone and store it against their account. If this is being displayed to anonymous users, you should probably display in UTC by default and provide an option to refresh it for a selected timezone.
Update
There are several issues which appear when trying to automatically determine a user's timezone.
Timezone is not provided to the server by the user agent.
Javascript does not provide access to the timezone (except in some browsers, sometimes).
The javascript function getTimezoneOffset() may initially sound like a good idea, but since there are multiple timezones with the same offset, this is not a unique value. The difference between many of these non-unique zones is their implementation of daylight saving time.
Example: Indiana does not regard DST. Therefore, for half the year their offset matches eastern time, while the other half their offset is equal to central time.
If, however, your user base is located primarily in the US and uses IE, Chrome, Safari, or Firefox, than you can use the toString() method on a Date object to obtain the timezone. These browsers append the timezone to the date string in different ways. Outside the US, the timezone is not included in all browsers (though some may still show it).
Open http://jsbin.com/onulo3 to observe:
IE8: Sun Feb 14 22:12:22 EST 2010
Chrome: Sun Feb 14 2010 22:12:22 GMT-0500 (Eastern Standard Time)
Safari: Sun Feb 14 2010 22:12:22 GMT-0500 (Eastern Standard Time)
Firefox: Sun Feb 14 2010 22:12:22 GMT-0500 (Eastern Standard Time)
With some parsing, you can now determine the timezone for all your American users. For everyone else you can display the time in UTC (with a notice to that effect).
I found the following on "Indiana Daylight Savings Time":
http://www.timetemperature.com/tzus/indiana_time_zone.shtml
As of now, 1/18/2010, a Microsoft system library call
TimeZoneInfo.ConvertTimeFromUtc seems to reflect this behavior, am checking ..
Here are two ways to convert.
Way #1: If you have the datetime in seconds-since-epoch format, just use something like:
var d=new Date(seconds_since_epoch);
document.write(d.toString());
Way #2: If you have only the year, month, day, hour, minute, and second in UTC time, use:
var d=new Date(yyyy,mo-1,dd,hh,mi,ss); // in JavaScript, January is month 0
document.write(d.toString());