What does SSMS show for a DateTimeOffset(7)? - datetime

It looks like such a simple thing, but neither of us know it for sure....
I have a DateTimeOffset(7) column defined in a table.
SSMS shows this value:
2014-09-11 08:00:00.0000000 +02:00
(we are currently in GMT +2)
Does this mean, it's 08.00 in OUR timezone (so we can determine the GMT time by subtracting the +2 from the 08.00 (which then is 06.00) or is it 08.00 GMT and it's 10.00 (08.00 + 2) in OUR timezone.

From the documentation for DateTimeOffset:
hh is two digits that range from 00 to 14 and represent the number of
hours in the time zone offset.
mm is two digits, ranging from 00 to
59, that represent the number of additional minutes in the time zone
offset.
(plus) or – (minus) is the mandatory sign for a time zone offset. This indicates whether the time zone offset is added or subtracted
from the UTC time to obtain the local time. The valid range of time
zone offset is from -14:00 to +14:00.
So in your example, it's 08:00 in YOUR timezone (+02:00).

Related

Scheduling events with changing Time zone

I need help with this scenario:
1) Currently it is summer time. I need to create a time interval for June 9 Monday 6 PM - 7 PM EDT and every week after that until end of 2018. This interval will be for students to schedule appointments with a tutor. The client right now sends that as a request for creating start time at June 9 Mondays 2 PM UTC. (EDT is -4 hours offset) The server creates a start time in db for June 9 2 PM UTC and adds 7 days worth of milliseconds to create recurring
^ this causes an issue because of DST. Let's say it is right now November 5th (which is after daylights saving change). The DB still has Nov 5, 2 PM UTC saved as value. But because my timezone changed, instead of offsetting by 4 hours like I did on June, I offset by 5 hours. So the correct start time is "6 PM session in my timezone" becomes "7 PM my timezone". this is the error
the solution is either of one of two (or combination of both):
1) instead of adding 7 days worth of milisecond, you add 1 week worth of miliseconds depending on the user's timezone Currently, there's no way to extract a person's timezone based on utc offset (-400, which is right now in east coast USA, is also applicable to Canada, Carribeans, South America etc. We need to save a user's timezone as a string, rather than UTC offset count. There is an international standard for timezones)
2) ?? something else

What date format is 636529536000000000?

I have to maintain an ASPX page that increments the date/time by passing a value in the querystring in this format:
636529536000000000 in reference to 31 January 2018
636530400000000000 in reference to 01 February 2018
The url format is: /reservas.aspx?t=636530400000000000
What is this date/time format?
It is the number of ticks where a tick is one hundred nanoseconds or one ten-millionth of a second. The number of ticks is measured since the epoch DateTime.MinValue (12:00:00 midnight, January 1, 0001). For example:
new DateTime(636529536000000000).ToString("F", CultureInfo.InvariantCulture)
outputs:
Wednesday, 31 January 2018 00:00:00
Could be a number of days from certain date, similar to julian date calculation:
https://en.wikipedia.org/wiki/Julian_day#Julian_date_calculation
Potentially incorporating the time as well?
Without details of the code I cant really advise from a provided value.

What does the last number in UNIX timestamp mean?

I have a few UNIX timestamps that I've been converting back and forth, and I notice that the last number of the timestamp would change without causing any difference in the date.
For example, if you convert this number to normal date:
1452120848 > 6-1-16 17:54
But if you convert it back:
6-1-16 17:54 > 1452120840
As you can see the last number was changed to a zero. I tried some of the online converters and discovered that the last number could be any number and the date wouldn't change. What does it mean?
The unix time is the time in seconds since 1970.
You don't convert the seconds part of your date, thus it's 'lost' - your numbers may differ by up to 60.
The timestamp of 1452120848 is actually: Wed Jan 6 22:54:08 2016
So you're missing 8 seconds.
The UNIX timestamp gives you the seconds since 1st January 1970 00.00.00 UTC. Since this is seconds and you are just printing up to minutes, the difference is not shown.
However, they are not the same date:
$ date -d#1452120848
Wed Jan 6 23:54:08 CET 2016
$ date -d#1452120840
Wed Jan 6 23:54:00 CET 2016

understanding epoch time to calculate password ageing in unix

my_current_epoch=15684 equivalent time stamp is Thu, 01 Jan 1970 04:21:24
last_password_reset_epoch_time=15547 equivalent time stamp is Thu, 01 Jan 1970 04:19:07
I am not able to understand how difference of these two will give the days since last password reset.
As per my understanding epoch time is denoted in seconds that has elapsed since Jan 1,1970
Can someone please help me understanding this.
man 5 shadow on a Linux box says:
The date of the last password change is given as the number of days since Jan 1, 1970. The password may not
be changed again until the proper number of days have passed, and must be changed after the maximum number
of days. If the minimum number of days required is greater than the maximum number of day allowed, this
password may not be changed by the user.
So, you can find out to within 24 hours when a password was changed by multiplying the value from /etc/shadow by 86400 (the number of seconds in a day — but you didn't need me to tell you that, did you?).
For the values given (bc to the rescue):
15684*86400 = 1355097600
15547*86400 = 1343260800
And:
$ timestamp -u 1355097600 1343260800
1355097600 = Mon Dec 10 00:00:00 2012
1343260800 = Thu Jul 26 00:00:00 2012
$
Timestamp is my program; modern versions of date can handle this too. The -u means 'report in UTC (aka GMT)' rather than in my time zone.
"epoch" value in /etc/shadow = 15684
the seconds in 24 hours (because normally "epoch" value shows in seconds but for some reason (to make compact view, maybe) in /etc/shadow file "epoch" value displays in days, not in seconds) = 24 * 60 * 60 = 86400
And by multipliying these two numbers: 15684 (days) x 86400 (seconds per day); we will get the number 1355097600.
Afterwards, either using Epoch Converter by copy/paste the final result, you can get the date, or
just use date --date #$(( 15684 * 86400 )) command in cli

How to calculate day of the week from timestamp? (DST)

I am developing code for device where datetime library is not available (note: floats also unavailable), so I have to do math myslef.
My timestamp is seconds from 1 Jan 2000 (in UTC).
In configuration of device I have:
current timezone as number of hours +/- from UTC
dst as number of hours to add
I need to know:
current day of week
current hour
Calculating current hour is pretty easy:
timestamp % 86400 # seconds from midnight
Calculating day of the week (1-monday,7-sunday):
dayofweek = (timestamp - 86400) % (86400*7) / 86400
if dayofweek = 0:
dayofweek = 7
notes:
86400 = seconds in one day
But before calculations I should:
1. add timezone hours
2. add DST hours
The problem is how to calculate if DST hours (for European Summer Time only) should be added or not? I need to do this efficiently beacuse I have very limited computing power and I need to do this as fast as possible :-)
To determine if DST is applied, you need to know day and month as well. In Europe, the change is on last weekend in March/last weekend in October. Would suggest you apply timezone offset without DST, do your calculations to get hour, day of week, day and month, and then if you are in DST, you may need to adjust any or all of these values (depending on the original value of hour, it may just be hour that needs adjusting).
By doing the timezone offset first, you are getting the local hour/day of week/day values correct without DST, then the DST adjustment is trivial.

Resources