I have a gap in Unix time understanding. Unix time started to be counted on 1/1/1970 but in what timezone?
Say it´s 31st of December 1969 11p.m. in London (-3600 Unix time)
In Sidney they have 8 a.m. 1st of January 1970 (28 800 Unix time) in the same time.
So my question is when did they start counting Unix time? 1/1 1970 of what timezone?
Thank you
"Unix time" should always be UTC.
http://pubs.opengroup.org/onlinepubs/9699919799/basedefs/V1_chap04.html#tag_04_15
Wikipedia has some further verbiage around this at https://en.wikipedia.org/wiki/Unix_time#UTC_basis:
The precise definition of Unix time as an encoding of UTC is only
uncontroversial when applied to the present form of UTC. Fortunately,
the fact that the Unix epoch predates the start of this form of UTC
does not affect its use in this era: the number of days from 1 January
1970 (the Unix epoch) to 1 January 1972 (the start of UTC) is not in
question, and the number of days is all that is significant to Unix
time.
Related
Is any math can sort out date from time-milli? (eg:1544901911)
It is possible to get time by a initial modulus of 86400 and dividing 3600 (hour) and modulus by 3600 and dividing by 60 (minute) of overal milli.
Is it possible to get date from these, I really don't know how it works (just knows that it begined from 1970 Jan 1 onwards).
Not using any code language, I am just asking the mathematics behind this.
I have problems making sense of what you wrote. 86400 is the number of seconds in a day. So if you have the time in seconds, and want the time of the day, then modulo 86400 makes sense. As your question starts with time in milliseconds, modulo 86400000 would be more appropriate. But I guess we get the idea either way.
So as I said, extracting the time of the day works as you know the number of seconds in a day. The number of seconds in a year is harder, as you have to deal with leap days. You can have a look at existing standard library implementations, e.g. Python datetime. That starts by taking the time (or rather number of days, i.e. time divided by time per day, whatever the unit) modulo 400 years, since the number of whole days in 400 years is fixed. Then it goes on looking at 100 year cycles, 4 year cycles and years, with case distinctions for leap years, and tables containing information about month lengths, and so on. Yes, this can be done, but it's tedious plus already part of standard libraries for most languages.
Note: For convenience, PowerShell is used to demonstrate the behavior, but the question is about the behavior of the System.DateTimeOffset .NET type.
Note: As Matt Johnson points out, the behavior in question only happens on Unix-like platforms (macOS, Linux, using .NET Core).
Seemingly, when System.DateTimeOffset converts a local date, the resulting time-zone offset (offset from UTC) differs for dates before 19 Nov 1883.
Consider the following PowerShell command:
([datetime] '1883-11-18'), ([datetime] '1883-11-19') |
ForEach-Object { ([datetimeoffset] $_).Offset.ToString() }
which, when invoked in the Eastern Time Zone yields:
-04:57:00
-05:00:00
(DST (daylight-saving time) didn't come into play until 1918.)
I suspect that this is related to the following, excerpted from Wikipedia, emphasis added:
[...] had brought the US railway companies to an agreement which led to standard railway time being introduced at noon on 18 November 1883 across the nation.
Can anyone confirm this and provide more information?
Why, specifically, did the offset change by 3 minutes?
Searching for 1883 in the entire .NET documentation yields just the following, passing reference:
[...] the U.S. Eastern Standard Time zone from 1883 to 1917, before the introduction of daylight saving time in the United States
When running on Unix-like platforms (Linux, OSX, etc.), the time zone data in the operating system originates from the IANA time zone database. Early dates in this data set are referenced by their Local Mean Time (LMT), which is calculated from the latitude and longitude of the reference city. In this case, US Eastern Time being represented by America/New_York has an LMT entry that aligns with your reported findings.
From the tzdb:
# Zone NAME GMTOFF RULES FORMAT [UNTIL]
Zone America/New_York -4:56:02 - LMT 1883 Nov 18 12:03:58
-5:00 US E%sT 1920
-5:00 NYC E%sT 1942
-5:00 US E%sT 1946
-5:00 NYC E%sT 1967
-5:00 US E%sT
Note that seconds in the offset are truncated on import due to the precision of offsets supported by DateTimeOffset.
You can read more about LMT (and lots more) in the theory file in the tz database.
On Windows, you would not see this behavior, because Windows doesn't have time zone data for the Eastern Standard Time time zone earlier than the 2006-2007 transition. Thus dates earlier than 1987 may be converted incorrectly. If that matters to you, you'll need another source of time zone information, such as the copy of tzdb included with Noda Time.
Also note that local time is implied with casting from DateTime to DateTimeOffset, but only when the input value's .Kind is DateTimeKind.Local or DateTimeKind.Unspecified. If instead it's DateTimeKind.Utc, then the resulting DateTimeOffset will have a zero offset.
In the docs for DateTime I see that Calendar.std_offset is "The time zone standard offset in seconds (not zero in summer times)" from this link
A Calendar.utc_offset is the offset in seconds from Coordinated UTC time according to Wikipedia. So what is the purpose of Calendar.std_offset? What does it do? It seems you could specify an offset purely from "utc_offset". Is the "std_offset" to account for Daylight savings time only?
The standard offset is the offset to apply to the standard time/UTC offset during summer/daylight savings time for the given zone. So given a UTC offset of 5 hours and a standard offset of 1 hour, the total summer/daylight savings time offset is 6 hours, and the standard time/UTC offset is 5 hours.
I have a program which saves some data to an NFC tag. The NFC tag only has some bytes for memory. And because I need to save a date and time in minutes (decimal) to the tag, I need to save this in the most memory efficient way possible. For instance the decimal number 23592786 requires 36 bits, but if the decimal number is converted to a base36 value it only requires 25 bits of memory.
Number 23592786 requires 25 bits, because binary representation of this number is 25-bit length. You can save some bits, if date range is limited. One year contains about 526000 minutes, so interval in minutes from 0:00 1st Jan 2000 (arbitrary start date) will take 24 bits (3 bytes) and represents dates till 2031 year.
The simplest might be to use a Unix time this gives the the number of seconds since Jan 1 1970, this typically takes 32 bits. As MBo has said you can reduce the number of bits by 6, by jut counting minutes or by choosing a more recent start date. However there are advantages in using an industry standard. Depending on you application you might be able to get it down to 2 byte which could represent about 45 days.
Time is often converted into numeric parameter (e.g., to miliseconds or other units) elapsed from a reference date (epoch time)
The overview on wikipedia is very incomplete:
http://en.wikipedia.org/wiki/Epoch_%28reference_date%29
What is is the list of epoch times for all possible OS platforms and major programming languages?
(e.g., R (running on different OS platforms,unix, windows, Android, Apple, Perl, Python, Ruby, C++, Java).
In most modern frameworks, it's the Unix/POSIX standard of 1/1/1970.
You asked about R - it's 1/1/1970. Refrence Here
Most languages/frameworks that are cross platform either do this internally, or they abstract it. It would be too painful otherwise. Imagine having to compensate for a different epoch every time you re-targeted. That would be aweful.
BTW - There is another list here that may be more interesting to you.
I have got the following data:
In a computing context, an epoch is the date and time relative to which a computer's clock and timestamp values are determined. The epoch traditionally corresponds to 0 hours, 0 minutes, and 0 seconds (00:00:00) Coordinated Universal Time (UTC) on a specific date, which varies from system to system. Most versions of Unix, for example, use January 1, 1970 as the epoch date; Windows uses January 1, 1601; Macintosh systems use January 1, 1904, and Digital Equipment Corporation's Virtual Memory System (VMS) uses November 17, 1858.
Reference : Epoch
also you can see this : Epoch Computer
In Java epoch is like unix i.e. January 1 1970 Midnight which is used in programming widely.