Convert UTC micros to julia date time - julia

I have a datetime I'm getting from golang that is in unix microseconds (The number of microseconds since January 1st 1970).
1652681499679534
I want to get it into a julia datetime. What is the proper calculation for that?

julia> unix2datetime(1652681499679534 / 10^6)
2022-05-16T06:11:39.680

Related

Date Time Conversions in PySpark

Can someone please explain me how the below epoch time
epoch time/unix-timestamp :1668443121840
converts to the date : 2022-11-14T16:25:21.840+0000
How is the conversion taking place and additionally how to identify an epoch timestamp if it is mentioned in seconds, milliseconds, microseconds or nanoseconds?
Additionally, is there a function in pyspark to convert the date back to epoch timestamp?
Thanks! in advance.
I tried a number of methods but I am not achieving the expected result:
t = datetime.datetime.strptime('2021-11-12 02:12:23', '%Y-%m-%d %H:%M:%S')
print(t.strftime('%s'))
As I am not able to control the format or accuracy in terms of seconds, milliseconds, microseconds or nanoseconds.
The epoch time/unix-timestamp uses a reference date: 00:00:00 UTC on 1 January 1970. It counts the seconds/milliseconds from that date.
The value you are looking for is in miliseconds, so you would have to calculate the milliseconds and concatenate with the epoch time:
import pyspark.sql.functions as F
df = spark.createDataFrame([('2022-11-14T16:25:21.840+0000',)]).toDF("timestamp")\
df\
.withColumn("timestamp",F.to_timestamp(F.col("timestamp")))\
.withColumn("epoch_seconds",F.unix_timestamp("timestamp"))\
.withColumn("epoch_miliseconds",F.concat(F.unix_timestamp("timestamp"), F.date_format("timestamp", "S")))\
.show(truncate=False)
# +----------------------+-------------+-----------------+
# |timestamp |epoch_seconds|epoch_miliseconds|
# +----------------------+-------------+-----------------+
# |2022-11-14 16:25:21.84|1668443121 |16684431218 |
# +----------------------+-------------+-----------------+
The unix timestamp counts the seconds that have elapsed since 00:00:00 UTC on 1 January 1970.
To convert dates to unix-timestamp in PySpark, you can use the unix_timestamp function.

Get number of milliseconds for a localised date, taking into account daylight savings

I have data in Google BigQuery that looks like this:
sample_date_time_UTC time_zone milliseconds_between_samples
-------- --------- ----------------------------
2019-03-31 01:06:03 UTC Europe/Paris 60000
2019-03-31 01:16:03 UTC Europe/Paris 60000
...
Data samples are expected at regular intervals, indicated by the value of the milliseconds_between_samples field:
The time_zone is a string that represents a Google Cloud Supported Timezone Value
I'm then checking the ratio of the actual number of samples compared to the expected number over any particular day, for any single day range (expressed as a local date, for the given time_zone):
with data as
(
select
-- convert sample_date_time_UTC to equivalent local datetime for the timezone
DATETIME(sample_date_time_UTC,time_zone) as localised_sample_date_time,
milliseconds_between_samples
from `mytable`
where sample_date_time between '2019-03-31 00:00:00.000000+01:00' and '2019-04-01 00:00:00.000000+02:00'
)
select date(localised_sample_date_time) as localised_date, count(*)/(86400000/avg(milliseconds_between_samples)) as ratio_of_daily_sample_count_to_expected
from data
group by localised_date
order by localised_date
The problem is that this has a bug, as I've hardcoded the expected number of milliseconds in a day to 86400000. This is incorrect, as when daylight saving begins in the specified time_zone (Europe/Paris), a day is 1hr shorter. When daylight saving ends, the day is 1hr longer.
So, the query above is incorrect. It queries data for 31st March of this year in the Europe/Paris timezone (which is when daylight saving started in that timezone). The milliseconds in that day should be 82800000.
Within the query, how can I get the correct number of milliseconds for the specified localised_date?
Update:
I tried doing this to see what it returns:
select DATETIME_DIFF(DATETIME('2019-04-01 00:00:00.000000+02:00', 'Europe/Paris'), DATETIME('2019-03-31 00:00:00.000000+01:00', 'Europe/Paris'), MILLISECOND)
That didn't work - I get 86400000
You can get the difference in milliseconds for the two timestamps by removing the +01:00 and +02:00. Note that this gives the difference between the timestamps in UTC: 90000000, which is not the same as the actual milliseconds that passed.
You can do something like this to get the milliseconds for one day:
select 86400000 + (86400000 - DATETIME_DIFF(DATETIME('2019-04-01 00:00:00.000000', 'Europe/Paris'), DATETIME('2019-03-31 00:00:00.000000', 'Europe/Paris'), MILLISECOND))
Thanks #Juta, for the hint on using UTC times for the calculation. As I'm grouping my data for each day by a localised date, I figured out that I can work out milliseconds for each day by getting the beginning and end datetime (in UTC), for my 'localised' date, using the following logic:
-- get UTC start datetime for localised date
-- get UTC end datetime for localised date
-- this then gives the milliseconds for that localised date:
datetime_diff(utc_end_datetime, utc_start_datetime, MILLISECOND);
So, my full query becomes:
with daily_sample_count as (
with data as
(
select
-- get the date in the local timezone, for sample_date_time_UTC
DATE(sample_date_time_UTC,time_zone) as localised_date,
milliseconds_between_samples
from `mytable`
where sample_date_time between '2019-03-31 00:00:00.000000+01:00' and '2019-04-01 00:00:00.000000+02:00'
)
select
localised_date,
count(*) as daily_record_count,
avg(milliseconds_between_samples) as daily_avg_millis_between_samples,
datetime(timestamp(localised_date, time_zone)) as utc_start_datetime,
datetime(timestamp(date_add(localised_date, interval 1 day), time_zone)) as utc_end_datetime
from data
)
select
localised_date,
-- apply calculation for ratio_of_daily_sample_count_to_expected
-- based on the actual vs expected number of samples for the day
-- no. of milliseconds in the day changes, when transitioning in/out of daylight saving - so we calculate milliseconds in the day
daily_record_count/(datetime_diff(utc_end_datetime, utc_start_datetime, MILLISECOND)/daily_avg_millis_between_samples) as ratio_of_daily_sample_count_to_expected
from
daily_sample_count

Get current time in milliseconds

I am trying to do an API call which requires a time in milliseconds. I am pretty new in R and been Googling for hours to achieve something like what In Java would be:
System.currentTimeMillis();
Only thing i see is stuff like
Sys.Date() and Sys.time
which returns a formatted date instead of time in millis.
I hope someone can give me a oneliner which solves my problem.
Sys.time does not return a "formatted time". It returns a POSIXct classed object, which is the number of seconds since the Unix epoch. Of course, when you print that object, it returns a formatted time. But how something prints is not what it is.
To get the current time in milliseconds, you just need to convert the output of Sys.time to numeric, and multiply by 1000.
R> print(as.numeric(Sys.time())*1000, digits=15)
[1] 1476538955719.77
Depending on the API call you want to make, you might need to remove the fractional milliseconds.
No need for setting the global variable digits.secs.
See strptime for details.
# Print milliseconds of current time
# See ?strptime for details, specifically
# the formatting option %OSn, where 0 <= n <= 6
as.numeric(format(Sys.time(), "%OS3")) * 1000
To get current epoch time (in second):
as.numeric(Sys.time())
If you want to get the time difference (for computing duration for example), just subtract Sys.time() directly and you will get nicely formatted string:
currentTs <- Sys.time()
# about five seconds later
elapsed <- Sys.time() - currentTs
print(elapsed) # Time difference of 4.926194 secs

In what date format is 1339698600000 = 15 June 2012?

I am using bootstrap-datepicker and get a value of 1339698600000 for the selected date of 15th June 2012.
What dateformat is this? How do I convert it to human readable format?
Is there any resource where I can find many more formats?
That is the number of milliseconds since January 1, 1970 (POSIX epoch). You can divide it by 1000 to get the number of seconds since epoch which is a standard way to represent time.
It's the number of milliseconds since 1/1/1970. To convert to human readable, Add that many milliseconds to a 1/1/1970 date object.

Converting datetime character string to double value of milliseconds since 1 Jan 1960

I've found out how to convert a Stata datetime format from milliseconds since Jan 1960 in R from a related question (see below):
as.POSIXct(874022400000/1000, origin="1960-01-01")
I am looking to do the opposite in R: i.e. given a datetime expressed as a character string, find out how to return the datetime value as milliseconds since 01 Jan 1960 00:00:00. Any suggestions would be much appreciated.
Use as.numeric to coerce the date-time back into seconds since the epoch. Since R uses 1970 as its origin, you have to additionally account for the 1960-1970 offset. Lastly, of course, take care of the seconds to milliseconds conversion.
> mydate = as.POSIXct(874022400000/1000, origin="1960-01-01")
> 1000 * (as.numeric(mydate) - as.numeric(as.POSIXct('1960-01-01')))
[1] 874022400000

Resources