Datetime not formatting properly as d:hh:mm - datetime

I have a Google spreadsheet where I'm finding the amount of time between two datetime values. The calculation seems to be working properly, because when I output the value as just a number, it looks correct. However, when I try to format it as d:hh:mm, for any values where the two dates are on the same day, it is showing me 30 days instead of 0. Some sample data:
Datetime | Diff as # | Diff as d:hh:mm
8/13/2016 20:24 | |
8/27/2016 00:09 | 13.15625 | 13:03:45 (this line is correct)
8/27/2016 04:43 | 0.190278 | 30:04:34 (this is incorrect, should be 0 days, not 30)
When I try the same thing in Excel, it works as expected. (instead of 30:04:34, I get 00:04:34) So I think it's some difference between the two that I'm not familiar with. Can someone assist?

At present, Google sheets don't have the feature to show the duration in days (via custom format), you can only have it in hours and/or minutes and/or seconds..
Screenshot:
Still there is a workaround to get the desired format, try the following formula in cell C3:
=(datevalue(A3)-datevalue(A2))&text(time(0,0,(A3-A2)*24*60*60),":HH:mm")
Screenshot:
EDIT (after viewing the comment):
Try the following formula:
=rounddown(A3-A2)&" days "&text(time(0,0,(A3-A2)*24*60*60),"HH:mm")

Related

is there any way to parse a date in jq which contains milliseconds, or is there a way to get the interval between two dates without parsing?

So basically i need to get an interval between 2 dates and i have found how to do that using mktime, then deducting the two numbers.
But from my searching in the strptime c library, there doesnt seem to be a way to parse a date containing milliseconds, so im asking if there is any way around this, or if there is any way to parse a date containing milliseconds.
Edit: if there is any way to round up or down the date, or just remove the milliseconds that would work too
example:
{
"ActiveFrom": "2022-02-13T11:32:01.321345+04:00",
"ActiveTo": "2022-02-13T11:33:13.031743+04:00"
}
You could strip the fractional seconds with a substitution, and then go via strptime (required because of non-UTC timezone) to get a Unix timestamp:
echo '{
"ActiveFrom": "2022-02-13T11:32:01.321345+04:00",
"ActiveTo": "2022-02-13T11:33:13.031743+04:00"
}' \
| jq '.ActiveFrom
| sub("\\.[[:digit:]]+"; "")
| strptime("%Y-%m-%dT%H:%M:%S%z")
| mktime'
resulting in 1644751921 (jq playground).
If your dates are of OBJECT data type, you can easily calculate the interval like this:
var interval = date_later - date_early;

neo4j datetime wrong translation to 1970

I have nodes with datetime and when using conversion as:
https://community.neo4j.com/t/cannot-construct-date-time-from-no-value-failure-when-processing-file/34973/3
https://community.neo4j.com/t/cannot-construct-date-time-from-no-value-failure-when-processing-file/34973/4
I got wrong dates to 1970. Can you help find what is wrong with this query ?
MATCH (n:Resource)
with n, datetime({epochmillis: toInteger(n.created_at)}) as time
return n.created_at, toInteger(n.created_at), time
Results of time column does not make sense, they should be in 2022.
https://www.epochconverter.com/
Got this result:
"n.created_at" │"toInteger(n.created_at)"│"time" │
╞═════════════════╪═════════════════════════╪════════════════════════════════╡
│1651750310.706613│1651750310 │"1970-01-20T02:49:10.310000000Z"│
├─────────────────┼─────────────────────────┼────────────────────────────────┤
│1651750359.453425│1651750359 │"1970-01-20T02:49:10.359000000Z"│
├─────────────────┼─────────────────────────┼────────────────────────────────┤
│1651751391.714048│1651751391 │"1970-01-20T02:49:11.391000000Z"│
You are using epochmillis which expect to get the timestamp in milliseconds, but your created_at is in seconds. Just multiply it by 1000 before inserting to epochmillis

Kusto query help for Time chart

I am writing a Kusto query to display ths status of build results in time chart. That is the first column will display the time in 5 mins difference and the remaining columns will have the count for the respective Build status like (sucess, failed, in progress)
Once I do all the filters, I am using the below query
´´´| summarize count= count() by Status ,bin(timestamp(), 1h)
| render timechart´´´
It says unknown function and I am not sure how to display a time chart. So for each status how do I get the count for every 5 mins. Thanks for any inputs.
It seems that the issue is that you are using the function notation when you are telling the "bin" function which column to use, instead of simply provide the name of the column. In other words, remove the parenthesis after the column name timestamp as follows:
T
| summarize count= count() by Status ,bin(timestamp, 1h) | render timechart

Snowflake - convert string to datetime

I am using snowflake and I have date as a string in this format
'2021-04-01 08:00:05.577209+00'
I want to convert it to DateTime. I used the below code to do this (I trim '+00' from each string first). However I think I defined it somehow wrong, so I keep getting errors.
TO_TIMESTAMP_NTZ(left(ts,len(ts)-4),'YYYY-MM-DD HH24:MI:SS.FF'),
Why do you want to trim the +00 off? just do it like this:
select to_timestamp_ntz('2021-04-01 08:00:05.577209+00', 'YYYY-MM-DD HH24:MI:SS.FF+00')
It would be better to use left(ts,len( ts)-3) instead of left(ts,len( ts)-4) to trim last 3 characters.
Can you check your data and be sure it is '2021-04-01 08:00:05.577209+00' cause it works as expected (tested with both):
select ts,
left(ts,len( ts)-3) trimmed,
TO_TIMESTAMP_NTZ(left(ts,len( ts)-3),'YYYY-MM-DD HH24:MI:SS.FF') result
from values ('2021-04-01 08:00:05.577209+00') tmp (ts);
Result:
+-------------------------------+----------------------------+-------------------------+
| TS | TRIMMED | RESULT |
+-------------------------------+----------------------------+-------------------------+
| 2021-04-01 08:00:05.577209+00 | 2021-04-01 08:00:05.577209 | 2021-04-01 08:00:05.577 |
+-------------------------------+----------------------------+-------------------------+
I have found answer on my question. I was reading data from CSV files on Azure Data Lake and I haven't noticed quotes in a columns. When I deleted them everything is working fine.

Calculate a date in behind 24 hours Hive

My demand is really so silly, so basically I need to go back in time 24 hours in a timestamp column in Hive.
So far, I have tried two different ways but it's not going thru:
select
recordDate, --original date
cast(date_sub(cast(recorddate as timestamp),1) as timestamp), -- going one day behind without hour
cast((cast(cast(recorddate as timestamp) AS bigint)-1*3600) as timestamp) -- crazy year
from mtmbuckets.servpro_agents_events limit 10;
My output looks:
I appreciate the support you can give me.
thanks
There is not straight forward function in hive .
1 Create UDF to do so .
or
Convert date in no of second and do you calculation( -24 *60*60) sec then change back int to data.
use from_unixtime and unix_timestamp to achieve below code.
select from_unixtime(unix_timestamp(recorddate) - 86400)
from mtmbuckets.servpro_agen ts_events limit 10;;
From_unixtime
Convert time string with given pattern to Unix time stamp (in seconds) The result of this function is in seconds.
Unix_timestamp
Converts time string in format yyyy-MM-dd HH:mm:ss to Unix timestamp (in seconds), using the default timezone and the default locale, return 0 if fail: unix_timestamp('2009-03-20 11:30:01') = 1237573801

Resources