I have table with timestamps in ms stored in it. I want to convert those timestamps in a human readable form.
Here is a sample output of my table:
SELECT date raw, strftime('%d-%m-%Y', (date/1000)) as_string
FROM my_table
+-----------------+--------------+
| raw | as_string |
+-----------------+--------------+
| 1444687200000 | 06-47-3950 |
+-----------------+--------------+
... ... ...
+-----------------+--------------+
As you can see, the date as string is quite strange (06-47-3950).
How can I obtain 12-10-2015?
Try this:
SELECT date raw, strftime('%d-%m-%Y', datetime(date/1000, 'unixepoch')) as_string
FROM my_table
You need to convert timestamp to date before.
You need to convert timestamp to daytime first. There was an answer on one forum. I quote it here.
Here you are: try those queries to see why and how.
select julianday('1899-12-30 00:00:00');
-- that gives 2415018.5 (remember Julian dates start at noon)
select datetime('40660.9454658044', '+2415018 days', '+12 hours', 'localtime');
-- gets you 2011-04-28 00:41:28 (depending on your local
time)
Related
Is there a conversion to go from an integer (ex: 54600) to date-time format? I am seeing the integers in a flowsheet in healthcare, used to record a time
CONVERT(varchar, DATEADD(ms, b22.meas_value * 1000, 0), 114) AS 'START TIME' : Is the code that worked in MS SQL, but SNOWFLAKE does not recognize the function.
You can try to use TO_DATE. The data type of the returned value is DATE.
Syntax
TO_DATE( '<integer>' )
where
<integer>
An expression that evaluates to a string containing an integer, for example ‘15000000’. Depending upon the magnitude of the string, it can be interpreted as seconds, milliseconds, microseconds, or nanoseconds.
For details or other usage, see the above link.
There are a few different ways:
select 1637804567::varchar::date as date;
+------------+
| DATE |
|------------|
| 2021-11-25 |
+------------+
select to_date(1637804567::varchar) as date;
+------------+
| DATE |
|------------|
| 2021-11-25 |
+------------+
TO_DATE or CASTING will require the value to be string, so you need to convert the integer to string first before converting to date.
For your example, it will be like below in Snowflake:
select dateadd(ms, 54600*1000, to_date(1637804567::varchar)) as date;
+-------------------------------+
| DATE |
|-------------------------------|
| 2021-11-25 15:10:00.000000000 |
+-------------------------------+
TRY_TO_TIME() (there's a joke in there somewhere)
select try_to_time('54600') AS "START TIME"
I'm using the following code to convert a string datetime variable to datetime, but the converted string is missing SSS part.
Code used:
cast(FROM_UNIXTIME(UNIX_TIMESTAMP(oldtime, "yyyy-MM-dd'T'HH:mm:ss.SSS'Z'"),"yyyy-MM-dd HH:mm:ss.SSS") as timestamp) as newtime
The outcome:
2019-03-08T18:28:36.901Z is converted to 08MAR2019:18:28:36.000000
Some other oldtimes in string:
2020-03-09T16:05:06:827Z
2020-03-09T16:03:19:354Z
2020-03-11T16:03:57:280Z
2020-03-10T16:02:57:642Z
2020-03-10T16:04:07:455Z
2020-03-10T16:04:09:737Z
2020-03-10T16:03:57:280Z
2020-03-10T16:02:46:816Z
The SSS part '901' is missing in the converted time. Would like help on keeping the SSS part since I need to sort the records by their exact time.
Thank you!
from_unixtime is always until minutes(yyyy-MM-dd HH:mm:ss) to get millisecs we need to do some workarounds.
we will extract the millisecs from the old_time using regexp_extract then concat that to from_unixtime result and finally cast to timestamp.
Example:
select old_time,
timestamp(concat_ws(".", --concat_ws with . and cast
FROM_UNIXTIME(UNIX_TIMESTAMP(old_time, "yyyy-MM-dd'T'HH:mm:ss.SSS'Z'"),"yyyy-MM-dd HH:mm:ss"), -- from_unixtime and unix_timestamp to convert without millisecs
regexp_extract(string(old_time),".+\\.(.*)(?i)z",1))) as newtime from --regexp_extract to extract last 3 digits before z then concat
(select string("2020-03-11T21:14:41.335Z")old_time)e
+------------------------+-----------------------+
|old_time |newtime |
+------------------------+-----------------------+
|2020-03-11T21:14:41.335Z|2020-03-11 21:14:41.335|
+------------------------+-----------------------+
UPDATE:
Your sample data have : before milliseconds, Try with below query:
select old_time,
timestamp(concat_ws(".", --concat_ws with . and cast
FROM_UNIXTIME(UNIX_TIMESTAMP(old_time, "yyyy-MM-dd'T'HH:mm:ss:SSS'Z'"),"yyyy-MM-dd HH:mm:ss"), -- from_unixtime and unix_timestamp to convert without millisecs
regexp_extract(string(old_time),".+\\:(.*)(?i)z",1))) as newtime from --regexp_extract to extract last 3 digits before z then concat
(select string("2020-03-11T21:14:41:335Z")old_time)e
Simply replace 'T' with space ' ' remove 'Z' and replace last ':' with dot, like this :
select regexp_replace('2020-03-09T16:05:06:827Z','(.*?)T(.*?):([^:]*?)Z$','$1 $2\\.$3');
Result:
2020-03-09 16:05:06.827
Read also this answer if you need to convert to different format, preserving milliseconds: https://stackoverflow.com/a/59645846/2700344
I need convert string to datetime (date and time together).
I try this:
cast(to_date(from_unixtime(unix_timestamp('20190303164305', 'yyyyMMddHHmmss'))) as date) as date_data_chamada
timezone: Brazil
But this way returns just date, like this: 2019-03-03, and I need: 2019-03-03 16:43:05
Thanks!
Full code:
INSERT INTO p_b.este PARTITION (dt_originacao_fcdr)
SELECT
tp_registro_fcdr,
seq_registro_fcdr,
tp_cdr_fcdr,
dt_atendimento_fcdr,
data_atendimento_completa_fcdr,
cast(from_unixtime(unix_timestamp(data_atendimento_completa_fcdr, 'yyyyMMddHHmmss'),"yyyy-MM-dd HH:mm:ss")as timestamp) as date_data_atendimento_fcdr,
hr_atendimento_fcdr,
duracao_atend_fcdr,
hr_originacao_fcdr,
duracao_total_fcdr,
duracao_chamada_tarifada_fcdr,
st_chamada_fcdr,
fim_sel_orig_fcdr
FROM p_b.norm;
Remove date casting and to_date functions as you are expecting timestamp!
Example:
hive> select from_unixtime(unix_timestamp('20190303164305', 'yyyyMMddHHmmss'),"yyyy-MM-dd HH:mm:ss") as date_data_chamada;
RESULT:
2019-03-03 16:43:05
If you use to_date or cast('string' as date) then hive results only date(yyyy-MM-dd)!
Ex:
hive> select to_date(from_unixtime(unix_timestamp('20190303164305', 'yyyyMMddHHmmss'),"yyyy-MM-dd HH:mm:ss")) as date_data_chamada;
--2019-03-03
Pass the second argument format string to from_unixtime. Note that the returned type is string.
from_unixtime(unix_timestamp('20190303164305','yyyyMMddHHmmss'),'yyyy-MM-dd HH:mm:ss')
I have 2 columns: time_stamp and time_offset. Both are STRING data type.
How can we convert one column values into UTC with the help of second column which is in UTC? Is their any hive or from unix solution to convert time_stamp column into UTC?
hive> select time_stamp from table1 limit 2;
OK
20170717-22:31:57.348
20170719-21:10:15.393
[yyyymmdd-hh:mm:ss.msc] this column is in local time
hive> select time_offset from table1 limit 2;
OK
-05:00
+05:00
[‘+hh:mm’ or ‘-hh:mm’ ] this column is in UTC
You can use the Hive Date Functions unix_timestamp and from_unixtime to perform the conversion.
Code
WITH table1 AS (
SELECT '20170717-22:31:57.348' AS time_stamp, '-05:00' AS time_offset UNION ALL
SELECT '20170719-21:10:15.393' AS time_stamp, '+05:00' AS time_offset
)
SELECT
time_stamp,
time_offset,
unix_timestamp(concat(time_stamp, ' ', time_offset), 'yyyyMMdd-HH:mm:ss.SSS X') AS unix_timestamp_with_offset,
from_unixtime(unix_timestamp(concat(time_stamp, ' ', time_offset), 'yyyyMMdd-HH:mm:ss.SSS X'), 'yyyyMMdd-HH:mm:ss.SSS') AS string_timestamp_with_offset
FROM table1
;
Result Set
+------------------------+--------------+-----------------------------+-------------------------------+--+
| time_stamp | time_offset | unix_timestamp_with_offset | string_timestamp_with_offset |
+------------------------+--------------+-----------------------------+-------------------------------+--+
| 20170717-22:31:57.348 | -05:00 | 1500348717 | 20170717-20:31:57.000 |
| 20170719-21:10:15.393 | +05:00 | 1500480615 | 20170719-09:10:15.000 |
+------------------------+--------------+-----------------------------+-------------------------------+--+
Explanation
unix_timestamp can accept an optional format string in the same syntax as Java SimpleDateFormat. I am guessing that your offsets are using the ISO 8601 syntax, so let's use the X format specifier. Then, we can use the concat String Operator to combine time_stamp and time_offset before passing to unix_timestamp.
The unix_timestamp function results in a numeric timestamp specified as seconds since epoch. To convert that back to a string representation, we can pass the result obtained from unix_timestamp through from_unixtime, this time specifying our original format specifier.
(Please do test thoroughly to make sure the results are making sense in your environment. Time zone math can be tricky.)
I have a table of price quotes for multiple symbols
Table QUOTES
ID INT
SYMBOL NVARCHAR(6)
DT DATETIME
PRICE DECIMAL(18,5)
Table TempSymbol
SYMBOL NVARCHAR(6)
I want to extract only those symbols from QUOTES whose symbols are also in a temp table that could vary based on user request
Create TABLE TempSymbol
(
SYMBOL NVARCHAR(6) NOT NULL
);
INSERT INTO TempSymbol(SYMBOL) VALUES ('MSFT');
INSERT INTO TempSymbol(SYMBOL) VALUES ('INTC');
INSERT INTO TempSymbol(SYMBOL) VALUES ('AAPL');
I want a query that will return from QUOTES the following data...
datetime symbol1 | price1 | symbol2 | price2 | symbol3 | price3
2012-11-12 12:10:00 MSFT | 12.10 | INTC | 5.68 | AAPL | 16.89
2012-11-12 12:15:00 MSFT | 12.22 | INTC | 5.97 | AAPL | 16.22
....
...
..
SELECT DT, SYMBOL, PRICE FROM QUOTE AS Q INNER JOIN TempSymbol AS TS ON Q.SYMBOL = TS.SYMBOL
This returns records that I need to pivot but that's not available in SQLite is there an another way I should be attempting this? Any help is appreciated.
try out this
SELECT DT, SYMBOL, PRICE FROM QUOTE where SYMBOL in (Select SYMBOL from TempSymbol)
SQL is doing the part of your problem that it's designed to do: retrieve the data. You can add ORDER BY DT to make the records for the same date-time adjacent.
If you think about it a minute you'll see that a SELECT can't possibly return what you want. It returns table rows, and SQL table rows have constant length. So doing what you call a "pivot" is not a SELECT operation. You may be thinking of pivots in spreadsheets. Databases aren't spreadsheets.
After that, producing the report you want is best done with a little program in any of the languages with an SQLite interface (in Android for example that's Java; otherwise C or TCL). Make the query. Get the rows back as hashes, arrays, or ODM records. The rest is a couple of loops over this data. The algorithm is:
last_dt = null
for row in all rows
if row.dt != last_dt
start new output line
print dt
last_dt = dt
end
print ' | ', row.symbol, ' | ', row.price
end
Another note: With advanced DB features like stored procedures and XML objects you could implement this in SQL. XML objects can have variable numbers of fields. Here the limit is SQLite, which doesn't provide these features.