I have a SQLite database with a TEXT column that stores a DateTime value formated like this: '11/08/2019 00:00:00'. I would like to convert the entire column contents to UTC Epoch timestamp local timezone.
Is there an Update SQL string with a DateTime function that I could run using supported SQL syntax to perform this task or should I perhaps just write a quick C# console application to do it?
I have not found any example online or in SO that would do what I need to do in this situation.
An UPDATE SQL such as :-
UPDATE mytable SET mycolumn =
CASE WHEN substr(mycolumn,3,1) = '/'
THEN
strftime('%s',substr(mycolumn,7,4)||'-'||substr(mycolumn,4,2)||'-'||substr(mycolumn,1,2)||' '||substr(mycolumn,12,8))
ELSE
mycolumn
END
;
could be used.
Example
Perhaps consider the following which will convert the column (or not if it has already been converted (or not if it does not match the dd/mm/yyyy format))
Note the below just checks the 3rd character for /, a more rigorous check could be used if desired.
:-
DROP TABLE IF EXISTS mytable;
CREATE TABLE IF NOT EXISTS mytable (mycolumn TEXT);
/* Load the testing data */
INSERT INTO mytable VALUES
('11/08/2019 00:00:00'),
('01/08/2019 00:00:00'),
('31/01/2019 00:00:00'),
('31/01/2019 13:25:33.004') /* test for micro seconds (dropped by utc)*/;
/* display data before conversion */
SELECT * FROM mytable;
/* Convert the data to unix */
UPDATE mytable SET mycolumn =
CASE WHEN substr(mycolumn,3,1) = '/'
THEN
strftime('%s',substr(mycolumn,7,4)||'-'||substr(mycolumn,4,2)||'-'||substr(mycolumn,1,2)||' '||substr(mycolumn,12,8))
ELSE
mycolumn
END
;
/* Display data as is, as formatted localised and as formatted UTC */
SELECT *, datetime(mycolumn,'unixepoch','localtime') AS local, datetime(mycolumn,'unixepoch') AS utc FROM mytable;
Note the above would NOT cater for dates such as 1/1/2019, such dates would need a more complex CASE clause.
Note that UTC is worldwide coordinated time i.e one value is stored you adjust from UTC according to the time zone
Results
Note testing in timezone that is +10 hours
When first run the results are :-
Pre-conversion :-
Post-convserion
Rerun (DROP commented out)
Pre-conversion (mixed data) :-
circled data is already converted
Post-conversion :-
Related
This is the BTEQ call I'm using:
.EXPORT REPORT FILE = OUTPUT_FILE;
SET SESSION DATEFORM = ANSIDATE;
SELECT * FROM TABLE_NAME
;
Dates keep coming up as IntegerDates YY/MM/DD
Teradata 16.xx. Is this not supported by BTEQ/Unix?
Update:
Fred's solution (worked like a charm)
ALTER TABLE TABLE_NAME ADD COLUMN_NAME DATE FORMAT 'MM/DD/YYYY';
The SESSION DATEFORM is a default. It applies to string values being supplied for dates or to date expressions that don't have an explicit FORMAT. It will also be used to set the FORMAT for date columns if you don't specify one in your DDL. But if a table already exists, the defined column FORMAT will override
Hi i'm trying to find rows where the unixtimestamp is 0:
WHERE datetimeField = datetime(0,'unixepoch','localtime') -> does not work
WHERE datetimeField = datetime(0,'unixepoch','utc') -> does not work
WHERE datetimeField = '1970-01-01T01:00:00' -> works
any ideas why no datetime-format works?
Because you're doing a string comparison [sqlite's "=" is not timestamp-aware, it's comparing two things for equality]. Your datetimeField is presumably being stored as the String '1970-01-01T01:00:00', which sqlite's datetime functions can gracefully deal with, but is still eventually a string because sqlite doesn't have an intrinsic datetime type
Note that all three of these are different strings, and thus won't return true for the equality check:
sqlite> select datetime(0,'unixepoch','localtime');
1969-12-31 16:00:00
sqlite> select datetime(0,'unixepoch','utc');
1970-01-01 08:00:00
sqlite> select '1970-01-01T01:00:00';
1970-01-01T01:00:00
sqlite>
To clarify, a simple solution to this problem is to convert everything to another format that you're confident it can be converted to, and is easy to compare. Since you've already brought it up, I quite like epoch seconds. Given your date of one hour after midnight, on the first of January 1970:
sqlite> select strftime('%s', '1970-01-01T01:00:00');
3600
sqlite>
So from your code:
{stuff} WHERE 0 = strftime('%s', datetimeField)
I would like to run a query involving joining a table to a manually generated list but am stuck trying to generate the manual list. There is an example of what I am attempting to do below:
SELECT
*
FROM
('29/12/2014', '30/12/2014', '30/12/2014') dates
;
Ideally I would want my output to look like:
29/12/2014
30/12/2014
31/12/2014
What's your Teradata release?
In TD14 there's STRTOK_SPLIT_TO_TABLE:
SELECT *
FROM TABLE (STRTOK_SPLIT_TO_TABLE(1 -- any dummy value
,'29/12/2014,30/12/2014,30/12/2014' -- any delimited string
,',' -- delimiter
)
RETURNS (outkey INTEGER
,tokennum INTEGER
,token VARCHAR(20) CHARACTER SET UNICODE) -- modify to match the actual size
) AS d
You can easily put this in a Derived Table and then join to it.
inkey (here the dummy value 1) is a numeric or string column, usually a key. Can be used for joining back to the original row.
outkey is the same as inkey.
tokennum is the ordinal position of the token in the input string.
token is the extracted substring.
Try this:
select '29/12/2014'
union
select '30/12/2014'
union
...
It should work in Teradata as well as in MySql.
I am reading a big csv (>1GB big for me!). It contains a timestamp field.
I read it (100 rows to start with ) with fread from the excellent data.table package.
ddfr <- fread(input="~/file1.csv",nrows=100, header=T)
Problem 1 (RESOLVED): the timestamp fields (called "ts" and "update"), e.g. "02/12/2014 04:40:00 AM" is converted to string. I convert the fields back to timestamp with lubridate package mdh_hms. Splendid.
ddfr$ts <- data.frame( mdy_hms(ddfr$ts))
Problem 2 (NOT RESOLVED): The timestamp is created with time zone as per POSIXlt.
How do I create in R a timestamp with NO TIME ZONE? is it possible??
Now I use another (new) great package, PivotalR to write the dataframe to PostGreSQL 9.3 using as.db.data.frame. It works as a charm.
x <- as.db.data.frame(ddfr, table.name= "tbl1", conn.id = 1)
Problem 3 (NOT RESOLVED): As the original dataframe timestamp fields had time zones, a table is created with the fields "timestamp with time zone". Ultimately the data needs to be stored in a table with fields configured as "timestamp without time zone".
But in my table in Postgres the data is stored as "2014-02-12 04:40:00.0", where the .0 at the end is the UTC offset. I think I need to have "2014-02-12 04:40:00".
I tried
ALTER TABLE tbl ALTER COLUMN ts type timestamp without time zone;
Then I copied across. While Postgres accepts the ALTER COLUMN command, when I try to copy (using INSERT INTO tbls SELECT ...) I get an error:
"column "ts" is of type timestamp without time zone but expression is of type text
Hint: You will need to rewrite or cast the expression."
Clearly the .0 at the end is not liked (but why then Postgres accepts the ALTER COLUMN? boh!).
I tried to do what the error suggested using CAST in the INSERT INTO query:
INSERT INTO tbl2 SELECT CAST(ts as timestamp without time zone) FROM tbl1
But I get the same error (including the suggestion to use CAST aargh!)
The table directly created by PivotalR (based on the dataframe) has this CREATE script:
CREATE TABLE tbl2
(
businessid integer,
caseno text,
ts timestamp with time zone
)
WITH (
OIDS=FALSE
);
ALTER TABLE tbl1
OWNER TO mydb;
The table I'm inserting into has this CREATE script:
CREATE TABLE tbl1
(
id integer NOT NULL DEFAULT nextval('bus_seq'::regclass),
businessid character varying,
caseno character varying,
ts timestamp without time zone,
updated timestamp without time zone,
CONSTRAINT busid_pkey PRIMARY KEY (id)
)
WITH (
OIDS=FALSE
);
ALTER TABLE tbl1
OWNER TO postgres;
My apologies for the convoluted explanation, but potentially a solution could be found at any step in the chain, so I preferred to put all my steps in one question. I am sure there has to be a simpler method...
I think you're confused about copying data between tables.
INSERT INTO ... SELECT without a column list expects the columns from source and destination to be the same. It doesn't magically match up columns by name, it'll just assign columns from the SELECT to the INSERT from left to right until it runs out of columns, at which point any remaining cols are assumed to be null. So your query:
INSERT INTO tbl2 SELECT ts FROM tbl1;
isn't doing this:
INSERT INTO tbl2(ts) SELECT ts FROM tbl1;
it's actually picking the first column of tbl2, which is businessid, so it's really attempting to do:
INSERT INTO tbl2(businessid) SELECT ts FROM tbl1;
which is clearly nonsense, and no casting will fix that.
(Your error in the original question doesn't match your tables and queries, so the details might be different as you've clearly made a mistake in mangling/obfuscating your tables or posted a newer version of the tables than the error. The principle remains.)
It's generally a really bad idea to assume your table definitions won't change and column order won't change anyway. So always be explicit about columns. In this case I think your intention might have actually been:
INSERT INTO tbl2(businessid, caseno, ts)
SELECT CAST(businessid AS integer), caseno, ts
FROM tbl1;
Note the cast, because the type of businessid is different between the two tables.
Hi am trying to convert column defined as date to date with format DD-MON-YYYY but can't get it working
select
to_date(to_char
(DESIGN_COMPLETION_DATE,'DD-MON-YYYY'),'DD-MON-YYYY') as Assessment_Completion_Date
from nbi_dates
Also Tried
Select to_date(DESIGN_COMPLETION_DATE,'DD-MON-YYYY') as Assessment_Completion_Date
from nbi_dates
What works is, but I can't do any calculations on it as it is char
Select to_char(DESIGN_COMPLETION_DATE,'DD-MON-YYYY') as Assessment_Completion_Date
from nbi_dates
Thanks
If you have a DATE that includes a time portion but are only interested in the actual date part, you can use the trunc function:
select trunc(design_completion_date) as assessment_completion_date from nbi_dates
An example of the difference using sysdate; notice the time on the trunc'd version has been set to midnight:
SQL> alter session set nls_date_format = 'DD/MM/YYYY HH24:MI:SS';
Session altered.
SQL> select sysdate, trunc(sysdate) from dual;
SYSDATE TRUNC(SYSDATE)
------------------- -------------------
11/04/2013 15:14:31 11/04/2013 00:00:00
A DATE has no inherent format. DD-MON-YYYY is a format mask applied to display the date, or to convert it to a string representation, which is usually only necessary for display anyway. What you have as your third option is right for that purpose, but not if you want to do any further date calculations with the result.
Select to_date(to_char(DESIGN_COMPLETION_DATE,'DD-MON-YYYY'),'DD-MON-YYYY') as Assessment_Completion_Date
from nbi_dates
or simply (in case you want date object for calculations but not for rendereing)
Select DESIGN_COMPLETION_DATE as Assessment_Completion_Date
from nbi_dates