I am working on a TPT script where I am getting input a flat file in below format.
File.txt
12191|2010-04-06|2010-09-08
28036|2009-05-20|2009-05-20
30742|2016-05-27|2016-05-27
Below is the portion of code from TPT scripts. I am not able to run this script as it is failing during computation of casting into the date format.
APPLY
(
'INSERT INTO '||#WorkingDatabase||'.'||#TargetTable||'(
id
,actvtn_dt
,end_dt
,load_dt_tm
) VALUES
(
:id
, CAST(:actvtn_dt AS DATE FORMAT ''YYYY-MM-DD'')
, CAST(:end_dt AS DATE FORMAT ''YYYY-MM-DD'')
, CURRENT_TIMESTAMP(0)
);'
)
Can anyone help in this regard?
Related
My source is a file and loading into SQL Server table. I'm working on a scenario where i have to convert a string '2019-04-02T21:24:00.065' to informatica datetime format.
I tried below expression but some times its failing due to we are not receiving milliseconds from our source file in few occasions.
IIF(NOT ISNULL(DATEFIELD),TO_DATE(SUBSTR (DATEFIELD, 0, 10) || ' ' || SUBSTR(DATEFIELD, 12, 12), 'YYYY-MM-DD HH24.MI.SS.US'),NULL)
I'm looking for a permanent fix to handle all types of datetime formats regardless of what we receive in the file.
Well... I'm sorry to say, but there is no magic component that will recognize all possible date and time formats (including e.g. verbal in swahili).
You will need to detect the format for yourself. You can use a DECODE function, like e.g.:
DECODE(True,
IS_DATE(your_input_port, 'DD/MM/YYYY'), TO_DATE(your_input_port, 'DD/MM/YYYY'),
...)
If you are completely sure that only seconds/milliseconds are the missing part, you can check for length, if less than 12, use RPAD to the second part of your SUBSTR with missing format, or you can use decode as suggested by #maciejg and write code for all possible date formats.
Thanks for your inputs guys. Since we are not sure which date format we are receiving , we decided to go with a simple fix for this. I have changed the target field to varchar and simply replacing the T with ' ' value since this a staging mapping.
Again thanks for your time and inputs.
I have a table with a column called 'Start_Date', in the format DD/MM/YYYY.
I want to convert this into a datetime within a query, and have been trying various methods without success.
I currently have...
SELECT CONVERT(DATETIME, start_date, 103) FROM product_backfile;
...but get the error message...
Error Code: 1064
You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'start_date, 103)
FROM product_backfile LIMIT 0, 1000' at line 1
I should add that I've tried this too, with no luck...
SELECT CONVERT(DATE, start_date, 103) FROM product_backfile;
The full syntax I'm trying to implement is...
CONCAT(DATE_FORMAT(start_Date, '%Y-%m-%d'),'T', '00:00:00Z'), CHAR(93), ' TO ', CHAR(91), CONCAT(DATE_FORMAT(End_Date, '%Y-%m-%d'),'T', '00:00:00Z')
I've narrowed the problem down to the way in which the Start_Date and End_Date fields are being interpreted, and have previously got the full CONCAT string to work in MySQL Workbench...
I'm at the end of my tether...please help!! :)
Answered my own question, after some more investigation....! :)
DATE_FORMAT(STR_TO_DATE(start_date, '%d/%m/%Y'), '%Y-%m-%d')
Full syntax I was after...
CONCAT(DATE_FORMAT(STR_TO_DATE(start_date, '%d/%m/%Y'), '%Y-%m-%d'),'T', '00:00:00Z')
I have a .txt file contains time stamp column like 20180607093059000 and some other columns, but while importing into teradata SQL assistant getting invalid time stamp error. please help me the way
as I need to import file data into volatile and has to do minus operation between these file table and actual table.
The default format for a timestamp is 'YYYY-MM-DD HH:MI:SS', you can apply TO_TIMESTAMP
To_Timestamp(ts, 'yyyymmddhhmissff3')
which results in a Timestamp(6)
To get Timestamp(3) you need to CAST using a FORMAT after adding the fractional period:
Cast(Substring(ts From 1 FOR 14) || '.' || Substring(ts From 15) AS TIMESTAMP(3) FORMAT 'yyyymmddhhmiss.s(3)')
I have a string column - COL1 in TABLE1 which is if string data type. This table is loaded by Informatica session ( data coming from mainframe) and the format of the COL1 is YYYY-MM-DD. Now I have to use TABLE1 as the source in my next mapping . In the SQL override query of second mapping i will be casting COL1 to date using the below query .
SELECT
CAST(COL1 AS DATE FORMAT 'YYYY-MM-DD') AS CHK_DT FROM TABLE1
But when i try to execute this query in Teradata SQLA, just to check if it runs fine it gives me below error.
SELECT Failed. 2666: Invalid date supplied for COL1.
Can you please help me resolve this issue ? This is not the only date column which has issue, there are two more date columns . I guess the resolution is same for all three columns .
P.S - Just to verify, I updated all rows of COL1 of TABLE1 as 2016-12-12 and ran the select statement, select worked fine . I then updated COL1 of all rows as 2016-13-12, it gave same error . If either of DD or MM is more than 12, it is giving me error
Thanks
If DATE is represented/stored in ANSI standard literal YYYY-MM-DD, the CAST will work.
SELECT CAST('2016-12-13' AS DATE FORMAT 'YYYY-MM-DD') AS Date1
However i doubt that in your case.
The date is most probably in YYYY-DD-MM format. In that case the ANSI standard format will throw the error. You need YYYY-DD-MM
select CAST('2016-13-12' AS DATE FORMAT 'YYYY-DD-MM') AS Date2
P.S. You can confirm the conversion to date using TYPE() function. It should return DATE in your case
Hi Please try this piece of code
CAST(CAST(date_col AS FORMAT 'YYYY-MM-DD') AS VARCHAR(15))
instead of the transformation you are using.
Thanks for your response. However the issue was something else. Some of the incoming records had space in this column . So I had to tweak my informatica mapping to put a trim on date column . Now the select is running fine . Thanks for your time .
Had a cvs file containing 3 fields
1,cat,2012-06-16,2013-06-16
1,cat,2013-06-16,
I am trying to load that to temporal table having valid_dt PERIOD(DATE) using fastload script
nonsequenced validtime
INSERT INTO financial.test1 (id,name,valid_dt) values
(:id,:name,period( cast(:start_dt as date FORMAT 'YYYY-MM-DD'),cast(:end_dt as date FORMAT 'YYYY-MM-DD'))
);
Error I got is RDBMS error 3618: Expression not allowed in Fast Load
Insert, column INTERNALPERIODDATETYPE.
could not find any in manuals, they only said it will be possible with fastload.
Thankyou.
FastLoad doesn't allow ANSI style CAST, must be old Teradata style instead:
:start_dt (date, FORMAT 'YYYY-MM-DD')
But there's no old-style PERIOD cast and FastLoad also doesn't allow any kind of expression and PERIOD(...) is an expression.
So you can only load data which can be automatically converted to a PEROD like:
1;cat;(2012-06-16, 2013-06-16)
1;cat;(2013-06-16, 9999-12-31)
Including the parens, the blank after the comma and a different delimiter...
I would suggest simply loading the data as DATEs (or CHARs) into a staging table using FastLoad or MultiLoad, followed by a
nonsequenced validtime
INSERT INTO financial.test1 (id,name,valid_dt) values
select id, name, period(start_dt,COALESCE(end_dt, date '9999-12-31'))
from stagingtable