Informatica giving error while loading Teradata Number column - teradata

In a Teradata(13.5) target table we have column with data-type Number; when we try to load this table using Informatica flow it gives following error:
[Severity Timestamp Node Thread Message Code Message
ERROR 4/1/2015 3:08:52 PM node01_<host_name> WRITER_1_*_1 WRT_8229 Database errors occurred:
FnName: Execute -- [Teradata][ODBC Teradata Driver] Illegal data conversion]
We have tried everything including:
1. Changing Informatica target datatype to decimal, bigint, integer, varchar
2. Importing target table to Informatica using Informatica Target Designer but this Number field is imported as Varchar(0)
Please suggest how to solve this as changing target data-type is not an option for us.

You can import target table to Informatica using Informatica Target Designer, and then edit the datatype for the column to anything you want. It can be alterered without issues.

Related

Are Sybase datetime fields supported in PolyBase exernal tables?

I'm working with PolyBase on SQL Server 2022. I have some tables with type "datetime NULL" on Sybase ASE 16.
When declaring an external table e.g.
CREATE EXTERNAL TABLE SYBASESCHEMA.SomeTable
(
[SomeNiceTime] DATETIME NULL
)
WITH (LOCATION = N'SomeNiceDatabase.dbo.SomeTable', DATA_SOURCE = SYBASE_DS);
I receive an error message like the following one:
105083;The following columns in the user defined schema are incompatible with the external table schema for table 'SomeTable': 'SomeNiceTime' failed to be reflected with the error: 'The detected ODBC SQL_TYPE 11 is not supported for external generic tables.'
Does anyone know how this could be resolved?

How to handle clob column in iics/informatica cloud

I am trying to map data from my oracle database to flatfile. But as I have CLOB column in my source table, my synchronization job is failing with error "Internal error. The DTM process terminated unexpectedly. Contact Informatica Global Customer Support". But if I convert the CLOB using to_char and try, it's working.But it works only for data less than 4000 chanracters. I have data lot more than this size. Please suggest.
Can you please check data type of said column in informatica ?have you tried using STRING or TEXT data type?

MariaDB: SELECT INSERT from ODBC CONNECT engine from SQL Server keeps causing "error code 1406 data too long"

Objective: Using MariaDB I want to read some data from MS SQL Server (via ODBC Connect engine) and SELECT INSERT it into a local table.
Issue: I keep geting "error code 1406 data too long" even if source and destination varchar fields have the very same size (see further details)
Details:
The query which I'm trying to execute is in the form:
INSERT INTO DEST_TABLE(NUMERO_DOCUMENTO)
SELECT SUBSTR(TRIM(NUMERO_DOCUMENTO),0,5)
FROM CONNECT_SRC_TABLE
The above is the very minimal subset of fields which causes the problem.
The source CONNECT Table is actually a view inside SQL Server. The destination table has been defined so to be identical to the the ODBC CONNECT Table (same field names, same NULL constranints, same filed types ans sizes)
There's no issue on a couple of other VARCHAR fields
The issue is happening with a filed NUMERO_DOCUMENTO VARCHAR(14) DEFAULT NULL where the max length from the input table is 14
The same issue is also happening with 2 other fields ont the same table
All in all it seems to be an issue with the source data rather then the destination table.
Attemped workarounds:
I tried to force silent truncation but, reasonably, this does not make any difference: Error Code: 1406. Data too long for column - MySQL
I tried enlarging the destination field with no appreciable effect NUMERO_DOCUMENTO VARCHAR(100) DEFAULT NULL
I tried to TRIM the source field (hidden spaces?) and to limit its size at source to no avail: INSERT INTO DEST_TABLE(NUMERO_DOCUMENTO) SELECT SUBSTR(TRIM(NUMERO_DOCUMENTO),0,5) FROM CONNECT_SRC_TABLE but the very same error is always returned
Workaround:
I tried performing the same thing using a FOR x IN (src_query) DO INSERT .... END FOR and this solution seems to work: this means that the problem is not into the data itself but in how the engine performs the INSERT SELECT query

Get error message with date field in Informatica when the workflow is run

I am getting the following error when I try to link a date field from Source Qualifier to Target table in Informatica:
ERROR 7/19/2019 9:05:26 AM node01_dev WRITER_1_*_1 WRT_8229 Database errors occurred:
FnName: Execute -- [Informatica][ODBC SQL Server Wire Protocol driver]Timestamp parameters with zero scale must have a precision of 13, 16, or 19. Parameter number: 1, precision: 12.
FnName: Execute -- [DataDirect][ODBC lib] Function sequence error
I have done the same thing (used datetime for a target) with another workflow and it ran successfully.
I have done a search on the internet with this error message but none of the solutions from my search resolved the problem.
The target table SA_Cases needs to have the data insert into it. Right now, the Monitor shows that all of the rows are rejected.
The source is a table in Oracle. The target is a table in Microsoft SQL Server
enter image description here
enter image description here
Here is the mapping that worksenter image description here
The SA_Cases table, which is the Target table, has fields with spaces. I replaced the spaces with underscores and it works. The problem was the spaces in the field names.

ORACLE 11g Know Insert Record Details which Failed to insert

I have started auditing insert records by user on failure to any table in my oracle 11g Database. I have used following command to do the same.
AUDIT INSERT ANY TABLE BY SHENA BY ACCESS WHENEVER NOT SUCCESSFUL;
I would like to know whenever the record insert will fail, Can i know what was the records which failed to insert into table.
Where we can see such information. Or if you know any other way of auditing of the same please suggest. One way which i know is to write a trigger on insert. In that trigger handle insert failure EXCEPTION and save those values to some table.
Use SQL Loader Utility with following control file format.
options(skip=1,rows=65534,errors=65534,readsize=16777216,bindsize=16777216)
load data
infile 'c:\users\shena\desktop\1.txt'
badfile 'C:\Users\shena\Desktop\test.bad'
discardfile 'C:\Users\shena\Desktop\test.dsc'
log 'C:\Users\shena\Desktop\test.log'
append
into table ma_basic_bd
fields terminated by '|' optionally enclosed by '"' trailing nullcols
(fs_perm_sec_id,
"DATE" "to_date(:DATE,'YYYY-MM-DD')",
adjdate "to_date(:adjdate,'YYYY-MM-DD')",
currency,
p_price,
p_price_open,
p_price_high,
p_price_low,
p_volume)
You are requested to use the conventional path loading so that we can get the rejected(rejected because of datatype mismatch and business rule violation) records in .bad file. Conventional path loading is a default option.
Following URL can be used for the detailed knowledge.
https://youtu.be/eovTBGAc2RI
Total 4 videos are there. Very helpful.

Resources