How to handle clob column in iics/informatica cloud - informatica-cloud

I am trying to map data from my oracle database to flatfile. But as I have CLOB column in my source table, my synchronization job is failing with error "Internal error. The DTM process terminated unexpectedly. Contact Informatica Global Customer Support". But if I convert the CLOB using to_char and try, it's working.But it works only for data less than 4000 chanracters. I have data lot more than this size. Please suggest.

Can you please check data type of said column in informatica ?have you tried using STRING or TEXT data type?

Related

How to interpret or decode the Hostdata column from Teradata error table

The use case is that, there is an Informatica Cloud mapping which loads from SQL Server to Teradata database. If there any failures during the run time of the mapping then that mappings writes all the failed rows to a table in Teradata database. The key column in this error table is HOSTDATA which I assume. I am trying to decode the HOSTDATA column so that if a similar ETL failure happens in the production then it would be helpful in identifying the root cause much quickly. By default HOSTDATA is a column of type VARBYTES.
To decode the HOSTDATA column, converted the column to ASCII and Base 16 format. None of them made any use.
Then tried the below from the Teradata forum.
Then tried to extract the data from the error table using a BTEQ script. For that the data is being exported into a .err file and it is being loaded back into the Teradata database using a fastload script. Fastload is unable to load the data because there is no specific delimiter for the data. There data in the .err file looks gibberish. Snapshot of the data from the .err file:
My end goal is to interpret the Hostdata column in a more human readable way. Any suggestions in this direction are also welcome.
The Error Table Extractor command twbertbl which is part of "Teradata Parallel Transporter Base" software is designed to extract and format HOSTDATA from the error table VARBYTE column.
Based on the screenshot in your question, I suspect you will need to specify FORMATTED as the record format option for twbertbl (default is DELIMITED).

ORACLE 11g Know Insert Record Details which Failed to insert

I have started auditing insert records by user on failure to any table in my oracle 11g Database. I have used following command to do the same.
AUDIT INSERT ANY TABLE BY SHENA BY ACCESS WHENEVER NOT SUCCESSFUL;
I would like to know whenever the record insert will fail, Can i know what was the records which failed to insert into table.
Where we can see such information. Or if you know any other way of auditing of the same please suggest. One way which i know is to write a trigger on insert. In that trigger handle insert failure EXCEPTION and save those values to some table.
Use SQL Loader Utility with following control file format.
options(skip=1,rows=65534,errors=65534,readsize=16777216,bindsize=16777216)
load data
infile 'c:\users\shena\desktop\1.txt'
badfile 'C:\Users\shena\Desktop\test.bad'
discardfile 'C:\Users\shena\Desktop\test.dsc'
log 'C:\Users\shena\Desktop\test.log'
append
into table ma_basic_bd
fields terminated by '|' optionally enclosed by '"' trailing nullcols
(fs_perm_sec_id,
"DATE" "to_date(:DATE,'YYYY-MM-DD')",
adjdate "to_date(:adjdate,'YYYY-MM-DD')",
currency,
p_price,
p_price_open,
p_price_high,
p_price_low,
p_volume)
You are requested to use the conventional path loading so that we can get the rejected(rejected because of datatype mismatch and business rule violation) records in .bad file. Conventional path loading is a default option.
Following URL can be used for the detailed knowledge.
https://youtu.be/eovTBGAc2RI
Total 4 videos are there. Very helpful.

Insert a WCF-SAP response message to a sql column of XML datatype

I came across a scenario where I need to insert the WCF-SAP Response to a SQL staging table column which has XML datatype.
Is there any possibility to use BizTalk Map as I have other columns in the staging table to populate values.
This is solved by generating the schema in the assignment shape and inserted the XML into the SQL Column.
I faced some issues with Parent node accepts only Text where mine is a XML. That is also fixed when I used CDATA format.

which data type is used to store big statements

I create a online exam web application in asp.net. but when I update a question more than 250 characters then it show a error "String or binary data would be truncated.The statement has been terminated ".
What I do to store this data ?
This error occurs because the field is not big enough to hold your data. so if you are using SQL, just update the type for that field in table (ex:question or answer) to be :
NVARCHAR(MAX)

Why does Teradata throw "Too many data records packed in one USING row" ONLY sometimes ??

I am using jdbc and uploading data to Teradata. I used to have 100,000 rows of batch previously and it ALWAYS worked fine for me. No dataset failed uploading EVER !
Now, I tried to upload a one column table (all integers) , I get this Too many data records packed in one USING row ? As I changed the batch to 16,383 it worked .
I found out that I am still able to use 100,000 rows batch for tables with multiple columns, however when I try to upload a table with a single column, it throws Too many data records packed in one USING row . . . I just can't understand why ? Intuitively , a single column table should be easier to upload right ? What is going on here ?
16383 is the limit for a PreparedStatement using a non-FastLoad INSERT for Teradata JDBC.
Have you considered adding TYPE=FASTLOAD to your connection parameters and allowing Teradata to invoke the FastLoad API to bulk load your data for INSERT statements that are supported by FastLoad? The JDBC FastLoad mechanism is suggested for inserts of 100K records or more. The big factor here is that your target table in Teradata must be empty.
If it isn't empty then you may be able to load an empty stage table that you in turn use the ANSI MERGE operator to perform an UPSERT of the stage data to the target table.

Resources