need help - teradata TPT script failing - teradata

TPT SCRIPT:
DEFINE JOB LD_CAN_IN_TRANSIT
DESCRIPTION 'Load canada retail in transit'
(
DEFINE SCHEMA schema_Canada_Retail_Intransit_Wrk
(
VOUCHER_DATE VARCHAR(50),
VOU_NO VARCHAR(50),
PO_NO VARCHAR(50),
DOC_QTY VARCHAR(50),
DESCRIP VARCHAR(50),
UPC VARCHAR(50),
STORE_ADDR6 VARCHAR(50),
COUNTRY VARCHAR(50),
VENDOR_CODE VARCHAR(50),
LINENUMBER VARCHAR(50)
);
DEFINE OPERATOR DDL_OPERATOR
TYPE DDL
ATTRIBUTES
(
VARCHAR PrivateLogName = 'ddl_log',
VARCHAR TdpId = '******',
VARCHAR LogonMech = 'LDAP',
VARCHAR UserName = '*************',
VARCHAR UserPassword = '*************',
VARCHAR ErrorList = '3807'
);
DEFINE OPERATOR dml_canada_retail_int
TYPE UPDATE
SCHEMA *
ATTRIBUTES
(
VARCHAR LogonMech = 'LDAP',
VARCHAR TdpId = '*******',
VARCHAR UserName = '************',
VARCHAR UserPassword = '********',
VARCHAR TargetTable = 'ODM_SCD_STG_T.abc' ,
VARCHAR LogTable = 'EIS_AUX_T.log_table',
VARCHAR ErrorTable1 = 'EIS_AUX_T.err_ET',
VARCHAR ErrorTable2 = 'EIS_AUX_T.err_RL'
VARCHAR DeleteTask = 'Y'
);
DEFINE OPERATOR prod_can_ret_int
TYPE DATACONNECTOR PRODUCER
SCHEMA schema_Canada_Retail_Intransit_Wrk
ATTRIBUTES
(
VARCHAR DirectoryPath= '<path_to_file>',
VARCHAR FileName = #data_file,
VARCHAR Format = 'Delimited',
VARCHAR OpenMode = 'Read',
VARCHAR TextDelimiter =',',
INTEGER SkipRows = 1
);
DEFINE OPERATOR load_can_ret_int
TYPE LOAD
SCHEMA *
ATTRIBUTES
(
VARCHAR LogonMech = 'LDAP',
VARCHAR TdpId = '********',
VARCHAR UserName = '*****************',
VARCHAR UserPassword = '***************',
VARCHAR TargetTable = 'ODM_SCD_STG_T.abc' ,
VARCHAR LogTable = 'EIS_AUX_T.log_table',
VARCHAR ErrorTable1 = 'EIS_AUX_T.err_ET',
VARCHAR ErrorTable2 = 'EIS_AUX_T.err_RL'
);
STEP Setup_Tables
(
APPLY
('DROP TABLE EIS_AUX_T.log_table;'),
('DROP TABLE EIS_AUX_T.err_ET;'),
('DROP TABLE EIS_AUX_T.err_RL;')
TO OPERATOR (DDL_OPERATOR);
);
STEP stSetup_Tables
(
APPLY
('DELETE ODM_SCD_STG_T.abc;')
TO OPERATOR (dml_canada_retail_int);
);
STEP stLOAD_CAN_RET_INT
(
APPLY
('<insert statement>')
TO OPERATOR (load_can_ret_int)
SELECT * FROM OPERATOR(prod_can_ret_int);
);
);
ERROR:
Teradata Parallel Transporter Version 16.20.00.14 64-Bit
The global configuration file '/opt/teradata/client/16.20/tbuild/twbcfg.ini' is used.
Log Directory: /opt/teradata/client/16.20/tbuild/logs
Checkpoint Directory: /opt/teradata/client/16.20/tbuild/checkpoint
TPT_INFRA: TPT03624: Warning: tbuild -s option argument specifies the first job step;
no job steps will be skipped (unless this is a restarted job).
Job log: /opt/teradata/client/16.20/tbuild/logs/ec2-user-207.out
Job id is ec2-user-207, running on ip-10-179-114-26.us-west-2.compute.internal
Teradata Parallel Transporter SQL DDL Operator Version 16.20.00.14
DDL_OPERATOR: private log specified: ddl_log
DDL_OPERATOR: connecting sessions
DDL_OPERATOR: sending SQL requests
DDL_OPERATOR: TPT10508: RDBMS error 3807: Object 'EIS_AUX_T.err_ET' does not exist.
DDL_OPERATOR: TPT18046: Error is ignored as requested in ErrorList
DDL_OPERATOR: TPT10508: RDBMS error 3807: Object 'EIS_AUX_T.err_RL' does not exist.
DDL_OPERATOR: TPT18046: Error is ignored as requested in ErrorList
DDL_OPERATOR: disconnecting sessions
DDL_OPERATOR: Total processor time used = '0.012241 Second(s)'
DDL_OPERATOR: Start : Thu Jan 9 20:22:28 2020
DDL_OPERATOR: End : Thu Jan 9 20:22:28 2020
Job step Setup_Tables completed successfully
Teradata Parallel Transporter Update Operator Version 16.20.00.14
dml_canada_retail_int: private log not specified
dml_canada_retail_int: connecting sessions
dml_canada_retail_int: preparing target table(s)
**dml_canada_retail_int: TPT10508: RDBMS error 3524: The user does not have CREATE TABLE access to database ODM_SCD_STG_T.**
dml_canada_retail_int: disconnecting sessions
dml_canada_retail_int: Performance metrics:
dml_canada_retail_int: MB/sec in Acquisition phase: 0
dml_canada_retail_int: Elapsed time from start to Acquisition phase: 2 second(s)
dml_canada_retail_int: Elapsed time in Acquisition phase: 0 second
dml_canada_retail_int: Elapsed time in Application phase: 0 second
dml_canada_retail_int: Elapsed time from Application phase to end: < 1 second
dml_canada_retail_int: Total processor time used = '0.0397 Second(s)'
dml_canada_retail_int: Start : Thu Jan 9 20:22:28 2020
dml_canada_retail_int: End : Thu Jan 9 20:22:30 2020
Job step stSetup_Tables terminated (status 12)
Job ec2-user terminated (status 12)
Job start: Thu Jan 9 20:22:28 2020
Job end: Thu Jan 9 20:22:30 2020
Question:
The goal is to create error tables in EIS_AUX_T schema and
the user that i'm using have CREATE TABLE access to it. But i'm not
sure why do i need CREATE TABLE access to database (ODM_SCD_STG_T in
this case) in which target table exists. Process is failing at Load
operator step.
1.User have CREATE TABLE access in EIS_AUX_T database.
2.User does NOT have CREATE TABLE access in ODM_SCD_STG_T database.
3.USer has DML privileges in ODM_SCD_STG_T database.
4.Error and Log tables should be created in EIS_AUX_T database.

The UPDATE operator also uses a WorkTable in addition to Log & Error tables; since you didn't specify WorkTable or WorkingDatabase attributes, TPT is trying to create the work table in the target table's database.
But using the UPDATE operator DeleteTask to truncate a table is just extra overhead. Use the DDL operator for this.

Related

Data load with Teradata TPT failing

Here i'm trying load a csv file into teradata tables using TPT utility
,but is filing with an error:
Here is my TPT script:
DEFINE JOB test_tpt
DESCRIPTION 'Load a Teradata table from a file'
(
DEFINE SCHEMA SCHEMA_EMP_NAME
(
NAME VARCHAR(50),
AGE VARCHAR(50)
);
DEFINE OPERATOR od_EMP_NAME
TYPE DDL
ATTRIBUTES
(
VARCHAR PrivateLogName = 'tpt_log',
VARCHAR LogonMech = 'LDAP',
VARCHAR TdpId = 'TeraDev',
VARCHAR UserName = 'user',
VARCHAR UserPassword = 'pwd',
VARCHAR ErrorList = '3807'
);
DEFINE OPERATOR op_EMP_NAME
TYPE DATACONNECTOR PRODUCER
SCHEMA SCHEMA_EMP_NAME
ATTRIBUTES
(
VARCHAR DirectoryPath= '/home/hadoop/retail/',
VARCHAR FileName = 'emp_age.csv',
VARCHAR Format = 'Delimited',
VARCHAR OpenMode = 'Read',
VARCHAR TextDelimiter =','
);
DEFINE OPERATOR ol_EMP_NAME
TYPE LOAD
SCHEMA *
ATTRIBUTES
(
VARCHAR LogonMech = 'LDAP',
VARCHAR TdpId = 'TeraDev',
VARCHAR UserName = 'user',
VARCHAR UserPassword = 'pwd',
VARCHAR LogTable = 'EMP_NAME_LG',
VARCHAR ErrorTable1 = 'EMP_NAME_ET',
VARCHAR ErrorTable2 = 'EMP_NAME_UV',
VARCHAR TargetTable = 'EMP_NAME'
);
STEP stSetup_Tables
(
APPLY
('DROP TABLE EMP_NAME_LG;'),
('DROP TABLE EMP_NAME_ET;'),
('DROP TABLE EMP_NAME_UV;'),
('DROP TABLE EMP_NAME;'),
('CREATE TABLE EMP_NAME(NAME VARCHAR(50), AGE VARCHAR(2));')
TO OPERATOR (od_EMP_NAME);
);
STEP stLOAD_FILE_NAME
(
APPLY
('INSERT INTO EMP_NAME
(Name,Age)
VALUES
(:Name,:Age);
')
TO OPERATOR (ol_EMP_NAME)
SELECT * FROM OPERATOR(op_EMP_NAME);
);
);
Call TPT:
tbuild -f test_tpt.sql
Above TPT script is failing with following error:
Teradata Parallel Transporter Version 15.10.01.02 64-Bit
TPT_INFRA: Syntax error at or near line 6 of Job Script File 'test_tpt.sql':
TPT_INFRA: At "NAME" missing RPAREN_ in Rule: Explicit Schema Element List
TPT_INFRA: Syntax error at or near line 8 of Job Script File 'test_tpt.sql':
TPT_INFRA: TPT03020: Rule: DEFINE SCHEMA
Compilation failed due to errors. Execution Plan was not generated.
Job script compilation failed .
Am i missing any detail in here?
The messages certainly could be clearer, but the issue is that NAME is a restricted word.

Kafka JDBC source connector time stamp mode failing for sqlite3

I tried to set up a database with two tables in sqlite. Once of my table is having a timestamp column . I am trying to implement timestamp mode to capture incremental changes in the DB. Kafka connect is failing with the below error:
ERROR Failed to get current time from DB using Sqlite and query 'SELECT
CURRENT_TIMESTAMP'
(io.confluent.connect.jdbc.dialect.SqliteDatabaseDialect:471)
java.sql.SQLException: Error parsing time stamp
Caused by: java.text.ParseException: Unparseable date: "2019-02-05 02:05:29"
does not match (\p{Nd}++)\Q-\E(\p{Nd}++)\Q-\E(\p{Nd}++)\Q
\E(\p{Nd}++)\Q:\E(\p{Nd}++)\Q:\E(\p{Nd}++)\Q.\E(\p{Nd}++)
Many thanks for the help
Config:
name=test-query-sqlite-jdbc-autoincrement
connector.class=io.confluent.connect.jdbc.JdbcSourceConnector
tasks.max=1
connection.url=jdbc:sqlite:employee.db
query=SELECT users.id, users.name, transactions.timestamp, transactions.payment_type FROM users JOIN transactions ON (users.id = transactions.user_id)
mode=timestamp
timestamp.column.name=timestamp
topic.prefix=test-joined
DDL:
CREATE TABLE transactions(id integer primary key not null,
payment_type text not null,
timestamp DATETIME DEFAULT(STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')),
user_id int not null,
constraint fk foreign key(user_id) references users(id)
);
CREATE TABLE users (id integer primary key not null,name text not null);
The kafka connect jdbc connector easily detects the changes in the timestamp, if the values of the 'timestamp' column are in the format of the 'UNIX timestamp'.
sqlite> CREATE TABLE transact(timestamp TIMESTAMP DEFAULT (STRFTIME('%s', 'now')) not null,
...> id integer primary key not null,
...> payment_type text not null);
sqlite>
The values can be inserted as:
sqlite> INSERT INTO transact(timestamp,payment_type,id) VALUES (STRFTIME('%s', 'now'),'cash',1);
The timestamp related changes are then detected by the kafka jdbc source connector and the same can be consumed as follows:
kafka-console-consumer --bootstrap-server localhost:9092 --topic jdbc-transact --from-beginning
{"timestamp":1562321516,"id":2,"payment_type":"card"}
{"timestamp":1562321790,"id":1,"payment_type":"online"}
I've reproduced this, and it is already logged as an issue for the JDBC Source connector. You can monitor it here: https://github.com/confluentinc/kafka-connect-jdbc/issues/219

Executing SQL script in server ERROR: Error 1064

I'm trying to create a database from the E/R Diagram I just created using MySQL WorkBench, can someone please help?
Executing SQL script in server
ERROR: Error 1064: You have an error in your SQL syntax; check the manual that corresponds to your MariaDB server version for the right syntax to use near
CONSTRAINT `IDProveedor`
FOREIGN KEY (`IDproveedor`)
REFERENCES `Repu
at line 11
SQL Code:
-- -----------------------------------------------------
-- Table `Repuestos`.`Articulo`
-- -----------------------------------------------------
CREATE TABLE IF NOT EXISTS `Repuestos`.`Articulo` (
`IDArticulo` INT NOT NULL AUTO_INCREMENT,
`Nombre` VARCHAR(45) NOT NULL,
`IDproveedor` INT NOT NULL,
`Articulocol` VARCHAR(45) NOT NULL,
`Valor_Unitario` INT NOT NULL,
PRIMARY KEY (`IDArticulo`),
INDEX `NitProveedor_idx` (`IDproveedor` ASC) VISIBLE,
CONSTRAINT `IDProveedor`
FOREIGN KEY (`IDproveedor`)
REFERENCES `Repuestos`.`Proveedor` (`IDProveedor`)
ON DELETE NO ACTION
ON UPDATE NO ACTION)
ENGINE = InnoDB
SQL script execution finished: statements: 6 succeeded, 1 failed
Fetching back view definitions in final form.
Nothing to fetch.

How can I use IF statements in Teradata without using BTEQ

I'm trying to create some deployment tools and I don't want to use BTEQ. I've been trying to work with the Teradata.Client.Provider in PowerShell but I'm getting syntax errors on the creation of a table.
[Teradata Database] [3706] Syntax error: expected something between
';' and the 'IF' keyword.
SELECT * FROM DBC.TablesV WHERE DatabaseName = DATABASE AND TableName = 'MyTable';
IF ACTIVITYCOUNT > 0 THEN GOTO EndStep1;
CREATE MULTISET TABLE MyTable ,
NO FALLBACK ,
NO BEFORE JOURNAL,
NO AFTER JOURNAL,
CHECKSUM = DEFAULT,
DEFAULT MERGEBLOCKRATIO
(
MyColId INTEGER GENERATED ALWAYS AS IDENTITY
(START WITH 1
INCREMENT BY 1
MINVALUE 0
MAXVALUE 2147483647
NO CYCLE)
NOT NULL,
MyColType VARCHAR(50) NULL,
MyColTarget VARCHAR(128) NULL,
MyColScriptName VARCHAR(256) NULL,
MyColOutput VARCHAR(64000) NULL,
isMyColException BYTEINT(1) NULL,
ExceptionOutput VARCHAR(64000) NULL,
MyColBuild VARCHAR(128) NULL,
MyColDate TIMESTAMP NOT NULL
)
PRIMARY INDEX PI_MyTable_MyColLogId(MyColLogId);
LABEL EndStep1;
I would rather not use BTEQ as I've not found it has worked well in other deployment tools we have created and requires a bit of hacks. Is there anything I can use that would avoid using that tool?
What Parse error?
The CREATE will fail due to double INTEGER in MyColId and VARCHAR(max) in ExceptionOutput, it's an unknown datatype in Teradata.

SQL Error: No more data to read from socket while inserting data in table

I have created a table in oracle xe
create table tbl_unit_mst
(
id number(10,0) constraint id_pk primary key,
unit_code char(2) not null constraint unit_code_uk unique,
unit_name varchar2(30) not null constraint unit_name_uk unique,
crtd_date date default sysdate,
is_active number(1,0) default 1 constraint is_active_ck check(is_active in (0,1)),
crtd_by varchar2(6)
);
and then created a squence
create sequence seq_tbl_unit
start with 1
increment by 1
nocache
nocycle;
then I created a Trigger
create trigger trig_id_increment
before insert
on tbl_unit_mst for each row
begin
select seq_tbl_unit.nextval into : new.id from dual;
end;
Now when I am trying to run an insert statement
insert into tbl_unit_mst ( unit_code, unit_name) values('01','Ajbapur');
it gives an error SQL Error: No more data to read from socket
If I disable Trigger then it is working fine.
can anyone help me to find out where I am making mistakes

Resources