I am trying incremental import through SQOOP from Teradata to Hadoop. which is not working in my case.
It seems from error that SQOOP internally creating SQL which are syntactically incorrect. I even tried with --verbose option ....no usefull info.
Here is Table schema at Teradata which I am importing to Hadoop :
CREATE TABLE Employee ( EmpNo INT NOT NULL, EmpName CHAR(30), DOB DATE, Mob integer, LastUpdated timestamp );
Here is import command :
sqoop import --connect jdbc:teradata://XXXXXXXX/Database=XXXXX --driver com.teradata.jdbc.TeraDriver --username XXXXX --password XXXXXX --table Employee --target-dir /user/hive/incremental_emp_table -m 1 --check-column LastUpdated --incremental lastmodified --last-value "2001-12-17 07:36:01.280000"
and I get the following:
Warning: /usr/share/hadoop_echosystem/sqoop-1.4.5//../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /usr/share/hadoop_echosystem/sqoop-1.4.5//../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
Note: /tmp/sqoop-cloud/compile/917cdf768aea5267d838a949502ed0d0/Employee.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/share/hadoop_echosystem/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/share/hadoop_echosystem/hbase-0.96.1-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/share/hadoop_echosystem/apache-hive-1.0.0-bin/lib/hive-jdbc-1.0.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/06/16 14:20:16 ERROR manager.SqlManager: SQL exception accessing current timestamp: com.teradata.jdbc.jdbc_4.util.JDBCException: [Teradata Database] [TeraJDBC 14.10.00.26] [Error 3706] [SQLState 42000] Syntax error: expected something between '(' and ')'.
com.teradata.jdbc.jdbc_4.util.JDBCException: [Teradata Database] [TeraJDBC 14.10.00.26] [Error 3706] [SQLState 42000] Syntax error: expected something between '(' and ')'.
at com.teradata.jdbc.jdbc_4.util.ErrorFactory.makeDatabaseSQLException(ErrorFactory.java:307)
at com.teradata.jdbc.jdbc_4.statemachine.ReceiveInitSubState.action(ReceiveInitSubState.java:109)
at com.teradata.jdbc.jdbc_4.statemachine.StatementReceiveState.subStateMachine(StatementReceiveState.java:314)
at com.teradata.jdbc.jdbc_4.statemachine.StatementReceiveState.action(StatementReceiveState.java:202)
at com.teradata.jdbc.jdbc_4.statemachine.StatementController.runBody(StatementController.java:123)
at com.teradata.jdbc.jdbc_4.statemachine.StatementController.run(StatementController.java:114)
at com.teradata.jdbc.jdbc_4.TDStatement.executeStatement(TDStatement.java:384)
at com.teradata.jdbc.jdbc_4.TDStatement.executeStatement(TDStatement.java:326)
at com.teradata.jdbc.jdbc_4.TDStatement.doNonPrepExecuteQuery(TDStatement.java:314)
at com.teradata.jdbc.jdbc_4.TDStatement.executeQuery(TDStatement.java:1091)
at org.apache.sqoop.manager.SqlManager.getCurrentDbTimestamp(SqlManager.java:960)
at org.apache.sqoop.tool.ImportTool.initIncrementalConstraints(ImportTool.java:328)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:488)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:601)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
15/06/16 14:20:16 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Could not get current time from database
at org.apache.sqoop.tool.ImportTool.initIncrementalConstraints(ImportTool.java:330)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:488)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:601)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
I have walked through implementation of method org.apache.sqoop.manager.SqlManager.getCurrentDbTimestamp()
protected String getCurTimestampQuery() {
return "SELECT CURRENT_TIMESTAMP()";
}
SqlManager uses "SELECT CURRENT_TIMESTAMP();" to get current timestamp which is syntactically incorrect.
For teradata, It should be "SELECT CURRENT_TIMESTAMP;"
Please help me in resolving the issue.
It's bug .... Current timestamp query needs to be DB specific.
I have raised JIRA for the same.
https://issues.apache.org/jira/browse/SQOOP-2402
Related
When Exchange imports Hive data, I get the following error:
Exception in thread "main" org.apache.spark.sql.AnalysisException: Table or view not found
Check whether the -h parameter is omitted in the command for submitting the NebulaGraph Exchange task and whether the table and database are correct, and run the user-configured exec statement in spark-SQL to verify the correctness of the exec statement.
I'm trying to migrate snowflake schema with flyway. Using below command:
flyway migrate with url and other required parameters
I got below error:
Flyway Community Edition 6.3.0 by Redgate
Database: jdbc❄️//.snowflakecomputing.com:443/ (Snowflake 4.8)
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by net.snowflake.client.jdbc.internal.io.netty.util.internal.ReflectionUtil (file:/C:/flyway-6.3.0/drivers/snowflake-jdbc-3.12.2.jar) to constructor java.nio.DirectByteBuffer(long,int)
WARNING: Please consider reporting this to the maintainers of net.snowflake.client.jdbc.internal.io.netty.util.internal.ReflectionUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
ERROR:
Unable to check whether table "PUBLIC"."flyway_schema_history" exists
SQL State : 02000
Error Code : 2043
Message : SQL compilation error:
Object does not exist, or operation cannot be performed.
'warehouse' is not recognized as an internal or external command,
operable program or batch file.
'role' is not recognized as an internal or external command,
operable program or batch file.
I don't know why this error because I'm passing all the parameters correctly
From the error, it looks like Flyway can connect but can't find "PUBLIC"."flyway_schema_history".
When an object is double-quoted like that Snowflake recognizes that as case-sensitive.
You might want to:
Try connecting to Snowflake directly via your regular web login and then seeing if select top 1 * from <database>."PUBLIC"."flyway_schema_history" works correctly. Check to see that you are passing in a database parameter since I can't see that in your query.
See if you can connect in both the Snowflake website without the double quotes and in Flyway so that select top 1 * from <database>.public.flyway_schema_history also works so you remove case-sensitivity as a possibility.
I am trying to create migration schemas for a CorDapp as per the instructions here. I am running the following command:
java -jar corda-tools-database-manager-3.1.jar
--base-directory /opt/User
--create-migration-sql-for-cordapp fnolUseCase.state.FNOLSchema
However, I am getting the following error:
-- 2018-08-22T13:29:23,145Z migration.tool.invoke - Creating database migration
files for schema: fnolUseCase.state.FNOLSchema into /opt/User/migration
Failed to create datasource.
Please check that the correct JDBC driver is installed in one of the following
folders:
- /opt/User/drivers
Caused By java.lang.ClassCastException: fnolUseCase.state.FNOLSchema cannot be cast
to net.corda.core.schemas.MappedSchema
What should I be doing differently?
It seems to be having trouble locating your fnolUseCase.state.FNOLSchema class. Try dropping the schema name from the end of your command. This will cause a migration schema to be created for every schema in your application:
java -jar corda-tools-database-manager-3.1.jar
--base-directory /opt/User
--create-migration-sql-for-cordapp fnolUseCase.state.FNOLSchema
I am trying to refresh Materialised view using Flyway DB but below error coming
Help....is this supported or not?
Below SQL
ALTER MATERIALIZED VIEW TEST.TBL_M_V REFRESH COMPLETE ON DEMAND;
EXECUTE DBMS_MVIEW.REFRESH('TEST.TBL_M_V','C');
ALTER MATERIALIZED VIEW TEST.TBL_M_V NEVER REFRESH;
Below error coming
ERROR: Migration of schema "TEST" to version 4.1 failed! Please restore backu
ps and roll back database and code!
ERROR:
Migration V4_1__MViewRefresh_Test.sql failed
--------------------------------------------------
SQL State : 42000
Error Code : 900
Message : ORA-00900: invalid SQL statement
Location : C:/dev/flyway-3.1/sql/V4_1__MViewRefresh_Test.sql
Line : 8
Statement : EXECUTE DBMS_MVIEW.REFRESH('TEST.TBL_M_V','C')
As the error says, it is not a Flyway issue. The Oracle JDBC driver refused your statement.
This is the correct syntax you should use:
BEGIN
DBMS_MVIEW.REFRESH('TEST.TBL_M_V','C');
END;
I'm trying to run a mixture of SQL and Java migrations via Maven based on the example from Axel Fontaine here: http://www.methodsandtools.com/tools/flyway.php
Basically I am trying to execute several SQL migrations, followed by a java migration (to load BLOBS into a table), then followed by another SQL migration.
The first set of SQL migrations run fine. If I specify a file extension of .java for the Java migration, it gets ignored. If I specify a file extension of .sql for the Java migration, it gets run in the correct sequence, but I get the following error:
[ERROR] com.googlecode.flyway.core.api.FlywayException: Error
executing statement at line 1: package db.migration [ERROR] Caused by
org.postgresql.util.PSQLException: ERROR: syntax error at or near
"package" Position: 1 [ERROR]
com.googlecode.flyway.core.api.FlywayException: Migration of schema
"test" to version 1.0.0106 failed! Changes successfully rolled back.
Here is the head of my Java migration file:
package db.migration;
import com.googlecode.flyway.core.api.migration.jdbc.JdbcMigration;
import java.sql.Connection;
import java.sql.PreparedStatement;
import java.io.File;
Any ideas as to what I'm doing wrong?
Okay, I finally figured out what was going on. While Flyway allows version numbers that contain "." in the name (ex. V1.0.0000_filename), apparently it is not supported for Java migration class names. I changed the class name to use "" instead of "." (V1_0_1000_filename) and that allowed me to get past the original error.