Sqoop Can't parse input data '""' - airflow

I get can't export data error while exporting sqoop, how can i solve this problem?
Error: java.io.IOException: Can't export data, please check failed map task logs
Caused by: java.lang.RuntimeException: Can't parse input data: '""' at

Related

How to solve the error `Exception in thread "main" org.apache.spark.sql.AnalysisException: Table or view not found` in NebulaGraph Exchange?

When Exchange imports Hive data, I get the following error:
Exception in thread "main" org.apache.spark.sql.AnalysisException: Table or view not found
Check whether the -h parameter is omitted in the command for submitting the NebulaGraph Exchange task and whether the table and database are correct, and run the user-configured exec statement in spark-SQL to verify the correctness of the exec statement.

FindException: unable to derive module descriptor

Error occurred during initialization of boot layer
java.lang.module.FindException: Unable to derive module descriptor for C:\JavaFX\jlfgr-1_0.jar
Caused by: java.lang.IllegalArgumentException: jlfgr.1.0: Invalid module name: '1' is not a Java identifier
I am getting this error again and again. I had tried all the solutions which are available on this site but my bug was not fixed. May any one please help me?

Why am I getting connection reset error in Sqoop?

I am using Sqoop 1.4.6v and hadoop-2.7.1v.
I am importing data from Oracle DB and using ojdbc6.jar.
It is working fine but sometimes I am getting following error:-
19/03/15 16:27:23 INFO mapreduce.Job: Task Id : attempt_1552649108375_0013_m_000000_0, Status : FAILED
Error: java.lang.RuntimeException: java.lang.RuntimeException: java.sql.SQLRecoverableException: IO Error: Connection reset
How do I resolve this issue?
Any help regarding this would be appreciated.
I found something for you let me know if it helps :
This problem occurs primarily due to the lack of a fast random number generation device on the host where the map tasks execute
Please refer the sqoop guide for detailed explanation:
https://sqoop.apache.org/docs/1.4.6/SqoopUserGuide.html#_oracle_connection_reset_errors

Talend (7.0.1) - Guess Schema error - org.apache.thrift.TAppliccationexception:Required field 'client_protocol' is unset

I am getting below error while designing and running job in Talend(i.e. when i hit Guess Schema button in tHiveInput component). I have tried all the options but I am unable to fix this. Any help would be appreciated.
Talend version : 7.0.1
OS : RHEL 7
ERROR jdbc.HiveConnection: Error opening session
org.apache.thrift.TApplicationException: Required field 'client_protocol' is unset! Struct:TOpenSessionReq(client_protocol:null, configuration:{set:hiveconf:hive.server2.thrift.resultset.default.fetch.size=1000, use:database=default})
at org.apache.thrift.TApplicationException.read(TApplicationException.java:111)
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:79)
at org.apache.hive.service.rpc.thrift.TCLIService$Client.recv_OpenSession(TCLIService.java:168)
at org.apache.hive.service.rpc.thrift.TCLIService$Client.OpenSession(TCLIService.java:155)
at org.apache.hive.jdbc.HiveConnection.openSession(HiveConnection.java:680)
at org.apache.hive.jdbc.HiveConnection.(HiveConnection.java:200)
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107)
at org.talend.metadata.managment.connection.manager.HiveConnectionManager$1.call(HiveConnectionManager.java:259)
at org.talend.metadata.managment.connection.manager.HiveConnectionManager$1.call(HiveConnectionManager.java:1)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.lang.Thread.run(Thread.java:748)
18/09/14 13:48:00 WARN jdbc.HiveConnection: Failed to connect to :
ERROR:
java.sql.SQLException: java.util.concurrent.ExecutionException: java.sql.SQLException: Could not open client transport with JDBC Uri: jdbc:hive2://:/: Could not establish connection to jdbc:hive2://:/: Required field 'client_protocol' is unset! Struct:TOpenSessionReq(client_protocol:null, configuration:{set:hiveconf:hive.server2.thrift.resultset.default.fetch.size=1000, use:database=default})

How to resolve the error in oozie compilation

I'm getting the following error in oozie compilation.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:2.3.2:compile
(default-compile) on project oozie-core: Compilation failure
[ERROR] error: error reading /home/oozie/.m2/repository/javax/jdo/jdo2-api/2.3-ec/jdo2-api-2.3-ec.jar; error in opening zip file.
Please help me to rectify this.

Resources