importing avro data using java api in sqoop - bigdata

The problem is how to run java program for sqoop import ?
I am using sqoop version 1.4.7 and hadoop version 2.7.2 and I
am trying to run on net beans ide 8.1
here is the code:
package sqoop5;
import java.io.File;
import java.io.FileInputStream;
import java.util.Properties;
import com.cloudera.sqoop.SqoopOptions;
import com.cloudera.sqoop.tool.BaseSqoopTool;
import com.cloudera.sqoop.tool.ImportTool;
import java.io.BufferedReader;
import java.io.InputStreamReader;
public class Sqoop5 {
public static void main(String[] args) throws Exception {
String driver = "org.postgresql.Driver";
Class.forName(driver).newInstance();
SqoopOptions options = new SqoopOptions();
options.setConnectString(("jdbc:postgresql://
127.0.0.1:5432/postgres"));
options.setTableName(("new"));
options.setUsername(("postgres"));
options.setPassword(("********"));
options.setNumMappers(1);
options.setTargetDir(("hdfs://
127.0.0.1:9000/usr/new11"));
options.setFileLayout(com.cloudera.sqoop.
SqoopOptions.FileLayout.AvroD ataFile);
new ImportTool().run((com.cloudera.sqoop.SqoopOptions)
options);
}
}
The error message is as follows:
cause:org.apache.hadoop.ipc.RemoteException: Server IPC
version 9
cannot communicate with client version 4
Aug 05, 2019 10:46:29 AM org.apache.sqoop.tool.ImportTool run
SEVERE: Encountered IOException running import job:
org.apache.hadoop.ipc.RemoteException: Server IPC version 9
cannot
communicate with client version 4
at org.apache.hadoop.ipc.Client.call(Client.java:1113)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229)
at com.sun.proxy.$Proxy6.getProtocolVersion(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethod)

If you read this post, you will see that it's something similar to your problem.
Probably a version mismatch between the applications. Check the version of hadoop you are including vs the version of hadoop from your cluster.

Related

How to import external library in QT Python?

I have an issue when importing external python library in QT app.
The program is crashing when I'm trying to import the canlib.
However, no import error is caught, only application communicate appear like this:
program finished with code -1
When I comment out the canlib import, program runs fine.
import os
from pathlib import Path
import sys
import random
from Connection import Connection
from PySide6.QtGui import QGuiApplication
from PySide6.QtQml import QQmlApplicationEngine
from PySide6.QtCore import QTimer
from canlib import kvadblib #here is an error source
if __name__ == "__main__":
app = QGuiApplication(sys.argv)
engine = QQmlApplicationEngine()
# Connection QT <---> Python
connection = Connection()
engine.rootContext().setContextProperty("connection", connection)
# End
# Hardware Init
db = kvadblib.Dbc(filename='battery_monitoring_app.dbc')
ch = communication.open_virtual_channel(1)
# End
engine.load(os.fspath(Path(__file__).resolve().parent / "qml/main.qml"))
if not engine.rootObjects():
sys.exit(-1)
### DO WHILE TRUE STUFF
def doStuff():
connection.testSignalVariable.emit(random.uniform(0.00, 2.00))
### END
timer = QTimer()
timer.timeout.connect(doStuff)
timer.start(100)
sys.exit(app.exec_())

Not compile project after migrate from Vaadin 7 to Vaadin 8

Current version of Vaadin is 7.3.6
Here some my code:
import com.vaadin.data.Property;
import com.vaadin.data.Property.ValueChangeEvent;
import com.vaadin.ui.NativeSelect;
import com.vaadin.ui.TextField;
import com.vaadin.ui.UI;
import com.vaadin.ui.VerticalLayout;
private NativeSelect currencySelector;
private void initCurrencySelector(String providerId) {
currencySelector = new NativeSelect();
List<String> selectCurrencyList;
currencySelector.removeAllItems();
}
And this code success compile.
But after I try to upgrade to Vaadin 8.12.0 then this code not compile.
error in this lines:
import com.vaadin.data.Property;
import com.vaadin.data.Property.ValueChangeEvent;
import com.vaadin.event.FieldEvents.TextChangeEvent;
import com.vaadin.event.FieldEvents.TextChangeListener;
and in this line:
currencySelector.removeAllItems();
the new imports should be
import com.vaadin.data.HasValue.ValueChangeEvent;
import com.vaadin.event.FieldsEvent
TextChangeEvent and TextChangeListener probably were replaced by HasValue.ValueChangeEvent and HasValue.ValueChangeListener
currencySelector.removeAllItems(); should be
currencySelector.setDataProvider(new ListDataProvider(new ArrayList()));
a list of incompatible changes can be found here https://vaadin.com/download/prerelease/8.0/8.0.0/8.0.0.beta1/release-notes.html#incompatible

package EJB module as EAR or JAR?

Should this hello world EJB be packages as an EAR or JAR for deployment?
thufir#dur:~/NetBeansProjects/EJBModule1$
thufir#dur:~/NetBeansProjects/EJBModule1$ cat src/java/net/bounceme/dur/ejb/NewSessionBean.java
package net.bounceme.dur.ejb;
import java.util.logging.Logger;
import javax.ejb.Stateless;
#Stateless(name = "FooBean", mappedName = "ejb/FooBean")
public class NewSessionBean implements NewSessionBeanRemote {
private final static Logger log = Logger.getLogger(NewSessionBean.class.getName());
#Override
public void sayHello() {
log.info("hi");
}
}
thufir#dur:~/NetBeansProjects/EJBModule1$
deploy:
thufir#dur:~/NetBeansProjects/EJBModule1$
thufir#dur:~/NetBeansProjects/EJBModule1$ asadmin list-applications
Nothing to list.
No applications are deployed to this target server.
Command list-applications executed successfully.
thufir#dur:~/NetBeansProjects/EJBModule1$
thufir#dur:~/NetBeansProjects/EJBModule1$ ant -p
Buildfile: /home/thufir/NetBeansProjects/EJBModule1/build.xml
Builds, tests, and runs the project EJBModule1.
Main targets:
-profile-pre72 Profile a J2EE project in the IDE.
clean Clean build products.
compile Compile project.
debug Debug project in IDE.
default Build whole project.
dist Build distribution (JAR).
dist-directory-deploy Build distribution (JAR) - if directory deployment is not supported.
dist-ear Build distribution (JAR) to be packaged into an EAR.
javadoc Build Javadoc.
profile Profile a J2EE project in the IDE.
run Deploy to server.
test Run unit tests.
test-single Run single unit test.
test-single-method Run single unit test.
Default target: default
thufir#dur:~/NetBeansProjects/EJBModule1$
thufir#dur:~/NetBeansProjects/EJBModule1$ ant run
Buildfile: /home/thufir/NetBeansProjects/EJBModule1/build.xml
-pre-init:
-init-private:
-init-userdir:
-init-user:
-init-project:
-init-macrodef-property:
-do-init:
-post-init:
-init-check:
-init-ap-cmdline-properties:
-init-macrodef-javac-with-processors:
-init-macrodef-javac-without-processors:
-init-macrodef-javac:
-init-macrodef-test-impl:
-init-macrodef-junit-init:
-init-macrodef-junit-single:
-init-test-properties:
-init-macrodef-junit-batch:
-init-macrodef-junit:
-init-macrodef-junit-impl:
-init-macrodef-testng:
-init-macrodef-testng-impl:
-init-macrodef-test:
-init-macrodef-junit-debug:
-init-macrodef-junit-debug-batch:
-init-macrodef-junit-debug-impl:
-init-macrodef-test-debug-junit:
-init-macrodef-testng-debug:
-init-macrodef-testng-debug-impl:
-init-macrodef-test-debug-testng:
-init-macrodef-test-debug:
-init-macrodef-java:
-init-debug-args:
-init-macrodef-nbjpda:
-init-macrodef-debug:
-init-taskdefs:
-init-ap-cmdline-supported:
-init-ap-cmdline:
init:
-init-cos:
-init-deploy:
-deps-module-jar:
-deps-ear-jar:
deps-jar:
-pre-pre-compile:
-pre-compile:
-copy-meta-inf:
-do-compile:
-post-compile:
compile:
-library-inclusion-in-archive-weblogic:
-library-inclusion-in-archive-by-user:
library-inclusion-in-archive:
-pre-dist:
-do-tmp-dist-without-manifest:
-do-tmp-dist-with-manifest:
-do-dist-directory-deploy:
-post-dist:
dist-directory-deploy:
pre-run-deploy:
-pre-nbmodule-run-deploy:
-run-deploy-nb:
-init-deploy-ant:
-init-cl-deployment-env:
-parse-glassfish-web:
-parse-sun-web:
-no-parse-sun-web:
-add-resources:
-deploy-ant:
-deploy-without-pw:
[echo] Deploying dist/EJBModule1.jar
[get] Getting: http://localhost:4848/__asadmin/deploy?path=/home/thufir/NetBeansProjects/EJBModule1/dist/EJBModule1.jar&force=true&name=EJBModule1
[get] To: /tmp/gfv3945180939
[delete] Deleting: /tmp/gfv3945180939
-deploy-with-pw:
-run-deploy-am:
-post-nbmodule-run-deploy:
post-run-deploy:
-do-update-breakpoints:
run-deploy:
run:
BUILD SUCCESSFUL
Total time: 1 second
thufir#dur:~/NetBeansProjects/EJBModule1$
thufir#dur:~/NetBeansProjects/EJBModule1$ asadmin list-applications
EJBModule1 <ejb>
Command list-applications executed successfully.
thufir#dur:~/NetBeansProjects/EJBModule1$
thufir#dur:~/NetBeansProjects/EJBModule1$ cat src/java/net/bounceme/dur/ejb/NewSessionBean.java
package net.bounceme.dur.ejb;
import java.util.logging.Logger;
import javax.ejb.Stateless;
#Stateless(name = "FooBean", mappedName = "ejb/FooBean")
public class NewSessionBean implements NewSessionBeanRemote {
private final static Logger log = Logger.getLogger(NewSessionBean.class.getName());
#Override
public void sayHello() {
log.info("hi");
}
}
thufir#dur:~/NetBeansProjects/EJBModule1$
It's just a "module"; to my understanding this means that there's no web component (WAR file). But it could still be an EAR...or not?
(The build file/etc are all defaults from Netbeans.)
Ear is used to bundle both Jar modules and war modules togeather.
Hence Ear module makes sense when your enterprise application has:
multiple Jar modules
multiple war modules
multiple Jar and war modules.
Also, there is classpath implications in the above setup.
If my application has just a single jar module or war module, I would deploy it directly.

How to use Spark 1.2.0 in Play 2.2.3 project as it fails with NoSuchMethodError: akka.util.Helpers?

Have you ever had a problem with Play framework? In my case, first of all I have build all in one jar: spark-assebmly-1.2.0-hadoop2.4.0.jar, and Spark works perfectly from a shell. But there are two questions:
Should I use this assebmled Spark_jar in Play_project and how?? Because I try to move it into the lib_directiry and it did n`t help to provide any Spark_imports.
If I'm defining Spark library like: "org.apache.spark" %% "spark-core" % "1.2.0"
PLAY FRAMEWORK CODE:
Build.scala :
val appDependencies = Seq(
jdbc
,"org.apache.spark" %% "spark-streaming" % "1.2.0"
,"org.apache.spark" %% "spark-core" % "1.2.0"
,"org.apache.spark" %% "spark-sql" % "1.2.0"
TestEntity.scala :
package models
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.hadoop.conf.Configuration
import models.SparkMain
import org.apache.spark.rdd.RDD
import org.apache.spark.SparkContext._
object TestEntity {
val TestEntityPath = "/home/t/PROD/dict/TestEntity .txt"
val TestEntitySpark= SparkMain.sc.textFile(TestEntityPath, 4).cache
val TestEntityData = TestEntitySpark.flatMap(_.split(","))
def getFive() : Seq[String] = {
println("TestEntity.getFive")
TestEntityData.take(5)
}
}
SparkMain.scala :
package models
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.hadoop.conf.Configuration
import org.apache.spark.rdd.RDD
import org.apache.spark.SparkContext._
import org.apache.spark.streaming.{ Seconds, StreamingContext }
import StreamingContext._
import org.apache.spark.sql.SQLContext
import org.apache.spark.SparkConf
object SparkMain {
val driverPort = 8080
val driverHost = "localhost"
val conf = new SparkConf(false) // skip loading external settings
.setMaster("local[4]") // run locally with enough threads
.setAppName("firstSparkApp")
.set("spark.logConf", "true")
.set("spark.driver.port", s"$driverPort")
.set("spark.driver.host", s"$driverHost")
.set("spark.akka.logLifecycleEvents", "true")
val sc = new SparkContext(conf)
}
and controller code, which use Spark stuff :
def test = Action {
implicit req => {
val chk = TestEntity.getFive
Ok("it works")
}
}
..in runtime a have this errors:
[info] o.a.s.SparkContext - Spark configuration:
spark.akka.logLifecycleEvents=true
spark.app.name=firstSparkApp
spark.driver.host=localhost
spark.driver.port=8080
spark.logConf=true
spark.master=local[4]
[warn] o.a.s.u.Utils - Your hostname, uisprk resolves to a loopback address: 127.0.1.1; using 10.0.2.15 instead (on interface eth0)
[warn] o.a.s.u.Utils - Set SPARK_LOCAL_IP if you need to bind to another address
[info] o.a.s.SecurityManager - Changing view acls to: t
[info] o.a.s.SecurityManager - Changing modify acls to: t
[info] o.a.s.SecurityManager - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(t); users with modify permissions: Set(t)
[error] application -
! #6l039e8d5 - Internal server error, for (GET) [/ui] ->
play.api.Application$$anon$1: Execution exception[[RuntimeException: java.lang.NoSuchMethodError: akka.util.Helpers$.ConfigOps(Lcom/typesafe/config/Config;)Lcom/typesafe/config/Config;]]
at play.api.Application$class.handleError(Application.scala:293) ~[play_2.10-2.2.3.jar:2.2.3]
at play.api.DefaultApplication.handleError(Application.scala:399) [play_2.10-2.2.3.jar:2.2.3]
at play.core.server.netty.PlayDefaultUpstreamHandler$$anonfun$13$$anonfun$apply$1.applyOrElse(PlayDefaultUpstreamHandler.scala:166) [play_2.10-2.2.3.jar:2.2.3]
at play.core.server.netty.PlayDefaultUpstreamHandler$$anonfun$13$$anonfun$apply$1.applyOrElse(PlayDefaultUpstreamHandler.scala:163) [play_2.10-2.2.3.jar:2.2.3]
at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33) [scala-library-2.10.4.jar:na]
at scala.util.Failure$$anonfun$recover$1.apply(Try.scala:185) [scala-library-2.10.4.jar:na]
Caused by: java.lang.RuntimeException: java.lang.NoSuchMethodError: akka.util.Helpers$.ConfigOps(Lcom/typesafe/config/Config;)Lcom/typesafe/config/Config;
at play.api.mvc.ActionBuilder$$anon$1.apply(Action.scala:314) ~[play_2.10-2.2.3.jar:2.2.3]
at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4$$anonfun$apply$5.apply(Action.scala:109) ~[play_2.10-2.2.3.jar:2.2.3]
at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4$$anonfun$apply$5.apply(Action.scala:109) ~[play_2.10-2.2.3.jar:2.2.3]
at play.utils.Threads$.withContextClassLoader(Threads.scala:18) ~[play_2.10-2.2.3.jar:2.2.3]
at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4.apply(Action.scala:108) ~[play_2.10-2.2.3.jar:2.2.3]
at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4.apply(Action.scala:107) ~[play_2.10-2.2.3.jar:2.2.3]
Caused by: java.lang.NoSuchMethodError: akka.util.Helpers$.ConfigOps(Lcom/typesafe/config/Config;)Lcom/typesafe/config/Config;
at akka.remote.RemoteSettings.<init>(RemoteSettings.scala:48) ~[akka-remote_2.10-2.3.4-spark.jar:na]
at akka.remote.RemoteActorRefProvider.<init>(RemoteActorRefProvider.scala:114) ~[akka-remote_2.10-2.3.4-spark.jar:na]
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[na:1.7.0_72]
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) ~[na:1.7.0_72]
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[na:1.7.0_72]
at java.lang.reflect.Constructor.newInstance(Constructor.java:526) ~[na:1.7.0_72]
How to tie the library? through dependency or assembled_jar?
Any advice, please.
nosuchmethodeersrror exception is 100 % due to mismatch of jars version at compile time and runtime.
check the versions of jar. Also I have some questions about architecture of your app
Instead of calling spark code from play framework you can also call spark submit from shell scripts which looks better in your case. Even you can do it from your play application. no need to include jar in play app classpath.
The problem with the configuration is Akka dependency of Apache Spark and Play Framework -- they both depend on Akka and, as you've faced it, different and incompatible versions should be resolved at build time with evicted command in sbt.
You may want to use update command and find the reports in target/resolution-cache/reports quite useful.

Error : java.lang.NoClassDefFoundError: org/openqa/selenium/HasInputDevices , when running tests using ghostdriver

I am trying to run my webdriver test cases using ghostdriver (Phantomjs) but that's giving error java.lang.NoClassDefFoundError: org/openqa/selenium/HasInputDevices. Everything seems fine to me but I dont understand why there is error.
OS - WIN7
Coding - JAVA 1.7
Framework : java1.7+testng6.5.2+maven3
Selenium-java version 2.35.0
testcase
import org.openqa.selenium.By;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.phantomjs.PhantomJSDriver;
import org.openqa.selenium.phantomjs.PhantomJSDriverService;
import org.openqa.selenium.remote.DesiredCapabilities;
import org.testng.annotations.Test;
public class ghosttest {
WebDriver driver;
#Test
public void testing() {
DesiredCapabilities caps = new DesiredCapabilities();
caps.setJavascriptEnabled(true);
caps.setCapability(
PhantomJSDriverService.PHANTOMJS_EXECUTABLE_PATH_PROPERTY,
"D:/dumps/phantomjs-1.9.1-windows/phantomjs-1.9.1-windows/phantomjs.exe");
driver = new PhantomJSDriver(caps);
driver.get("http://www.google.com");
String Logintext = driver.findElement(By.linkText("Maps")).getText();
System.out.println(Logintext);
}
}
maven dependency for ghostdriver
<dependency>
<groupId>com.github.detro.ghostdriver</groupId>
<artifactId>phantomjsdriver</artifactId>
<version>1.0.3</version>
</dependency>
Your problem is that ghostdriver is not compatible with Selenium 2.35.
If you change your dependency to 2.34 you will be fine. You will have to wait for a new PhantomJSDriver unfortunately if you specifically need Selenium 2.35.
Currently the latest version of phantomjsdriver is 1.0.4 as well, you had 1.0.3.

Resources