So, I am running into the same issue that many are experiencing - see the error below.
WARN scheduler.TaskSetManager: Lost task 0.0 in stage 3.0 (): java.io.IOException: java.lang.reflect.InvocationTargetException
Caused by: java.lang.reflect.InvocationTargetException
Caused by: java.lang.NoClassDefFoundError: org/apache/htrace/Trace
Caused by: java.lang.ClassNotFoundException: org.apache.htrace.Trace
The instructions for fixing this seem to be missing, incomplete, or overly complex. I've resolved other reference issues simply by adding them to the packages list in spark-submit. I've tried doing the same with this htrace dependency too, but with no luck.
This is my current spark-submit command:
spark-submit --class com.biz.HbasePassthrough \
--packages \
org.apache.spark:spark-streaming-kafka_2.10:1.3.0,org.apache.hbase:hbase-common:1.0.0,org.apache.hbase:hbase-client:1.0.0,org.apache.hbase:hbase-server:1.0.0,org.json4s:json4s-jackson_2.10:3.2.11,org.apache.htrace:htrace:3.1.0-incubating,org.apache.htrace:htrace-core:3.1.0-incubating,org.apache.htrace:htrace-hbase:3.1.0-incubating,org.apache.hbase:hbase-annotations:1.0.0 \
./my-spark_2.10-1.0.8.jar \
>spark_log 2>&1
Output of grep -n1 "htrace" spark_log
8-org.json4s#json4s-jackson_2.10 added as a dependency
9:org.apache.htrace#htrace added as a dependency
10:org.apache.htrace#htrace-core added as a dependency
11:org.apache.htrace#htrace-hbase added as a dependency
12-org.apache.hbase#hbase-annotations added as a dependency
--
36- found org.mortbay.jetty#jetty-util;6.1.26 in central
37: found org.apache.htrace#htrace-core;3.1.0-incubating in central
38- found org.apache.hbase#hbase-client;1.0.0 in central
--
75- found com.fasterxml.jackson.core#jackson-core;2.3.1 in central
76: found org.apache.htrace#htrace;3.1.0-incubating in central
77: found org.apache.htrace#htrace-hbase;3.1.0-incubating in central
78-:: resolution report :: resolve 6423ms :: artifacts dl 45ms
--
111- org.apache.hbase#hbase-server;1.0.0 from central in [default]
112: org.apache.htrace#htrace;3.1.0-incubating from central in [default]
113: org.apache.htrace#htrace-core;3.1.0-incubating from central in [default]
114: org.apache.htrace#htrace-hbase;3.1.0-incubating from central in [default]
115- org.apache.kafka#kafka_2.10;0.8.1.1 from central in [default]
--
182-15/11/30 23:39:20 INFO spark.SparkContext: Added JAR file:/home/ubuntu/.ivy2/jars/json4s-jackson_2.10.jar at http://192.168.240.209:37644/jars/json4s-jackson_2.10.jar with timestamp 1448926760838
183:15/11/30 23:39:20 INFO spark.SparkContext: Added JAR file:/home/ubuntu/.ivy2/jars/htrace-core.jar at http://192.168.240.209:37644/jars/htrace-core.jar with timestamp 1448926760845
184:15/11/30 23:39:20 INFO spark.SparkContext: Added JAR file:/home/ubuntu/.ivy2/jars/htrace-hbase.jar at http://192.168.240.209:37644/jars/htrace-hbase.jar with timestamp 1448926760846
185-15/11/30 23:39:20 INFO spark.SparkContext: Added JAR file:/home/ubuntu/.ivy2/jars/hbase-annotations.jar at http://192.168.240.209:37644/jars/hbase-annotations.jar with timestamp 1448926760847
All of that output at the beginning of the job seems to suggest that it found the package and will use it successfully, but it doesn't work.
I did finally find something that I think got me past the issue - here. I adapted this by adding the following two lines to my spark-submit command above
--driver-class-path "/opt/cloudera/parcels/CDH/lib/hbase/lib/htrace-core-3.1.0-incubating.jar" \
--conf "spark.executor.extraClassPath=/opt/cloudera/parcels/CDH/lib/hbase/lib/htrace-core-3.1.0-incubating.jar" \
Why didn't the packages reference work?
Related
When using sbt run as follows:
sbt "project epa-recon" "run"
We see that two main's are found:
[warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list
Multiple main classes detected, select one to run:
[1] com.lash.epa.recon.EPAReconApp
[2] com.lash.epa.recon.EPAReconApp47D
So then we should be able to use runMain .. no?
sbt "project epa-recon" "runMain com.lash.epa.recon.EPAReconApp"
well .. no ..
[warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list
java.lang.RuntimeException: No main class detected.
at scala.sys.package$.error(package.scala:27)
[trace] Stack trace suppressed: run last epa-recon/compile:runMain for the full output.
[error] (epa-recon/compile:runMain) No main class detected.
These messages are contradictory. So .. any insights into what were actually the problem here?
I just ran the exact syntax you provided in the question, and it worked. Using sbt 1.2.3 on Linux.
sbt "project subproject1" "runMain com.myco.SomeClassName"
I have a Pseudo-distributed cluster with Oozie 4.2.0, Hadoop 2.7, Hive 1.1.2 and Java 1.8. After I built the Oozie distribution with the components I am trying to copy the "shared lib" to HDFS. When I run the command it gives me the below error. I think a JAR file is missing (or it says so).
I am not a JAVA person and have no knowledge about this error what so ever. But, I think if I have built Oozie successfully with all required JAR files then this error should not crop up. I browsed through all other similar Oozie issues with JNI error but I found no credible answer to solve this issue. Can someone help me in this front here please?
oozie-setup.sh sharelib create -fs hdfs://localhost:9000
Error: A JNI error has occurred, please check your installation and try
again
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache
/hadoop/conf/Configuration
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
at java.lang.Class.getMethod0(Class.java:3018)
at java.lang.Class.getMethod(Class.java:1784)
at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException:
org.apache.hadoop.conf.Configuration
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 7 more
I found the Solution for this myself,
Step 1 : Copy $HADOOP_INSTALL/share/common/*.jar to $OOZIE_INSTALL/libext
Step 2 : Rebuild the .WAR file.
Step 3 : Rerun oozie
set -e pipefail; sbt -Dsbt.log.noformat=true -DchiselVersion="latest.release" "run Parity --genHarness --compile --test --backend c --vcd " | tee Parity.out
Getting org.scala-sbt sbt 0.13.8 ...
problems summary ::
WARNINGS
module not found: org.scala-sbt#sbt;0.13.8
::::::::::::::::::::::::::::::::::::::::::::::
:: UNRESOLVED DEPENDENCIES ::
::::::::::::::::::::::::::::::::::::::::::::::
:: org.scala-sbt#sbt;0.13.8: not found
::::::::::::::::::::::::::::::::::::::::::::::
ERRORS
Server access Error: Connection refused url=https://repo.typesafe.com/typesafe/ivy-releases/org.scala-sbt/sbt/0.13.8/ivys/ivy.xml
Server access Error: Connection refused url=https://repo1.maven.org/maven2/org/scala-sbt/sbt/0.13.8/sbt-0.13.8.pom
Server access Error: Connection refused url=https://repo1.maven.org/maven2/org/scala-sbt/sbt/0.13.8/sbt-0.13.8.jar
:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
unresolved dependency: org.scala-sbt#sbt;0.13.8: not found
Error during sbt execution: Error retrieving required libraries
Error: Could not retrieve sbt 0.13.8
this might be proxy issue.
Edit $SBT_HOME/conf directory/sbtconfig.txt file and add following entries:
-Dhttp.proxyHost=<proxy server>
-Dhttp.proxyPort=<proxy port>
-Dhttp.proxyUser=<username>
-Dhttp.proxyPassword=<password>
-Dhttps.proxyHost=<proxy server>
-Dhttps.proxyPort=<proxy port>
-Dhttps.proxyUser=<username>
-Dhttps.proxyPassword=<password>
-Dftp.proxyHost=<proxy server>
-Dftp.proxyPort=<proxy port>
-Dftp.proxyUser=<username>
-Dftp.proxyPassword=<password>
Notes:
https settings are necessary as many urls referred by the SBT are https based.
Don't include "http://" in the value
I faced a similar issue. Seems like the issue is with the java that is used. By mistake my environment was pointing to jre rather than jdk. After pointing to right JAVA_HOME as below, the sbt clean package compile worked fine.
[root#spark-sql-perf]# update-alternatives --config java
There are 2 programs which provide 'java'.
Selection Command
*+ 1 java-1.8.0-openjdk.ppc64le (/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.111-2.b15.el7_3.ppc64le/jre/bin/java)
2 java-1.7.0-openjdk.ppc64le (/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.121-2.6.8.0.el7_3.ppc64le/jre/bin/java)
Enter to keep the current selection[+], or type selection number: q
There are 2 programs which provide 'java'.
Selection Command
*+ 1 java-1.8.0-openjdk.ppc64le (/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.111-2.b15.el7_3.ppc64le/jre/bin/java)
2 java-1.7.0-openjdk.ppc64le (/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.121-2.6.8.0.el7_3.ppc64le/jre/bin/java)
Enter to keep the current selection[+], or type selection number: ^C
[root#spark-sql-perf]# export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.111-2.b15.el7_3.ppc64le/
[root#spark-sql-perf]# export PATH=$JAVA_HOME/bin:$PATH
I have built a Camel project in Eclipse with Maven dependencies.
It ran successfully and also Built the Jar file and ran it from the command prompt
which is running as required. But when I moved the JAR file onto to our Linux machine
which is like a Job Manager server and when I try to run the JAR file as below
I am getting the below error message.
When I try to run with the below command
$ java –jar mycamelproject
I am getting the below error, but I do have the below mentioned dependency in the Dependency-Jars folder.
Exception in thread "main" java.lang.NoClassDefFoundError: org/springframework/c
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2531)
at java.lang.Class.getMethod0(Class.java:2774)
at java.lang.Class.getMethod(Class.java:1663)
at sun.launcher.LauncherHelper.getMainMethod(LauncherHelper.java:494)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:486)
Caused by: java.lang.ClassNotFoundException: org.springframework.context.support
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 6 more
Then I tried running with the below command.
$ mvn -X exec:java -Dexec.mainClass=mycamelpackage.mycamelmainclass
I am getting a series of the below errors such as below
[DEBUG] Could not find metadata org.codehaus.mojo/maven-metadata.xml in local (/home/ec2-user/.m2/repository)
[DEBUG] Skipped remote update check for org.codehaus.mojo/maven-metadata.xml, already updated during this session.
[WARNING] Failure to transfer org.codehaus.mojo/maven-metadata.xml from
http://repo.maven.apache.org/maven2 was cached in the local repository, resolution will not be reattempted until the update interval of central has elapsed or updates are forced. Original error: Could not transfer metadata org.codehaus.mojo/maven-metadata.xml from/to central (http://repo.maven.apache.org/maven2): proxy.host.net
[ERROR] No plugin found for prefix 'exec' in the current project
and in the plugin groups [org.apache.maven.plugins, org.codehaus.mojo]
available from the repositories
How should java –jar mycamelproject work if no classpath is set? And mycamelproject should be a JAR file anyway (and if it is, you should add the *.jar extension).
Beside that, I guess your local Maven repository may be corrupt. That's usually the reason why you see the resolution will not be reattempted error message. The easiest way to fix this is to remove the corrupt directories and start Maven again.
I’m a newbie with Storm and I have setup a Storm-on-Yarn on an HDP cluster using the instructions at the HDP Storm-on-Yarn page and the storm-yarn-master from anfeng's storm-yarn git project.
I’m able to get Nimbus running and even submit topologies and see them on Storm UI. However, the spouts and the bolts don’t seem to be “working” (0 counts of tuples emitted).
I did some digging around and realized that my worker daemons are not starting. The supervisor log spits out these:
2014-03-13 11:22:03 b.s.d.supervisor [INFO] 18bf93a1-1cea-4e99-93da-8f36a4e9c056 still hasn't started
I tried launching the worker command from the “Launching worker with command” line in the supverviser log and I got this error:
Exception in thread "main" java.lang.NoClassDefFoundError: backtype/storm/daemon/worker
Caused by: java.lang.ClassNotFoundException: backtype.storm.daemon.worker
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: backtype.storm.daemon.worker. Program will exit.
It looks like it can’t find the worker class although it’s present in the storm-core jar.
Any ideas on how I can proceed with troubleshooting this? I’ve attached the nimbus and the supervisor logs. The worker logs don't seem to have been created.
Nimbus Log - http://paste.ubuntu.com/7089418/
Supervisor Log - http://paste.ubuntu.com/7089422/
Hadoop Version - 2.2
Storm Version - 0.9.0-wip21
I've had an issue like this when the JAR file I was creating did not exclude the storm binaries. i.e. in the pom.xml file, make sure that you have the storm-core dependency set with:
<scope>provided</scope>
As well, I had issues where multiple versions of netty were installed in the storm lib folder (had to delete the old version JAR). This was also causing NoClassDefFoundErrors to be thrown (albeit, different than the one you are experiencing).
I would suggest looking at the classpath that shows up when you submit the topolgies (you can do that by doing ps -Af | grep storm)