Adding jars to a spark app - jar

Current when I run spark-submit I provide a whole bunch to paths of jars followed by the '--jars' option:
./spark\-submit --class "AppName" --master spark://server24:7077 --jars /path1.jar,path2.jar,path3.jar /pathAppName_2.10-1.0.jar arg1 arg2
Is there a cleaner way to include the jars files followed by --jar in the command above?
I tried adding them to spark.driver.extraClassPath in spark-defaults.conf but that does not seem to help. Couldn't find anything in the spark documentation otherwise.
Anyone know?

You can specify jars that you depend on when creating a SparkContext:
val conf = new SparkConf()
.setMaster('local[*]')
.setAppName('example')
.setJars(Array('/path/to/dependencies/first.jar',
'/path/to/dependencies/second.jar'))
val sc = new SparkContext(conf)
This is basically what's happening under the covers when you use the --jars argument of spark-submit.

The way I solved this in my Java Spark app was by using the maven shade plugin to create a fat-packed jar with all of the external dependencies in it. Otherwise, if you're using scala, this link may help you. For java, I would reference this.
In terms of another method of doing this out of the box with Spark, I don't think there's a cleaner way - at least if there is I never found one.

Related

How do I set Java options in SBT for the test configuration only?

I currently have a command line sbt -Dsome.configuration.option test doing what I want, but I would like it to apply that configuration option automatically for sbt test (and no other sbt phase). If my terminology is correct, then I want to set a Java Option for the Test Configuration. How do I do this?
Searching on these terms has led me to http://www.scala-sbt.org/release/docs/Testing.html but I have not yet been able to understand it.
This question looks similar to mine: Define custom test configurations in sbt
Try this:
testOptions in Test +=
Tests.Setup(() => sys.props += "some.configuration.option" -> "true")
Caveat:
Because you're not forking this mutates the state of the system property in the JVM running sbt itself.
Which means that after running test the first time it that system property will also be set if you, for instance, run your main from within sbt (run/runMain).

Combine sbt tasks from different scopes

I use sbt with the native packager plugin, in order to build Debian packages for our Play 2.2 applications. We use the debian:publish in order to upload the packages to our Artifactory server, and the publish command to publish the regular Java jars.
I'd like to be able to use the regular publish command to published both the jar files and the Debian packages. I think I need to somehow combine the publish task in the Debian scope with the regular one in the Compile scope, but I can't really find any documentation on how to do that.
I came up with the following code, which works, but seems to me to be the 'wrong' way to do it:
publish := { // Also publish deb files
val value = publish.value
(publish in Debian).value
}
Especially the second line seems wrong, since it's ignoring the value. The val is there to quiet a warning, which is another smell.
Is there a better way to do this?
You can use triggeredBy. In your build.sbt add following line:
publish in Debian <<= (publish in Debian).triggeredBy(publish in Compile)
PS. I think the way you did it is also fine. If you're worried about the warning you can assign the result to some val.
Here, the dependsOn task is appropriate, if you don't care about the return value:
publish := publish.dependsOn(publish in Debian).value

How to know available commands (=named operations) in sbt

I use sbt 0.13.1.
To list the tasks in a sbt project, one can use sbt tasks -V.
As A "command" looks similar to a task: it's a named operation that can be executed from the sbt console and Typically, you would resort to a command when you need to do something that's impossible in a regular task, how can I know what commands are available for a sbt project?
Is there a similar command to list all of the commands in a sbt project?
Say, the sbt-idea plugin's installed in a sbt project. How could I query the project to find out about the gen-idea command?
It's so confusing given the comment of #Mark Harrah: "gen-idea is a command, not a task." and the documentation of the sbt-idea plugin where it says "Use the gen-idea sbt task to create Idea project files." (note the use of task). I am confused.
Doesn't "help" without any arguments do that? From my understanding "tasks" without any arguments will list available tasks and "help" w/o arguments will do a similar things for commands.
I'd argue it's an implementation detail.
Do you have a real use case where you require to list only commands? :-)
--
Update 1:
sbt
$ <tab><tab>
Display all 339 possibilities? (y or n) [y]
# ...
gen-idea
# ...
Simply tabbing in the terminal gives you all actions you can perform, including gen-idea - your use-case.

adobe brackets-shell : cef extract failed

I followed all the step is mention in given below url to build my project( I am using win7 OS).
https://github.com/adobe/brackets-shell/wiki/Building-brackets-shell.
actullly i want to create brackets installer (installed wix 3.7).
but i am getting cef-extract failed error.
even though i also used grunt cef-extract --force.
after that its throunging new error.
create -project failed after that i am not able to process further.
can some one help me.
thanks in advanced.
Regards
ashish .
If you include the exact console output you're seeing, it would be much easier to help you. But based on snags other people have encountered recently, you can try these things:
Make sure your PATH includes Python 2.7 (otherwise "create-project" will fail).
Delete all these folders to be sure you're starting from a clean slate: deps, Debug, include, libcef_dll, Release, Resources.
Just run the high-level tasks grunt setup and grunt build, following the Building brackets-shell instructions. (There's a known bug where grunt cef-extract fails when run standalone).

Java - ant cannot find rt.jar

I'm compiling DrJava by following these instructions.
But when I run ant jar, I get the error:
/Users/arthur/dj/drjava/build.xml:1270: Can't find rt.jar in the Java 7 home: ${env.JAVA7_HOME}
I know that rt.jar is in /Library/Java/JavaVirtualMachines/jdk1.7.0_07.jdk/Contents/Home/jre/lib.
How do I fix it?
This answer is same as what others have provided - just a little more detailed. So what you simply need is type this on your shell:
export JAVA7_HOME=/Library/Java/JavaVirtualMachines/jdk1.7.0_07.jdk/Contents/Home/jre/lib
Now you have mentioned I know that rt.jar is in /Library/Java/JavaVirtualMachines/jdk1.7.0_07.jdk/Contents/Home/jre/lib.
Usually you would not have Contents/Home inside JDK. A standard Java installation should have jdk1.7.0_07.jdk/jre/lib. Check if you have provided the correct path above and rt.jar is indeed in there.
You should make sure you point ant to a JDK rather than to a JRE.
SET PATH TO JDK..This will solve the issue.
I ran into this same problem. Pulak was close, but not quite right: you don't need the jre/lib part at the end. The command that ultimately did the trick for me was:
export JAVA7_HOME=/Library/Java/JavaVirtualMachines/jdk1.7.0_60.jdk/Contents/Home/

Resources