Ignoring subproject while building fat JAR of root project - sbt

I have a root project from which I want to build a fat JAR with sbt-assembly. It does have a subproject, which depends on root which I want to have ignored for the fat JAR (as if it didn't exist). How do I do this?
Basically I want the root project packages as if there was no localMode subproject from the build.sbt in Try 1.
Try 1
My build.sbt is
import sbt.Keys._
name := "myprojectname"
version := "0.0.1-SNAPSHOT"
scalaVersion in ThisBuild := "2.11.8"
mainClass in(Compile, run) := Some("com.mywebsite.MyExample")
mainClass in(Compile, packageBin) := Some("com.mywebsite.MyExample")
mainClass in assembly := Some("com.mywebsite.MyExample")
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.0" % Provided
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.0" % Test
lazy val localMode = project.
in(file("localMode")).
dependsOn(RootProject(file("."))).
settings(
name := "myprojectname_local",
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.0" % Compile,
mainClass in(Compile, run) := Some("com.mywebsite.MyExample"),
mainClass in(Compile, packageBin) := Some("com.mywebsite.MyExample")
)
fullClasspath in assembly := {
(fullClasspath in assembly).value diff
(fullClasspath in assembly in localMode).value
}
currently I get the error message:
[error] (localMode/*:assembly) deduplicate: different file contents found in the following:
[error] ~/.ivy2/cache/org.sonatype.sisu/sisu-guice/jars/sisu-guice-2.1.7-noaop.jar:com/google/inject/AbstractModule.class
[error] ~/.ivy2/cache/com.google.inject/guice/jars/guice-3.0.jar:com/google/inject/AbstractModule.class
[error] deduplicate: different file contents found in the following:
[error] ~/.ivy2/cache/org.sonatype.sisu/sisu-guice/jars/sisu-guice-2.1.7-noaop.jar:com/google/inject/Binder.class
[error] ~/.ivy2/cache/com.google.inject/guice/jars/guice-3.0.jar:com/google/inject/Binder.class
[error] deduplicate: different file contents found in the following:
[error] ~/.ivy2/cache/org.sonatype.sisu/sisu-guice/jars/sisu-guice-2.1.7-noaop.jar:com/google/inject/ConfigurationException.class
[error] ~/.ivy2/cache/com.google.inject/guice/jars/guice-3.0.jar:com/google/inject/ConfigurationException.class
and so on...
If I command sbt root/assembly I get
[error] Expected ID character
[error] Not a valid command: root (similar: reboot, boot, project)
[error] Expected project ID
[error] Expected configuration
[error] Expected ':' (if selecting a configuration)
[error] Expected key
[error] Not a valid key: root (similar: products)
[error] root/assembly
[error] ^
Try 2
My second build.sbt cannot be build:
import sbt.Keys._
lazy val commonSettings = Seq(
version := "0.0.1-SNAPSHOT",
scalaVersion in ThisBuild := "2.11.8",
mainClass in(Compile, run) := Some("com.mywebsite.MyExample"),
mainClass in(Compile, packageBin) := Some("com.mywebsite.MyExample"),
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.0" % Test
)
lazy val root = project.
in(file("root")).
dependsOn(RootProject(file("."))).
settings(
name := "myprojectname",
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.0" % Provided,
mainClass in assembly := Some("com.mywebsite.MyExample")
)
lazy val localMode = project.
in(file("localMode")).
dependsOn(RootProject(file("."))).
settings(
name := "myprojectname_local",
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.0" % Compile
)

I think you can do it with the assembly::fullClasspath setting. By default it's set to fullClasspath or (fullClasspath in Runtime). So probably you can do something like this:
fullClasspath in assembly := {
(fullClasspath in assembly).value diff
(fullClasspath in assembly in localMode).value
}
In the absence of details about you particular project configuration, I guess localMode is the name of the subproject you want to exclude.
UPDATE
There are some problems with the build.sbt in your Try 2:
- you don't add common settings to your projects
- "root" is the one in the, well, root of your project dir (i.e. in file("."))
- if you explicitly define root project, the other one should depend on it, instead of the RootProject, which is just a way to refer to the "implicitly" defined root project
import sbt.Keys._
lazy val commonSettings = Seq(
version := "0.0.1-SNAPSHOT",
scalaVersion in ThisBuild := "2.11.8",
mainClass in(Compile, run) := Some("com.mywebsite.MyExample"),
mainClass in(Compile, packageBin) := Some("com.mywebsite.MyExample"),
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.0" % Test
)
lazy val root = project.in(file(".")).
settings(commonSettings: _*).
settings(
name := "myprojectname",
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.0" % Provided,
mainClass in assembly := Some("com.mywebsite.MyExample")
)
lazy val localMode = project. // by default the name of the project val is the name of its base directory
dependsOn(root).
settings(commonSettings: _*).
settings(
name := "myprojectname_local",
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.0" % Compile
)
Check sbt docs on Multi-project builds. To your questions about the root project, there's a chapter called Default root project. Now with these fixes, does root/assembly work as expected?

Try putting
aggregate in (Compile, assembly) := false

Related

Build a Jar with and without provided dependencies

I have an SBT project, with spark dependencies. These dependencies are provided at runtime, and hence I import them under provided scope.
val hadoop = Seq("org.apache.hadoop" % "hadoop-client" % "3.3.1" % provided)
val spark = Seq(
"org.apache.spark" %% "spark-core" % SparkVersion % provided,
"org.apache.spark" %% "spark-sql" % SparkVersion % provided,
"org.apache.spark" %% "spark-mllib" % SparkVersion % provided
)
lazy val coreDto = (project in file("xxxx"))
.enablePlugins(BuildInfoPlugin)
.enablePlugins(PackPlugin)
.settings(
name := "xxxx",
moduleName := "xxxx",
version := "1.0",
libraryDependencies ++= (hadoop ++ spark))
All is well till now. Now I have a new scenario, where I have to publish the jar to our maven repository. And successfully I am able to publish it. The issue now is: provided scope. As the compile time dependencies are not appropriately set.
Question: How do I configure, where the provided scope is ignored during publishing? But still considered when I package it?
Using this to publish in case if helpful
lazy val publishSettings = Seq(
publishMavenStyle := true,
publishTo := {
val url = "https://xxxxl/maven/v1/"
if (version.value.trim.toUpperCase.endsWith("SNAPSHOT"))
Some("snapshots".at(url))
else
Some("releases".at(url))
},
aetherDeploy / logLevel := Level.Info,
aetherOldVersionMethod := true,
credentials += Credentials(Path.userHome / ".sbt" / ".credentials")
)

How to add dependent project in build.sbt for running sbt compile

I am new to sbt build.
I would like to add java files of a dependent project (say Proj A) to my compiling project (Proj B).
Running sbt compile in Proj B throws error that dependent project's java package/classes are not found.
I went through the link: https://www.scala-sbt.org/0.13/docs/Multi-Project.html but its not clear to me add this dependency to make it work.
I tried adding a below line in build.sbt, but it didnt work.
lazy val projB = project.dependsOn(/projA)
Updated
build.sbt of projB:
organization := "com.org"
name := "projB"
version := "1"
resolvers ++= Seq(
"Typesafe" at "http://repo.typesafe.com/typesafe/releases/",
"Java.net Maven2 Repository" at "http://download.java.net/maven/2/",
)
lazy val projB = project.dependsOn(projA)
// the library dependencies of springframework here
build.sbt of Proj A:
organization := "com.org"
name := "proj A"
version := "1"
resolvers ++= Seq(
"Typesafe" at "http://repo.typesafe.com/typesafe/releases/",
"Java.net Maven2 Repository" at "http://download.java.net/maven/2/",
)
// the library dependencies of springframework here
When i do sbt compile on proj B, it throws error the dependent classes are not found. Class Hbase is in Proj A.
[error] import com.org.config.Hbase;
[error] **\hbase\HbaseDAO.java:38:1:
cannot find symbol
[error] symbol: class Hbase
[error] location: class com.org.hbase.HbaseDAO
[error] private Hbase hbase;
[error] (Compile / compileIncremental) javac returned non-zero exit code
[error] Total time: 6 s, completed 29/08/2019 9:58:39 AM
Updated build.sbt after the suggestion:
inThisBuild(
Seq(
organization := "com.org",
version := "1",
resolvers ++= Seq(
"Typesafe" at "http://repo.typesafe.com/typesafe/releases/",
"Java.net Maven2 Repository" at "http://download.java.net/maven/2/",
)
)
)
lazy val root = project
.in(file("."))
.aggregate(projA,projB)
lazy val projA = project.settings(
// project A settings and library dependencies
libraryDependencies += "org.springframework.boot" % "spring-boot-starter-
parent" % "2.1.6.RELEASE" pomOnly()
libraryDependencies += "org.springframework.boot" % "spring-boot-starter-
web" % "2.1.6.RELEASE"
libraryDependencies += "org.springframework.data" % "spring-data-hadoop-
hbase" % "2.3.0.RELEASE"
libraryDependencies += "org.mortbay.jetty" % "jetty" % "7.0.0.pre5"
libraryDependencies += "io.netty" % "netty-all" % "5.0.0.Alpha2"
libraryDependencies += "commons-beanutils" % "commons-beanutils" % "1.9.4"
libraryDependencies += "commons-beanutils" % "commons-beanutils-core" %
"1.8.3"
libraryDependencies += "xerces" % "xercesImpl" % "2.12.0"
libraryDependencies += "org.apache.hadoop" % "hadoop-yarn-server-
nodemanager" % "3.2.0"
libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "3.2.0"
libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.7.0"
libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "3.2.0"
libraryDependencies += "org.apache.hbase" % "hbase-client" % "2.1.1"
libraryDependencies += "org.apache.hbase" % "hbase" % "2.1.1" pomOnly()
libraryDependencies += "org.apache.hbase" % "hbase-common" % "2.1.1"
)
lazy val projB = project
.dependsOn(projA)
.settings(
// project B settings and library dependencies
libraryDependencies += "org.springframework.boot" % "spring-boot-starter-
parent" % "2.1.6.RELEASE" pomOnly()
libraryDependencies += "org.springframework.boot" % "spring-boot-starter-
web" % "2.1.6.RELEASE"
libraryDependencies += "org.springframework.data" % "spring-data-hadoop-
hbase" % "2.3.0.RELEASE"
libraryDependencies += "org.mortbay.jetty" % "jetty" % "7.0.0.pre5"
libraryDependencies += "io.netty" % "netty-all" % "5.0.0.Alpha2"
libraryDependencies += "commons-beanutils" % "commons-beanutils" % "1.9.4"
libraryDependencies += "commons-beanutils" % "commons-beanutils-core" %
"1.8.3"
libraryDependencies += "xerces" % "xercesImpl" % "2.12.0"
libraryDependencies += "org.apache.hadoop" % "hadoop-yarn-server-
nodemanager" % "3.2.0"
libraryDependencies += "com.fasterxml.jackson.core" % "jackson-databind" %
"2.10.0.pr2"
libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "3.2.0"
libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "3.2.0"
libraryDependencies += "org.apache.hbase" % "hbase-client" % "2.1.1"
libraryDependencies += "org.apache.hbase" % "hbase" % "2.1.1" pomOnly()
libraryDependencies += "org.apache.hbase" % "hbase-common" % "2.1.1"
)
An error is thrown while sbt compile after the below library dependency in both project settings projA and projB
libraryDependencies += "org.springframework.boot" % "spring-boot-starter-
web" % "2.1.6.RELEASE"
')' expected but string literal found is thrown for this line in projA settings and
';' expected but string literal found is thrown for this line in projB settings.
I couldnt get much clue with this err.
Looking at the two snippets you posted, I'm guessing that you have two separate build.sbt files, one for each subproject. This makes them independent and one project just doesn't see the other. While it may be possible to have multiple build.sbt files for the subprojects, it's recommended to define the whole multiproject build in a single build.sbt file in the root of the project.
For example, if you structure your project like this:
├── project
│ ├── build.properties
│ └── plugins.sbt
├── projA
│ └── src
├── projB
│ └── src
└── build.sbt
Then you can put all the build settings and subproject relations in the root build.sbt:
inThisBuild(
Seq(
organization := "com.org",
version := "1",
resolvers ++= Seq(
"Typesafe" at "http://repo.typesafe.com/typesafe/releases/",
"Java.net Maven2 Repository" at "http://download.java.net/maven/2/",
)
)
)
lazy val root = project
.in(file("."))
.aggregate(projA, projB)
lazy val projA = project
.settings(
// project A settings and library dependencies
)
lazy val projB = project
.dependsOn(projA)
.settings(
// project B settings and library dependencies
)
Then if you launch an sbt shell from the root of the project, you can call compile (or any other task) to compile both projA and projB, or you can call projA/compile to compile that subproject specifically.
You are already reading documentation, so you know where to find more information. Notice that the link you provided points to the old documentation, at the top there is a banner pointing to the new page: https://www.scala-sbt.org/1.x/docs/Multi-Project.html

Excluding a dependency in creating fat jar using SBT

I am writing a akka application. While creating far jar of application , I dont want scala libraries to be packaged with the jar. My build.sbt looks as follows:
lazy val root = (project in file(".")).
settings(
name :="akka-app",
version :="1.0",
scalaVersion :="2.10.4",
mainClass in Compile := Some("sample.hello.HelloWorld")
)
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % "2.3.4" % "provided",
"com.typesafe" % "config" % "1.2.1"
)
// META-INF discarding
mergeStrategy in assembly <<= (mergeStrategy in assembly) { (old) =>
{
case PathList("META-INF", xs # _*) => MergeStrategy.discard
case x => MergeStrategy.first
}
}
But this sbt packages scala with jar. I want only com.typesafe.config library to be present in the jar. Any solution how to achieve this?
You can exclude Scala by modifying the option in the assemblyOption setting:
assemblyOption in assembly :=
(assemblyOption in assembly).value.copy(includeScala = false)

sbt assembly not publishing fat jar

I can sbt assembly myself a fat jar without an issue with the below build.sbt file. However when I try to publish this "fat jar", sbt publish dumps only 1kb .jar files in the s3 bucket.
Unzipping the .jar file shows that it only contains a manifest file.
How do I get the fat jar into my repo?
update: striked text has been altered since initial question was posed. Removed the name override and it now publishes the build code but without the external libraries
below, my build.sbt file
name := "util_myutil"
version := "1.0.1"
scalaVersion := "2.10.4"
scalacOptions += "-target:jvm-1.7"
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.5.0-cdh5.5.2" % "provided"
unmanagedJars in Compile += file(".lib/my.jar")
unmanagedJars in Compile += file(".lib/some_other.jar")
assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)
assemblyJarName in assembly := s"${name.value}-${version.value}.jar"
ivyScala := ivyScala.value map { _.copy(overrideScalaVersion = true) }
resolvers ++= Seq(
"Cloudera repos" at "https://repository.cloudera.com/artifactory/cloudera-repos",
"Cloudera releases" at "https://repository.cloudera.com/artifactory/libs-release",
"Era7 maven releases" at "https://s3-eu-west-1.amazonaws.com/releases.era7.com"
)
s3sse := true
s3region := com.amazonaws.services.s3.model.Region.US_Standard
s3acl := com.amazonaws.services.s3.model.CannedAccessControlList.Private
s3overwrite := true
publishMavenStyle := true
publishTo := {
val suffix = if (isSnapshot.value) "snapshots" else "releases"
Some(s3resolver.value(s"IT Insights Artifacts $suffix", s3("my-mvn-repo." + suffix)))
}
from https://github.com/sbt/sbt-assembly:
add this to your build.sbt:
artifact in (Compile, assembly) := {
val art = (artifact in (Compile, assembly)).value
art.copy(`classifier` = Some("assembly"))
}
addArtifact(artifact in (Compile, assembly), assembly)

How to tell sbteclipse to ignore src/main/java?

How can I get the sbt-eclipse plugin to ignore adding/creating the src/main/java and src/test/java to the eclipse .classpath?
I dont have these folders and when I run >eclipse the eclipse-sbt-plugin creates those folders and adds to eclipse .classpath.
build.sbt file
name := "myproject"
version := "1.0"
scalaVersion := "2.10.1"
resolvers += "google-api-services" at "http://google-api-client-libraries.appspot.com/mavenrepo"
libraryDependencies += "org.scalatest" %% "scalatest" % "1.9.1" % "test"
libraryDependencies += "junit" % "junit" % "4.10" % "test"
libraryDependencies += "com.novocode" % "junit-interface" % "0.10-M1" % "test"
EclipseKeys.createSrc := EclipseCreateSrc.ValueSet(EclipseCreateSrc.Unmanaged, EclipseCreateSrc.Source, EclipseCreateSrc.Resource)
projects/plugins.sbt file
resolvers += Classpaths.typesafeResolver
addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "2.2.0")
Thanks.
This is the default behavior of sbt to have javaSources and scalaSources in classpath.
Them being in eclipse is just a consequence.
It can be changed with (for only java project):
unmanagedSourceDirectories in Compile := (javaSource in Compile).value :: Nil
or (for only scala project)
unmanagedSourceDirectories in Compile := (scalaSource in Compile).value :: Nil
or just remove them all
unmanagedSourceDirectories in Compile := Nil
You can do it like this:
unmanagedSourceDirectories in Test <<= (sourceDirectory){ src => src / "somerandompathfortestsources" :: Nil}
To see what they are (in sbt console):
show unmanagedSourceDirectories
show sources
...
To see what makes them:
inspect unmanagedSourceDirectories
...
More about:
http://www.scala-sbt.org/0.13.0/docs/Detailed-Topics/Java-Sources.html

Resources