Say I have the following multi-project structure:
my-project/
├── build.sbt
├── app1/
├── app2/
├── app3/
├── sharedA/
└── sharedB/
I use sbt-assembly to create fat JARs for the app subprojects. The shared subprojects are just library code so I disable AssemblyPlugin for those.
I use the default root project syntax so the subprojects are aggregated without me having to list them all out.
name := "my-project"
lazy val app1 = (project in file("app1"))
.dependsOn(sharedA)
lazy val app2 = (project in file("app2"))
.dependsOn(sharedB)
lazy val app3 = (project in file("app3"))
.dependsOn(sharedA, sharedB)
lazy val sharedA = (project in file("sharedA"))
.disablePlugins(AssemblyPlugin)
lazy val sharedB = (project in file("sharedB"))
.disablePlugins(AssemblyPlugin)
When I run sbt assembly, this mostly works: I get assembly jars for the apps and none for the shareds. However, I also get a my-project-assembly.jar which I don't need. How do I disable this?
If I disable from the top
name := "my-project"
disablePlugins(AssemblyPlugin)
...
Then the plugin is disabled completely, even for subprojects.
I know I can explicitly declare a root project, and this works:
lazy val root = (project in file("."))
.aggregate(app1, app2, app3, sharedA, sharedB)
.disablePlugins(AssemblyPlugin)
...
But, in reality, there can be a lot of subprojects so I would like to stick to the default root project style.
If you discard the paths in the assemblyMergeStrategy i think it will run as you wish.
assemblyMergeStrategy in assembly := {
case PathList("sharedA", xs # _*) => MergeStrategy.discard
case PathList("sharedB", xs # _*) => MergeStrategy.discard
...
}
Related
In my project there are 3 sub projects under root. build.sbt is as below.
proj_C depends on proj_A and proj_B.
If I created the assembly proj_C package with below command. It success and the assembly package could be imported in other projects.
sbt "project proj_C" assembly
If I publish with "sbt publish", as I defined addArtifact in proj_C settings, an assembly jar package is also generated and then published. But when I try to compile another project which imports this assembly jar, it will below error
[error] unresolved dependency: proj_A;1.0.0: not found
part of build.sbt is as below. Could anyone point me what I made wrong in my way?
Many thanks!
artifact in (Compile, assembly) := {
val art = (artifact in (Compile, assembly)).value
art.withClassifier(Some("assembly"))
}
lazy val assemblySettings = Seq(
assemblyMergeStrategy in assembly := {
{
case PathList("META-INF", xs # _*) => MergeStrategy.discard
case _ => MergeStrategy.first
}
}
)
lazy val root = Project(base = file("."))
.disablePlugins(sbtassembly.AssemblyPlugin)
.aggregate(proj_A, proj_B, proj_C)
.settings(
commonSettings,
skip in publish := true,
name := "proj_root"
)
lazy val proj_A= (project in file("proj_A"))
.disablePlugins(sbtassembly.AssemblyPlugin)
.settings(
commonSettings,
skip in publish := true,
name := "proj_A"
)
lazy val proj_B= (project in file("proj_B"))
.disablePlugins(sbtassembly.AssemblyPlugin)
.settings(
commonSettings,
skip in publish := true,
name := "proj_B"
)
lazy val proj_C= (project in file("proj_C"))
.settings(
commonSettings,
assemblySettings,
addArtifact(artifact in (Compile, assembly), assembly),
name := "proj_C"
) dependsOn(proj_A, proj_B)
First of all, I hope you know that the publishing of the fat jar is not recommended. And to be honest, in your case I really see no benefit in doing so.
If you simply publish A, B, C separately and then add the dependency in your other project it will all be automatically downloaded (along with dependencies of those projects). And the dependency management will be much easier...
But, since you want to add the A-assembly dependency, by the error I guess that you are actually adding the wrong jar. My guess would be that you publish both C.jar and C-assembly.jar, and you added the dependency like:
"your.organisation" %% "C" % "version"
but you should have:
"your.organisation" %% "C" % "version" classifier "assembly"
I'm building a Docker image with a fat jar. I use the sbt-assembly plugin to build the jar, and the sbt-native-packager to build the Docker image. I'm not very familiar with SBT and am running into the following issues.
I'd like to declare a dependency on the assembly task from the docker:publish task, such that the fat jar is created before it's added to the image. I did as instructed in the doc, but it's not working. assembly doesn't run until I invoke it.
publish := (publish dependsOn assembly).value
One of the steps in building the image is copying the fat jar. Since assembly plugin creates the jar in target/scala_whatever/projectname-assembly-X.X.X.jar, I need to know the exact scala_whatever and the jar name. assembly seems to have a key assemblyJarName but I'm not sure how to access it. I tried the following which fails.
Cmd("COPY", "target/scala*/*.jar /app.jar")
Help!
Answering my own questions, the following works:
enablePlugins(JavaAppPackaging, DockerPlugin)
assemblyMergeStrategy in assembly := {
case x => {
val oldStrategy = (assemblyMergeStrategy in assembly).value
val strategy = oldStrategy(x)
if (strategy == MergeStrategy.deduplicate)
MergeStrategy.first
else strategy
}
}
// Remove all jar mappings in universal and append the fat jar
mappings in Universal := {
val universalMappings = (mappings in Universal).value
val fatJar = (assembly in Compile).value
val filtered = universalMappings.filter {
case (file, name) => !name.endsWith(".jar")
}
filtered :+ (fatJar -> ("lib/" + fatJar.getName))
}
dockerRepository := Some("username")
import com.typesafe.sbt.packager.docker.{Cmd, ExecCmd}
dockerCommands := Seq(
Cmd("FROM", "username/spark:2.1.0"),
Cmd("WORKDIR", "/"),
Cmd("COPY", "opt/docker/lib/*.jar", "/app.jar"),
ExecCmd("ENTRYPOINT", "/opt/spark/bin/spark-submit", "/app.jar")
)
I completely overwrite the docker commands because the defaults add couple of scripts that I don't need because I overwrite the entrypoint as well. Also, the default workdir is /opt/docker which is not where I want to put the fat jar.
Note that the default commands are shown by show dockerCommands in sbt console.
I have a single module client-server project with a main for each.
I'm trying to use sbt-native-packager to generate start-script for both.
project/P.scala
object Tactic extends Build {
lazy val root =
(project in file(".")).
configs(Client, Server)
.settings( inConfig(Client)(Defaults.configTasks) : _*)
.settings( inConfig(Server)(Defaults.configTasks) : _*)
lazy val Client = config("client") extend Compile
lazy val Server = config("server") extend Compile
}
build.sbt
mainClass in Client := Some("myProject.Client")
mainClass in Server := Some("myProject.Server")
enablePlugins(JavaAppPackaging)
When I run client:stage the directory target/universal/stage/lib is created with all the necessary jars but the bin directory is missing. What am I doing wrong?
Subsidiary question: what is the key to set the starting script name?
I would recommend setting up your project as a multi-module build, instead of creating and using new configurations. I tried your multiple configuration route and it gets hairy very quickly.
For example (I created a shared project for anything shared between client & server):
def commonSettings(module: String) = Seq[Setting[_]](
organization := "org.tactic",
name := s"tactic-$module",
version := "1.0-SNAPSHOT",
scalaVersion := "2.11.6"
)
lazy val root = (project in file(".")
settings(commonSettings("root"))
dependsOn (shared, client, server)
aggregate (shared, client, server)
)
val shared = (project
settings(commonSettings("shared"))
)
val client = (project
settings(commonSettings("client"))
enablePlugins JavaAppPackaging
dependsOn shared
)
val server = (project
settings(commonSettings("server"))
enablePlugins JavaAppPackaging
dependsOn shared
)
Note I'm enabling sbt-native-packager's JavaAppPackaging in the client and server.
Then run stage.
Also, the key for the starting script name is executableScriptName.
I have the following project structure:
lazy val root = project.aggregate(rest,backend)
lazy val rest = project
lazy val backend = project
When I execute the "run" task from the parent, I want a specific class from the "backend" project to have its main method executed. How would I accomplish this?
lazy val root = project.aggregate(rest,backend).dependsOn(rest,backend) //<- don't forget dependsOn
lazy val rest = project
lazy val backend = project.settings(mainClass in (Compile, run) := Some("fully.qualified.path.to.MainClass"))
run in Compile <<= (run in Compile in backend)
In Maven you can have Profiles, which can set up a build configuration for different environments. For example DEV, QA, UAT, PRODUCTION
In order to support continuous integration, there must be a way to tell SBT which environment to run against.
how to set up for different environments in SBT. For example DEV, QA, UAT, PRODUCTION?
thanks
You can do this by creating a custom configuration.
val ProfileDev = config("dev") extend(Runtime)
val ProfileQA = config("qa") extend(Runtime)
val root = (project in file(".")).
configs(ProfileDev, ProfileQA). // add config here!
settings(
name := "helloworld",
....
).
settings(inConfig(ProfileDev)(Classpaths.configSettings ++ Defaults.configTasks ++ Defaults.resourceConfigPaths ++ Seq(
unmanagedResourceDirectories += {baseDirectory.value / "src" / configuration.value.name / "resources"}
)): _*).
settings(inConfig(ProfileQA)(Classpaths.configSettings ++ Defaults.configTasks ++ Defaults.resourceConfigPaths ++ Seq(
unmanagedResourceDirectories += {baseDirectory.value / "src" / configuration.value.name / "resources"}
)): _*)
You then place your config file in src/dev/resources and src/qa/resources, and it should be part of your classpath when you say dev:run or dev:package. Here's a quick test:
object Main extends App {
println(xml.XML.load(this.getClass.getResource("/config.xml")))
}