As per SBT documentation (version 1.2.x)
-> https://www.scala-sbt.org/1.x/docs/Artifacts.html
there is a packagerWar task.
// disable .jar publishing
publishArtifact in (Compile, packageBin) := false
// create an Artifact for publishing the .war file
artifact in (Compile, packageWar) := {
val previous: Artifact = (artifact in (Compile, packageWar)).value
previous.withType("war").withExtension("war")
}
// add the .war file to what gets published
addArtifact(artifact in (Compile, packageWar), packageWar)
The packageWar key does not exist in SBT though. Is it a mistake in SBT documentation or is my understanding of the way it should be used incorrect?
Related
I'd like to create an SBT project with inheritance and shared dependencies.
With Maven's POM files, there is the idea of Project Inheritance where you can set a parent project. I'd like to do the same thing with SBT.
The xchange-stream library uses Maven's Project Inheritance to resolve subproject dependencies when compiled from the parent project.
Here is my idea of what the file structure would look like:
sbt-project/
project/
dependencies.scala # Contains dependencies common to all projects
build.sbt # Contains definition of parent project with references
# to subprojects
subproject1/
build.sbt # Contains `subproject3` as a dependency
subproject2/
build.sbt # Contains `subproject3` as a dependency
subproject3/
build.sbt # Is a dependency for `subproject1` and `subproject2`
Where project1 and project2 can include project3 in their dependencies lists like this:
libraryDependencies ++= "tld.organization" % "project3" % "1.0.0"
Such that when subproject1 or subproject2 are compiled by invoking sbt compile from within their subdirectories, or when the parent: sbt-project is compiled from the main sbt-project directory, then subproject3 will be compiled and published locally with SBT, or otherwise be made available to the projects that need it.
Also, how would shared dependencies be specified in sbt-project/build.sbt or anywhere in the sbt-project/project directory, such that they are useable within subproject1 and subproject2, when invoking sbt compile within those subdirectories?
The following examples don't help answer either of the above points:
jbruggem/sbt-multiproject-example:
Uses recursive build.sbt files, but doesn't share dependencies among child projects.
Defining Multi-project Builds with sbt: pbassiner/sbt-multi-project-example:
Uses a single build.sbt file for the projects in their subdirectories.
sachabarber/SBT_MultiProject_Demo:
Uses a single build.sbt file.
Such that when subproject1 or subproject2 are compiled by invoking sbt compile from within their subdirectories...
Maybe Maven is meant to be used together with the shell environment and cd command, but that's not how sbt works at least as of sbt 1.x in 2019.
The sbt way is to use sbt as an interactive shell, and start it at the top level. You can then either invoke compilation as subproject1/compile, or switch into it using project subproject1, and call compile in there.
house plugin
A feature similar to parent POM would be achieved by creating a custom plugin.
package com.example
import sbt._
import Keys._
object FooProjectPlugin extends AutoPlugin {
override def requires = plugins.JvmPlugin
val commonsIo = "commons-io" % "commons-io" % "2.6"
override def buildSettings: Seq[Def.Setting[_]] = Seq(
organization := "com.example"
)
override def projectSettings: Seq[Def.Setting[_]] = Seq(
libraryDependencies += commonsIo
)
}
sbt-sriracha
It's not exactly what you are asking for, but I have an experimental plugin that allows you to switch between source dependency and binary dependency. See hot source dependencies using sbt-sriracha.
Using that you could create three individual sbt builds for project1, project2, and project3, all located inside $HOME/workspace directory.
ThisBuild / scalaVersion := "2.12.8"
ThisBuild / version := "0.1.1-SNAPSHOT"
lazy val project3Ref = ProjectRef(workspaceDirectory / "project3", "project3")
lazy val project3Lib = "tld.organization" %% "project3" % "0.1.0"
lazy val project1 = (project in file("."))
.enablePlugins(FooProjectPlugin)
.sourceDependency(project3Ref, project3Lib)
.settings(
name := "project1"
)
With this setup, you can launch sbt -Dsbt.sourcemode=true and it will pick up project3 as a subproject.
You can use Mecha super-repo concept. Take a look on the setup and docs here: https://github.com/storm-enroute/mecha
The basic idea is that you can combine dependent sbt projects (with their own build.sbt) under single root super-repo sbt project:
/root
/project/plugins.sbt
repos.conf
/project1
/src/..
/project/plugins.sbt
build.sbt
/project2
/src/..
/project/plugins.sbt
build.sbt
Please, note that there is no build.sbt in the root folder!
Instead there is repos.conf file. It contains definition of the sub-repos and looks like the folowing:
root {
dir = "."
origin = ""
mirrors = []
}
project1 {
dir = "project1"
origin = "git#github.com:some_user/project1.git"
mirrors = []
}
project2 {
dir = "project2"
origin = "git#github.com:some_user/project2.git"
mirrors = []
}
Then you can specify the Inter-Project, source-level Dependencies within individual projects.
There are two approaches:
dependencies.conf file
or in the build source code
For more details, please, see the docs
I'm attempting to use the sbt-aspectj plugin with the sbt native packager and am running into an issue where the associated -javaagent path to the aspectj load time weaver jar references an ivy cache location rather than something packaged.
That is, after running sbt stage, executing the staged application via bash -x target/universal/stage/bin/myapp/ results in this javaagent:
exec java -javaagent:/home/myuser/.ivy2/cache/org.aspectj/aspectjweaver/jars/aspectjweaver-1.8.10.jar -cp /home/myuser/myproject/target/universal/stage/lib/org.aspectj.aspectjweaver-1.8.10.jar:/home/myuser/myproject/target/universal/stage/lib/otherlibs.jar myorg.MyMainApp args
My target platform is Heroku where the artifacts are built before being effectively 'pushed' out to individual 'dynos' (very analogous to a docker setup). The issue here is that the resulting -javaagent path was valid on the machine in which the 'staged' deployable was built, but will not exist where it's ultimately run.
How can one configure the sbt-aspectj plugin to reference a packaged lib rather than one from the ivy cache?
Current configuration:
project/plugins.sbt:
addSbtPlugin("com.typesafe.sbt" % "sbt-aspectj" % "0.10.6")
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "1.1.5")
build.sbt (selected parts):
import com.typesafe.sbt.SbtAspectj._
lazy val root = (project in file(".")).settings(
aspectjSettings,
javaOptions in Runtime ++= { AspectjKeys.weaverOptions in Aspectj }.value,
// see: https://github.com/sbt/sbt-native-packager/issues/598#issuecomment-111584866
javaOptions in Universal ++= { AspectjKeys.weaverOptions in Aspectj }.value
.map { "-J" + _ },
fork in run := true
)
Update
I've tried several approaches including pulling the relevant output for javaOptions from existing mappings, but the result is a cyclical dependency error thrown by sbt.
I have something that technically solves my problem but feels unsatisfactory. As of now, I'm including an aspectjweaver dependency directly and using the sbt-native-packager concept of bashScriptExtraDefines to append an appropriate javaagent:
updated build.sbt:
import com.typesafe.sbt.SbtAspectj._
lazy val root = (project in file(".")).settings(
aspectjSettings,
bashScriptExtraDefines += scriptClasspath.value
.filter(_.contains("aspectjweaver"))
.headOption
.map("addJava -javaagent:${lib_dir}/" + _)
.getOrElse(""),
fork in run := true
)
You can add the following settings in your sbt config:
.settings(
retrieveManaged := true,
libraryDependencies += "org.aspectj" % "aspectjweaver" % aspectJWeaverV)
AspectJ weaver JAR will be copied to ./lib_managed/jars/org.aspectj/aspectjweaver/aspectjweaver-[aspectJWeaverV].jar in your project root.
I actually solved this by using the sbt-javaagent plugin to adding agents to the runtime
I'm trying to publish obfuscated by ProGuard jar using sbt. I have this code so far, but it is not pushing obfuscated jar into the local ivy2 repo with sbt publish-local:
artifact in (Proguard, ProguardKeys.proguard) ~= {
art => art.copy(`classifier` = Some("proguard"))
}
addArtifact(Artifact("myJar", "jar", "jar"), assembly in ProguardKeys.proguard)
publishArtifact in ProguardKeys.proguard := true
Did you do such things before or have any ideas?
Thank you
Here is the trick:
// do not publish source, javadoc and default jar
publishArtifact in (Compile, packageBin) := false
publishArtifact in (Compile, packageDoc) := false
publishArtifact in (Compile, packageSrc) := false
// add the Proguard jar for publishing
addArtifact(artifact in (Compile, ProguardKeys.proguard), (ProguardKeys.proguard in Proguard) map { xs => xs.head })
With this config I disable publishing of the sources, javadoc and default jar, and add the jar generated by Proguard to be published. Now publish[Local] tasks publish only the pom and the Proguard jar.
Here's an example build.sbt:
import AssemblyKeys._
assemblySettings
buildInfoSettings
net.virtualvoid.sbt.graph.Plugin.graphSettings
name := "scala-app-template"
version := "0.1"
scalaVersion := "2.9.3"
val FunnyRuntime = config("funnyruntime") extend(Compile)
libraryDependencies += "org.spark-project" %% "spark-core" % "0.7.3" % "provided"
sourceGenerators in Compile <+= buildInfo
buildInfoPackage := "com.psnively"
buildInfoKeys := Seq[BuildInfoKey](name, version, scalaVersion, target)
assembleArtifact in packageScala := false
val root = project.in(file(".")).
configs(FunnyRuntime).
settings(inConfig(FunnyRuntime)(Classpaths.configSettings ++ baseAssemblySettings ++ Seq(
libraryDependencies += "org.spark-project" %% "spark-core" % "0.7.3" % "funnyruntime"
)): _*)
The goal is to have spark-core "provided" so it and its dependencies are not included in the assembly artifact, but to reinclude them on the runtime classpath for the run- and test-related tasks.
It seems that using a custom scope will ultimately be helpful, but I'm stymied on how to actually cause the default/global run/test tasks to use the custom libraryDependencies and hopefully override the default. I've tried things including:
(run in Global) := (run in FunnyRuntime)
and the like to no avail.
To summarize: this feels essentially a generalization of the web case, where the servlet-api is in "provided" scope, and run/test tasks generally fork a servlet container that really does provide the servlet-api to the running code. The only difference here is that I'm not forking off a separate JVM/environment; I just want to manually augment those tasks' classpaths, effectively "undoing" the "provided" scope, but in a way that continues to exclude the dependency from the assembly artifact.
For a similar case I used in assembly.sbt:
run in Compile <<= Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run))
and now the 'run' task uses all the libraries, including the ones marked with "provided". No further change was necessary.
Update:
#rob solution seems to be the only one working on latest SBT version, just add to settings in build.sbt:
run in Compile := Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run)).evaluated,
runMain in Compile := Defaults.runMainTask(fullClasspath in Compile, runner in(Compile, run)).evaluated
Adding to #douglaz' answer,
runMain in Compile <<= Defaults.runMainTask(fullClasspath in Compile, runner in (Compile, run))
is the corresponding fix for the runMain task.
Another option is to create separate sbt projects for assembly vs run/test. This allows you to run sbt assemblyProj/assembly to build a fat jar for deploying with spark-submit, as well as sbt runTestProj/run for running directly via sbt with Spark embedded. As added benefits, runTestProj will work without modification in IntelliJ, and a separate main class can be defined for each project in order to e.g. specify the spark master in code when running with sbt.
val sparkDep = "org.apache.spark" %% "spark-core" % sparkVersion
val commonSettings = Seq(
name := "Project",
libraryDependencies ++= Seq(...) // Common deps
)
// Project for running via spark-submit
lazy val assemblyProj = (project in file("proj-dir"))
.settings(
commonSettings,
assembly / mainClass := Some("com.example.Main"),
libraryDependencies += sparkDep % "provided"
)
// Project for running via sbt with embedded spark
lazy val runTestProj = (project in file("proj-dir"))
.settings(
// Projects' target dirs can't overlap
target := target.value.toPath.resolveSibling("target-runtest").toFile,
commonSettings,
// If separate main file needed, e.g. for specifying spark master in code
Compile / run / mainClass := Some("com.example.RunMain"),
libraryDependencies += sparkDep
)
If you use sbt-revolver plugin, here is a solution for its "reStart" task:
fullClasspath in Revolver.reStart <<= fullClasspath in Compile
UPD: for sbt-1.0 you may use the new assignment form:
fullClasspath in Revolver.reStart := (fullClasspath in Compile).value
I have an internal maven repository located at file:///some/path/here. I would like to publish my sbt artifacts to this location. I gleaned that the following should work.
publishMavenStyle := true
publishTo <<= version { (v: String) =>
val path = "file:///some/path/here/"
if (v.trim.endsWith("SNAPSHOT"))
Some("snapshots" at nexus + "maven-snapshots")
else
Some("releases" at nexus + "maven")
}
However, this fails with the following exception.
[info] delivering ivy file to .../target/scala-2.9.2/ivy-1.0-SNAPSHOT.xml
java.lang.UnsupportedOperationException: URL repository only support HTTP PUT at the moment
at org.apache.ivy.util.url.BasicURLHandler.upload(BasicURLHandler.java:202)
at org.apache.ivy.util.FileUtil.copy(FileUtil.java:150)
at org.apache.ivy.plugins.repository.url.URLRepository.put(URLRepository.java:84)
How can I publish artifacts using sbt to a repository specified by a file path?
Use this format to publish to a local file path:
publishTo := Some(Resolver.file("file", new File("/some/path/here")))