I can make a Makefile that has a target that processes all sources in the directory.
SOURCE_DIR := src
TARGET_DIR := target
SOURCES := $(wildcard $(SOURCE_DIR)/*)
$(TARGET_DIR)/%: $(SOURCE_DIR)/%
md5sum $^ > $#
all: $(SOURCES:$(SOURCE_DIR)/%=$(TARGET_DIR)/%)
A nice advantage here is that each file is a separate target, so they can be processed incrementally, and concurrently. The concurrent part is important in this situation.
I am trying to something similar with SBT, but am finding it surprisingly difficult. The SBT analog of a Make target sees to be a task, so I try creating one task that aggregate a variable number of smaller tasks.
import org.apache.commons.codec.digest.DigestUtils
all <<= Def.task().dependsOn({
file(sourceDir.value).listFiles.map { source =>
val target = rebase(sourceDir.value, targetDir.value)(f)
Def.task {
IO.write(target, DigestUtils.md5Hex(IO.readBytes(source)))
}
}
}: _*)
I get the error
`value` can only be used within a task or setting macro, such as :=, +=, ++=,
Def.task, or Def.setting
How can I make a proper SBT build file that resembles my Makefile, with a dynamic number of concurrent targets/tasks?
I needed flatMap.
all <<= (sourceDir, targetDir).flatMap { (sourceDir, targetDir) =>
task{}.dependsOn({
file(sourceDir).listFiles.map { source =>
task {
val target = rebase(sourceDir, targetDir)(f)
IO.write(target, DigestUtils.md5Hex(IO.readBytes(source)))
}
}
}: _*)
}
There might be a slicker way to do task{}.dependsOn(...: _*), but I don't know what it is.
Related
I have an SBT project which pulls in dependencies. I only want to pull in the direct dependencies - not any transitive dependencies. I'd like to find the filename of the dependency that's pulled in, so that I can copy it somewhere.
e.g. given a build.sbt file with the following contents:
libraryDependencies += "org.eclipse.jetty" % "jetty-server" % "9.4.28.v20200408"
I would like to know where is the jetty-server jar on the file system.
I have tried adding the following to my build.sbt file:
lazy val mytaskKey: TaskKey[Unit] = TaskKey[Unit]("mytask")
def mytask: Def.Setting[Task[Unit]] = mytaskKey := {
val updateReport = update.value
updateReport.allFiles foreach { f =>
println(f)
}
}
mytask
When I run this, I get a full list of dependencies:
/Users/dylan/.sbt/boot/scala-2.12.10/lib/scala-library.jar
/Users/dylan/.coursier/cache/v1/https/repo1.maven.org/maven2/org/eclipse/jetty/jetty-server/9.4.28.v20200408/jetty-server-9.4.28.v20200408.jar
/Users/dylan/.sbt/boot/scala-2.12.10/lib/scala-compiler.jar
/Users/dylan/.sbt/boot/scala-2.12.10/lib/scala-reflect.jar
/Users/dylan/.coursier/cache/v1/https/repo1.maven.org/maven2/org/scala-lang/modules/scala-xml_2.12/1.0.6/scala-xml_2.12-1.0.6.jar
/Users/dylan/.coursier/cache/v1/https/repo1.maven.org/maven2/jline/jline/2.14.6/jline-2.14.6.jar
/Users/dylan/.coursier/cache/v1/https/repo1.maven.org/maven2/org/fusesource/jansi/jansi/1.12/jansi-1.12.jar
I don't want that full list - I just want the jetty jar. i.e.
/Users/dylan/.coursier/cache/v1/https/repo1.maven.org/maven2/org/eclipse/jetty/jetty-server/9.4.28.v20200408/jetty-server-9.4.28.v20200408.jar
How might I get this list?
Yes, there is with either intransitive() or notTransitive() classifiers. It's documented here.
I want my scalacSettings to be more strict (more linting) when I issue my own command validate.
What is the best way to achieve that?
A new scope (strict) did work, but it requires to compile the project two times when you issue test. So that's not a option.
SBT custom command allows for temporary modification of build state which can be discarded after command finishes:
def validate: Command = Command.command("validate") { state =>
import Project._
val stateWithStrictScalacSettings =
extract(state).appendWithSession(
Seq(Compile / scalacOptions ++= Seq(
"-Ywarn-unused:imports",
"-Xfatal-warnings",
"...",
))
,state
)
val (s, _) = extract(stateWithStrictScalacSettings).runTask(Test / test, stateWithStrictScalacSettings)
s
}
commands ++= Seq(validate)
or more succinctly using :: convenience method for State transformations:
commands += Command.command("validate") { state =>
"""set scalacOptions in Compile := Seq("-Ywarn-unused:imports", "-Xfatal-warnings", "...")""" ::
"test" :: state
}
This way we can use sbt test during development, while our CI hooks into sbt validate which uses stateWithStrictScalacSettings.
I'm building a Docker image with a fat jar. I use the sbt-assembly plugin to build the jar, and the sbt-native-packager to build the Docker image. I'm not very familiar with SBT and am running into the following issues.
I'd like to declare a dependency on the assembly task from the docker:publish task, such that the fat jar is created before it's added to the image. I did as instructed in the doc, but it's not working. assembly doesn't run until I invoke it.
publish := (publish dependsOn assembly).value
One of the steps in building the image is copying the fat jar. Since assembly plugin creates the jar in target/scala_whatever/projectname-assembly-X.X.X.jar, I need to know the exact scala_whatever and the jar name. assembly seems to have a key assemblyJarName but I'm not sure how to access it. I tried the following which fails.
Cmd("COPY", "target/scala*/*.jar /app.jar")
Help!
Answering my own questions, the following works:
enablePlugins(JavaAppPackaging, DockerPlugin)
assemblyMergeStrategy in assembly := {
case x => {
val oldStrategy = (assemblyMergeStrategy in assembly).value
val strategy = oldStrategy(x)
if (strategy == MergeStrategy.deduplicate)
MergeStrategy.first
else strategy
}
}
// Remove all jar mappings in universal and append the fat jar
mappings in Universal := {
val universalMappings = (mappings in Universal).value
val fatJar = (assembly in Compile).value
val filtered = universalMappings.filter {
case (file, name) => !name.endsWith(".jar")
}
filtered :+ (fatJar -> ("lib/" + fatJar.getName))
}
dockerRepository := Some("username")
import com.typesafe.sbt.packager.docker.{Cmd, ExecCmd}
dockerCommands := Seq(
Cmd("FROM", "username/spark:2.1.0"),
Cmd("WORKDIR", "/"),
Cmd("COPY", "opt/docker/lib/*.jar", "/app.jar"),
ExecCmd("ENTRYPOINT", "/opt/spark/bin/spark-submit", "/app.jar")
)
I completely overwrite the docker commands because the defaults add couple of scripts that I don't need because I overwrite the entrypoint as well. Also, the default workdir is /opt/docker which is not where I want to put the fat jar.
Note that the default commands are shown by show dockerCommands in sbt console.
I have a Scala project that is divided into several subprojects:
lazy val core: Seq[ProjectReference] = Seq(common, json_scalaz7, json_scalaz)
I'd like to make the core lazy val conditional on the Scala version I'm currently using, so I tried this:
lazy val core2: Seq[ProjectReference] = scalaVersion {
case "2.11.0" => Seq(common, json_scalaz7)
case _ => Seq(common, json_scalaz7, json_scalaz)
}
Simply speaking, I'd like to exclude json_scalaz for Scala 2.11.0 (when the value of the scalaVersion setting is "2.11.0").
This however gives me the following compilation error:
[error] /home/diego/work/lift/framework/project/Build.scala:39: type mismatch;
[error] found : sbt.Project.Initialize[Seq[sbt.Project]]
[error] required: Seq[sbt.ProjectReference]
[error] lazy val core2: Seq[ProjectReference] = scalaVersion {
[error] ^
[error] one error found
Any idea how to solve this?
Update
I'm using sbt version 0.12.4
This project is the Lift project, which compiles against "2.10.0", "2.9.2", "2.9.1-1", "2.9.1" and now we are working on getting it to compile with 2.11.0. So creating a compile all task would not be practical, as it would take a really long time.
Update 2
I'm hoping there is something like this:
lazy val scala_xml = "org.scala-lang.modules" %% "scala-xml" % "1.0.1"
lazy val scala_parser = "org.scala-lang.modules" %% "scala-parser-combinators" % "1.0.1"
...
lazy val common =
coreProject("common")
.settings(description := "Common Libraties and Utilities",
libraryDependencies ++= Seq(slf4j_api, logback, slf4j_log4j12),
libraryDependencies <++= scalaVersion {
case "2.11.0" => Seq(scala_xml, scala_parser)
case _ => Seq()
}
)
but for the projects list
Note how depending on the scala version, I add the scala_xml and scala_parser_combinator libraries
You can see the complete build file here
Cross building a project
Simply speaking, I'd like to exclude json_scalaz for Scala 2.11.0
The built-in support in sbt for this is called cross building, which is described in Cross-Building a Project. Here's from the section with a bit of correction:
Define the versions of Scala to build against in the crossScalaVersions setting. For example, in a .sbt build definition:
crossScalaVersions := Seq("2.10.4", "2.11.0")
To build against all versions listed crossScalaVersions, prefix the action to run with +. For example:
> +compile
Multiple-project builds
sbt also has built-in support to aggregate tasks across multiple projects, which is described Aggregation. If what you need eventually is normal built-in tasks like compile and test, you could set up a dummy aggregate without json_scalaz.
lazy val withoutJsonScalaz = (project in file("without-json-scalaz")).
.aggregate(liftProjects filterNot {_ == json_scalaz}: _*)
From the shell, you should be able to use this as:
> ++2.11.0
> project withoutJsonScalaz
> test
Getting values from multiple scopes
Another feature you might be interested in is ScopeFilter. This has the ability to traverse multiple projects beyond usual aggregation and cross building. You would need to create a setting whose type is ScopeFilter and set it based on scalaBinaryVersion.value. With scope filters, you can do:
val coreProjects = settingKey[ScopeFilter]("my core projects")
val compileAll = taskKey[Seq[sbt.inc.Analysis]]("compile all")
coreProjects := {
(scalaBinaryVersion.value) match {
case "2.10" => ScopeFilter(inProjects(common, json_scalaz7, json_scalaz))
}
}
compileAll := compileAllTask.value
lazy val compileAllTask = Def.taskDyn {
val f = coreProjects.value
(compile in Compile) all f
}
In this case compileAll would have the same effect as +compile, but you could aggregate the result and do something interesting like sbt-unidoc.
val webAssemblyTask = TaskKey[Unit](
"web-assembly",
"assembly web/war like run-time package"
)
var out: TaskStreams = _
val baseSettings: Seq[Setting[_]] = Seq(
webAssemblyOutputDir <<= (sourceManaged) { _ / "build" },
webAssemblyTask <<= (
streams,
target,
sourceDirectory,
outputDirProjectName
) map {
(out_log, targetDir, sourceDir, outputDirProjectName) => {
out_log.log.info("web-assembly start")
out_log.log.info("sourceDir:" + sourceDir.getAbsolutePath)
out_log.log.info("targetDir:" + targetDir.getAbsolutePath)
val sourceAssetsDir = (sourceDir / "webapp" / "assets").toPath
val classesAssetsDir = (targetDir / "scala-2.10" / "classes" / "assets").toPath
Files.createSymbolicLink(classesAssetsDir, sourceAssetsDir)
}
}
)
val webAssemblySettings = inConfig(Runtime)(baseSettings)
I wrote a plugin of sbt.
I type webAssembly in sbt console, the plugin run ok.
But I want to run after compile, before runtime, how can I do it?
how to set sbt plugins invoke scope?
I think you're confusing the configuration (also known as Maven scope) name with tasks like compile and run. They happen to have related configuration, but that doesn't mean compile task is identical to Compile configuration.
I could interpret this question to be how can a plugin setting invoke tasks scoped in some other configuration. For that you use in method like: key in (Config) or key in (Config, task). Another way to interpret it may be how can plugin tasks be scoped in a configuration. You use inConfig(Config)(...), which you're already doing. But you'd typically want plugins to be configuration neutral. See my blog post for more details on this.
I want to run after compile, before run, how can I do it?
This makes much more sense. In sbt you mostly focus on the preconditions of the tasks. One of the useful command is inspect tree key. You can run that for run tasks and get the entire tasks/settings that it depends on. Here's where you see it calling compile:compile (another notation for compile in Compile):
helloworld> inspect tree run
[info] compile:run = InputTask[Unit]
[info] +-runtime:fullClasspath = Task[scala.collection.Seq[sbt.Attributed[java.io.File]]]
[info] | +-runtime:exportedProducts = Task[scala.collection.Seq[sbt.Attributed[java.io.File]]]
[info] | | +-compile:packageBin::artifact = Artifact(sbt-sequential,jar,jar,None,List(compile),None,Map())
[info] | | +-runtime:configuration = runtime
[info] | | +-runtime:products = Task[scala.collection.Seq[java.io.File]]
[info] | | | +-compile:classDirectory = target/scala-2.10/sbt-0.13/classes
[info] | | | +-compile:copyResources = Task[scala.collection.Seq[scala.Tuple2[java.io.File, java.io.File]]]
[info] | | | +-compile:compile = Task[sbt.inc.Analysis]
This is useful in discovering compile:products, which "Build products that get packaged" according to help products command:
helloworld> help products
Build products that get packaged.
Since runtime:products happens before compile:run, if it depended on your task, your task will be called before compile:run (inspect tree also shows that run resolved to that).
To simplify your plugin task, I'm just going to call it sayHello:
val sayHello = taskKey[Unit]("something")
sayHello := {
println("hello")
}
You can rewire products in Runtime as follows:
products in Runtime := {
val old = (products in Runtime).value
sayHello.value
old
}
This will satisfy "before run" part. You want to make sure that this runs after compile. Again, just add task dependency to it:
sayHello := {
(compile in Compile).value
println("hello")
}
When the user runs run task, sbt will correct calculate the dependencies and runs sayHello task somewhere between compile and run.