Migrating task dependencies from <<= to := - sbt

In SBT 0.13
docker <<= docker dependsOn assembly
gives a deprecation warning. Instead the := operator is recommended.
docker := {
assembly.value
docker.value
}
Does not work, because order of execution is not guaranteed. I need these two tasks to run in serial.
What's the trick?

dockerfile in docker := {
val artifact: File = assembly.value
val artifactTargetPath = (assemblyOutputPath in assembly).value
new Dockerfile {
from("java:8-jre")
add(artifact, artifactTargetPath)

Related

javaOptions for custom test configuration with SBT

I'd like to suspend the forked vm and wait for connection from external debugger only when using the IntegrationDebug config.
With reference to the 'shared sources' section in http://www.scala-sbt.org/1.x/docs/Testing.html, I come up with the following config:
import sbt.Keys._
lazy val IntegrationDebug = config("itd") extend (IntegrationTest)
val scalaTestV = "3.0.4"
lazy val root = project.in(file("."))
.configs(
IntegrationTest,
IntegrationDebug
)
.settings(
Defaults.itSettings,
inConfig(IntegrationDebug)(Defaults.testTasks),
libraryDependencies ++= Seq(
"org.scalactic" %% "scalactic" % scalaTestV,
"org.scalatest" %% "scalatest" % scalaTestV,
),
fork in IntegrationTest := true,
javaOptions in IntegrationDebug += "-agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=8123",
)
However, it doesn't work as expected:
it:test -> vm not suspended (expected)
itd:test -> vm not suspended (unexpected!!)
If I change the scope of the javaOptions to IntegrationTest, i.e.
...
javaOptions in IntegrationTest += "-agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=8123",
...
then
it:test -> vm suspended (unexpected!!)
itd:test -> vm suspended (expected)
Is there anyway to make it work like:
it:test -> vm not suspended
itd:test -> vm suspended
Ok, if I'm not mistaken, I might have found solution for the problem.
Replacing line:
javaOptions in IntegrationDebug += "-agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=8123",
with
testOptions in IntegrationDebug += Tests.Argument("-agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=8123")
should do the trick (in this case VM suspended only for the itd:test command).
I know its been a long time, but someone might find this useful. I had practically the same problem and this is what solved it for me.

SBT: How to Dockerize a fat jar?

I'm building a Docker image with a fat jar. I use the sbt-assembly plugin to build the jar, and the sbt-native-packager to build the Docker image. I'm not very familiar with SBT and am running into the following issues.
I'd like to declare a dependency on the assembly task from the docker:publish task, such that the fat jar is created before it's added to the image. I did as instructed in the doc, but it's not working. assembly doesn't run until I invoke it.
publish := (publish dependsOn assembly).value
One of the steps in building the image is copying the fat jar. Since assembly plugin creates the jar in target/scala_whatever/projectname-assembly-X.X.X.jar, I need to know the exact scala_whatever and the jar name. assembly seems to have a key assemblyJarName but I'm not sure how to access it. I tried the following which fails.
Cmd("COPY", "target/scala*/*.jar /app.jar")
Help!
Answering my own questions, the following works:
enablePlugins(JavaAppPackaging, DockerPlugin)
assemblyMergeStrategy in assembly := {
case x => {
val oldStrategy = (assemblyMergeStrategy in assembly).value
val strategy = oldStrategy(x)
if (strategy == MergeStrategy.deduplicate)
MergeStrategy.first
else strategy
}
}
// Remove all jar mappings in universal and append the fat jar
mappings in Universal := {
val universalMappings = (mappings in Universal).value
val fatJar = (assembly in Compile).value
val filtered = universalMappings.filter {
case (file, name) => !name.endsWith(".jar")
}
filtered :+ (fatJar -> ("lib/" + fatJar.getName))
}
dockerRepository := Some("username")
import com.typesafe.sbt.packager.docker.{Cmd, ExecCmd}
dockerCommands := Seq(
Cmd("FROM", "username/spark:2.1.0"),
Cmd("WORKDIR", "/"),
Cmd("COPY", "opt/docker/lib/*.jar", "/app.jar"),
ExecCmd("ENTRYPOINT", "/opt/spark/bin/spark-submit", "/app.jar")
)
I completely overwrite the docker commands because the defaults add couple of scripts that I don't need because I overwrite the entrypoint as well. Also, the default workdir is /opt/docker which is not where I want to put the fat jar.
Note that the default commands are shown by show dockerCommands in sbt console.

sbt run fails in this trivial example

This is a minimal example to show a very weird thing when using sbt run. Trying to make the example any smaller, then the problem does not arises any more. Am I missing something or is this a bug or sbt?
build.sbt
name := "testsbt"
version := "1.0"
scalaVersion := "2.11.8"
project/build.properties
sbt.version = 0.13.13
src/main/scala/application/Test.scala
package application
import java.io._
case class Page(items: List[Item])
case class Item(text: String)
object Test extends App {
val page = Page(List(Item("item1")))
val filename = "./page.obj"
val oos = new ObjectOutputStream(new FileOutputStream(filename))
try oos.writeObject(page)
finally oos.close()
println("read: " + new ObjectInputStream(new FileInputStream(filename)).readObject())
}
executing this with sbt run fails:
$ sbt run
[error] (run-main-0) java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.List$SerializationProxy to field application.Page.items of type scala.collection.immutable.List in instance of application.Page
however, executing sbt package and running from it works:
$ sbt package
$ scala -cp target/scala-2.11/testsbt_2.11-1.0.jar application.Test
read: Page(List(Item(item1)))

How to combine InputKey and TaskKey into a new InputKey?

I have a SBT multi project which includes two sub projects. One is an ordinary Scala web server project and the other is just some web files. With my self written SBT plugin I can run Gulp on the web project. This Gulp task runs asynchronous. So with
sbt "web/webAppStart" "server/run"
I can start the Gulp development web server and my Scala backend server in parallel. Now I want to create a new task, that combines them both. So afterwards
sbt dev
for example should do the same. Here is what I tried so far:
// Build.sbt (only the relevant stuff)
object Build extends sbt.Build {
lazy val runServer: InputKey[Unit] = run in server in Compile
lazy val runWeb: TaskKey[Unit] = de.choffmeister.sbt.WebAppPlugin.webAppStart
lazy val dev = InputKey[Unit]("dev", "Starts a development web server")
// Scala backend project
lazy val server = (project in file("project-server"))
// Web frontend project
lazy val web = (project in file("project-web"))
// Root project
lazy val root = (project in file("."))
.settings(dev <<= (runServer) map { (_) => {
// do nothing
})
.aggregate(server, web)
This works so far. Now I don't have any idea, how to make dev also depend on the runWeb task. If I just add the runWeb task like
.settings(dev <<= (runWeb, runServer) map { (_, _) => {
// do nothing
})
then I get the error
[error] /Users/choffmeister/Development/shop/project/Build.scala:59:value map is not a member of (sbt.TaskKey[Unit], sbt.InputKey[Unit])
[error] .settings(dev <<= (runWeb, runServer) map { (_, _) =>
[error] ^
[error] one error found
[error] (compile:compile) Compilation failed
Can anyone help me with this please?
The optimal solution would pass the arguments given to dev to the runServer task. But I could also live with making dev a TaskKey[Unit] and then hard code to run runServer with no arguments.
tl;dr Use .value macro to execute dependent tasks or just alias the task sequence.
Using .value macro
Your case seems overly complicated to my eyes because of the pre-0.13 syntax (<<=) and the use of project/Build.scala (that often confuse not help people new to sbt).
You should just execute the two tasks in another as follows:
dev := {
runWeb.value
runServer.value
}
The complete example:
lazy val server = project
lazy val runServer = taskKey[Unit]("runServer")
runServer := {
println("runServer")
(run in server in Compile).value
}
lazy val runWeb = taskKey[Unit]("runWeb")
runWeb := {
println("runWeb")
}
lazy val dev = taskKey[Unit]("dev")
dev := {
println("dev")
}
dev <<= dev dependsOn (runServer, runWeb)
Using alias command
sbt offers alias command that...
[sbt-learning-space]> help alias
alias
Prints a list of defined aliases.
alias name
Prints the alias defined for `name`.
alias name=value
Sets the alias `name` to `value`, replacing any existing alias with that name.
Whenever `name` is entered, the corresponding `value` is run.
If any argument is provided to `name`, it is appended as argument to `value`.
alias name=
Removes the alias for `name`.
Just define what tasks/command you want to execute in an alias as follows:
addCommandAlias("devAlias", ";runServer;runWeb")
Use devAlias as if it were a built-in task:
[sbt-learning-space]> devAlias
runServer
[success] Total time: 0 s, completed Jan 25, 2015 6:30:15 PM
runWeb
[success] Total time: 0 s, completed Jan 25, 2015 6:30:15 PM

SBT execute code based on value

I want to do something like the following in SBT:
CrossVersion.partialVersion(scalaVersion.value) match {
case Some((2, 11)) =>
case Some((2, 10)) =>
}
But I don't want to assign that to anything, I simply want to run some code based on the value of the current cross version.
I could create a Task and then execute the task, but can I do this without needing the task?
I know you've said you didn't want to create a task, but I would say that's the cleanest way of doing it, so I'll post it as one of the solutions anyway.
Depends on Compile
val printScalaVersion = taskKey[Unit]("Prints Scala version")
printScalaVersion := {
CrossVersion.partialVersion(scalaVersion.value) match {
case Some((2, 11)) => println("2.11")
case Some((2, 10)) => println("2.10")
case _ => println("Other version")
}
}
compile in Compile := ((compile in Compile) dependsOn printScalaVersion).value
Override the Compile Task
If you really wouldn't like to create new task, you could redefine the compile task and add your code there (I think it's not as clean as the solution above).
compile in Compile := {
val analysis = (compile in Compile).value
CrossVersion.partialVersion(scalaVersion.value) match {
case Some((2, 11)) => println("2.11")
case Some((2, 10)) => println("2.10")
case _ => println("Other version")
}
analysis
}
Just a small "enhancement" to what #lpiepiora offered.
There could be a setting that'd hold the value of CrossVersion.partialVersion(scalaVersion.value) as follows:
lazy val sv = settingKey[Option[(Int, Int)]]("")
sv := CrossVersion.partialVersion(scalaVersion.value)
With the setting:
> sv
[info] Some((2,10))
> ++ "2.9.3"
[info] Setting version to 2.9.3
[info] Set current project to projectA (in build file:/C:/dev/sandbox/scalaVersionSetting/)
> sv
[info] Some((2,9))
> ++ "2.10.4"
[info] Setting version to 2.10.4
[info] Set current project to projectA (in build file:/C:/dev/sandbox/scalaVersionSetting/)
> sv
[info] Some((2,10))
> ++ "2.11"
[info] Setting version to 2.11
[info] Set current project to projectA (in build file:/C:/dev/sandbox/scalaVersionSetting/)
> sv
[info] Some((2,11))
...and so on.
That would give a setting to case upon.
lazy val printScalaVersion = taskKey[Unit]("Prints Scala version")
printScalaVersion := {
sv.value foreach println
}

Resources