SBT task dependsOn - sbt

Since SBT 0.13.13 this is deprecated (<<= is deprecated):
compile in Compile <<= (compile in Compile).dependsOn(apiDoc)
So the only way to do I found is this:
compile in Compile := {
apiDoc.value
(compile in Compile).value
}
But now I have a warning about a useless expression apiDoc.value.
But this is not useless!
I can't find any documentation about what is the new way to do.

I haven't found docs for this, but you can create a dependsOn like:
compile.in(Compile) := compile.dependsOn(apiDoc).value
Note that if you're doing this for an InputTask, you'll need to use evaluated instead of value:
myInputTask := myInputTask.dependsOn(apiDoc).evaluated

I've struggled with the problem of depending tasks and it all became clear after I read this page: http://www.beyondthelines.net/computing/understanding-sbt/
TLD;DR: to make a task dependent on another inside a task definition, you have to use Def.sequential (examples from the page):
lazy val A = taskKey[Unit]("Task A")
A in Global := { println("Task A") }
lazy val B = taskKey[Unit]("Task B")
B in Global := { println("Task B") }
lazy val C = taskKey[Unit]("Task C")
C := Def.sequential(A, B).value
So for your case, it would be:
compile in Compile := Def.sequential(apiDoc, compile in Compile).value
Or, if you use the new sbt syntax and have different scoping:
ThisBuild / Compile / compile := Def.sequential(apiDoc, Compile / compile).value

Related

Conditional scalacSettings / settingKey

I want my scalacSettings to be more strict (more linting) when I issue my own command validate.
What is the best way to achieve that?
A new scope (strict) did work, but it requires to compile the project two times when you issue test. So that's not a option.
SBT custom command allows for temporary modification of build state which can be discarded after command finishes:
def validate: Command = Command.command("validate") { state =>
import Project._
val stateWithStrictScalacSettings =
extract(state).appendWithSession(
Seq(Compile / scalacOptions ++= Seq(
"-Ywarn-unused:imports",
"-Xfatal-warnings",
"...",
))
,state
)
val (s, _) = extract(stateWithStrictScalacSettings).runTask(Test / test, stateWithStrictScalacSettings)
s
}
commands ++= Seq(validate)
or more succinctly using :: convenience method for State transformations:
commands += Command.command("validate") { state =>
"""set scalacOptions in Compile := Seq("-Ywarn-unused:imports", "-Xfatal-warnings", "...")""" ::
"test" :: state
}
This way we can use sbt test during development, while our CI hooks into sbt validate which uses stateWithStrictScalacSettings.

Using runTask(...mainClass...) inside inputTask with command line args, := vs <<= or?

I want to define a task which runs MyMainClass defined in the same module where task is defined, passing task command line to it:
$ sbt
> project myModule
> myKey2 someArgument
...compiles MyMainClass
...runs MyMainClass.main("someArgument")
Without command line args, this works:
val myKey1 = taskKey[Unit]("myKey1")
lazy val myModule = project.settings(
myKey1 <<= runTask(Compile, "MyMainClass", "mode1"),
myKey1 <<= myKey1.dependsOn(compile in Compile)
)
But I could not make it with command line args. Trying to use Def.spaceDelimited().parsed with taskKey gives me compilation error explicitly saying that I must use inputKey instead; trying to use <<= with inputKey does not compile either; this compiles but does not work:
val myKey2 = inputKey[Unit]("myKey2")
lazy val myModule = project.settings(
...
myKey2 := runTask(
Compile, "MyMainClass", "mode2",
{
val args = Def.spaceDelimited().parsed.head)
// This line is executed, but MyMainClass.main() is not:
System.err.println("***** args=" + args)
args.head
}
),
myKey2 <<= myKey2.dependsOn(compile in Compile)
)
Tried SBT 0.13.7 and 0.13.9. Please help. Thanks. :)
UPD. Or maybe I'm doing this completely wrong (deprecated) way? I could not find SBT 0.13 docs mention <<= at all.
Rewritten in new style (:= instead of <<=).
This worked:
myKey1 := {
// Works without this line, but kept it for clarity and just in case:
val _ = (compile in Compile).value
runTask(Compile, "MyMainClass1", "mode1").value
},
myKey2 := {
val _ = (compile in Compile).value
runInputTask(Compile, "MyMainClass", "mode2").evaluated
}
BTW directly accessing .value in procedural style feels much conceptually simpler than old ways I used to use (I guess that was before SBT has been rewritten using macros).

Including project in build depending on setting's value, e.g. scalaVersion?

I have a Scala project that is divided into several subprojects:
lazy val core: Seq[ProjectReference] = Seq(common, json_scalaz7, json_scalaz)
I'd like to make the core lazy val conditional on the Scala version I'm currently using, so I tried this:
lazy val core2: Seq[ProjectReference] = scalaVersion {
case "2.11.0" => Seq(common, json_scalaz7)
case _ => Seq(common, json_scalaz7, json_scalaz)
}
Simply speaking, I'd like to exclude json_scalaz for Scala 2.11.0 (when the value of the scalaVersion setting is "2.11.0").
This however gives me the following compilation error:
[error] /home/diego/work/lift/framework/project/Build.scala:39: type mismatch;
[error] found : sbt.Project.Initialize[Seq[sbt.Project]]
[error] required: Seq[sbt.ProjectReference]
[error] lazy val core2: Seq[ProjectReference] = scalaVersion {
[error] ^
[error] one error found
Any idea how to solve this?
Update
I'm using sbt version 0.12.4
This project is the Lift project, which compiles against "2.10.0", "2.9.2", "2.9.1-1", "2.9.1" and now we are working on getting it to compile with 2.11.0. So creating a compile all task would not be practical, as it would take a really long time.
Update 2
I'm hoping there is something like this:
lazy val scala_xml = "org.scala-lang.modules" %% "scala-xml" % "1.0.1"
lazy val scala_parser = "org.scala-lang.modules" %% "scala-parser-combinators" % "1.0.1"
...
lazy val common =
coreProject("common")
.settings(description := "Common Libraties and Utilities",
libraryDependencies ++= Seq(slf4j_api, logback, slf4j_log4j12),
libraryDependencies <++= scalaVersion {
case "2.11.0" => Seq(scala_xml, scala_parser)
case _ => Seq()
}
)
but for the projects list
Note how depending on the scala version, I add the scala_xml and scala_parser_combinator libraries
You can see the complete build file here
Cross building a project
Simply speaking, I'd like to exclude json_scalaz for Scala 2.11.0
The built-in support in sbt for this is called cross building, which is described in Cross-Building a Project. Here's from the section with a bit of correction:
Define the versions of Scala to build against in the crossScalaVersions setting. For example, in a .sbt build definition:
crossScalaVersions := Seq("2.10.4", "2.11.0")
To build against all versions listed crossScalaVersions, prefix the action to run with +. For example:
> +compile
Multiple-project builds
sbt also has built-in support to aggregate tasks across multiple projects, which is described Aggregation. If what you need eventually is normal built-in tasks like compile and test, you could set up a dummy aggregate without json_scalaz.
lazy val withoutJsonScalaz = (project in file("without-json-scalaz")).
.aggregate(liftProjects filterNot {_ == json_scalaz}: _*)
From the shell, you should be able to use this as:
> ++2.11.0
> project withoutJsonScalaz
> test
Getting values from multiple scopes
Another feature you might be interested in is ScopeFilter. This has the ability to traverse multiple projects beyond usual aggregation and cross building. You would need to create a setting whose type is ScopeFilter and set it based on scalaBinaryVersion.value. With scope filters, you can do:
val coreProjects = settingKey[ScopeFilter]("my core projects")
val compileAll = taskKey[Seq[sbt.inc.Analysis]]("compile all")
coreProjects := {
(scalaBinaryVersion.value) match {
case "2.10" => ScopeFilter(inProjects(common, json_scalaz7, json_scalaz))
}
}
compileAll := compileAllTask.value
lazy val compileAllTask = Def.taskDyn {
val f = coreProjects.value
(compile in Compile) all f
}
In this case compileAll would have the same effect as +compile, but you could aggregate the result and do something interesting like sbt-unidoc.

Using sbt-aether-deploy with sbt-native-packager

Has anyone published an sbt-native-packager produced artifact (tgz in my case) using sbt-aether-deploy to a nexus repo? (I need this for the timestamped snapshots, specifically the "correct" version tag in nexus' artifact-resolution REST resource).
I can do one or the other but can't figure out how to add the packagedArtifacts in Universal to the artifacts that sbt-aether-deploy deploys to do both.
I suspect the path to pursue would be to the addArtifact() the packagedArtifacts in Universal or creating another AetherArtifact and then to override/replace the deployTask to use that AetherArtifact?
Any help much appreciated.
I am the author of the sbt-aether-deploy plugin, and I just came over this post.
import aether.AetherKeys._
crossPaths := false //needed if you want to remove the scala version from the artifact name
enablePlugins(JavaAppPackaging)
aetherArtifact := {
val artifact = aetherArtifact.value
artifact.attach((packageBin in Universal).value, "dist", "zip")
}
This will also publish the other main artifact.
If you want to disable publishing of the main artifact, then you will need to rewrite the artifact coordinates. Maven requires a main artifact.
I have added a way to replace the main artifact for this purpose, but I can now see that way is kind of flawed. It will still assume that the artifact is published as a jar file. The main artifact type is locked down to that, since the POM packaging is set to jar by default by SBT.
If this is an app, then that limitation is probably OK, since Maven will never resolve that into an artifact.
The "proper" way in Maven terms is to add a classifier to the artifact and change the "packaging" in the POM file to "pom". We will see if I get around to changing that particular part.
Ok, I think I got it amazingly enough. If there's a better way to do it I'd love to hear. Not loving that blind Option.get there..
val tgzCoordinates = SettingKey[MavenCoordinates]("the maven coordinates for the tgz")
lazy val myPackagerSettings = packageArchetype.java_application ++ deploymentSettings ++ Seq(
publish <<= publish.dependsOn(publish in Universal),
publishLocal <<= publishLocal.dependsOn(publishLocal in Universal)
)
lazy val defaultSettings = buildSettings ++ Publish.settings ++ Seq(
scalacOptions in Compile ++= Seq("-encoding", "UTF-8", "-target:jvm-1.7", "-deprecation", "-feature", "-unchecked", "-Xlog-reflective-calls"),
testOptions in Test += Tests.Argument("-oDF")
)
lazy val myAetherSettings = aetherSettings ++ aetherPublishBothSettings
lazy val toastyphoenixProject = Project(
id = "toastyphoenix",
base = file("."),
settings = defaultSettings ++ myPackagerSettings ++ myAetherSettings ++ Seq(
name in Universal := name.value + "_" + scalaBinaryVersion.value,
packagedArtifacts in Universal ~= { _.filterNot { case (artifact, file) => artifact.`type`.contains("zip")}},
libraryDependencies ++= Dependencies.phoenix,
tgzCoordinates := MavenCoordinates(organization.value + ":" + (name in Universal).value + ":tgz:" + version.value).get,
aetherArtifact <<= (tgzCoordinates, packageZipTarball in Universal, makePom in Compile, packagedArtifacts in Universal) map {
(coords: MavenCoordinates, mainArtifact: File, pom: File, artifacts: Map[Artifact, File]) =>
createArtifact(artifacts, pom, coords, mainArtifact)
}
)
)
I took Peter's solution and reworked it slightly, avoiding the naked Option.get by creating the MavenCoordinates directly:
import aether.MavenCoordinates
import aether.Aether.createArtifact
name := "mrb-test"
organization := "me.mbarton"
version := "1.0"
crossPaths := false
packageArchetype.java_application
publish <<= (publish) dependsOn (publish in Universal)
publishLocal <<= (publishLocal) dependsOn (publishLocal in Universal)
aetherPublishBothSettings
aetherArtifact <<= (organization, name in Universal, version, packageBin in Universal, makePom in Compile, packagedArtifacts in Universal) map {
(organization, name, version, binary, pom, artifacts) =>
val nameWithoutVersion = name.replace(s"-$version", "")
createArtifact(artifacts, pom, MavenCoordinates(organization, nameWithoutVersion, version, None, "zip"), binary)
}
The nameWithoutVersion replace works around SBT native packager including the version in the artifact name:
Before: me/mbarton/mrb-test-1.0/1.0/mrb-test-1.0.zip
After: me/mbarton/mrb-test/1.0/mrb-test-1.0.zip
crossPaths avoids the Scala postfix on the version.

sbt-assembly: skip specific test

I would like to configure sbt-assembly to skip a specific test class.
Is there any way to do this? If it helps, I tagged the test using ScalaTest #Network tag.
See Additional test configurations with shared sources. This allows you to come up with alternative "test" task in FunTest configuration while reusing your test source.
After you have fun:test working with whatever filter you define using testOptions in FunTest := Seq(Tests.Filter(itFilter)), you can then rewire
test in assembly := test in FunTest
Eugene is right (obviously) but that wasn't quite enough information for me to get it to work - I have a build.scala file. I am defining baseSettings like this:
val baseSettings = Defaults.defaultSettings ++
buildSettings ++
Seq(sbt.Keys.test in assembly := {})
You can tag your tests with ignore, then sbt/ScalaTest won't run them. See ScalaTest docs on Tagging tests.
Just for completeness, if you want to skip all tests in assembly task or run only particular ones you can customize it with test in assembly := { ... }
Based on #eugene-yokata reply, I found how to use the flag from ScalaTest:
lazy val UnitTest = config("unit") extend (Test)
lazy val companyApp = (project in file("applications/"))
.assembly("com.company.app", "app.jar")
.configs(UnitTest)
.settings(
inConfig(UnitTest)(Defaults.testTasks),
UnitTest / testOptions ++= Seq(
Tests.Argument(
TestFrameworks.ScalaTest,
"-l",
"com.company.tag.HttpIntegrationTest"
),
Tests.Argument(
TestFrameworks.ScalaTest,
"-l",
"com.company.tag.EsIntegrationTest"
)
),
test in assembly := (UnitTest / test).value
)

Resources