How to execute clean task in dependent projects from a task? - sbt

How do I clean dependent projects in SBT from inside the code of a task?
I've checked before this related questions:
SBT: Traverse project dependency graph
get reference to "child" projects from "parent" in sbt
but I'm getting a little lost with strange syntax.
I've tried this:
projectDependencies.value.foreach { p =>
System.out.println(s"Cleaning ${p.name}")
(clean.all(ScopeFilter(inProjects(new LocalProject(p.name))))).value
}
but SBT complains about dynamic scope:
Illegal dynamic reference: p

Use the following in build.sbt:
val selectDeps = ScopeFilter(inDependencies(ThisProject))
clean in Compile := clean.all(selectDeps).value

Based on the solution offered by Jacek Laskowski (thanks), here is a more complete snippet:
val cleanDependencies = taskKey[Seq[Unit]]("Clean dependencies of current project")
lazy val MyProject = project.settings(Seq(
cleanDependencies <<= clean.all(ScopeFilter(inDependencies(ThisProject))),
package <<= package.dependsOn(clean, cleanDependencies)
): _*)

Related

Define configuration with compiler plugin

I'd like to create a compile configuration which is the same as the default one but adds a compiler plugin. In my particular case, I want to have a "dev" configuration but with the linter plugin (https://github.com/HairyFotr/linter) because it slows down compile times and there's no need to run it in production or continuous integration.
Now this is what I tried:
lazy val Dev = config("dev") extend Compile
lazy val root = (project in file(".")).configs(Dev).settings(
inConfig(Dev)(addCompilerPlugin("org.psywerx.hairyfotr" %% "linter" % "0.1.12")): _*)
and it should work, since when I inspect dev:libraryDependencies, it's what I expect it to be- it has org.psywerx.hairyfotr:linter:0.1.12:plugin->default(compile). Normally if I add the library with a "plugin" scope, it does work for the default settings:
libraryDependencies += ("org.psywerx.hairyfotr" %% "linter" % "0.1.12" % "plugin"
It just does not work if I add this under a different configuration, so there must be something else going on here.
This solves the problem, but not exactly in a way was asked. Here's the full build.sbt:
libraryDependencies ++= Seq(
"org.psywerx.hairyfotr" %% "linter" % "0.1.14" % "test")
val linter = Command.command("linter")(state => {
val linterJar = for {
(newState, result) <- Project.runTask(fullClasspath in Test, state)
cp <- result.toEither.right.toOption
linter <- cp.find(
_.get(moduleID.key).exists(mId =>
mId.organization == "org.psywerx.hairyfotr" &&
mId.name == "linter_2.11"))
} yield linter.data.absolutePath
val res = Project.runTask(scalacOptions, state)
res match {
case Some((newState, result)) =>
result.toEither.right.foreach { defaultScalacOptions =>
Project.runTask(compile in Test,
Project.extract(state).append(
scalacOptions := defaultScalacOptions ++ linterJar.map(p => Seq(s"-Xplugin:$p")).getOrElse(Seq.empty),
newState))
}
case None => sys.error("Couldn't get defaultScalacOptions")
}
state
})
lazy val root = (project in file(".")).configs(Test).settings(commands ++= Seq(linter))
The fact that you return unmodified state means you don't change the project settings. So if you run sbt linter, you should get your project compiled with the additional scalacOptions, but if you run compile in the same sbt session, it will not use those additional settings.
The tricky thing here is that scalacOptions is actually a TaskKey, not a SettingKey. I don't know why is that, but to get its value, you have to run that task. One reason might be that in sbt you cannot make setting depending on a task, but you can make a task depending on a task. In other words, scalacOptions can depend on the other task value, and maybe internally it does, I haven't checked. If current answer will work for you, I can try and think about more elegant way of achieving the same result.
EDIT: modified the code to specify the scalacOptions for the linter plugin proper. Please note the plugin has to be a managed dependency, not just a downloaded jar, for this solution to work. If you want to have it unmanaged, there's a way, but I won't go into it for now. Additionally, I've taken a freedom of making it also work for testing code, for illustration purposes.
Looking at Defaults.scala in the source, it seems like the compile command is always taking the options from the compile scope. So if I'm correct, you can have only one set of compilation options!
This seems to be confirmed by the fact that scalacOptions behaves the same way, and this is also why I don't see a non-hacky answer for these similar questions:
Different scalac options for different scopes or tasks?
Different compile options for tests and release in SBT?
I'd be happy to be proven wrong.
EDIT: FWIW, one might not be able to define another scalac options profile in the same project, but you could do so in a "different" project:
lazy val dev = (project in file(".")).
settings(target := baseDirectory.value / "target" / "dev").
settings(addCompilerPlugin("org.psywerx.hairyfotr" %% "linter" % "0.1.12"): _*)
This has the disadvantage that it has a separate output directory, so it will take more space and, more importantly, will not get incremental compiles between the two projects. However, after spending some time thinking about it, this may be by design. After all, even though linters don't, some scalac compilation options could conceivably change the output. This would make it meaningless to try to keep the metadata for incremental compilation from one set of scalac options to another. Thus different scalac options would indeed require different target directories.

How to declare subproject dependency from another project?

I have two projects A and B in sibling directories. I know I can make A depend on B using RootProject:
object A extends Build {
lazy val A = project in file(".") dependsOn(B)
lazy val B = RootProject(file("../B"))
}
But B includes two core and extensions subprojects, and I want to depend on core only. Is this possible?
The answer turns out to be given at SBT dependsOn RootProject: doesn't compile the dependency:
lazy val BCore = ProjectRef(file("../B"), "core")

Where to place resources, e.g. images, that scaladoc can use?

I am currently writing the documentation of an API written in Scala. I would like to include several diagrams to make the code more understandable.
I am wondering where to put resources (such as diagrams) in order that they can be automatically imported by an invocation of scaladoc, and how to refer to these resources in the documentation of the code.
For example, let's assume that I use sbt. My code is located in the src/main/scala directory. Here is an example of a scala package object for package foo:
/**
* Provides main classes of the bar API.
*
* ==Overview==
* Main classes are depicted on the following diagram:
* <img src="path/to/diagram-foo.svg" />
*
*/
package object foo {
}
Where should 'diagram-foo.svg' be located in my project in order to be visible to scaladoc? Subsequently, what is the correct value of path/to/ in the img tag?
WARNING It may be a hack as I know very little about scaladoc.
Since <img src="../path/to/diagram-foo.svg" /> is just a regular HTML you just need to copy necessary assets to the doc target path so the img resolves.
You can use the following copyDocAssetsTask custom task that with (doc in Compile) and src/main/doc-resources directory gives you what you want. The point is to copy images to the directory where the documentation is generated, i.e. (target in (Compile, doc)).value.
build.sbt:
lazy val copyDocAssetsTask = taskKey[Unit]("Copy doc assets")
copyDocAssetsTask := {
println("Copying doc assets")
val sourceDir = file("src/main/doc-resources")
val targetDir = (target in (Compile, doc)).value
IO.copyDirectory(sourceDir, targetDir)
}
copyDocAssetsTask <<= copyDocAssetsTask triggeredBy (doc in Compile)
Obviously the directory where you place the images is arbitrary, and when you decide otherwise, just update the custom task accordingly.
Thanks, I used an adaptation of this that I hope might help others, particularly on multi-module projects:
First, unidoc at https://github.com/sbt/sbt-unidoc will merge your scaladoc from multi-module projects into a single location, which is typically what you want. Then the following in build.sbt:
lazy val copyDocAssetsTask = taskKey[Unit]("Copy unidoc resources")
copyDocAssetsTask := {
println("Copying unidoc resources")
val sourceDir = file("src/main/doc-resources")
val targetDir = (target in (Compile, doc)).value.getParentFile
println(s"from ${sourceDir.getAbsolutePath} to ${targetDir.getAbsolutePath}")
IO.copyDirectory(sourceDir, new java.io.File(targetDir, "unidoc"))
}
copyDocAssetsTask := (copyDocAssetsTask triggeredBy (unidoc in Compile)).value
then put your documents under src/main/doc-resources in the root project in subdirectories following your package structure for the path to your class with the scaladoc to include the diagram (this just saves you having to mess around with parent directories in the URL) and embed something like:
<img src="DesignModel.svg" width="98%"/> in your scaladoc
e.g. if this scaladoc was in a class in package com.someone.thing in any project in the multi-module build, the DesignModel.svg file would go in src/main/doc-resources/com/someone/thing inside the root project.

How to call our own .class postprocessor after compile?

My company is switching from ant to sbt to ease Scala integration into our huge Java existing code (smart move if you ask me).
After compiling, we usually post-process all the generated .class with a tool of our own which is a result of the compilation.
I have been trying to do the same in sbt and it appears more complicated than expected.
I tried:
calling our postprocessor with fullRunTask. Works fine but we would like to pass "products.value" to look for the .class files and it does not work
another and even better solution would be to extend compile (compile in Compile ~= { result => ...). But I did not found how the code after "result =>" can call our postprocessor
we are looking at other solutions: multiple projects, one for the postprocessor, one for the rest of the code and this would clean but because the source code is entangled, this is not as easy as it seems (and we still would have the first problem)
Any help?
I would just write a simple plugin that runs after the other stages. It can inspect the target folder for all the .class files.
You could then do something like sbt clean compile myplugin in your build server.
This is the approach taken by the proguard plugin[1]. You could take a look at that as a starting point.
[1] https://github.com/sbt/sbt-proguard
Finally, I found a solution after reading "SBT in Action" and other documents. It is very simple but understanding SBT is not (at least for me).
name := "Foo"
version := "1.0"
scalaVersion := "2.11.0"
fork := true
lazy val foo = TaskKey[Unit]("foo")
val dynamic = Def.taskDyn {
val classDir = (classDirectory in Compile).value
val command = " Foo "+classDir
(runMain in Compile).toTask(command)
}
foo := {
dynamic.value
}
foo <<= foo triggeredBy(compile in Compile)
The sample project contains a Foo.scala with the main function

How to share version values between project/plugins.sbt and project/Build.scala?

I would like to share a common version variable between an sbtPlugin and the rest of the build
Here is what I am trying:
in project/Build.scala:
object Versions {
scalaJs = "0.5.0-M3"
}
object MyBuild extends Build {
//Use version number
}
in plugins.sbt:
addSbtPlugin("org.scala-lang.modules.scalajs" % "scalajs-sbt-plugin" % Versions.scalaJs)
results in
plugins.sbt:15: error: not found: value Versions
addSbtPlugin("org.scala-lang.modules.scalajs" % "scalajs-sbt-plugin" % Versions.scalaJs)
Is there a way to share the version number specification between plugins.sbt and the rest of the build, e.g. project/Build.scala?
sbt-buildinfo
If you need to share version number between build.sbt and hello.scala, what would you normally do? I don't know about you, but I would use sbt-buildinfo that I wrote.
This can be configured using buildInfoKeys setting to expose arbitrary key values like version or some custom String value. I understand this is not exactly what you're asking but bear with me.
meta-build (turtles all the way down)
As Jacek noted and stated in Getting Started Guide, the build in sbt is a project defined in the build located in project directory one level down. To distinguish the builds, let's define the normal build as the proper build, and the build that defines the proper build as meta-build. For example, we can say that an sbt plugin is a library of the root project in the meta build.
Now let's get back to your question. How can we share info between project/Build.scala and project/plugins.sbt?
using sbt-buildinfo for meta-build
We can just define another level of build by creating project/project and add sbt-buildinfo to the (meta-)meta-build.
Here are the files.
In project/project/buildinfo.sbt:
addSbtPlugin("com.eed3si9n" % "sbt-buildinfo" % "0.3.2")
In project/project/Dependencies.scala:
package metabuild
object Dependencies {
def scalaJsVersion = "0.5.0-M2"
}
In project/build.properties:
sbt.version=0.13.5
In project/buildinfo.sbt:
import metabuild.Dependencies._
buildInfoSettings
sourceGenerators in Compile <+= buildInfo
buildInfoKeys := Seq[BuildInfoKey]("scalaJsVersion" -> scalaJsVersion)
buildInfoPackage := "metabuild"
In project/scalajs.sbt:
import metabuild.Dependencies._
addSbtPlugin("org.scala-lang.modules.scalajs" % "scalajs-sbt-plugin" % scalaJsVersion)
In project/Build.scala:
import sbt._
import Keys._
import metabuild.BuildInfo._
object Builds extends Build {
println(s"test: $scalaJsVersion")
}
So there's a bit of a boilerplate in project/buildinfo.sbt, but the version info is shared across the build definition and the plugin declaration.
If you're curious where BuildInfo is defined, peek into project/target/scala-2.10/sbt-0.13/src_managed/.
For the project/plugins.sbt file you'd have to have another project under project with the Versions.scala file. That would make the definition of Versions.scalaJs visible.
The reason for doing it is that *.sbt files belong to a project build definition at the current level with *.scala files under project to expand on it. And it's...turtles all the way down, i.e. sbt is recursive.
I'm not sure how much the following can help, but it might be worth to try out - to share versions between projects - plugins and the main one - you'd have to use ProjectRef as described in the answer to RootProject and ProjectRef:
When you want to include other, separate builds directly instead of
using their published binaries, you use "source dependencies". This is
what RootProject and ProjectRef declare. ProjectRef is the most
general: you specify the location of the build (a URI) and the ID of
the project in the build (a String) that you want to depend on.
RootProject is a convenience that selects the root project for the
build at the URI you specify.
My proposal is to hack. For example, in build.sbt you can add a task:
val readPluginSbt = taskKey[String]("Read plugins.sbt file.")
readPluginSbt := {
val lineIterator = scala.io.Source.fromFile(new java.io.File("project","plugins.sbt")).getLines
val linesWithValIterator = lineIterator.filter(line => line.contains("scalaxbVersion"))
val versionString = linesWithValIterator.mkString("\n").split("=")(1).trim
val version = versionString.split("\n")(0) // only val declaration
println(version)
version
}
When you call readPluginSbt you will see the contents of plugins.sbt. You can parse this file and extract the variable.
For example:
resolvers += Resolver.sonatypeRepo("public")
val scalaxbVersion = "1.1.2"
addSbtPlugin("org.scalaxb" % "sbt-scalaxb" % scalaxbVersion)
addSbtPlugin("org.xerial.sbt" % "sbt-pack" % "0.5.1")
You can extract scalaxbVersion with regular expressions/split:
scala> val line = """val scalaxbVersion = "1.1.2""""
line: String = val scalaxbVersion = "1.1.2"
scala> line.split("=")(1).trim
res1: String = "1.1.2"

Resources