how to set sbt plugins invoke scope? - sbt

val webAssemblyTask = TaskKey[Unit](
"web-assembly",
"assembly web/war like run-time package"
)
var out: TaskStreams = _
val baseSettings: Seq[Setting[_]] = Seq(
webAssemblyOutputDir <<= (sourceManaged) { _ / "build" },
webAssemblyTask <<= (
streams,
target,
sourceDirectory,
outputDirProjectName
) map {
(out_log, targetDir, sourceDir, outputDirProjectName) => {
out_log.log.info("web-assembly start")
out_log.log.info("sourceDir:" + sourceDir.getAbsolutePath)
out_log.log.info("targetDir:" + targetDir.getAbsolutePath)
val sourceAssetsDir = (sourceDir / "webapp" / "assets").toPath
val classesAssetsDir = (targetDir / "scala-2.10" / "classes" / "assets").toPath
Files.createSymbolicLink(classesAssetsDir, sourceAssetsDir)
}
}
)
val webAssemblySettings = inConfig(Runtime)(baseSettings)
I wrote a plugin of sbt.
I type webAssembly in sbt console, the plugin run ok.
But I want to run after compile, before runtime, how can I do it?

how to set sbt plugins invoke scope?
I think you're confusing the configuration (also known as Maven scope) name with tasks like compile and run. They happen to have related configuration, but that doesn't mean compile task is identical to Compile configuration.
I could interpret this question to be how can a plugin setting invoke tasks scoped in some other configuration. For that you use in method like: key in (Config) or key in (Config, task). Another way to interpret it may be how can plugin tasks be scoped in a configuration. You use inConfig(Config)(...), which you're already doing. But you'd typically want plugins to be configuration neutral. See my blog post for more details on this.
I want to run after compile, before run, how can I do it?
This makes much more sense. In sbt you mostly focus on the preconditions of the tasks. One of the useful command is inspect tree key. You can run that for run tasks and get the entire tasks/settings that it depends on. Here's where you see it calling compile:compile (another notation for compile in Compile):
helloworld> inspect tree run
[info] compile:run = InputTask[Unit]
[info] +-runtime:fullClasspath = Task[scala.collection.Seq[sbt.Attributed[java.io.File]]]
[info] | +-runtime:exportedProducts = Task[scala.collection.Seq[sbt.Attributed[java.io.File]]]
[info] | | +-compile:packageBin::artifact = Artifact(sbt-sequential,jar,jar,None,List(compile),None,Map())
[info] | | +-runtime:configuration = runtime
[info] | | +-runtime:products = Task[scala.collection.Seq[java.io.File]]
[info] | | | +-compile:classDirectory = target/scala-2.10/sbt-0.13/classes
[info] | | | +-compile:copyResources = Task[scala.collection.Seq[scala.Tuple2[java.io.File, java.io.File]]]
[info] | | | +-compile:compile = Task[sbt.inc.Analysis]
This is useful in discovering compile:products, which "Build products that get packaged" according to help products command:
helloworld> help products
Build products that get packaged.
Since runtime:products happens before compile:run, if it depended on your task, your task will be called before compile:run (inspect tree also shows that run resolved to that).
To simplify your plugin task, I'm just going to call it sayHello:
val sayHello = taskKey[Unit]("something")
sayHello := {
println("hello")
}
You can rewire products in Runtime as follows:
products in Runtime := {
val old = (products in Runtime).value
sayHello.value
old
}
This will satisfy "before run" part. You want to make sure that this runs after compile. Again, just add task dependency to it:
sayHello := {
(compile in Compile).value
println("hello")
}
When the user runs run task, sbt will correct calculate the dependencies and runs sayHello task somewhere between compile and run.

Related

Key in Configuration : how to list Configurations and Keys?

The sbt in Action book introduces a concept of Key in Configuration
It then lists the default configurations:
Compile
Test
Runtime
IntegrationTest
Q1) Is it possible to print out a list of all Configurations from a sbt session? If not, can I find information on Configurations in the sbt documentation?
Q2) For a particular Configuration, e.g. 'Compile', is it possible to print out a list of Keys for the Configuration from a sbt session? If not, can I find information on a Configuration's Keys in the sbt documentation?
List of all configurations
For this you can use a setting like so:
val allConfs = settingKey[List[String]]("Returns all configurations for the current project")
val root = (project in file("."))
.settings(
name := "scala-tests",
allConfs := {
configuration.all(ScopeFilter(inAnyProject, inAnyConfiguration)).value.toList
.map(_.name)
}
This shows the name of all configurations. You can access more details about each configuration inside the map.
Output from the interactive sbt console:
> allConfs
[info] * provided
[info] * test
[info] * compile
[info] * runtime
[info] * optional
If all you want is to print them you can have a settingKey[Unit] and use println inside the setting definition.
List of all the keys in a configuration
For this we need a task (there might be other ways, but I haven't explored, in sbt I'm satisfied if something works... ) and a parser to parse user input.
All join the above setting in this snippet:
import sbt._
import sbt.Keys._
import complete.DefaultParsers._
val allConfs = settingKey[List[String]]("Returns all configurations for the current project")
val allKeys = inputKey[List[String]]("Prints all keys of a given configuration")
val root = (project in file("."))
.settings(
name := "scala-tests",
allConfs := {
configuration.all(ScopeFilter(inAnyProject, inAnyConfiguration)).value.toList
.map(_.name)
},
allKeys := {
val configHints = s"One of: ${
configuration.all(ScopeFilter(inAnyProject, inAnyConfiguration)).value.toList.mkString(" ")
}"
val configs = spaceDelimited(configHints).parsed.map(_.toLowerCase).toSet
val extracted: Extracted = Project.extract(state.value)
val l = extracted.session.original.toList
.filter(set => set.key.scope.config.toOption.map(_.name.toLowerCase)
.exists(configs.contains))
.map(_.key.key.label)
l
}
)
Now you can use it like:
$ sbt "allKeys compile"
If you are in interactive mode you can press tab after allKeys to see the prompt:
> allKeys
One of: provided test compile runtime optional
Since allKeys is a task it's output won't appear on the sbt console if you just "return it" but you can print it.

SBT Subprojects do not recognize plugin commands

I'm having an issue with getting SBT Subprojects to recognize commands provided by plugins. I have the following plugin source:
object DemoPlugin extends AutoPlugin {
override lazy val projectSettings = Seq(commands += demoCommand)
lazy val demoCommand =
Command.command("demo") { (state: State) =>
println("Demo Plugin!")
state
}
}
Which is used by a project configured as follows:
lazy val root = project in file(".")
lazy val sub = (project in file("sub")).
enablePlugins(DemoPlugin).
settings(
//...
)
The plugin is, of course, listed in project/plugins.sbt. However, when I open up sbt in the project, I see the following:
> sub/commands
[info] List(sbt.SimpleCommand#413d2cd1)
> sub/demo
[error] Expected ':' (if selecting a configuration)
[error] Not a valid key: demo (similar: doc)
[error] sub/demo
Even stranger, using consoleProject, I can see that the command in the project is the one defined by DemoPlugin!
scala> (commands in sub).eval.map { c => c.getClass.getMethod("name").invoke(c) }
res0: Seq[Object] = List(demo)
I'm looking to be able to type sub/demo, and have it perform the demo command. Any help would be much appreciated!
Commands aren't per-project. They only work for the top-level project.
It's also recommended to try and use tasks, or if needed input tasks where you might want to use a command.
If you really need a command, there's a way to have a sort of "holder" task, see the answer to Can you access a SBT SettingKey inside a Command?

How to combine InputKey and TaskKey into a new InputKey?

I have a SBT multi project which includes two sub projects. One is an ordinary Scala web server project and the other is just some web files. With my self written SBT plugin I can run Gulp on the web project. This Gulp task runs asynchronous. So with
sbt "web/webAppStart" "server/run"
I can start the Gulp development web server and my Scala backend server in parallel. Now I want to create a new task, that combines them both. So afterwards
sbt dev
for example should do the same. Here is what I tried so far:
// Build.sbt (only the relevant stuff)
object Build extends sbt.Build {
lazy val runServer: InputKey[Unit] = run in server in Compile
lazy val runWeb: TaskKey[Unit] = de.choffmeister.sbt.WebAppPlugin.webAppStart
lazy val dev = InputKey[Unit]("dev", "Starts a development web server")
// Scala backend project
lazy val server = (project in file("project-server"))
// Web frontend project
lazy val web = (project in file("project-web"))
// Root project
lazy val root = (project in file("."))
.settings(dev <<= (runServer) map { (_) => {
// do nothing
})
.aggregate(server, web)
This works so far. Now I don't have any idea, how to make dev also depend on the runWeb task. If I just add the runWeb task like
.settings(dev <<= (runWeb, runServer) map { (_, _) => {
// do nothing
})
then I get the error
[error] /Users/choffmeister/Development/shop/project/Build.scala:59:value map is not a member of (sbt.TaskKey[Unit], sbt.InputKey[Unit])
[error] .settings(dev <<= (runWeb, runServer) map { (_, _) =>
[error] ^
[error] one error found
[error] (compile:compile) Compilation failed
Can anyone help me with this please?
The optimal solution would pass the arguments given to dev to the runServer task. But I could also live with making dev a TaskKey[Unit] and then hard code to run runServer with no arguments.
tl;dr Use .value macro to execute dependent tasks or just alias the task sequence.
Using .value macro
Your case seems overly complicated to my eyes because of the pre-0.13 syntax (<<=) and the use of project/Build.scala (that often confuse not help people new to sbt).
You should just execute the two tasks in another as follows:
dev := {
runWeb.value
runServer.value
}
The complete example:
lazy val server = project
lazy val runServer = taskKey[Unit]("runServer")
runServer := {
println("runServer")
(run in server in Compile).value
}
lazy val runWeb = taskKey[Unit]("runWeb")
runWeb := {
println("runWeb")
}
lazy val dev = taskKey[Unit]("dev")
dev := {
println("dev")
}
dev <<= dev dependsOn (runServer, runWeb)
Using alias command
sbt offers alias command that...
[sbt-learning-space]> help alias
alias
Prints a list of defined aliases.
alias name
Prints the alias defined for `name`.
alias name=value
Sets the alias `name` to `value`, replacing any existing alias with that name.
Whenever `name` is entered, the corresponding `value` is run.
If any argument is provided to `name`, it is appended as argument to `value`.
alias name=
Removes the alias for `name`.
Just define what tasks/command you want to execute in an alias as follows:
addCommandAlias("devAlias", ";runServer;runWeb")
Use devAlias as if it were a built-in task:
[sbt-learning-space]> devAlias
runServer
[success] Total time: 0 s, completed Jan 25, 2015 6:30:15 PM
runWeb
[success] Total time: 0 s, completed Jan 25, 2015 6:30:15 PM

Why does inConfig(conf)(settings) not pick some settings?

I thought that inConfig(conf)(settings) would copy all settings into the given configuration. But this doesn't seem to do what I would expect.
Given a configuration:
lazy val Monkjack: Configuration = config("monkjack")
Then I do:
inConfig(Monkjack)(Defaults.compileSettings)
So I can do compile as I would expect:
sbt clean monkjack:compile
[info] Compiling 17 Scala sources to ...
[success] Total time: 9 s, completed 01-Sep-2014 09:40:41
So now I want to adjust the scalac options when using this new config (the actual options are irrevlant, this one is just useful because it has verbose output so its easy to see if its being used or not):
scalacOptions in Monkjack := Seq("-Yshow-syms")
When I monjack:compile, I don't see this option being triggered. It's like the above line wasn't added. But if I also add in the following lines it works!
sources in Monkjack := (sources in Compile).value
sourceDirectory in Monkjack := (sourceDirectory in Compile).value,
So why do I need the final two lines and what is inConfig actually doing if its not doing what I expect. As an additional oddity, when I do the above, although it works, I get two compile phases, one going to target/classes and one going to target/monkjack-classes.
Edit (inspect without the sources/sourceDirectory settings)
> inspect tree monkjack:compile
[info] monkjack:compile = Task[sbt.inc.Analysis]
[info] +-monkjack:compile::compileInputs = Task[sbt.Compiler$Inputs]
[info] | +-*:compilers = Task[sbt.Compiler$Compilers]
[info] | +-monkjack:sources = Task[scala.collection.Seq[java.io.File]]
[info] | +-*/*:maxErrors = 100
[info] | +-monkjack:incCompileSetup = Task[sbt.Compiler$IncSetup]
[info] | +-monkjack:compile::streams = Task[sbt.std.TaskStreams[sbt.Init$ScopedKey[_ <: Any]]]
[info] | | +-*/*:streamsManager = Task[sbt.std.Streams[sbt.Init$ScopedKey[_ <: Any]]]
[info] | |
[info] | +-*/*:sourcePositionMappers = Task[scala.collection.Seq[scala.Function1[xsbti.Position, scala.Option[xsbti.Position]]]]
[info] | +-monkjack:dependencyClasspath = Task[scala.collection.Seq[sbt.Attributed[java.io.File]]]
[info] | +-monkjack:classDirectory = target/scala-2.11/monkjack-classes
[info] | +-monkjack:scalacOptions = Task[scala.collection.Seq[java.lang.String]]
[info] | +-*:javacOptions = Task[scala.collection.Seq[java.lang.String]]
[info] | +-*/*:compileOrder = Mixed
[info] |
[info] +-monkjack:compile::streams = Task[sbt.std.TaskStreams[sbt.Init$ScopedKey[_ <: Any]]]
[info] +-*/*:streamsManager = Task[sbt.std.Streams[sbt.Init$ScopedKey[_ <: Any]]]
[info]
tl;dr No sources for a new configuration means no compilation and hence no use of scalacOptions.
From When to define your own configuration:
If your plugin introduces either a new set of source code or its own library dependencies, only then you want your own configuration.
inConfig does the (re)mapping only so all the keys are initialised for a given scope - in this case the monkjack configuration.
In other words, inConfig computes values for the settings in a new scope.
The settings of much influence here are sourceDirectory and sourceManaged that are set in sourceConfigPaths (in Defaults.sourceConfigPaths) as follows:
lazy val sourceConfigPaths = Seq(
sourceDirectory <<= configSrcSub(sourceDirectory),
sourceManaged <<= configSrcSub(sourceManaged),
...
)
configSrcSub gives the answer (reformatted slightly to ease reading):
def configSrcSub(key: SettingKey[File]): Initialize[File] =
(key in ThisScope.copy(config = Global), configuration) { (src, conf) =>
src / nameForSrc(conf.name)
}
That leads to the answer that if you moved your sources to src/monkjack/scala that would work fine. That's described in Scoping by configuration axis:
A configuration defines a flavor of build, potentially with its own classpath, sources, generated packages, etc. (...)
By default, all the keys associated with compiling, packaging, and running are scoped to a configuration and therefore may work differently in each configuration.

Including project in build depending on setting's value, e.g. scalaVersion?

I have a Scala project that is divided into several subprojects:
lazy val core: Seq[ProjectReference] = Seq(common, json_scalaz7, json_scalaz)
I'd like to make the core lazy val conditional on the Scala version I'm currently using, so I tried this:
lazy val core2: Seq[ProjectReference] = scalaVersion {
case "2.11.0" => Seq(common, json_scalaz7)
case _ => Seq(common, json_scalaz7, json_scalaz)
}
Simply speaking, I'd like to exclude json_scalaz for Scala 2.11.0 (when the value of the scalaVersion setting is "2.11.0").
This however gives me the following compilation error:
[error] /home/diego/work/lift/framework/project/Build.scala:39: type mismatch;
[error] found : sbt.Project.Initialize[Seq[sbt.Project]]
[error] required: Seq[sbt.ProjectReference]
[error] lazy val core2: Seq[ProjectReference] = scalaVersion {
[error] ^
[error] one error found
Any idea how to solve this?
Update
I'm using sbt version 0.12.4
This project is the Lift project, which compiles against "2.10.0", "2.9.2", "2.9.1-1", "2.9.1" and now we are working on getting it to compile with 2.11.0. So creating a compile all task would not be practical, as it would take a really long time.
Update 2
I'm hoping there is something like this:
lazy val scala_xml = "org.scala-lang.modules" %% "scala-xml" % "1.0.1"
lazy val scala_parser = "org.scala-lang.modules" %% "scala-parser-combinators" % "1.0.1"
...
lazy val common =
coreProject("common")
.settings(description := "Common Libraties and Utilities",
libraryDependencies ++= Seq(slf4j_api, logback, slf4j_log4j12),
libraryDependencies <++= scalaVersion {
case "2.11.0" => Seq(scala_xml, scala_parser)
case _ => Seq()
}
)
but for the projects list
Note how depending on the scala version, I add the scala_xml and scala_parser_combinator libraries
You can see the complete build file here
Cross building a project
Simply speaking, I'd like to exclude json_scalaz for Scala 2.11.0
The built-in support in sbt for this is called cross building, which is described in Cross-Building a Project. Here's from the section with a bit of correction:
Define the versions of Scala to build against in the crossScalaVersions setting. For example, in a .sbt build definition:
crossScalaVersions := Seq("2.10.4", "2.11.0")
To build against all versions listed crossScalaVersions, prefix the action to run with +. For example:
> +compile
Multiple-project builds
sbt also has built-in support to aggregate tasks across multiple projects, which is described Aggregation. If what you need eventually is normal built-in tasks like compile and test, you could set up a dummy aggregate without json_scalaz.
lazy val withoutJsonScalaz = (project in file("without-json-scalaz")).
.aggregate(liftProjects filterNot {_ == json_scalaz}: _*)
From the shell, you should be able to use this as:
> ++2.11.0
> project withoutJsonScalaz
> test
Getting values from multiple scopes
Another feature you might be interested in is ScopeFilter. This has the ability to traverse multiple projects beyond usual aggregation and cross building. You would need to create a setting whose type is ScopeFilter and set it based on scalaBinaryVersion.value. With scope filters, you can do:
val coreProjects = settingKey[ScopeFilter]("my core projects")
val compileAll = taskKey[Seq[sbt.inc.Analysis]]("compile all")
coreProjects := {
(scalaBinaryVersion.value) match {
case "2.10" => ScopeFilter(inProjects(common, json_scalaz7, json_scalaz))
}
}
compileAll := compileAllTask.value
lazy val compileAllTask = Def.taskDyn {
val f = coreProjects.value
(compile in Compile) all f
}
In this case compileAll would have the same effect as +compile, but you could aggregate the result and do something interesting like sbt-unidoc.

Resources