Why does inConfig(conf)(settings) not pick some settings? - sbt

I thought that inConfig(conf)(settings) would copy all settings into the given configuration. But this doesn't seem to do what I would expect.
Given a configuration:
lazy val Monkjack: Configuration = config("monkjack")
Then I do:
inConfig(Monkjack)(Defaults.compileSettings)
So I can do compile as I would expect:
sbt clean monkjack:compile
[info] Compiling 17 Scala sources to ...
[success] Total time: 9 s, completed 01-Sep-2014 09:40:41
So now I want to adjust the scalac options when using this new config (the actual options are irrevlant, this one is just useful because it has verbose output so its easy to see if its being used or not):
scalacOptions in Monkjack := Seq("-Yshow-syms")
When I monjack:compile, I don't see this option being triggered. It's like the above line wasn't added. But if I also add in the following lines it works!
sources in Monkjack := (sources in Compile).value
sourceDirectory in Monkjack := (sourceDirectory in Compile).value,
So why do I need the final two lines and what is inConfig actually doing if its not doing what I expect. As an additional oddity, when I do the above, although it works, I get two compile phases, one going to target/classes and one going to target/monkjack-classes.
Edit (inspect without the sources/sourceDirectory settings)
> inspect tree monkjack:compile
[info] monkjack:compile = Task[sbt.inc.Analysis]
[info] +-monkjack:compile::compileInputs = Task[sbt.Compiler$Inputs]
[info] | +-*:compilers = Task[sbt.Compiler$Compilers]
[info] | +-monkjack:sources = Task[scala.collection.Seq[java.io.File]]
[info] | +-*/*:maxErrors = 100
[info] | +-monkjack:incCompileSetup = Task[sbt.Compiler$IncSetup]
[info] | +-monkjack:compile::streams = Task[sbt.std.TaskStreams[sbt.Init$ScopedKey[_ <: Any]]]
[info] | | +-*/*:streamsManager = Task[sbt.std.Streams[sbt.Init$ScopedKey[_ <: Any]]]
[info] | |
[info] | +-*/*:sourcePositionMappers = Task[scala.collection.Seq[scala.Function1[xsbti.Position, scala.Option[xsbti.Position]]]]
[info] | +-monkjack:dependencyClasspath = Task[scala.collection.Seq[sbt.Attributed[java.io.File]]]
[info] | +-monkjack:classDirectory = target/scala-2.11/monkjack-classes
[info] | +-monkjack:scalacOptions = Task[scala.collection.Seq[java.lang.String]]
[info] | +-*:javacOptions = Task[scala.collection.Seq[java.lang.String]]
[info] | +-*/*:compileOrder = Mixed
[info] |
[info] +-monkjack:compile::streams = Task[sbt.std.TaskStreams[sbt.Init$ScopedKey[_ <: Any]]]
[info] +-*/*:streamsManager = Task[sbt.std.Streams[sbt.Init$ScopedKey[_ <: Any]]]
[info]

tl;dr No sources for a new configuration means no compilation and hence no use of scalacOptions.
From When to define your own configuration:
If your plugin introduces either a new set of source code or its own library dependencies, only then you want your own configuration.
inConfig does the (re)mapping only so all the keys are initialised for a given scope - in this case the monkjack configuration.
In other words, inConfig computes values for the settings in a new scope.
The settings of much influence here are sourceDirectory and sourceManaged that are set in sourceConfigPaths (in Defaults.sourceConfigPaths) as follows:
lazy val sourceConfigPaths = Seq(
sourceDirectory <<= configSrcSub(sourceDirectory),
sourceManaged <<= configSrcSub(sourceManaged),
...
)
configSrcSub gives the answer (reformatted slightly to ease reading):
def configSrcSub(key: SettingKey[File]): Initialize[File] =
(key in ThisScope.copy(config = Global), configuration) { (src, conf) =>
src / nameForSrc(conf.name)
}
That leads to the answer that if you moved your sources to src/monkjack/scala that would work fine. That's described in Scoping by configuration axis:
A configuration defines a flavor of build, potentially with its own classpath, sources, generated packages, etc. (...)
By default, all the keys associated with compiling, packaging, and running are scoped to a configuration and therefore may work differently in each configuration.

Related

Key in Configuration : how to list Configurations and Keys?

The sbt in Action book introduces a concept of Key in Configuration
It then lists the default configurations:
Compile
Test
Runtime
IntegrationTest
Q1) Is it possible to print out a list of all Configurations from a sbt session? If not, can I find information on Configurations in the sbt documentation?
Q2) For a particular Configuration, e.g. 'Compile', is it possible to print out a list of Keys for the Configuration from a sbt session? If not, can I find information on a Configuration's Keys in the sbt documentation?
List of all configurations
For this you can use a setting like so:
val allConfs = settingKey[List[String]]("Returns all configurations for the current project")
val root = (project in file("."))
.settings(
name := "scala-tests",
allConfs := {
configuration.all(ScopeFilter(inAnyProject, inAnyConfiguration)).value.toList
.map(_.name)
}
This shows the name of all configurations. You can access more details about each configuration inside the map.
Output from the interactive sbt console:
> allConfs
[info] * provided
[info] * test
[info] * compile
[info] * runtime
[info] * optional
If all you want is to print them you can have a settingKey[Unit] and use println inside the setting definition.
List of all the keys in a configuration
For this we need a task (there might be other ways, but I haven't explored, in sbt I'm satisfied if something works... ) and a parser to parse user input.
All join the above setting in this snippet:
import sbt._
import sbt.Keys._
import complete.DefaultParsers._
val allConfs = settingKey[List[String]]("Returns all configurations for the current project")
val allKeys = inputKey[List[String]]("Prints all keys of a given configuration")
val root = (project in file("."))
.settings(
name := "scala-tests",
allConfs := {
configuration.all(ScopeFilter(inAnyProject, inAnyConfiguration)).value.toList
.map(_.name)
},
allKeys := {
val configHints = s"One of: ${
configuration.all(ScopeFilter(inAnyProject, inAnyConfiguration)).value.toList.mkString(" ")
}"
val configs = spaceDelimited(configHints).parsed.map(_.toLowerCase).toSet
val extracted: Extracted = Project.extract(state.value)
val l = extracted.session.original.toList
.filter(set => set.key.scope.config.toOption.map(_.name.toLowerCase)
.exists(configs.contains))
.map(_.key.key.label)
l
}
)
Now you can use it like:
$ sbt "allKeys compile"
If you are in interactive mode you can press tab after allKeys to see the prompt:
> allKeys
One of: provided test compile runtime optional
Since allKeys is a task it's output won't appear on the sbt console if you just "return it" but you can print it.

generate resources from an AutoPlugin in sbt

I have created the following plugin, like so many plugins before it...
/**
* This plugin automatically generates a version number based on the configured
* minor version and today's date and time.
*/
object DateVersionPlugin extends AutoPlugin {
//override def trigger = allRequirements
def dateFormat (fmt : String) =
new java.text.SimpleDateFormat(fmt).format(
new java.util.Date()
)
def versionNumber (majorVersion : String,
versionTrimFront : Int,
versionDateFormat : String) =
"%s.%s".format(
majorVersion, dateFormat(versionDateFormat).substring(versionTrimFront)
)
/**
* Defines all settings/tasks that get automatically imported,
* when the plugin is enabled
*/
object autoImport {
/**
* The number of values to trim off the front of the date string.
*
* This is used to achieve a date string which doesn't include the
* present millenium. The century, a stretch, can be imagined as
* conceivable - but few civilizations have lasted multiple millennia.
*/
lazy val versionTrimFront = settingKey[Int]("Number of characters to remove from front of date")
/**
* The format to use for generating the date-part of this version number.
*/
lazy val versionDateFormat = settingKey[String]("The date format to use for versions")
/**
* The major version to place at the front of the version number.
*/
lazy val versionMajor = settingKey[String]("The major version number, default 0")
/**
* The filename of the generated resource.
*/
lazy val versionFilename = settingKey[String]("The filename of the file to generate")
/**
* The name of the property to place in the version number.
*/
lazy val versionPropertyName = settingKey[String]("The name of the property to store as version")
/**
* Generate a version.conf configuration file.
*
* This task generates a configuration file of the name specified in the
* settings key.
*/
lazy val generateVersionConf = taskKey[Seq[File]]("Generates a version.conf file.")
}
import autoImport._
/**
* Provide default settings
*/
override def projectSettings: Seq[Setting[_]] = Seq(
versionFilename := "version.conf",
versionPropertyName := "version",
versionDateFormat := "YY.D.HHmmss",
versionTrimFront := 0,
versionMajor := "0",
(version in Global) := versionNumber(versionMajor.value,
versionTrimFront.value, versionDateFormat.value),
generateVersionConf <<=
(resourceManaged in Compile, version, versionFilename, versionPropertyName, streams) map {
(dir, v, filename, propertyName, s) =>
val file = dir / filename
val contents = propertyName + " = \"" + v.split("-").head + "\""
s.log.info("Writing " + contents + " to " + file)
IO.write(file, contents)
Seq(file)
},
resourceGenerators in Compile += generateVersionConf.taskValue
)
}
The generate-version-conf task behaves as desired, generating the file I'm looking for. The version setting is updated as expected by projects that use this plugin. But yet the following are not happening and I'm not clear why:
The config file is not generated by compile.
The config file is not packaged in the jar by package.
The config file is not in the classpath when the run task is used.
Note I have also tried a dozen or so variations on this, and I have further tried:
resourceGenerators in Compile <+= generateVersionConf
Which as I understand it should result in more or less the same behavior.
Inspecting the runtime attributes of this, I see some of the settings are applied successfully:
> inspect version
[info] Setting: java.lang.String = 0.15.338.160117
[info] Description:
[info] The version/revision of the current module.
[info] Provided by:
[info] */*:version
[info] Defined at:
[info] (com.quantcast.sbt.version.DateVersionPlugin) DateVersionPlugin.scala:101
[info] Reverse dependencies:
[info] *:isSnapshot
[info] *:generateVersionConf
[info] *:projectId
[info] Delegates:
[info] *:version
[info] {.}/*:version
[info] */*:version
[info] Related:
[info] */*:version
However, this is not true for compile:resourceGenerators, which shows that it still maintains the defaults.
> inspect compile:resourceGenerators
[info] Setting: scala.collection.Seq[sbt.Task[scala.collection.Seq[java.io.File]]] = List(Task(_))
[info] Description:
[info] List of tasks that generate resources.
[info] Provided by:
[info] {file:/home/scott/code/quantcast/play/sbt-date-version/sbt-test/}root/compile:resourceGenerators
[info] Defined at:
[info] (sbt.Defaults) Defaults.scala:207
[info] (sbt.Defaults) Defaults.scala:208
[info] Dependencies:
[info] compile:discoveredSbtPlugins
[info] compile:resourceManaged
[info] Reverse dependencies:
[info] compile:managedResources
[info] Delegates:
[info] compile:resourceGenerators
[info] *:resourceGenerators
[info] {.}/compile:resourceGenerators
[info] {.}/*:resourceGenerators
[info] */compile:resourceGenerators
[info] */*:resourceGenerators
[info] Related:
[info] test:resourceGenerators
My question is (now that I've continued to research this more), what could be keeping my changes to (generateResources in Compile) from being applied?
If this plugin needs to require the JvmPlugin. This is because the JvmPlugin defines the setting dependencies. Without it, apparently the compile:resourceGenerators setting is being overwritten by the defaults, redefining the set of resource generators to Nil and building from there.
So the solution is to include the following line in the AutoPlugin definition.
override def requires = plugins.JvmPlugin

Why does publishing plugin project fail with RuntimeException: Repository for publishing is not specified?

I am trying to publish an SBT plugin to a repository. I'm not sure if this has any relevance, but our plugin loads the sbt-twirl plugin - Googling around, it seems like publishConfiguration might be overriden:
new PublishConfiguration(None, "dotM2", arts, Seq(), level)
When I run the publish task, artifacts are deployed to the repo, but the sbt task then fails:
sbt (my-sbt-plugin)> publish
[info] Loading global plugins from ...
...
[info] Done packaging.
[info] published sbt-my-sbt-plugin to http://my.repo.com/.../sbt-my-sbt-plugin-0.1-SNAPSHOT.jar
java.lang.RuntimeException: Repository for publishing is not specified.
.... stack trace here ....
[error] (my-sbt-plugin/*:publishConfiguration) Repository for publishing is not specified.
What is causing the error, and what could I do to stop the publishing from failing?
** Update ** Here is inspect publish
sbt (my-sbt-plugin)> inspect publish
[info] Task: Unit
[info] Description:
[info] Publishes artifacts to a repository.
[info] Provided by:
[info] {file:/path/to/my-sbt-plugin/}my-sbt-plugin/*:publish
[info] Defined at:
[info] (sbt.Classpaths) Defaults.scala:988
[info] Dependencies:
[info] my-sbt-plugin/*:ivyModule
[info] my-sbt-plugin/*:publishConfiguration
[info] my-sbt-plugin/*:publish::streams
[info] Delegates:
[info] my-sbt-plugin/*:publish
[info] {.}/*:publish
[info] */*:publish
[info] Related:
[info] plugin/*:publish
Here's how I've configured publishing (with some of the plugin settings, excluding libraryDependencies and 1 or 2 other settings)
lazy val plugin = project
.settings(publishSbtPlugin: _*)
.settings(
name := "my-sbt-plugin",
sbtPlugin := true,
addSbtPlugin("com.typesafe.sbt" % "sbt-twirl" % "1.0.2")
)
def publishSbtPlugin = Seq(
publishMavenStyle := true,
publishTo := {
val myrepo = "http://myrepo.tld/"
if (isSnapshot.value) Some("The Realm" at myrepo + "snapshots")
else Some("The Realm" at myrepo + "releases")
},
credentials += Credentials(Path.userHome / ".ivy2" / ".credentials")
)
tl;dr Don't use lazy val plugin = project to define a project (for unknown yet reasons)
After few comments it turned out that the issue was that the name of the project plugin as defined using lazy val plugin = project. It seems that the name is somehow reserved. Change the project's name to any other name than plugin and start over.
Specifying a project name other than "plugin" resolved the issue. I simplified the build definition a bit by removing a redundant build.sbt in 1 of the projects and am just using a full build definition in project directory. The root project that hosts the multi-project build is also reconfigured for no publishing:
lazy val root =
Project("sbt-my-plugin-root", file("."))
.settings(noPublishing: _*)
.aggregate(sbtMyPluginModule)
lazy val sbtMyPluginModule =
Project("sbt-my-plugin-module", file("sbt-my-plugin-module"))
.settings(publishSbtPlugin: _*)
.settings(
name := "sbt-my-plugin-module",
organization := "com.my.org",
sbtPlugin := true
)
lazy val noPublishing = seq(
publish := (),
publishLocal := ()
)
lazy val publishSbtPlugin = Seq(
publishMavenStyle := true,
publishArtifact in Test := false,
publishTo := {
val myrepo = "http://myrepo.tld/"
if (isSnapshot.value) Some("The Realm" at myrepo + "snapshots")
else Some("The Realm" at myrepo + "releases")
},
credentials += Credentials(Path.userHome / ".ivy2" / ".credentials")
)
if you trying this on your local then use publishLocal (not publish) as follows:
sbt clean compile publish-local

SBT execute code based on value

I want to do something like the following in SBT:
CrossVersion.partialVersion(scalaVersion.value) match {
case Some((2, 11)) =>
case Some((2, 10)) =>
}
But I don't want to assign that to anything, I simply want to run some code based on the value of the current cross version.
I could create a Task and then execute the task, but can I do this without needing the task?
I know you've said you didn't want to create a task, but I would say that's the cleanest way of doing it, so I'll post it as one of the solutions anyway.
Depends on Compile
val printScalaVersion = taskKey[Unit]("Prints Scala version")
printScalaVersion := {
CrossVersion.partialVersion(scalaVersion.value) match {
case Some((2, 11)) => println("2.11")
case Some((2, 10)) => println("2.10")
case _ => println("Other version")
}
}
compile in Compile := ((compile in Compile) dependsOn printScalaVersion).value
Override the Compile Task
If you really wouldn't like to create new task, you could redefine the compile task and add your code there (I think it's not as clean as the solution above).
compile in Compile := {
val analysis = (compile in Compile).value
CrossVersion.partialVersion(scalaVersion.value) match {
case Some((2, 11)) => println("2.11")
case Some((2, 10)) => println("2.10")
case _ => println("Other version")
}
analysis
}
Just a small "enhancement" to what #lpiepiora offered.
There could be a setting that'd hold the value of CrossVersion.partialVersion(scalaVersion.value) as follows:
lazy val sv = settingKey[Option[(Int, Int)]]("")
sv := CrossVersion.partialVersion(scalaVersion.value)
With the setting:
> sv
[info] Some((2,10))
> ++ "2.9.3"
[info] Setting version to 2.9.3
[info] Set current project to projectA (in build file:/C:/dev/sandbox/scalaVersionSetting/)
> sv
[info] Some((2,9))
> ++ "2.10.4"
[info] Setting version to 2.10.4
[info] Set current project to projectA (in build file:/C:/dev/sandbox/scalaVersionSetting/)
> sv
[info] Some((2,10))
> ++ "2.11"
[info] Setting version to 2.11
[info] Set current project to projectA (in build file:/C:/dev/sandbox/scalaVersionSetting/)
> sv
[info] Some((2,11))
...and so on.
That would give a setting to case upon.
lazy val printScalaVersion = taskKey[Unit]("Prints Scala version")
printScalaVersion := {
sv.value foreach println
}

how to set sbt plugins invoke scope?

val webAssemblyTask = TaskKey[Unit](
"web-assembly",
"assembly web/war like run-time package"
)
var out: TaskStreams = _
val baseSettings: Seq[Setting[_]] = Seq(
webAssemblyOutputDir <<= (sourceManaged) { _ / "build" },
webAssemblyTask <<= (
streams,
target,
sourceDirectory,
outputDirProjectName
) map {
(out_log, targetDir, sourceDir, outputDirProjectName) => {
out_log.log.info("web-assembly start")
out_log.log.info("sourceDir:" + sourceDir.getAbsolutePath)
out_log.log.info("targetDir:" + targetDir.getAbsolutePath)
val sourceAssetsDir = (sourceDir / "webapp" / "assets").toPath
val classesAssetsDir = (targetDir / "scala-2.10" / "classes" / "assets").toPath
Files.createSymbolicLink(classesAssetsDir, sourceAssetsDir)
}
}
)
val webAssemblySettings = inConfig(Runtime)(baseSettings)
I wrote a plugin of sbt.
I type webAssembly in sbt console, the plugin run ok.
But I want to run after compile, before runtime, how can I do it?
how to set sbt plugins invoke scope?
I think you're confusing the configuration (also known as Maven scope) name with tasks like compile and run. They happen to have related configuration, but that doesn't mean compile task is identical to Compile configuration.
I could interpret this question to be how can a plugin setting invoke tasks scoped in some other configuration. For that you use in method like: key in (Config) or key in (Config, task). Another way to interpret it may be how can plugin tasks be scoped in a configuration. You use inConfig(Config)(...), which you're already doing. But you'd typically want plugins to be configuration neutral. See my blog post for more details on this.
I want to run after compile, before run, how can I do it?
This makes much more sense. In sbt you mostly focus on the preconditions of the tasks. One of the useful command is inspect tree key. You can run that for run tasks and get the entire tasks/settings that it depends on. Here's where you see it calling compile:compile (another notation for compile in Compile):
helloworld> inspect tree run
[info] compile:run = InputTask[Unit]
[info] +-runtime:fullClasspath = Task[scala.collection.Seq[sbt.Attributed[java.io.File]]]
[info] | +-runtime:exportedProducts = Task[scala.collection.Seq[sbt.Attributed[java.io.File]]]
[info] | | +-compile:packageBin::artifact = Artifact(sbt-sequential,jar,jar,None,List(compile),None,Map())
[info] | | +-runtime:configuration = runtime
[info] | | +-runtime:products = Task[scala.collection.Seq[java.io.File]]
[info] | | | +-compile:classDirectory = target/scala-2.10/sbt-0.13/classes
[info] | | | +-compile:copyResources = Task[scala.collection.Seq[scala.Tuple2[java.io.File, java.io.File]]]
[info] | | | +-compile:compile = Task[sbt.inc.Analysis]
This is useful in discovering compile:products, which "Build products that get packaged" according to help products command:
helloworld> help products
Build products that get packaged.
Since runtime:products happens before compile:run, if it depended on your task, your task will be called before compile:run (inspect tree also shows that run resolved to that).
To simplify your plugin task, I'm just going to call it sayHello:
val sayHello = taskKey[Unit]("something")
sayHello := {
println("hello")
}
You can rewire products in Runtime as follows:
products in Runtime := {
val old = (products in Runtime).value
sayHello.value
old
}
This will satisfy "before run" part. You want to make sure that this runs after compile. Again, just add task dependency to it:
sayHello := {
(compile in Compile).value
println("hello")
}
When the user runs run task, sbt will correct calculate the dependencies and runs sayHello task somewhere between compile and run.

Resources