I'd like to create a compile configuration which is the same as the default one but adds a compiler plugin. In my particular case, I want to have a "dev" configuration but with the linter plugin (https://github.com/HairyFotr/linter) because it slows down compile times and there's no need to run it in production or continuous integration.
Now this is what I tried:
lazy val Dev = config("dev") extend Compile
lazy val root = (project in file(".")).configs(Dev).settings(
inConfig(Dev)(addCompilerPlugin("org.psywerx.hairyfotr" %% "linter" % "0.1.12")): _*)
and it should work, since when I inspect dev:libraryDependencies, it's what I expect it to be- it has org.psywerx.hairyfotr:linter:0.1.12:plugin->default(compile). Normally if I add the library with a "plugin" scope, it does work for the default settings:
libraryDependencies += ("org.psywerx.hairyfotr" %% "linter" % "0.1.12" % "plugin"
It just does not work if I add this under a different configuration, so there must be something else going on here.
This solves the problem, but not exactly in a way was asked. Here's the full build.sbt:
libraryDependencies ++= Seq(
"org.psywerx.hairyfotr" %% "linter" % "0.1.14" % "test")
val linter = Command.command("linter")(state => {
val linterJar = for {
(newState, result) <- Project.runTask(fullClasspath in Test, state)
cp <- result.toEither.right.toOption
linter <- cp.find(
_.get(moduleID.key).exists(mId =>
mId.organization == "org.psywerx.hairyfotr" &&
mId.name == "linter_2.11"))
} yield linter.data.absolutePath
val res = Project.runTask(scalacOptions, state)
res match {
case Some((newState, result)) =>
result.toEither.right.foreach { defaultScalacOptions =>
Project.runTask(compile in Test,
Project.extract(state).append(
scalacOptions := defaultScalacOptions ++ linterJar.map(p => Seq(s"-Xplugin:$p")).getOrElse(Seq.empty),
newState))
}
case None => sys.error("Couldn't get defaultScalacOptions")
}
state
})
lazy val root = (project in file(".")).configs(Test).settings(commands ++= Seq(linter))
The fact that you return unmodified state means you don't change the project settings. So if you run sbt linter, you should get your project compiled with the additional scalacOptions, but if you run compile in the same sbt session, it will not use those additional settings.
The tricky thing here is that scalacOptions is actually a TaskKey, not a SettingKey. I don't know why is that, but to get its value, you have to run that task. One reason might be that in sbt you cannot make setting depending on a task, but you can make a task depending on a task. In other words, scalacOptions can depend on the other task value, and maybe internally it does, I haven't checked. If current answer will work for you, I can try and think about more elegant way of achieving the same result.
EDIT: modified the code to specify the scalacOptions for the linter plugin proper. Please note the plugin has to be a managed dependency, not just a downloaded jar, for this solution to work. If you want to have it unmanaged, there's a way, but I won't go into it for now. Additionally, I've taken a freedom of making it also work for testing code, for illustration purposes.
Looking at Defaults.scala in the source, it seems like the compile command is always taking the options from the compile scope. So if I'm correct, you can have only one set of compilation options!
This seems to be confirmed by the fact that scalacOptions behaves the same way, and this is also why I don't see a non-hacky answer for these similar questions:
Different scalac options for different scopes or tasks?
Different compile options for tests and release in SBT?
I'd be happy to be proven wrong.
EDIT: FWIW, one might not be able to define another scalac options profile in the same project, but you could do so in a "different" project:
lazy val dev = (project in file(".")).
settings(target := baseDirectory.value / "target" / "dev").
settings(addCompilerPlugin("org.psywerx.hairyfotr" %% "linter" % "0.1.12"): _*)
This has the disadvantage that it has a separate output directory, so it will take more space and, more importantly, will not get incremental compiles between the two projects. However, after spending some time thinking about it, this may be by design. After all, even though linters don't, some scalac compilation options could conceivably change the output. This would make it meaningless to try to keep the metadata for incremental compilation from one set of scalac options to another. Thus different scalac options would indeed require different target directories.
Related
I have a semi-complicated SBT process because I need to conditionally include a different config file based on what kind of build is needed. I solved this problem through sub-projects:
lazy val app = project
.in(file("."))
.enablePlugins(JavaAppPackaging)
.settings(
commonSettings // Seq() of settings to be shared between projects
,sourceGenerators in Compile += (avroScalaGenerateSpecific in Compile).taskValue
,(avroSpecificSourceDirectory in Compile) := new java.io.File("src/main/resources/com/coolCompany/folderName/avro")
)
lazy val localPackage = project
.in(file("build/local"))
.enablePlugins(JavaAppPackaging)
.settings(
organization := "com.coolCompany",
version := "0.1.0-SNAPSHOT",
scalaVersion := "2.11.8",
name := "my-neat-project",
scalacOptions := compilerOptions, //Seq() of compiler flags
sourceDirectory in Compile := (sourceDirectory in (app, Compile)).value,
mappings in Universal += {
((sourceDirectory in Compile).value / "../../conf/local/config.properties") -> "lib/config.properties"
}
)
.dependsOn(app)
val buildNumber = inputKey[String]("The version number of the artifact.")
lazy val deployedPackage = project
.in(file("build/deployed"))
.enablePlugins(JavaAppPackaging)
.settings(
organization := "com.coolCompany",
buildNumber := {
val args : Seq[String] = spaceDelimited("<arg>").parsed
println(s"Input version number is ${args.head}")
args.head
},
version := buildNumber.inputTaskValue + "-SNAPSHOT", //"0.1.0-SNAPSHOT",
scalaVersion := "2.11.8",
name := "my-cool-project",
scalacOptions := compilerOptions,
sourceDirectory in Compile := (sourceDirectory in (app, Compile)).value,
mappings in Universal += {
((sourceDirectory in Compile).value / "../../conf/deployed/config.properties") -> "lib/config.properties"
}
)
.dependsOn(app)
Now I need to allow the version number to be passed in by a build tool when building. You can see what I've attempted to do already: I created an inputKey task called buildNumber, then tried to access that in the version := definition. I can run the buildNumber task itself just fine:
$ sbt 'deployedPackage/buildNumber 0.1.2'
Input version number is 0.1.2
So I can at least verify that my input task works as expected. The issue is that I can't figure out how I actually get to that input value when running the actual packageBin step that I want.
I've tried the following:
$ sbt 'deployedPackage/universal:packageBin 0.1.2'
[error] Expected key
[error] Expected '::'
[error] Expected end of input.
[error] deployedPackage/universal:packageBin 0.1.2
So it clearly doesn't understand what to do with the version number. I've tried a bunch of different input variations, such as [...]packageBin buildNumber::0.1.2, [...]packageBin -- buildNumber 0.1.2, or [...]packageBin -- 0.1.2, and all of them give that error or something similar indicating it doesn't understand what I'm trying to pass in.
Now, ultimately, these errors make sense. buildNumber, the task, is what knows what to do with the command line values, but packageBin does not. How do I set up this task or these set of tasks to allow the version number to be passed in?
I have seen this question but the answers link to an sbt plugin that seems to do about 100 more things than I want it to do, including quite a few that I would need to find a way to explicitly disable. I only want the version number to be able to be passed in & used in the artifact.
Edit/Update: I resolved this issue by switching back to Maven.
I think you're a bit confused about the way sbt settings work. Settings are set when sbt loads and then cannot be changed until you reload the session. So whatever you set version to, it will be fixed, it cannot be dynamic and depend on user input or a task (which on the contrast is dynamic and is evaluated every time you call it).
This is true. However you can still compute a SettingKey value in the first place. We use environment variables to set our build version.
version := "1." + sys.env.getOrElse("BUILD_NUMBER", "0-SNAPSHOT")
Explanation
1. is the major version. Increment this manually as you like
BUILD_NUMBER is an environment variable that is typically set by a CI, e.g. Jenkins. If not set, use 0-SNAPSHOT as a version suffix.
We use this in our company for our continuous deployment pipeline.
Hope that helps,
Muki
I think you're a bit confused about the way sbt settings work. Settings are set when sbt loads and then cannot be changed until you reload the session. So whatever you set version to, it will be fixed, it cannot be dynamic and depend on user input or a task (which on the contrast is dynamic and is evaluated every time you call it).
You have an input task buildNumber which depends on user input and is evaluated every time you call it:
> show buildNumber 123
Input version number is 123
[info] 123
[success] Total time: 0 s, completed Dec 24, 2017 3:41:43 PM
> show buildNumber 456
Input version number is 456
[info] 456
[success] Total time: 0 s, completed Dec 24, 2017 3:41:45 PM
It returns whatever you give it and doesn't do anything (besides println). More importantly, it doesn't affect version setting anyhow (and couldn't even in theory).
When you use buildNumber.inputTaskValue, you refer to the input task itself, not its value (which is unknown because it depends on the user input). You can see it by checking the version setting value:
> show version
[info] sbt.InputTask#6c996907-SNAPSHOT
So it's definitely not what you want.
I suggest you to review your approach in general and read a bit more sbt docs, for example about Task graph (and the whole Getting started chapter).
If you still really need to set version on sbt load according to your input, you can do it with a command. It would look like this:
commands += Command.single("pkg") { (state0, buildNumber) =>
val state1 = Project.extract(state0).append(Seq(version := buildNumber + "-SNAPSHOT"), state0)
val (state2, result) = Project.extract(state1).runTask(packageBin in (deployedPackage, universal), state1)
state2
}
But you should be really careful dealing with the state manually. Again, I recommend you to review your approach and change it to a more sbt-idiomatic one.
I suggest you just set the version setting only for the project deployedPackage just before you call the task needing the version. This is the simplest way of setting the version, and as settings keys have scopes, this should work as intended.
I used something like this for a big multi module project, where one of the projects had a separate versioning history than the other projects.
There's the command sbt flywayMigrate from flywaydb.org. The command requires use to set flywayUrl, flywayUser, and flywayPassword beforehand. It was good so far.
Now I want to be able to use sbt flywayMigrate for two different environment; Their variables should be different.
I tried to make two new commands: sbt flywayMigrateDev and sbt flywayMigrateProd. I couldn't figure out how to connect the new commands to flywayMigrate.
I tried creating a new scope. But I couldn't figure out how to wire the variables and tasks properly.
I wonder if anyone can give me an example on how to do this. I'd like to see a code example.
We can simplify the problem to:
There's the command sbt flywayMigrate that depends on flywayUrl. How do we allow the command to use different flywayUrls by calling sbt commands (or any other way is good, too)?
Thank you!
You should use config for this.
Example .sbt file contents:
// Set up your configs.
lazy val prodConfig = config("prod")
lazy val devConfig = config("dev")
// Set up any configuration that's common between dev and prod.
val commonFlyway = Seq(
// For the sake of example, a couple of shared settings.
flywayUser := "pg_admin",
flywayLocations := Seq("filesystem:migrations")
)
// Set up prod and dev.
inConfig(prodConfig)(flywayBaseSettings(prodConfig) ++ commonFlyway)
flywayUrl.in(prodConfig) := "jdbc:etc:proddb.somecompany.com"
// Or however you want to load your production password.
flywayPassword.in(prodConfig) := sys.env.getOrElse("PROD_PASSWD", "(unset)")
inConfig(devConfig)(flywayBaseSettings(prodConfig) ++ commonFlyway)
flywayUrl.in(devConfig) := "jdbc:etc:devdb.somecompany.com"
flywayPassword.in(devConfig) := "development_passwd"
Now you can run prod:flywayMigrate and dev:flywayMigrate to migrate production and development, respectively.
See the Flyway docs page for other examples.
My company is switching from ant to sbt to ease Scala integration into our huge Java existing code (smart move if you ask me).
After compiling, we usually post-process all the generated .class with a tool of our own which is a result of the compilation.
I have been trying to do the same in sbt and it appears more complicated than expected.
I tried:
calling our postprocessor with fullRunTask. Works fine but we would like to pass "products.value" to look for the .class files and it does not work
another and even better solution would be to extend compile (compile in Compile ~= { result => ...). But I did not found how the code after "result =>" can call our postprocessor
we are looking at other solutions: multiple projects, one for the postprocessor, one for the rest of the code and this would clean but because the source code is entangled, this is not as easy as it seems (and we still would have the first problem)
Any help?
I would just write a simple plugin that runs after the other stages. It can inspect the target folder for all the .class files.
You could then do something like sbt clean compile myplugin in your build server.
This is the approach taken by the proguard plugin[1]. You could take a look at that as a starting point.
[1] https://github.com/sbt/sbt-proguard
Finally, I found a solution after reading "SBT in Action" and other documents. It is very simple but understanding SBT is not (at least for me).
name := "Foo"
version := "1.0"
scalaVersion := "2.11.0"
fork := true
lazy val foo = TaskKey[Unit]("foo")
val dynamic = Def.taskDyn {
val classDir = (classDirectory in Compile).value
val command = " Foo "+classDir
(runMain in Compile).toTask(command)
}
foo := {
dynamic.value
}
foo <<= foo triggeredBy(compile in Compile)
The sample project contains a Foo.scala with the main function
I would like to share a common version variable between an sbtPlugin and the rest of the build
Here is what I am trying:
in project/Build.scala:
object Versions {
scalaJs = "0.5.0-M3"
}
object MyBuild extends Build {
//Use version number
}
in plugins.sbt:
addSbtPlugin("org.scala-lang.modules.scalajs" % "scalajs-sbt-plugin" % Versions.scalaJs)
results in
plugins.sbt:15: error: not found: value Versions
addSbtPlugin("org.scala-lang.modules.scalajs" % "scalajs-sbt-plugin" % Versions.scalaJs)
Is there a way to share the version number specification between plugins.sbt and the rest of the build, e.g. project/Build.scala?
sbt-buildinfo
If you need to share version number between build.sbt and hello.scala, what would you normally do? I don't know about you, but I would use sbt-buildinfo that I wrote.
This can be configured using buildInfoKeys setting to expose arbitrary key values like version or some custom String value. I understand this is not exactly what you're asking but bear with me.
meta-build (turtles all the way down)
As Jacek noted and stated in Getting Started Guide, the build in sbt is a project defined in the build located in project directory one level down. To distinguish the builds, let's define the normal build as the proper build, and the build that defines the proper build as meta-build. For example, we can say that an sbt plugin is a library of the root project in the meta build.
Now let's get back to your question. How can we share info between project/Build.scala and project/plugins.sbt?
using sbt-buildinfo for meta-build
We can just define another level of build by creating project/project and add sbt-buildinfo to the (meta-)meta-build.
Here are the files.
In project/project/buildinfo.sbt:
addSbtPlugin("com.eed3si9n" % "sbt-buildinfo" % "0.3.2")
In project/project/Dependencies.scala:
package metabuild
object Dependencies {
def scalaJsVersion = "0.5.0-M2"
}
In project/build.properties:
sbt.version=0.13.5
In project/buildinfo.sbt:
import metabuild.Dependencies._
buildInfoSettings
sourceGenerators in Compile <+= buildInfo
buildInfoKeys := Seq[BuildInfoKey]("scalaJsVersion" -> scalaJsVersion)
buildInfoPackage := "metabuild"
In project/scalajs.sbt:
import metabuild.Dependencies._
addSbtPlugin("org.scala-lang.modules.scalajs" % "scalajs-sbt-plugin" % scalaJsVersion)
In project/Build.scala:
import sbt._
import Keys._
import metabuild.BuildInfo._
object Builds extends Build {
println(s"test: $scalaJsVersion")
}
So there's a bit of a boilerplate in project/buildinfo.sbt, but the version info is shared across the build definition and the plugin declaration.
If you're curious where BuildInfo is defined, peek into project/target/scala-2.10/sbt-0.13/src_managed/.
For the project/plugins.sbt file you'd have to have another project under project with the Versions.scala file. That would make the definition of Versions.scalaJs visible.
The reason for doing it is that *.sbt files belong to a project build definition at the current level with *.scala files under project to expand on it. And it's...turtles all the way down, i.e. sbt is recursive.
I'm not sure how much the following can help, but it might be worth to try out - to share versions between projects - plugins and the main one - you'd have to use ProjectRef as described in the answer to RootProject and ProjectRef:
When you want to include other, separate builds directly instead of
using their published binaries, you use "source dependencies". This is
what RootProject and ProjectRef declare. ProjectRef is the most
general: you specify the location of the build (a URI) and the ID of
the project in the build (a String) that you want to depend on.
RootProject is a convenience that selects the root project for the
build at the URI you specify.
My proposal is to hack. For example, in build.sbt you can add a task:
val readPluginSbt = taskKey[String]("Read plugins.sbt file.")
readPluginSbt := {
val lineIterator = scala.io.Source.fromFile(new java.io.File("project","plugins.sbt")).getLines
val linesWithValIterator = lineIterator.filter(line => line.contains("scalaxbVersion"))
val versionString = linesWithValIterator.mkString("\n").split("=")(1).trim
val version = versionString.split("\n")(0) // only val declaration
println(version)
version
}
When you call readPluginSbt you will see the contents of plugins.sbt. You can parse this file and extract the variable.
For example:
resolvers += Resolver.sonatypeRepo("public")
val scalaxbVersion = "1.1.2"
addSbtPlugin("org.scalaxb" % "sbt-scalaxb" % scalaxbVersion)
addSbtPlugin("org.xerial.sbt" % "sbt-pack" % "0.5.1")
You can extract scalaxbVersion with regular expressions/split:
scala> val line = """val scalaxbVersion = "1.1.2""""
line: String = val scalaxbVersion = "1.1.2"
scala> line.split("=")(1).trim
res1: String = "1.1.2"
I'd like the ScalaDoc I generate with sbt to link to external libraries, and in sbt 0.13 we have autoAPIMappings which is supposed to add these links for libraries that declare their apiURL. In practice though, none of the libraries I use provide this in their pom/ivy metadata, and I suspect some of these libraries will never do so.
The apiMappings setting is supposed to help with just that, but it is typed as Map[File, URL] and hence geared towards setting doc urls for unmanaged dependencies. Managed dependencies are declared as instances of sbt.ModuleID and cannot be inserted directly in that map.
Can I somehow populate the apiMappings setting with something that will associate an URL with a managed dependency ?
A related question is: does sbt provide an idiomatic way of getting a File from a ModuleID? I guess I could try to evaluate some classpaths and get back Files to try and map them to ModuleIDs but I hope there is something simpler.
Note: this is related to https://stackoverflow.com/questions/18747265/sbt-scaladoc-configuration-for-the-standard-library/18747266, but that question differs by linking to the scaladoc for the standard library, for which there is a well known File scalaInstance.value.libraryJar, which is not the case in this instance.
I managed to get this working for referencing scalaz and play by doing the following:
apiMappings ++= {
val cp: Seq[Attributed[File]] = (fullClasspath in Compile).value
def findManagedDependency(organization: String, name: String): File = {
( for {
entry <- cp
module <- entry.get(moduleID.key)
if module.organization == organization
if module.name.startsWith(name)
jarFile = entry.data
} yield jarFile
).head
}
Map(
findManagedDependency("org.scalaz", "scalaz-core") -> url("https://scalazproject.ci.cloudbees.com/job/nightly_2.10/ws/target/scala-2.10/unidoc/")
, findManagedDependency("com.typesafe.play", "play-json") -> url("http://www.playframework.com/documentation/2.2.1/api/scala/")
)
}
YMMV of course.
The accepted answer is good, but it'll fail when assumptions about exact project dependencies don't hold. Here's a variation that might prove useful:
apiMappings ++= {
def mappingsFor(organization: String, names: List[String], location: String, revision: (String) => String = identity): Seq[(File, URL)] =
for {
entry: Attributed[File] <- (fullClasspath in Compile).value
module: ModuleID <- entry.get(moduleID.key)
if module.organization == organization
if names.exists(module.name.startsWith)
} yield entry.data -> url(location.format(revision(module.revision)))
val mappings: Seq[(File, URL)] =
mappingsFor("org.scala-lang", List("scala-library"), "http://scala-lang.org/api/%s/") ++
mappingsFor("com.typesafe.akka", List("akka-actor"), "http://doc.akka.io/api/akka/%s/") ++
mappingsFor("com.typesafe.play", List("play-iteratees", "play-json"), "http://playframework.com/documentation/%s/api/scala/index.html", _.replaceAll("[\\d]$", "x"))
mappings.toMap
}
(Including scala-library here is redundant, but useful for illustration purposes.)
If you perform mappings foreach println, you'll get output like (note that I don't have Akka in my dependencies):
(/Users/michaelahlers/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.11.7.jar,http://scala-lang.org/api/2.11.7/)
(/Users/michaelahlers/.ivy2/cache/com.typesafe.play/play-iteratees_2.11/jars/play-iteratees_2.11-2.4.6.jar,http://playframework.com/documentation/2.4.x/api/scala/)
(/Users/michaelahlers/.ivy2/cache/com.typesafe.play/play-json_2.11/jars/play-json_2.11-2.4.6.jar,http://playframework.com/documentation/2.4.x/api/scala/)
This approach:
Allows for none or many matches to the module identifier.
Concisely supports multiple modules to link the same documentation.
Or, with Nil provided to names, all modules for an organization.
Defers to the module as the version authority.
But lets you map over versions as needed.
As in the case with Play's libraries, where x is used for the patch number.
Those improvements allow you to create a separate SBT file (call it scaladocMappings.sbt) that can be maintained in a single location and easily copy and pasted into any project.
Alternatively to my last suggestion, the sbt-api-mappings plugin by ThoughtWorks shows a lot of promise. Long term, that's a far more sustainable route than each project maintaining its own set of mappings.