How to share version values between project/plugins.sbt and project/Build.scala? - sbt

I would like to share a common version variable between an sbtPlugin and the rest of the build
Here is what I am trying:
in project/Build.scala:
object Versions {
scalaJs = "0.5.0-M3"
}
object MyBuild extends Build {
//Use version number
}
in plugins.sbt:
addSbtPlugin("org.scala-lang.modules.scalajs" % "scalajs-sbt-plugin" % Versions.scalaJs)
results in
plugins.sbt:15: error: not found: value Versions
addSbtPlugin("org.scala-lang.modules.scalajs" % "scalajs-sbt-plugin" % Versions.scalaJs)
Is there a way to share the version number specification between plugins.sbt and the rest of the build, e.g. project/Build.scala?

sbt-buildinfo
If you need to share version number between build.sbt and hello.scala, what would you normally do? I don't know about you, but I would use sbt-buildinfo that I wrote.
This can be configured using buildInfoKeys setting to expose arbitrary key values like version or some custom String value. I understand this is not exactly what you're asking but bear with me.
meta-build (turtles all the way down)
As Jacek noted and stated in Getting Started Guide, the build in sbt is a project defined in the build located in project directory one level down. To distinguish the builds, let's define the normal build as the proper build, and the build that defines the proper build as meta-build. For example, we can say that an sbt plugin is a library of the root project in the meta build.
Now let's get back to your question. How can we share info between project/Build.scala and project/plugins.sbt?
using sbt-buildinfo for meta-build
We can just define another level of build by creating project/project and add sbt-buildinfo to the (meta-)meta-build.
Here are the files.
In project/project/buildinfo.sbt:
addSbtPlugin("com.eed3si9n" % "sbt-buildinfo" % "0.3.2")
In project/project/Dependencies.scala:
package metabuild
object Dependencies {
def scalaJsVersion = "0.5.0-M2"
}
In project/build.properties:
sbt.version=0.13.5
In project/buildinfo.sbt:
import metabuild.Dependencies._
buildInfoSettings
sourceGenerators in Compile <+= buildInfo
buildInfoKeys := Seq[BuildInfoKey]("scalaJsVersion" -> scalaJsVersion)
buildInfoPackage := "metabuild"
In project/scalajs.sbt:
import metabuild.Dependencies._
addSbtPlugin("org.scala-lang.modules.scalajs" % "scalajs-sbt-plugin" % scalaJsVersion)
In project/Build.scala:
import sbt._
import Keys._
import metabuild.BuildInfo._
object Builds extends Build {
println(s"test: $scalaJsVersion")
}
So there's a bit of a boilerplate in project/buildinfo.sbt, but the version info is shared across the build definition and the plugin declaration.
If you're curious where BuildInfo is defined, peek into project/target/scala-2.10/sbt-0.13/src_managed/.

For the project/plugins.sbt file you'd have to have another project under project with the Versions.scala file. That would make the definition of Versions.scalaJs visible.
The reason for doing it is that *.sbt files belong to a project build definition at the current level with *.scala files under project to expand on it. And it's...turtles all the way down, i.e. sbt is recursive.
I'm not sure how much the following can help, but it might be worth to try out - to share versions between projects - plugins and the main one - you'd have to use ProjectRef as described in the answer to RootProject and ProjectRef:
When you want to include other, separate builds directly instead of
using their published binaries, you use "source dependencies". This is
what RootProject and ProjectRef declare. ProjectRef is the most
general: you specify the location of the build (a URI) and the ID of
the project in the build (a String) that you want to depend on.
RootProject is a convenience that selects the root project for the
build at the URI you specify.

My proposal is to hack. For example, in build.sbt you can add a task:
val readPluginSbt = taskKey[String]("Read plugins.sbt file.")
readPluginSbt := {
val lineIterator = scala.io.Source.fromFile(new java.io.File("project","plugins.sbt")).getLines
val linesWithValIterator = lineIterator.filter(line => line.contains("scalaxbVersion"))
val versionString = linesWithValIterator.mkString("\n").split("=")(1).trim
val version = versionString.split("\n")(0) // only val declaration
println(version)
version
}
When you call readPluginSbt you will see the contents of plugins.sbt. You can parse this file and extract the variable.
For example:
resolvers += Resolver.sonatypeRepo("public")
val scalaxbVersion = "1.1.2"
addSbtPlugin("org.scalaxb" % "sbt-scalaxb" % scalaxbVersion)
addSbtPlugin("org.xerial.sbt" % "sbt-pack" % "0.5.1")
You can extract scalaxbVersion with regular expressions/split:
scala> val line = """val scalaxbVersion = "1.1.2""""
line: String = val scalaxbVersion = "1.1.2"
scala> line.split("=")(1).trim
res1: String = "1.1.2"

Related

Sbt: is it possible to import a file outside the project in build.sbt?

Context:
I have several applications and each has its own build.sbt.
Those applications depend on an other app with
lazy val sharedApp = RootProject(file("../shared-app"))
The problem is that I have to repeat on every build.sbt the configuration of the shared app and it is cumbersome.
How could I do in order to do that?
Is there a way to import a file that I would place in this shared app? (and so in the parent folder of the build.sbt file)
You could use Scala code in the build "project", to do whatever you want and it could be the case if you store parent project config there.
I see it something like this:
Parent.scala
import sbt._
object Dependencies {
// Versions
lazy val someVersion = "2.3.8"
// Libraries
val someLib = "com.typesafe.akka" %% "akka-actor" % akkaVersion
val someLib2 = "com.typesafe.akka" %% "akka-cluster" % akkaVersion
}
and later in your build.sbt you could import it and it could reduce your cumbersome code, e.g.
build.sbt
import Parent._
// user keys from Parent.scala to quicker setup your parent project
Make any sense for your use-case?
Some reference materials - here

Where to place resources, e.g. images, that scaladoc can use?

I am currently writing the documentation of an API written in Scala. I would like to include several diagrams to make the code more understandable.
I am wondering where to put resources (such as diagrams) in order that they can be automatically imported by an invocation of scaladoc, and how to refer to these resources in the documentation of the code.
For example, let's assume that I use sbt. My code is located in the src/main/scala directory. Here is an example of a scala package object for package foo:
/**
* Provides main classes of the bar API.
*
* ==Overview==
* Main classes are depicted on the following diagram:
* <img src="path/to/diagram-foo.svg" />
*
*/
package object foo {
}
Where should 'diagram-foo.svg' be located in my project in order to be visible to scaladoc? Subsequently, what is the correct value of path/to/ in the img tag?
WARNING It may be a hack as I know very little about scaladoc.
Since <img src="../path/to/diagram-foo.svg" /> is just a regular HTML you just need to copy necessary assets to the doc target path so the img resolves.
You can use the following copyDocAssetsTask custom task that with (doc in Compile) and src/main/doc-resources directory gives you what you want. The point is to copy images to the directory where the documentation is generated, i.e. (target in (Compile, doc)).value.
build.sbt:
lazy val copyDocAssetsTask = taskKey[Unit]("Copy doc assets")
copyDocAssetsTask := {
println("Copying doc assets")
val sourceDir = file("src/main/doc-resources")
val targetDir = (target in (Compile, doc)).value
IO.copyDirectory(sourceDir, targetDir)
}
copyDocAssetsTask <<= copyDocAssetsTask triggeredBy (doc in Compile)
Obviously the directory where you place the images is arbitrary, and when you decide otherwise, just update the custom task accordingly.
Thanks, I used an adaptation of this that I hope might help others, particularly on multi-module projects:
First, unidoc at https://github.com/sbt/sbt-unidoc will merge your scaladoc from multi-module projects into a single location, which is typically what you want. Then the following in build.sbt:
lazy val copyDocAssetsTask = taskKey[Unit]("Copy unidoc resources")
copyDocAssetsTask := {
println("Copying unidoc resources")
val sourceDir = file("src/main/doc-resources")
val targetDir = (target in (Compile, doc)).value.getParentFile
println(s"from ${sourceDir.getAbsolutePath} to ${targetDir.getAbsolutePath}")
IO.copyDirectory(sourceDir, new java.io.File(targetDir, "unidoc"))
}
copyDocAssetsTask := (copyDocAssetsTask triggeredBy (unidoc in Compile)).value
then put your documents under src/main/doc-resources in the root project in subdirectories following your package structure for the path to your class with the scaladoc to include the diagram (this just saves you having to mess around with parent directories in the URL) and embed something like:
<img src="DesignModel.svg" width="98%"/> in your scaladoc
e.g. if this scaladoc was in a class in package com.someone.thing in any project in the multi-module build, the DesignModel.svg file would go in src/main/doc-resources/com/someone/thing inside the root project.

How to call our own .class postprocessor after compile?

My company is switching from ant to sbt to ease Scala integration into our huge Java existing code (smart move if you ask me).
After compiling, we usually post-process all the generated .class with a tool of our own which is a result of the compilation.
I have been trying to do the same in sbt and it appears more complicated than expected.
I tried:
calling our postprocessor with fullRunTask. Works fine but we would like to pass "products.value" to look for the .class files and it does not work
another and even better solution would be to extend compile (compile in Compile ~= { result => ...). But I did not found how the code after "result =>" can call our postprocessor
we are looking at other solutions: multiple projects, one for the postprocessor, one for the rest of the code and this would clean but because the source code is entangled, this is not as easy as it seems (and we still would have the first problem)
Any help?
I would just write a simple plugin that runs after the other stages. It can inspect the target folder for all the .class files.
You could then do something like sbt clean compile myplugin in your build server.
This is the approach taken by the proguard plugin[1]. You could take a look at that as a starting point.
[1] https://github.com/sbt/sbt-proguard
Finally, I found a solution after reading "SBT in Action" and other documents. It is very simple but understanding SBT is not (at least for me).
name := "Foo"
version := "1.0"
scalaVersion := "2.11.0"
fork := true
lazy val foo = TaskKey[Unit]("foo")
val dynamic = Def.taskDyn {
val classDir = (classDirectory in Compile).value
val command = " Foo "+classDir
(runMain in Compile).toTask(command)
}
foo := {
dynamic.value
}
foo <<= foo triggeredBy(compile in Compile)
The sample project contains a Foo.scala with the main function

Can I use sbt's `apiMappings` setting for managed dependencies?

I'd like the ScalaDoc I generate with sbt to link to external libraries, and in sbt 0.13 we have autoAPIMappings which is supposed to add these links for libraries that declare their apiURL. In practice though, none of the libraries I use provide this in their pom/ivy metadata, and I suspect some of these libraries will never do so.
The apiMappings setting is supposed to help with just that, but it is typed as Map[File, URL] and hence geared towards setting doc urls for unmanaged dependencies. Managed dependencies are declared as instances of sbt.ModuleID and cannot be inserted directly in that map.
Can I somehow populate the apiMappings setting with something that will associate an URL with a managed dependency ?
A related question is: does sbt provide an idiomatic way of getting a File from a ModuleID? I guess I could try to evaluate some classpaths and get back Files to try and map them to ModuleIDs but I hope there is something simpler.
Note: this is related to https://stackoverflow.com/questions/18747265/sbt-scaladoc-configuration-for-the-standard-library/18747266, but that question differs by linking to the scaladoc for the standard library, for which there is a well known File scalaInstance.value.libraryJar, which is not the case in this instance.
I managed to get this working for referencing scalaz and play by doing the following:
apiMappings ++= {
val cp: Seq[Attributed[File]] = (fullClasspath in Compile).value
def findManagedDependency(organization: String, name: String): File = {
( for {
entry <- cp
module <- entry.get(moduleID.key)
if module.organization == organization
if module.name.startsWith(name)
jarFile = entry.data
} yield jarFile
).head
}
Map(
findManagedDependency("org.scalaz", "scalaz-core") -> url("https://scalazproject.ci.cloudbees.com/job/nightly_2.10/ws/target/scala-2.10/unidoc/")
, findManagedDependency("com.typesafe.play", "play-json") -> url("http://www.playframework.com/documentation/2.2.1/api/scala/")
)
}
YMMV of course.
The accepted answer is good, but it'll fail when assumptions about exact project dependencies don't hold. Here's a variation that might prove useful:
apiMappings ++= {
def mappingsFor(organization: String, names: List[String], location: String, revision: (String) => String = identity): Seq[(File, URL)] =
for {
entry: Attributed[File] <- (fullClasspath in Compile).value
module: ModuleID <- entry.get(moduleID.key)
if module.organization == organization
if names.exists(module.name.startsWith)
} yield entry.data -> url(location.format(revision(module.revision)))
val mappings: Seq[(File, URL)] =
mappingsFor("org.scala-lang", List("scala-library"), "http://scala-lang.org/api/%s/") ++
mappingsFor("com.typesafe.akka", List("akka-actor"), "http://doc.akka.io/api/akka/%s/") ++
mappingsFor("com.typesafe.play", List("play-iteratees", "play-json"), "http://playframework.com/documentation/%s/api/scala/index.html", _.replaceAll("[\\d]$", "x"))
mappings.toMap
}
(Including scala-library here is redundant, but useful for illustration purposes.)
If you perform mappings foreach println, you'll get output like (note that I don't have Akka in my dependencies):
(/Users/michaelahlers/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.11.7.jar,http://scala-lang.org/api/2.11.7/)
(/Users/michaelahlers/.ivy2/cache/com.typesafe.play/play-iteratees_2.11/jars/play-iteratees_2.11-2.4.6.jar,http://playframework.com/documentation/2.4.x/api/scala/)
(/Users/michaelahlers/.ivy2/cache/com.typesafe.play/play-json_2.11/jars/play-json_2.11-2.4.6.jar,http://playframework.com/documentation/2.4.x/api/scala/)
This approach:
Allows for none or many matches to the module identifier.
Concisely supports multiple modules to link the same documentation.
Or, with Nil provided to names, all modules for an organization.
Defers to the module as the version authority.
But lets you map over versions as needed.
As in the case with Play's libraries, where x is used for the patch number.
Those improvements allow you to create a separate SBT file (call it scaladocMappings.sbt) that can be maintained in a single location and easily copy and pasted into any project.
Alternatively to my last suggestion, the sbt-api-mappings plugin by ThoughtWorks shows a lot of promise. Long term, that's a far more sustainable route than each project maintaining its own set of mappings.

Does sbt have something like gradle's processResources task with ReplaceTokens support?

We are moving into Scala/SBT from a Java/Gradle stack. Our gradle builds were leveraging a task called processResources and some Ant filter thing named ReplaceTokens to dynamically replace tokens in a checked-in .properties file without actually changing the .properties file (just changing the output). The gradle task looks like:
processResources {
def whoami = System.getProperty( 'user.name' );
def hostname = InetAddress.getLocalHost().getHostName()
def buildTimestamp = new Date().format('yyyy-MM-dd HH:mm:ss z')
filter ReplaceTokens, tokens: [
"buildsig.version" : project.version,
"buildsig.classifier" : project.classifier,
"buildsig.timestamp" : buildTimestamp,
"buildsig.user" : whoami,
"buildsig.system" : hostname,
"buildsig.tag" : buildTag
]
}
This task locates all the template files in the src/main/resources directory, performs the requisite substitutions and outputs the results at build/resources/main. In other words it transforms src/main/resources/buildsig.properties from...
buildsig.version=#buildsig.version#
buildsig.classifier=#buildsig.classifier#
buildsig.timestamp=#buildsig.timestamp#
buildsig.user=#buildsig.user#
buildsig.system=#buildsig.system#
buildsig.tag=#buildsig.tag#
...to build/resources/main/buildsig.properties...
buildsig.version=1.6.5
buildsig.classifier=RELEASE
buildsig.timestamp=2013-05-06 09:46:52 PDT
buildsig.user=jenkins
buildsig.system=bobk-mbp.local
buildsig.tag=dev
Which, ultimately, finds its way into the WAR file at WEB-INF/classes/buildsig.properties. This works like a champ to record build specific information in a Properties file which gets loaded from the classpath at runtime.
What do I do in SBT to get something like this done? I'm new to Scala / SBT so please forgive me if this seems a stupid question. At the end of the day what I need is a means of pulling some information from the environment on which I build and placing that information into a properties file that is classpath loadable at runtime. Any insights you can give to help me get this done are greatly appreciated.
The sbt-buildinfo is a good option. The README shows an example of how to define custom mappings and mappings that should run on each compile. In addition to the straightforward addition of normal settings like version shown there, you want a section like this:
buildInfoKeys ++= Seq[BuildInfoKey](
"hostname" -> java.net.InetAddress.getLocalHost().getHostName(),
"whoami" -> System.getProperty("user.name"),
BuildInfoKey.action("buildTimestamp") {
java.text.DateFormat.getDateTimeInstance.format(new java.util.Date())
}
)
Would the following be what you're looking for:
sbt-editsource: An SBT plugin for editing files
sbt-editsource is a text substitution plugin for SBT 0.11.x and
greater. In a way, it’s a poor man’s sed(1), for SBT. It provides the
ability to apply line-by-line substitutions to a source text file,
producing an edited output file. It supports two kinds of edits:
Variable substitution, where ${var} is replaced by a value. sed-like
regular expression substitution.
This is from Community Plugins.

Resources