Setting a key local.sbt so that it is applied in a multi-project - sbt

How do I set key in a local.sbt in such a way that every subproject finds it ?
I'm trying to use Coursier plugin in a multi project, but since I'm testing it, I'm trying not to check it in in our git repo.
So I put it in my project/local.sbt and I was trying to set coursierUseSbtCredentials := true in a local.sbt.
This has no visible effect.
The authenticated nexus is defined in the commonSettings val in my build.sbt
val commonSettings = Seq(
...
resolvers += "my-nexus" at "http://blah",
credentials += ...
)
which every sub-project uses with .settings(commonSettings) (as per best-practices guide)
If I put coursierUseSbtCredentials := true in commonSettings it does work, but then I'd have to add it in my build.sbt, which I would rather not do.
How do I set this key so that every subproject can see it and in such a way that it is external to the build.sbt file ? (e.g. local.sbt ?)

Create a local plugin at project/SetupCoursierPlugin.scala:
import sbt._
import coursier.CoursierPlugin, CoursierPlugin.autoImport._
object SetupCoursierPlugin extends AutoPlugin {
override def requires = CoursierPlugin
override def trigger = allRequirements
override def projectSettings = Seq(
coursierUseSbtCredentials := true
)
}

Related

Find master project root directory in sbt multi-project

Is it possible to reference the master project from a subproject, in a multi-project sbt file?
I am writing a custom task, and I need to find two directories:
from the master project: the baseDirectory
from the subproject: the target
Of course, each of these are available inside their own projects. But I need to access them in the same code.
How can I do that?
The project layout is:
some/dir/build.sbt
val masterRoot = baseDirectory.value.getAbsolutePath // this works
lazy val root = (project in file(".")).aggregate(subproject)
some/dir/subproject/build.sbt
lazy val someTask = TaskKey[String]("someTask")
someTask := {
val subprojectTarget = target.value.getAbsolutePath // this works
val masterRootBroken = baseDirectory.in(root).value.getAbsolutePath // root is not found
// I need access to subprojectTarget AND masterRoot here
}
Alternatively, can I set a value into a SettingKey in the graph in the master project, and read it in the subproject?
I think there are two options available to you
First option is multi files project structure that you already have
build.sbt:
val sub = (project in file("sub"))
val root = (project in file("."))
Note: None of the above lines are mandatory. They are defined just to represent some possible additional logic like aggregate.
And sub/build.sbt with content
val root = (project in file("..")) //Note that ".." is used to refer to root project folder
val combinedPath = TaskKey[String]("combinedPath")
combinedPath := {
target.value.getAbsolutePath + baseDirectory.in(root).value.getAbsolutePath
}
Second one is to combine all build.sbt files into one build.sbt in root project with content
val combinedPath = TaskKey[String]("combinedPath")
val sub = (project in file("sub"))
.settings(
combinedPath := {
target.value.getAbsolutePath + baseDirectory.in(root).value.getAbsolutePath
}
)
lazy val root = (project in file("."))
Definition of task combinedPath is done in settings of sub project and it can refer to baseDirectory.in(root) of root.

How can I make a task depend on another task?

I'm new to sbt and I try to create a script for either deploy my application or to deploy and run the application.
What already works for me is
sbt deploy
which will successfully deploy the final .jar file to the remove location.
However, I don't know how to make deployAndRunTask dependent on deployTask. I've tried several things but none of them worked so far.
My last hope was
deployAndRunTask := {
val d = deployTask.value
}
However, this does not seem to work.
This is the script that I'm currently at but sbt deploy-run will only execute the deployAndRunTask task but not the deyployTask.
// DEPLOYMENT
val deployTask = TaskKey[Unit]("deploy", "Copies assembly jar to remote location")
deployTask <<= assembly map { (asm) =>
val account = "user#example.com"
val local = asm.getPath
val remote = account + ":" + "/home/user/" + asm.getName
println(s"Copying: $local -> $account:$remote")
Seq("scp", local, remote) !!
}
val deployAndRunTask = TaskKey[Unit]("deploy-run", "Deploy and run application.")
deployAndRunTask := {
val d = deployTask.value
}
deployAndRunTask <<= assembly map { (asm) =>
println(s"Running the script ..")
}
What is the problem here?
The problem is that you define your task and then redefine it. So only the latter definition is taken into account. You cannot separate task definition and its dependency on another task. Also you're using a couple of outdated things in sbt:
use taskKey macro and you don't need to think about task name, because it's the same as the key name:
val deploy = taskKey[Unit]("Copies assembly jar to remote location")
val deployAndRun = taskKey[Unit]("Deploy and run application.")
Then you can refer to them as deploy and deployAndRun both in build.sbt and in the sbt shell
replace <<= with := and keyname map { (keyvalue) => ... } with just keyname.value. Things are more concise and easier to write.
You can read more about Migrating from sbt 0.13.x.
So here's your deployAndRun task definition with these changes:
deployAndRun := {
val d = deploy.value
val asm = assembly.value
println(s"Running the script ..")
}
It's dependent both on deploy and assembly tasks and will run them both before doing anything else. You can also use dependsOn, but I think it's unnecessary here.
You may also be interested in looking into Defining a sequential task with Def.sequential and Defining a dynamic task with Def.taskDyn.

Rename file using sbt-assembly

I have a scala project that uses the ConfigFactory to set up the application configurations. For building I use sbt (together with sbt-assembly).
Depending on whether I create an assembly with sbt-assembly or whether I just run the project, I would like to use different config files (application.conf when running the project, assembly.conf when running the assembly of the project).
I thought of using the assemblyMergeStrategy for this purpose: When assembling the jar, I would discard the application.conf and rename assembly.conf. My idea was something like:
assemblyMergeStrategy in assembly := {
case PathList("application.conf") => MergeStrategy.discard
case PathList("assembly.conf") => MergeStrategy.rename
...
}
By this I would like to achieve is that when assembling the jar, the file assembly.conf is renamed to application.conf and is therefore used by ConfigFactory, whereas the original application.conf is discarded.
The code above obviously does not work, as I cannot specify to what filename assembly.conf should be renamed to. How can I achieve this?
You need to define your own MergeStrategy(in project directory) that will rename files to application.conf and then redefine assemblyMergeStrategy in assembly to discard original application.conf and apply MyMergeStrategy to assembly.conf:
import java.io.File
import sbtassembly.MergeStrategy
class MyMergeStrategy extends MergeStrategy{
override def name: String = "Rename to application.conf"
override def apply(tempDir: File, path: String, files: Seq[File]): Either[String, Seq[(File, String)]] = {
Right(files.map(_ -> "application.conf"))
}
}
And then use in build.sbt:
val root = (project in file(".")).settings(Seq(
assemblyMergeStrategy in assembly := {
case PathList("application.conf") => MergeStrategy.discard
case PathList("assembly.conf") => new MyMergeStrategy()
case x =>
val oldStrategy = (assemblyMergeStrategy in assembly).value
oldStrategy(x)
}
))
This will do just for your case but for more complicated cases I would read how they do it in sbt-native-packager:
https://www.scala-sbt.org/sbt-native-packager/recipes/package_configuration.html

How to load setting values from a Java properties file?

Is there a way for me to dynamically load a setting value from a properties file?
I mean, instead of hardcoding into build.sbt
name := "helloWorld"
Have some application.properties file with
name=helloWorld
And then, in the build.sbt file, have name := application.properties["name"]
(last example is purely schematic, but I hope the idea was clear)
You can create a setting key which holds properties read from a file.
import java.util.Properties
val appProperties = settingKey[Properties]("The application properties")
appProperties := {
val prop = new Properties()
IO.load(prop, new File("application.properties"))
prop
}
name := appProperties.value.getProperty("name")
Cheating a bit on the answer from #daniel-olszewski.
In project/build.sbt declare dependency on Typesafe Config:
libraryDependencies += "com.typesafe" % "config" % "1.2.1"
In build.sbt load properties using Typesafe Config and set settings:
import com.typesafe.config.{ConfigFactory, Config}
lazy val appProperties = settingKey[Config]("The application properties")
appProperties := {
ConfigFactory.load()
}
name := {
try {
appProperties.value.getString("name")
} catch {
case _: Exception => "<empty>"
}
}
You could define a def that would set values from the properties, too.

Different scalac options for different scopes or tasks?

I am trying to use a compiler plugin with sbt (I'm on 0.13.5), passed along in my build.sbt as:
autoCompilerPlugins := true
scalacOptions += "-Xplugin:myCompilerPluginJar.jar"
This works, the plugin runs, however I would really like to only run the plugin on some explicit compiles (perhaps with a scoped compile task or a custom task).
If I try something like:
val PluginConfig = config("plugin-config") extend(Compile)
autoCompilerPlugins := true
scalacOptions in PluginConfig += "-Xplugin:myCompilerPluginJar.jar"
The plugin does not run on "plugin-config:compile". In fact if I have
scalacOptions in Compile += "-Xplugin:myCompilerPluginJar.jar"
The plugin still runs on "test:compile" or compile on any other scope. I would guess I am probably not understanding something correctly with the configs/scopes.
I also tried:
lazy val pluginCommand = Command.command("plugincompile") { state =>
runTask(compile in Compile,
append(Seq(scalacOptions in Compile += "Xplugin:myCompilerPluginJar.jar"), state)
)
state
}
commands += pluginCommand
But the plugin doesn't actually run on that command, so again I am probably not understanding something there.
Any and all help welcome.
So I have come to hacky solution; I thought I would share it here in case anyone else stumbles upon this question.
val safeCompile = TaskKey[Unit]("safeCompile", "Compiles, catching errors.")
safeCompile := (compile in Compile).result.value.toEither.fold(
l => {
println("Compilation failed.")
}, r => {
println("Compilation success. " + r)})
//Hack to allow "-deprecation" and "-unchecked" in scalacOptions by default
scalacOptions <<= scalacOptions map { current: Seq[String] =>
val default = "-deprecation" :: "-unchecked" :: Nil
if (current.contains("-Xplugin:myCompilerPluginJar.jar")) current else default
}
addCommandAlias("depcheck", "; set scalacOptions := Seq(\"-deprecation\", \"-unchecked\", \"-Xplugin:myCompilerPluginJar.jar\"); safeCompile; set scalacOptions := Seq(\"-deprecation\", \"-unchecked\")")
As a quick guide, this code:
Defines a custom task "safeCompile" that runs the "Compile:compile" task, but succeeds even on errors (this is needed so that the sequence of commands defined later on doesn't break on compilation failure).
Declares "scalacOptions" to be dependent on a function that checks if the plugin is turned on (leaving the options untouched if it is) and otherwise sets the options to the default I want for the project (Seq("-deprecation", "-unchecked")). This is a hack so that these settings are on by default and so that a bare "scalacOptions :=" definition doesn't override the settings done in the aliased command sequence. (Using Seq.append and Seq.distinct might be a nicer way to do this hacky part).
Defines an aliased command sequence that: turns the plugin on, safeCompiles, turns the plugin off.
Comments are welcome, and if you get something cleaner to work, please share!

Resources