How to set the value of a SettingKey based on different sbt commands? - sbt

There's the command sbt flywayMigrate from flywaydb.org. The command requires use to set flywayUrl, flywayUser, and flywayPassword beforehand. It was good so far.
Now I want to be able to use sbt flywayMigrate for two different environment; Their variables should be different.
I tried to make two new commands: sbt flywayMigrateDev and sbt flywayMigrateProd. I couldn't figure out how to connect the new commands to flywayMigrate.
I tried creating a new scope. But I couldn't figure out how to wire the variables and tasks properly.
I wonder if anyone can give me an example on how to do this. I'd like to see a code example.
We can simplify the problem to:
There's the command sbt flywayMigrate that depends on flywayUrl. How do we allow the command to use different flywayUrls by calling sbt commands (or any other way is good, too)?
Thank you!

You should use config for this.
Example .sbt file contents:
// Set up your configs.
lazy val prodConfig = config("prod")
lazy val devConfig = config("dev")
// Set up any configuration that's common between dev and prod.
val commonFlyway = Seq(
// For the sake of example, a couple of shared settings.
flywayUser := "pg_admin",
flywayLocations := Seq("filesystem:migrations")
)
// Set up prod and dev.
inConfig(prodConfig)(flywayBaseSettings(prodConfig) ++ commonFlyway)
flywayUrl.in(prodConfig) := "jdbc:etc:proddb.somecompany.com"
// Or however you want to load your production password.
flywayPassword.in(prodConfig) := sys.env.getOrElse("PROD_PASSWD", "(unset)")
inConfig(devConfig)(flywayBaseSettings(prodConfig) ++ commonFlyway)
flywayUrl.in(devConfig) := "jdbc:etc:devdb.somecompany.com"
flywayPassword.in(devConfig) := "development_passwd"
Now you can run prod:flywayMigrate and dev:flywayMigrate to migrate production and development, respectively.
See the Flyway docs page for other examples.

Related

Pass in version number when building in sbt

I have a semi-complicated SBT process because I need to conditionally include a different config file based on what kind of build is needed. I solved this problem through sub-projects:
lazy val app = project
.in(file("."))
.enablePlugins(JavaAppPackaging)
.settings(
commonSettings // Seq() of settings to be shared between projects
,sourceGenerators in Compile += (avroScalaGenerateSpecific in Compile).taskValue
,(avroSpecificSourceDirectory in Compile) := new java.io.File("src/main/resources/com/coolCompany/folderName/avro")
)
lazy val localPackage = project
.in(file("build/local"))
.enablePlugins(JavaAppPackaging)
.settings(
organization := "com.coolCompany",
version := "0.1.0-SNAPSHOT",
scalaVersion := "2.11.8",
name := "my-neat-project",
scalacOptions := compilerOptions, //Seq() of compiler flags
sourceDirectory in Compile := (sourceDirectory in (app, Compile)).value,
mappings in Universal += {
((sourceDirectory in Compile).value / "../../conf/local/config.properties") -> "lib/config.properties"
}
)
.dependsOn(app)
val buildNumber = inputKey[String]("The version number of the artifact.")
lazy val deployedPackage = project
.in(file("build/deployed"))
.enablePlugins(JavaAppPackaging)
.settings(
organization := "com.coolCompany",
buildNumber := {
val args : Seq[String] = spaceDelimited("<arg>").parsed
println(s"Input version number is ${args.head}")
args.head
},
version := buildNumber.inputTaskValue + "-SNAPSHOT", //"0.1.0-SNAPSHOT",
scalaVersion := "2.11.8",
name := "my-cool-project",
scalacOptions := compilerOptions,
sourceDirectory in Compile := (sourceDirectory in (app, Compile)).value,
mappings in Universal += {
((sourceDirectory in Compile).value / "../../conf/deployed/config.properties") -> "lib/config.properties"
}
)
.dependsOn(app)
Now I need to allow the version number to be passed in by a build tool when building. You can see what I've attempted to do already: I created an inputKey task called buildNumber, then tried to access that in the version := definition. I can run the buildNumber task itself just fine:
$ sbt 'deployedPackage/buildNumber 0.1.2'
Input version number is 0.1.2
So I can at least verify that my input task works as expected. The issue is that I can't figure out how I actually get to that input value when running the actual packageBin step that I want.
I've tried the following:
$ sbt 'deployedPackage/universal:packageBin 0.1.2'
[error] Expected key
[error] Expected '::'
[error] Expected end of input.
[error] deployedPackage/universal:packageBin 0.1.2
So it clearly doesn't understand what to do with the version number. I've tried a bunch of different input variations, such as [...]packageBin buildNumber::0.1.2, [...]packageBin -- buildNumber 0.1.2, or [...]packageBin -- 0.1.2, and all of them give that error or something similar indicating it doesn't understand what I'm trying to pass in.
Now, ultimately, these errors make sense. buildNumber, the task, is what knows what to do with the command line values, but packageBin does not. How do I set up this task or these set of tasks to allow the version number to be passed in?
I have seen this question but the answers link to an sbt plugin that seems to do about 100 more things than I want it to do, including quite a few that I would need to find a way to explicitly disable. I only want the version number to be able to be passed in & used in the artifact.
Edit/Update: I resolved this issue by switching back to Maven.
I think you're a bit confused about the way sbt settings work. Settings are set when sbt loads and then cannot be changed until you reload the session. So whatever you set version to, it will be fixed, it cannot be dynamic and depend on user input or a task (which on the contrast is dynamic and is evaluated every time you call it).
This is true. However you can still compute a SettingKey value in the first place. We use environment variables to set our build version.
version := "1." + sys.env.getOrElse("BUILD_NUMBER", "0-SNAPSHOT")
Explanation
1. is the major version. Increment this manually as you like
BUILD_NUMBER is an environment variable that is typically set by a CI, e.g. Jenkins. If not set, use 0-SNAPSHOT as a version suffix.
We use this in our company for our continuous deployment pipeline.
Hope that helps,
Muki
I think you're a bit confused about the way sbt settings work. Settings are set when sbt loads and then cannot be changed until you reload the session. So whatever you set version to, it will be fixed, it cannot be dynamic and depend on user input or a task (which on the contrast is dynamic and is evaluated every time you call it).
You have an input task buildNumber which depends on user input and is evaluated every time you call it:
> show buildNumber 123
Input version number is 123
[info] 123
[success] Total time: 0 s, completed Dec 24, 2017 3:41:43 PM
> show buildNumber 456
Input version number is 456
[info] 456
[success] Total time: 0 s, completed Dec 24, 2017 3:41:45 PM
It returns whatever you give it and doesn't do anything (besides println). More importantly, it doesn't affect version setting anyhow (and couldn't even in theory).
When you use buildNumber.inputTaskValue, you refer to the input task itself, not its value (which is unknown because it depends on the user input). You can see it by checking the version setting value:
> show version
[info] sbt.InputTask#6c996907-SNAPSHOT
So it's definitely not what you want.
I suggest you to review your approach in general and read a bit more sbt docs, for example about Task graph (and the whole Getting started chapter).
If you still really need to set version on sbt load according to your input, you can do it with a command. It would look like this:
commands += Command.single("pkg") { (state0, buildNumber) =>
val state1 = Project.extract(state0).append(Seq(version := buildNumber + "-SNAPSHOT"), state0)
val (state2, result) = Project.extract(state1).runTask(packageBin in (deployedPackage, universal), state1)
state2
}
But you should be really careful dealing with the state manually. Again, I recommend you to review your approach and change it to a more sbt-idiomatic one.
I suggest you just set the version setting only for the project deployedPackage just before you call the task needing the version. This is the simplest way of setting the version, and as settings keys have scopes, this should work as intended.
I used something like this for a big multi module project, where one of the projects had a separate versioning history than the other projects.

Define configuration with compiler plugin

I'd like to create a compile configuration which is the same as the default one but adds a compiler plugin. In my particular case, I want to have a "dev" configuration but with the linter plugin (https://github.com/HairyFotr/linter) because it slows down compile times and there's no need to run it in production or continuous integration.
Now this is what I tried:
lazy val Dev = config("dev") extend Compile
lazy val root = (project in file(".")).configs(Dev).settings(
inConfig(Dev)(addCompilerPlugin("org.psywerx.hairyfotr" %% "linter" % "0.1.12")): _*)
and it should work, since when I inspect dev:libraryDependencies, it's what I expect it to be- it has org.psywerx.hairyfotr:linter:0.1.12:plugin->default(compile). Normally if I add the library with a "plugin" scope, it does work for the default settings:
libraryDependencies += ("org.psywerx.hairyfotr" %% "linter" % "0.1.12" % "plugin"
It just does not work if I add this under a different configuration, so there must be something else going on here.
This solves the problem, but not exactly in a way was asked. Here's the full build.sbt:
libraryDependencies ++= Seq(
"org.psywerx.hairyfotr" %% "linter" % "0.1.14" % "test")
val linter = Command.command("linter")(state => {
val linterJar = for {
(newState, result) <- Project.runTask(fullClasspath in Test, state)
cp <- result.toEither.right.toOption
linter <- cp.find(
_.get(moduleID.key).exists(mId =>
mId.organization == "org.psywerx.hairyfotr" &&
mId.name == "linter_2.11"))
} yield linter.data.absolutePath
val res = Project.runTask(scalacOptions, state)
res match {
case Some((newState, result)) =>
result.toEither.right.foreach { defaultScalacOptions =>
Project.runTask(compile in Test,
Project.extract(state).append(
scalacOptions := defaultScalacOptions ++ linterJar.map(p => Seq(s"-Xplugin:$p")).getOrElse(Seq.empty),
newState))
}
case None => sys.error("Couldn't get defaultScalacOptions")
}
state
})
lazy val root = (project in file(".")).configs(Test).settings(commands ++= Seq(linter))
The fact that you return unmodified state means you don't change the project settings. So if you run sbt linter, you should get your project compiled with the additional scalacOptions, but if you run compile in the same sbt session, it will not use those additional settings.
The tricky thing here is that scalacOptions is actually a TaskKey, not a SettingKey. I don't know why is that, but to get its value, you have to run that task. One reason might be that in sbt you cannot make setting depending on a task, but you can make a task depending on a task. In other words, scalacOptions can depend on the other task value, and maybe internally it does, I haven't checked. If current answer will work for you, I can try and think about more elegant way of achieving the same result.
EDIT: modified the code to specify the scalacOptions for the linter plugin proper. Please note the plugin has to be a managed dependency, not just a downloaded jar, for this solution to work. If you want to have it unmanaged, there's a way, but I won't go into it for now. Additionally, I've taken a freedom of making it also work for testing code, for illustration purposes.
Looking at Defaults.scala in the source, it seems like the compile command is always taking the options from the compile scope. So if I'm correct, you can have only one set of compilation options!
This seems to be confirmed by the fact that scalacOptions behaves the same way, and this is also why I don't see a non-hacky answer for these similar questions:
Different scalac options for different scopes or tasks?
Different compile options for tests and release in SBT?
I'd be happy to be proven wrong.
EDIT: FWIW, one might not be able to define another scalac options profile in the same project, but you could do so in a "different" project:
lazy val dev = (project in file(".")).
settings(target := baseDirectory.value / "target" / "dev").
settings(addCompilerPlugin("org.psywerx.hairyfotr" %% "linter" % "0.1.12"): _*)
This has the disadvantage that it has a separate output directory, so it will take more space and, more importantly, will not get incremental compiles between the two projects. However, after spending some time thinking about it, this may be by design. After all, even though linters don't, some scalac compilation options could conceivably change the output. This would make it meaningless to try to keep the metadata for incremental compilation from one set of scalac options to another. Thus different scalac options would indeed require different target directories.

Automatically clearing the screen on recompile in sbt

I am using sbt with the sbt-revolver plugin and I want to clear the terminal screen (^L) when the project is recompiled (~ re-start). How can this be done?
You could define new command clear, which would use jline to clear the screen. Sbt is using jline internally so you shouldn't have to include any extra dependency.
build.sbt
def clearConsoleCommand = Command.command("clear") { state =>
val cr = new jline.console.ConsoleReader()
cr.clearScreen
state
}
val root = project.in(file(".")).settings(commands += clearConsoleCommand)
Now you could run your compile like this ~;clear;compile. This will trigger clear the console followed by compile on each file change (assuming this is what you want).
Another solution, based on #SethTisue answer:
alias clearScreen=eval "\u001B[2J\u001B[0\u003B0H"
This line should be added to ~/.sbtrc, so that sbt will have knowledge of a "clearScreen" command. You can either just invoke the command with ~;clearScreen;compile
Or make an additional alias like alias cc=~;clearScreen;compile
On Twitter, Paul Phillips suggested this method:
alias cc = ~ ;eval "\u001B[2J\u001B[0\u003B0H" ;compile
source: https://twitter.com/extempore2/status/403564233775775744
This specifically helps when you're doing something in continuous mode, ala `compile:
maxErrors := 5
triggeredMessage := Watched.clearWhenTriggered
This works as of 0.13.7. The second line clears the screen before each command runs. The first line limits the number of errors. With this config, you only ever have one screen full of errors to work through. Obviously could adjust maxErrors depending on your sbt window.

How to set fork in Test when -jvm-debug given on command line?

Is there a way to conditionally disable forking if the project is run in debug mode:
sbt -jvm-debug 9999
Then in my build:
fork in Test := {
//find a key that lets me know if debugging in set up
!isDebugging.value
}
Specifying flywayUrl through system property in SBT should be of some help.
Add the following to build.sbt:
lazy val isDebugging = settingKey[Boolean]("true when xdebug is true; false otherwise")
isDebugging := System.getProperty("xdebug") == "true"
fork in Test := !isDebugging.value
When you execute sbt -Dxdebug=true it gives you what you want.
BTW I see no references to jvm-debug in the SBT sources, but it is indeed in the shell script I'm using to fire it up. It could be that you'd have to change sbt-launch-lib.bash to accommodate the change to add xdebug when -Xdebug is set.
Jacek's suggestion is pointing to the right direction, but couldn't work for me (or even at all?). System.getProperty cannot retrieve -Xdebug flag set by an sbt's Bash script basically - when calling System.getProperties, -Xdebug is not listed there, likewise any other non-standard JVM property (like -Xmx for example).
What's worked for me is this:
lazy val isDebug = settingKey[Boolean]("true when -Xdebug is set, false otherwise")
isDebug := ManagementFactory.getRuntimeMXBean.getInputArguments.contains("-Xdebug")
fork in Test := !isDebug.value
Cheers!

Does sbt have something like gradle's processResources task with ReplaceTokens support?

We are moving into Scala/SBT from a Java/Gradle stack. Our gradle builds were leveraging a task called processResources and some Ant filter thing named ReplaceTokens to dynamically replace tokens in a checked-in .properties file without actually changing the .properties file (just changing the output). The gradle task looks like:
processResources {
def whoami = System.getProperty( 'user.name' );
def hostname = InetAddress.getLocalHost().getHostName()
def buildTimestamp = new Date().format('yyyy-MM-dd HH:mm:ss z')
filter ReplaceTokens, tokens: [
"buildsig.version" : project.version,
"buildsig.classifier" : project.classifier,
"buildsig.timestamp" : buildTimestamp,
"buildsig.user" : whoami,
"buildsig.system" : hostname,
"buildsig.tag" : buildTag
]
}
This task locates all the template files in the src/main/resources directory, performs the requisite substitutions and outputs the results at build/resources/main. In other words it transforms src/main/resources/buildsig.properties from...
buildsig.version=#buildsig.version#
buildsig.classifier=#buildsig.classifier#
buildsig.timestamp=#buildsig.timestamp#
buildsig.user=#buildsig.user#
buildsig.system=#buildsig.system#
buildsig.tag=#buildsig.tag#
...to build/resources/main/buildsig.properties...
buildsig.version=1.6.5
buildsig.classifier=RELEASE
buildsig.timestamp=2013-05-06 09:46:52 PDT
buildsig.user=jenkins
buildsig.system=bobk-mbp.local
buildsig.tag=dev
Which, ultimately, finds its way into the WAR file at WEB-INF/classes/buildsig.properties. This works like a champ to record build specific information in a Properties file which gets loaded from the classpath at runtime.
What do I do in SBT to get something like this done? I'm new to Scala / SBT so please forgive me if this seems a stupid question. At the end of the day what I need is a means of pulling some information from the environment on which I build and placing that information into a properties file that is classpath loadable at runtime. Any insights you can give to help me get this done are greatly appreciated.
The sbt-buildinfo is a good option. The README shows an example of how to define custom mappings and mappings that should run on each compile. In addition to the straightforward addition of normal settings like version shown there, you want a section like this:
buildInfoKeys ++= Seq[BuildInfoKey](
"hostname" -> java.net.InetAddress.getLocalHost().getHostName(),
"whoami" -> System.getProperty("user.name"),
BuildInfoKey.action("buildTimestamp") {
java.text.DateFormat.getDateTimeInstance.format(new java.util.Date())
}
)
Would the following be what you're looking for:
sbt-editsource: An SBT plugin for editing files
sbt-editsource is a text substitution plugin for SBT 0.11.x and
greater. In a way, it’s a poor man’s sed(1), for SBT. It provides the
ability to apply line-by-line substitutions to a source text file,
producing an edited output file. It supports two kinds of edits:
Variable substitution, where ${var} is replaced by a value. sed-like
regular expression substitution.
This is from Community Plugins.

Resources