SBT setting fallback search path - sbt

How are you doing?
I did the following in SBT console:
inspect version
And I get something like the following:
[info] Delegates:
[info] *:version
[info] {.}/*:version
[info] */*:version
So, actually, what's the difference between the last two??? I read and read the documentation but can't seem to make any difference to me. One is ThisBuild (a.k.a. entire build, a.k.a. {.}) while the other one is Global.
Why does {.} in the project axis has precedence over * in the project axis?
The values {.} and * looks pretty much the same to me..
Thanks!!!!

The order of the last two in:
*:version -> try current project
{.}/*:version -> try this build
*/*:version -> try global
says that whatever version you specified in this build, you want that to override anything that was possibly defined in Global.
Example: Key "version"
For Global scope it was defined in Defaults.scala with value "0.1-SNAPSHOT".
For your projects in this build you might want to overwrite that with:
version in ThisBuild := "3.0.1"
So, because {.}/*:version has precedence over /:version, whenever you get "version" in your projects, you fetch "3.0.1" instead of "0.1-SNAPSHOT".

Related

How to show the result of an SBT task in Console

In sbt console, it is possible to do show settingsKey to view the value of a setting for example:
> show resourceManaged
[info] /Users/code/my_project/target/scala-2.11/resource_managed
Is there a way to do this for tasks? That is to execute and view the result of a task in the console?
Yes, you can also print the results of a task in console with show. For example sbt 'show fullClasspath' or directly show fullClasspath in sbt interactive mode.
Remember that keys are scoped:
Examples of scoped key notation
fullClasspath specifies just a key, so the default scopes are used: current project, a key-dependent configuration, and global task scope.
test:fullClasspath specifies the configuration, so this is fullClasspath in the test configuration, with defaults for the other two scope axes.
*:fullClasspath specifies Global for the configuration, rather than the default configuration.
doc::fullClasspath specifies the fullClasspath key scoped to the doc task, with the defaults for the project and configuration axes.
{file:/home/hp/checkout/hello/}default-aea33a/test:fullClasspath specifies a project, {file:/home/hp/checkout/hello/}default-aea33a, where the project is identified with the build {file:/home/hp/checkout/hello/} and then a project id inside that build default-aea33a. Also specifies configuration test, but leaves the default task axis.
{file:/home/hp/checkout/hello/}/test:fullClasspath sets the project axis to “entire build” where the build is {file:/home/hp/checkout/hello/}.
{.}/test:fullClasspath sets the project axis to “entire build” where the build is {.}. {.} can be written ThisBuild in Scala code.
{file:/home/hp/checkout/hello/}/compile:doc::fullClasspath sets all three scope axes.
You can use inspect fullClasspath to see scopes for that task in your project in Related.

Making package a dependency of a new sbt task

When at the sbt CLI I can just type package and everything works fine - two jar files are produced. But I want to make package a dependency of a new task I am creating, so I want to make packaging happen as part of the build script. This is what I have:
lazy val deployTask = TaskKey[Unit]("deploy")
deployTask := { println("deploy happening now!") }
deployTask := {
(deployTask.in(file("."))).value
(Keys.`package` in Compile).value
}
My reading of the documentation tells me that Compile really means file("src/main/scala"), which is not what I want. It seems that I have to put in <Something>. What do I need to put in instead of <Something> to get package to mean what it means when I type it at the CLI?
At the CLI I should be able to:
clean
show deploy
, but unfortunately it does not do the packaging I expect.
These are the projects:
projects
[info] In file:/C:/dev/v2/atmosphere/
[info] atmosphereJS
[info] atmosphereJVM
[info] * root
So it makes sense that when I run package from the CLI the root project is used.
So another way of asking this question might be: "How do I make package work for root from the deploy task I am creating?"

How to set system properties for runMain on command line?

How can I set a system property for runMain upon executing it from command line on Windows?
I'd like to be able to run the following command:
sbt -Dconfig.resource=../application.conf "runMain akka.Main com.my.main.Actor"
Regardless of whether fork is true, whether I put it in SBT_OPTS, or how I pass it in I cannot accomplish this. I am familiar with both Setting value of setting on command line when no default value defined in build? and Setting system properties with "sbt run" but neither answer my question.
Other questions seem to indicate you can't even easily view the Java invocation arguments easily in SBT. Any help is appreciated.
This works:
sbt '; set javaOptions += "-Dconfig.resource=../application.conf" ; runMain akka.Main com.my.main.Actor'
If this isn't a "friendly" enough syntax, wrap it in a little shell script.
(Note this assumes you have fork set to true for running. If you don't, see akauppi's comment.)
You could use envVars setting. I'm unsure how idiomatic it is in SBT, though.
> help envVars
Environment variables used when forking a new JVM
The following (very minimalistic) build.sbt worked fine.
fork := true
envVars := Map("msg" -> "hello")
Once you get it running, setting envVars to any value with set does the trick.
> help set
set [every] <setting-expression>
Applies the given setting to the current project:
1) Constructs the expression provided as an argument by compiling and loading it.
2) Appends the new setting to the current project's settings.
3) Re-evaluates the build's settings.
This command does not rebuild the build definitions, plugins, or configurations.
It does not automatically persist the setting(s) either.
To persist the setting(s), run 'session save' or 'session save-all'.
If 'every' is specified, the setting is evaluated in the current context
and the resulting value is used in every scope. This overrides the value
bound to the key everywhere.
I've got a simple app to run.
$ sbt run
[info] Set current project to fork-testing (in build file:/C:/dev/sandbox/fork-testing/)
[info] Running Hello
[info] hello
With the envVars setting changed on the command line the output would change as follows:
$ sbt 'set envVars := Map("msg" -> "Hello, Chad")' run
[info] Set current project to fork-testing (in build file:/C:/dev/sandbox/fork-testing/)
[info] Defining *:envVars
[info] The new value will be used by *:runner, compile:run::runner and 1 others.
[info] Run `last` for details.
[info] Reapplying settings...
[info] Set current project to fork-testing (in build file:/C:/dev/sandbox/fork-testing/)
[info] Running Hello
[info] Hello, Chad
runMain is no different from run in this case.
$ sbt 'set envVars := Map("msg" -> "Hello, Chad")' 'runMain Hello'
[info] Set current project to fork-testing (in build file:/C:/dev/sandbox/fork-testing/)
[info] Defining *:envVars
[info] The new value will be used by *:runner, compile:run::runner and 1 others.
[info] Run `last` for details.
[info] Reapplying settings...
[info] Set current project to fork-testing (in build file:/C:/dev/sandbox/fork-testing/)
[info] Running Hello
[info] Hello, Chad
If you're trying to set SBT properties, like plugin settings, then the above won't work (AFAICT) as of 0.13+ in my experience. The following however did work, when trying to pass in Liquibase settings, like password, from our CI frameworks.
In your build.sbt
Ugly, but supplies defaults, and optionally grabs from System.properties. This way you've got your default and override cases covered.
def sysPropOrDefault(propName:String,default:String):String = Option(System.getProperty(propName)).getOrElse(default)
liquibaseUsername := sysPropOrDefault("liquibase.username","change_me")
liquibasePassword := sysPropOrDefault("liquibase.password","chuck(\)orris")
From the commandline
Now just override via -Dprop=value like you would with Maven or other JVM programs. Note props appear before SBT task.
sbt -Dliquibase.password="shh" -Dliquibase.username="bob" liquibase:liquibase-update

Defining plugin dependency between subprojects in SBT?

EDIT:
Since I put up the bounty, I thought I should restate the question
How can a SBT project P, with two sub-projects A and B, set up B to have a plugin dependency on A, which is a SBT plugin?
Giving P a plugin dependency on A does not work, since A depends on other things in P, which results in a circular dependency graph
It has to be a plugin dependency, for A is a plugin needed to run Bs test suite.
dependsOn doesn't work, because, well, it has to be a plugin dependency
I'd like to know either of
How to do this, or
Why this is impossible, and what the next best alternatives are.
EDIT: clarified that it's a plugin-dependency, since build-dependency is ambiguous
When you have a multi-project build configuration with "project P and two sub-projects A and B" it boils down to the following configuration:
build.sbt
lazy val A, B = project
As per design, "If a project is not defined for the root directory in the build, sbt creates a default one that aggregates all other projects in the build." It means that you will have an implicit root project, say P (but the name is arbitrary):
[plugin-project-and-another]> projects
[info] In file:/Users/jacek/sandbox/so/plugin-project-and-another/
[info] A
[info] B
[info] * plugin-project-and-another
That gives us the expected project structure. On to defining plugin dependency between B and A.
The only way to define a plugin in a SBT project is to use project directory that's the plugins project's build definition - "A plugin definition is a project in <main-project>/project/." It means that the only way to define a plugin dependency on the project A is to use the following:
project/plugins.sbt
addSbtPlugin("org.example" % "example-plugin" % "1.0")
lazy val plugins = project in file(".") dependsOn(file("../A"))
In this build configuration, the plugins project depends on another SBT project that happens to be our A that's in turn a plugin project.
A/build.sbt
// http://www.scala-sbt.org/release/docs/Extending/Plugins.html#example-plugin
sbtPlugin := true
name := "example-plugin"
organization := "org.example"
version := "1.0"
A/MyPlugin.scala
import sbt._
object MyPlugin extends Plugin
{
// configuration points, like the built in `version`, `libraryDependencies`, or `compile`
// by implementing Plugin, these are automatically imported in a user's `build.sbt`
val newTask = taskKey[Unit]("A new task.")
val newSetting = settingKey[String]("A new setting.")
// a group of settings ready to be added to a Project
// to automatically add them, do
val newSettings = Seq(
newSetting := "Hello from plugin",
newTask := println(newSetting.value)
)
// alternatively, by overriding `settings`, they could be automatically added to a Project
// override val settings = Seq(...)
}
The two files - build.sbt and MyPlugin.scala in the directory A - make up the plugin project.
The only missing piece is to define the plugin A's settings for the project B.
B/build.sbt
MyPlugin.newSettings
That's pretty much it what you can do in SBT. If you want to have multi-project build configuration and have a plugin dependency between (sub)projects, you don't have much choice other than what described above.
With that said, let's see if the plugin from the project A is accessible.
[plugin-project-and-another]> newTask
Hello from plugin
[success] Total time: 0 s, completed Feb 13, 2014 2:29:31 AM
[plugin-project-and-another]> B/newTask
Hello from plugin
[success] Total time: 0 s, completed Feb 13, 2014 2:29:36 AM
[plugin-project-and-another]> A/newTask
[error] No such setting/task
[error] A/newTask
[error] ^
As you may have noticed, newTask (that comes from the plugin from the project A) is available in the (default) root project and the project B, but not in A.
As Jacek said, it cannot be done as I would like, as a subproject cannot have a SBT plugin that the root project does not. On the other hand, this discussion on the mailing list contains several alternatives, and would no doubt be useful to anyone who comes across this question in the future.
EDIT: Well, in the end the alternatives mentioned (sbt scripted, etc) were hard and clunky to use. My final solution was to just have a separate project (not subproject) inside the repo that depends on the original project via it's ivy coordinates, and using bash to publishLocal the first project, going into the second project and running its tests
sbt publishLocal; cd test; sbt test; cd ..
I always thought the point of something like SBT was to avoid doing this kind of bash gymnastics, but desperate times call for desperate measures...
This answer may include the solution https://stackoverflow.com/a/12754868/3189923 .
From that link, in short, set exportJars := true and to obtain jar file paths for a (sub)project exportedProducts in Compile.
Leaving the facts about plugins by side, you have a parent project P with sub-projects A and B. And then you state that A depends on P. But P is a aggregate of A and B and hence depends on A. So you already have a circular dependency between A and P. This can never work.
You have to split P in two parts: The part where A depends on (let's call this part A') and the rest (let's call this P_rest). Then you throw away P and make a new project P_rest consisting of A', A and B. And A depends on A'.

sbt key that corresponds to command that I type in

I want to make my tests run every time I type universal:package-zip-tarball. I know that to do this, I have to put something like
someKey <<= someKey dependsOn (test in Test)
in my project/Build.scala, where someKey is the key that provides the task I want to depend on the test run, in this case, universal:package-zip-tarball.
But my generic question is: how do I find out what someKey should be?
Note that this is a Play framework project, and I don't even know if universal:package-zip-tarball is provided by Play, or by some other sbt plugin.
Is there any way sbt can just tell me, without me having to go searching for the source code repository containing the relevant code?
Use the inspect command:
$ inspect universal:package-zip-tarball
[...]
[info] Defined at:
[info] (com.typesafe.sbt.packager.universal.UniversalPlugin)
UniversalPlugin.scala:73
This is actually the location of the definition of the code of the task, but this is close enough to help, because it lets us find the key (the key will be in the same sbt plugin).
From this we can find out that the key is:
com.typesafe.sbt.packager.universal.Keys.packageZipTarball
Unfortunately, just substituting this in doesn't work - it says:
[error] Reference to undefined setting:
[error]
[error] my-project/*:packageZipTarball from my-project/*:packageZipTarball
[error] Did you mean my-project/universal-docs:packageZipTarball ?
[error]
[error] Use 'last' for the full log.
So to fix this, the only thing remaining is to translate the universal: prefix. It is in fact this:
packageZipTarball in Universal <<= packageZipTarball in Universal dependsOn (test in Test)
but it just needs an extra import to make it compile:
import com.typesafe.sbt.SbtNativePackager._
(In this case, SbtNativePackager is the main plugin object, I think. Other plugins might require importing something else, to translate such a prefix.)

Resources