Dependent task runs only when main task called directly - sbt

So, I have a SBT project, with two custom tasks jooq:codegen and flyway:migrate from [ https://github.com/sean8223/jooq-sbt-plugin, https://github.com/sean8223/flyway-sbt-plugin ]
Not that it is relevant here, but the flyway:migrate task creates the schema in the database, and jooq:codegen generates code from the schema. Hence, it is imperative that flyway:migrate runs before jooq:codegen. So, I added the following line in my build.sbt
(codegen in JOOQ) <<= (codegen in JOOQ) dependsOn (migrate in Flyway)
Also, compile needs the generated code from jooq:codegen, but the plugin takes care of it by default.
Here is the weird part. When I run sbt compile, I get:
~/N/p/d/davion git:data-access ❯❯❯ sbt compile
[info] Loading project definition from /Users/rohan/Nomadly/projects/davion-projects/davion/project
[info] Set current project to davion (in build file:/Users/rohan/Nomadly/projects/davion-projects/davion/)
[info] Done updating.
[info] Initialising properties : /jooq-config4460313300896426081.xml
... OUTPUT TRUNCATED ...
[info] Table records generated : Total: 669.172ms, +44.536ms
[info] Routines fetched : 0 (0 included, 0 excluded)
[info] Packages fetched : 0 (0 included, 0 excluded)
[info] GENERATION FINISHED! : Total: 688.254ms, +19.082ms
[info] Compiling 7 Scala sources and 17 Java sources to /Users/rohan/Nomadly/projects/davion-projects/davion/target/classes...
[success] Total time: 24 s, completed 14 May, 2015 12:13:35 PM
So, the flyway:migrate task doesn't run. But when I run sbt jooq:codegen, this happens:
~/N/p/d/davion git:data-access ❯❯❯ sbt jooq:codegen
[info] Loading project definition from /Users/rohan/Nomadly/projects/davion-projects/davion/project
[info] Set current project to davion (in build file:/Users/rohan/Nomadly/projects/davion-projects/davion/)
[info] Flyway (Command-line Tool) v.2.0.3
[info]
[info] Current schema version: 6
[info] Schema is up to date. No migration necessary.
[info] Initialising properties : /jooq-config6431105742854017589.xml
[info] License parameters
... OUTPUT TRUNCATED ...
I have no idea why this is happening. If a task chain is setup where 'A' depends on 'B' which depends on 'C', then shouldn't running 'A' execute both 'C' and 'B' (and in that order)? Why does 'C' not run as a transitive dependency and how can I remedy this?

Related

sbt: Invoking tests after other tasks

I have a multi-project build, and want to run tests after starting two Docker containers. This is my custom task:
runTestsWithDocker := Def.taskDyn {
startDirectoryServer.value
val containerId = buildOrStartTestDatabase.value
Def.task {
(test in Test).value
sLog.value.info("running inside dynamic task")
containerId
}
}.value
As you can see from the output below, the Docker containers are started, and the log message is written from the dynamic task. However, there's no test output (and the build executes far too quickly for the tests to have run).
> runTestsWithDocker
[info] logging into ECR registry 123456789012.dkr.ecr.us-east-1.amazonaws.com
[info] checking repository for image container1:1.2.3-1200
[info] successfully logged-in to ECR registry 123456789012.dkr.ecr.us-east-1.amazonaws.com
[info] DockerSupport: pulling 123456789012.dkr.ecr.us-east-1.amazonaws.com/container2:latest
[info] DockerSupport: docker run -d -p 389:389 123456789012.dkr.ecr.us-east-1.amazonaws.com/container2:latest
[info] container ID: 80d16a268c6e13dd810f8c271ca8778fc8eaa6835f2d0640fa62d032ff052345
[info] image already exists; no need to build
[info] DockerSupport: pulling 123456789012.dkr.ecr.us-east-1.amazonaws.com/container1:1.2.3-1200
[info] DockerSupport: docker run -d -p 5432:5432 123456789012.dkr.ecr.us-east-1.amazonaws.com/container1:1.2.3-1200
[info] container ID: 2de559b0737e69d61b1234567890123bd123456789012d382ba8ffa40e0480cf
[info] Updating {file:/home/ubuntu/Workspace/mybuild/}mybuild...
[info] Resolving jline#jline;2.12.1 ...
[info] Done updating.
[info] running inside dynamic task
[success] Total time: 2 s, completed Jun 5, 2019 9:05:20 PM
I'm assuming that my scope is incorrect, and that I need to refer to test in some other scope, but I have no idea what that might be (I've tried Compile and ThisBuild as random stabs in the dark).
I've also seen (test in Test).result.value from other questions in SO. Thinking that maybe the test task was doing something non-standard I tried it, but with the same (non) result.
Lastly, I'm running SBT 0.13.16, so any convincing argument (as in a bug report) that it's a problem with that version would make me upgrade sooner than planned (my current goal is to refactor the build then upgrade).
Update: here's the output from inspect. It doesn't show the dependency on test, but I'm assuming that's because it's invoked from a dynamic task.
> inspect runTestsWithDocker
[info] Task: java.lang.String
[info] Description:
[info] Runs the test suite, after starting the LDAP server and running/initializing the test database
[info] Provided by:
[info] {file:/home/ubuntu/Workspace/mybuild/}mybuild/*:runTestsWithDocker
[info] Defined at:
[info] /home/ubuntu/Workspace/mybuild/build.sbt:597
[info] Dependencies:
[info] mybuild/*:buildOrStartTestDatabase
[info] mybuild/*:startDirectoryServer
[info] mybuild/*:settingsData
[info] Reverse dependencies:
[info] mybuild/*:publishTestDatabase
[info] Delegates:
[info] mybuild/*:runTestsWithDocker
[info] {.}/*:runTestsWithDocker
[info] */*:runTestsWithDocker
Update: if I specify a single sub-project, it correctly runs the tasks in that sub-project.
runTestsWithDocker := Def.taskDyn {
startDirectoryServer.value
val containerId = buildOrStartTestDatabase.value
Def.task {
(test in (subproject,Test)).result.value
containerId
}
}.value
So it looks like maybe the root project isn't aggregating? We're relying on the "default root" project, so I think my next change will be to create an explicit root project.
It turned out that the default root project was not in fact "aggregat[ing] all other projects in the build." Once I created this project and explicitly aggregated the other sub-projects under it, I was able to specify my task like so:
runTestsWithDocker := Def.taskDyn {
startDirectoryServer.value
val containerId = buildOrStartTestDatabase.value
Def.task {
(test in (root,Test)).result.value
containerId
}
}.value
:shrug:

Assign name to root project while keeping the default aggregation

I have a fairly large sbt project (about 30 sub-projects). From what I've figured out, sbt will use the name of the root directory as the name of the root project, if none is declared explicitly in build.sbt. Depending on where the project is checked out, e.g. in a CI environment, that name may change. I'm currently using sbt 1.2.8.
My issue is that I would like to assign a stable name to the root project so that I can e.g. run all tests using sbt root/test [0], leveraging the default aggregation of the root project over all sub-projects. The only way I have found so far to assign a name to the root project is by explicitly declaring the project. But this will disable the default aggregation.
Is there a way to assign a name to the root project that will keep the default aggregation over all sub-projects? Or is there another way to access the root project on the command line without relying on it's name?
[0]: The default project is changed by the build.sbt using onLoad in Global := (Command.process("project ...", _)) compose (onLoad in Global).value. So just running sbt test won't work.
Here is a potential solution without having to explicitly refer to the root project.
Given the following project structure consisting of root project, and sub-projects core and util
build.sbt
core
project
src
target
util
and the following build definition in build.sbt
lazy val util = (project in file("util"))
lazy val core = (project in file("core"))
onLoad in Global := { Command.process("project util", _: State) } compose (onLoad in Global).value
ThisBuild / libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.5" % Test
we could run tests from all projects whilst being in any particular sub-project by defining a custom task testAll which evaluates test using inAnyProject scope filter
val testAll = taskKey[Unit]("Run tests in all projects whilst being in any particular sub-project")
ThisBuild / testAll := Def.taskDyn {
(Test / test).all(ScopeFilter(inAnyProject))
}.value
Now executing sbt will load util sub-project by default, nevertheless testAll should run all tests from all projects:
sbt:util> testAll
[info] RootSpec:
[info] The Root object
[info] - should say root hello
[info] UtilSpec:
[info] The Util object
[info] - should say util hello
[info] CoreSpec:
[info] The Core object
[info] - should say core hello
[info] Run completed in 349 milliseconds.
[info] Total number of tests run: 1
[info] Suites: completed 1, aborted 0
[info] Tests: succeeded 1, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[info] Run completed in 309 milliseconds.
[info] Total number of tests run: 1
[info] Suites: completed 1, aborted 0
[info] Tests: succeeded 1, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[info] Run completed in 403 milliseconds.
[info] Total number of tests run: 1
[info] Suites: completed 1, aborted 0
[info] Tests: succeeded 1, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[success] Total time: 1 s, completed 11-Apr-2019 16:29:26
sbt:util>
where RootSpec, CoreSpec, and UtilsSpec are at
src/test/scala/example/RootSpec.scala
core/src/test/scala/example/CoreSpec.scala
util/src/test/scala/example/UtilSpec.scala

SBT Autoplugin and task modification with "dependsOn"

I've created an Autoplugin for an SBT project to launch middleware inside Docker containers for integration tests (Zookeeper and Kafka).
My first version without Autoplugin was to add manually in the projects settings such as :
(test in Test) <<= (test in Test) dependsOn zkStart
That was working very well.
Now with an Autoplugin, I've the following code
override def projectSettings: Seq[Def.Setting[_]] = Seq(
(test in Test) <<= (test in Test) dependsOn ZookeeperPlugin.zkStart
)
but Zookeeper is no longer start before tests.
when I do
[core_akka_cluster] $ inspect test
[info] Task: Unit
[info] Description:
[info] Executes all tests.
[info] Provided by:
[info] {file:/Users/xx/Projects/../../}core_akka_cluster/test:test
[info] Defined at:
[info] (sbt.Defaults) Defaults.scala:394
We can see that the setting test:test is provided by the default SBT values.
When I manually add the previous settings in the build definition of my project, this works once more and we have the following analysis
[core_akka_cluster] $ inspect test
[info] Task: Unit
[info] Description:
[info] Executes all tests.
[info] Provided by:
[info] [info] {file:/Users/xx/Projects/../../}core_akka_cluster/test:test
[info] Defined at:
[info] (sbt.Defaults) Defaults.scala:394
[info] (com.ingenico.msh.sbt.KafkaPluginSettings) KafkaPlugin.scala:36
Any idea about precedence in this case?
Thanks
Are you making the auto plugin a triggered plugin?
Since test is also added by an auto plugin (JvmPlugin) by sbt, you should require JvmPlugin.

Why does "sbt stage" fail with Not a valid command?

I am getting errors when I try to stage my application using sbt clean compile stage:
[error] Not a valid command: stage
[error] Not a valid project ID: stage
[error] Expected ':' (if selecting a configuration)
[error] Not a valid key: stage
[error] stage
[error] ^
I have done this hundreds of times on other machines without a problem. I have SBT 0.13.5 -- has anyone seen this before? I have read this other post, but I'm not on Heroku. Thanks.
After the comments above I realized that you just wanted to have stage command without bringing the entire Play foo in.
The stage command is part of sbt-native-packager that:
The goal [of the plugin] is to be able to bundle up Scala software built with SBT for native packaging systems, like deb, rpm, homebrew, msi.
One of the features of the sbt-native-packager plugin is the stage command that
> help stage
Create a local directory with all the files laid out as they would be in the final distribution.
Just add the following to project/plugins.sbt to have the plugin available in the project (after the comment of Muki the example uses the latest version 1.0.0-M1 with the autoplugin feature):
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "1.0.0-M1")
You will also have to add the following to build.sbt:
enablePlugins(JavaAppPackaging)
And that's it! You're all set now.
Execute stage.
> stage
[info] Packaging /Users/jacek/dev/sandbox/command-build-scala/target/scala-2.10/command-build-scala_2.10-0.1-SNAPSHOT-sources.jar ...
[info] Done packaging.
[info] Updating {file:/Users/jacek/dev/sandbox/command-build-scala/}command-build-scala...
[info] Wrote /Users/jacek/dev/sandbox/command-build-scala/target/scala-2.10/command-build-scala_2.10-0.1-SNAPSHOT.pom
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Packaging /Users/jacek/dev/sandbox/command-build-scala/target/scala-2.10/command-build-scala_2.10-0.1-SNAPSHOT-javadoc.jar ...
[info] Done packaging.
[info] Packaging /Users/jacek/dev/sandbox/command-build-scala/target/scala-2.10/command-build-scala_2.10-0.1-SNAPSHOT.jar ...
[info] Done packaging.
[success] Total time: 0 s, completed Nov 5, 2014 2:55:55 PM
After lots of digging, I found out 'stage' is implemented by a plugin from the Play framework, which I do use in my other projects and explains why sbt was accepting the stage command.

In SBT, how does one `addCompilerPlugin` from Git?

Normally, when not using Git, you can just write:
addCompilerPlugin("something" % "blah" ...)
scalacOptions += "-P:blah:..."
This addCompilerPlugin takes a ModuleID. But here... I've tried adding:
lazy val root = project in file(".") dependsOn
uri("git://github.com/puffnfresh/wartremover.git#master")
to the project/plugins.sbt as well as:
lazy val wartRemover = RootProject(
uri("git://github.com/puffnfresh/wartremover.git#master"))
lazy val root = Project(...).settings(
...
scalacOptions += "-P:wartremover:...",
...
) depends on wartRemover
Both result in:
[error] (root/*:update) sbt.ResolveException: unresolved dependency:
org.brianmckenna#wartremover_2.10.3;0.5-SNAPSHOT: not found
tl;dr The project wartremover has not been published for Scala version 2.10.3. Downgrade yours with the following in build.sbt among the other necessary settings:
scalaVersion := "2.10.2"
Detailed procedure focusing on Scala 2.10.3
The following in build.sbt
addCompilerPlugin("org.brianmckenna" % "wartremover" % "0.5" cross CrossVersion.full)
adds org.brianmckenna:wartremover:0.5:plugin->default(compile) to libraryDependencies.
[sbt-0-13-2]> show libraryDependencies
[info] List(org.scala-lang:scala-library:2.10.3, org.brianmckenna:wartremover:0.5:plugin->default(compile))
So to use a RootProject that points at the project wartremover at GitHub I had to use the following in build.sbt (this is the complete file):
scalacOptions in root += "-P:wartremover:traverser:org.brianmckenna.wartremover.warts.Unsafe"
lazy val root = project in file(".") dependsOn wartRemover % "plugin->default(compile)"
lazy val wartRemover = RootProject(
uri("git://github.com/puffnfresh/wartremover.git#master"))
Since the project wartremover is not published for 2.10.3 I followed the steps below:
Show available projects
[root]> projects
[info] In file:/Users/jacek/sandbox/so/sbt-0.13.2/
[info] * root
[info] In git://github.com/puffnfresh/wartremover.git#master
[info] wartremover
Switch to wartremover and publishLocal it for scalaVersion set to 2.10.3.
[wartremover]> set scalaVersion := "2.10.3"
[info] Defining wartremover/*:scalaVersion
[info] The new value will be used by wartremover/*:allDependencies, wartremover/*:assemblyPackageScala::assemblyDefaultJarName and 12 others.
[info] Run `last` for details.
[info] Reapplying settings...
[info] Set current project to wartremover (in build git://github.com/puffnfresh/wartremover.git#master)
[wartremover]> publishLocal
[info] Packaging /Users/jacek/.sbt/0.13/staging/d6dd3d2e3d818e69943a/wartremover/target/scala-2.10/wartremover_2.10.3-0.6-SNAPSHOT-sources.jar ...
[info] Updating {git://github.com/puffnfresh/wartremover.git#master}wartremover...
...
[info] published ivy to /Users/jacek/.ivy2/local/org.brianmckenna/wartremover_2.10.3/0.6-SNAPSHOT/ivys/ivy.xml
[success] Total time: 7 s, completed Jan 18, 2014 11:34:07 PM
Switch to the project root and do update. It should now work fine.
[wartremover]> project {file:/Users/jacek/sandbox/so/sbt-0.13.2/}
[info] Set current project to root (in build file:/Users/jacek/sandbox/so/sbt-0.13.2/)
[root]> update
[info] Updating {file:/Users/jacek/sandbox/so/sbt-0.13.2/}root...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[success] Total time: 0 s, completed Jan 18, 2014 11:36:24 PM
console should work fine now, too.
[root]> console
[info] Starting scala interpreter...
[info]
<console>:5: error: var is disabled
var value: scala.tools.nsc.interpreter.IMain = _
^
Welcome to Scala version 2.10.3 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_45).
Type in expressions to have them evaluated.
Type :help for more information.
scala>
The error message worries me, though. I don't know the plugin and neither do I know how to get rid of it. It also happens when I follow the steps described in Compiler plugin when scalaVersion := "2.10.2" is set in build.sbt (so the compiler plugin's available in the repo).

Resources