sbt: Invoking tests after other tasks - sbt

I have a multi-project build, and want to run tests after starting two Docker containers. This is my custom task:
runTestsWithDocker := Def.taskDyn {
startDirectoryServer.value
val containerId = buildOrStartTestDatabase.value
Def.task {
(test in Test).value
sLog.value.info("running inside dynamic task")
containerId
}
}.value
As you can see from the output below, the Docker containers are started, and the log message is written from the dynamic task. However, there's no test output (and the build executes far too quickly for the tests to have run).
> runTestsWithDocker
[info] logging into ECR registry 123456789012.dkr.ecr.us-east-1.amazonaws.com
[info] checking repository for image container1:1.2.3-1200
[info] successfully logged-in to ECR registry 123456789012.dkr.ecr.us-east-1.amazonaws.com
[info] DockerSupport: pulling 123456789012.dkr.ecr.us-east-1.amazonaws.com/container2:latest
[info] DockerSupport: docker run -d -p 389:389 123456789012.dkr.ecr.us-east-1.amazonaws.com/container2:latest
[info] container ID: 80d16a268c6e13dd810f8c271ca8778fc8eaa6835f2d0640fa62d032ff052345
[info] image already exists; no need to build
[info] DockerSupport: pulling 123456789012.dkr.ecr.us-east-1.amazonaws.com/container1:1.2.3-1200
[info] DockerSupport: docker run -d -p 5432:5432 123456789012.dkr.ecr.us-east-1.amazonaws.com/container1:1.2.3-1200
[info] container ID: 2de559b0737e69d61b1234567890123bd123456789012d382ba8ffa40e0480cf
[info] Updating {file:/home/ubuntu/Workspace/mybuild/}mybuild...
[info] Resolving jline#jline;2.12.1 ...
[info] Done updating.
[info] running inside dynamic task
[success] Total time: 2 s, completed Jun 5, 2019 9:05:20 PM
I'm assuming that my scope is incorrect, and that I need to refer to test in some other scope, but I have no idea what that might be (I've tried Compile and ThisBuild as random stabs in the dark).
I've also seen (test in Test).result.value from other questions in SO. Thinking that maybe the test task was doing something non-standard I tried it, but with the same (non) result.
Lastly, I'm running SBT 0.13.16, so any convincing argument (as in a bug report) that it's a problem with that version would make me upgrade sooner than planned (my current goal is to refactor the build then upgrade).
Update: here's the output from inspect. It doesn't show the dependency on test, but I'm assuming that's because it's invoked from a dynamic task.
> inspect runTestsWithDocker
[info] Task: java.lang.String
[info] Description:
[info] Runs the test suite, after starting the LDAP server and running/initializing the test database
[info] Provided by:
[info] {file:/home/ubuntu/Workspace/mybuild/}mybuild/*:runTestsWithDocker
[info] Defined at:
[info] /home/ubuntu/Workspace/mybuild/build.sbt:597
[info] Dependencies:
[info] mybuild/*:buildOrStartTestDatabase
[info] mybuild/*:startDirectoryServer
[info] mybuild/*:settingsData
[info] Reverse dependencies:
[info] mybuild/*:publishTestDatabase
[info] Delegates:
[info] mybuild/*:runTestsWithDocker
[info] {.}/*:runTestsWithDocker
[info] */*:runTestsWithDocker
Update: if I specify a single sub-project, it correctly runs the tasks in that sub-project.
runTestsWithDocker := Def.taskDyn {
startDirectoryServer.value
val containerId = buildOrStartTestDatabase.value
Def.task {
(test in (subproject,Test)).result.value
containerId
}
}.value
So it looks like maybe the root project isn't aggregating? We're relying on the "default root" project, so I think my next change will be to create an explicit root project.

It turned out that the default root project was not in fact "aggregat[ing] all other projects in the build." Once I created this project and explicitly aggregated the other sub-projects under it, I was able to specify my task like so:
runTestsWithDocker := Def.taskDyn {
startDirectoryServer.value
val containerId = buildOrStartTestDatabase.value
Def.task {
(test in (root,Test)).result.value
containerId
}
}.value
:shrug:

Related

SBT Autoplugin and task modification with "dependsOn"

I've created an Autoplugin for an SBT project to launch middleware inside Docker containers for integration tests (Zookeeper and Kafka).
My first version without Autoplugin was to add manually in the projects settings such as :
(test in Test) <<= (test in Test) dependsOn zkStart
That was working very well.
Now with an Autoplugin, I've the following code
override def projectSettings: Seq[Def.Setting[_]] = Seq(
(test in Test) <<= (test in Test) dependsOn ZookeeperPlugin.zkStart
)
but Zookeeper is no longer start before tests.
when I do
[core_akka_cluster] $ inspect test
[info] Task: Unit
[info] Description:
[info] Executes all tests.
[info] Provided by:
[info] {file:/Users/xx/Projects/../../}core_akka_cluster/test:test
[info] Defined at:
[info] (sbt.Defaults) Defaults.scala:394
We can see that the setting test:test is provided by the default SBT values.
When I manually add the previous settings in the build definition of my project, this works once more and we have the following analysis
[core_akka_cluster] $ inspect test
[info] Task: Unit
[info] Description:
[info] Executes all tests.
[info] Provided by:
[info] [info] {file:/Users/xx/Projects/../../}core_akka_cluster/test:test
[info] Defined at:
[info] (sbt.Defaults) Defaults.scala:394
[info] (com.ingenico.msh.sbt.KafkaPluginSettings) KafkaPlugin.scala:36
Any idea about precedence in this case?
Thanks
Are you making the auto plugin a triggered plugin?
Since test is also added by an auto plugin (JvmPlugin) by sbt, you should require JvmPlugin.

How to set system property for xsbt-web-plugin's jetty()?

I've migrated my project to 0.13.5 and started using the xsbt-web-plugin.
I'd like to configure logback to be using a configuration file outside the classpath which is set by a system property logback.configurationFile (so I can keep the logconfig outside the war file).
Previously I would simply set:
System.setProperty("logback.configurationFile", "/some/path/logback.xml")
inside the project/build.scala and logback would pick it up.
However, after upgrading sbt to 0.13.5 and migrating to the xsbt-web-plugin system properties set in sbt don't seem to be available at runtime(jetty).
I've tried setting system properties in different ways, also by passing it along using the -D flag when starting sbt.
On the sbt console I can see the property:
eval sys.props("logback.configurationFile")
[info] ans: String = /some/path/logback.xml
But it's not available inside the webapp.
Any ideas on how to set system properties to be available inside the webapp?
I've tried both jetty() and tomcat(). Same behaviour.
Update:
I ended up with:
jetty(options = new ForkOptions(runJVMOptions = Seq("-Dlogback.configurationFile=/some/path/logback.xml")))
that works.
Use javaOptions in container += "-Dlogback.configurationFile=/some/path/logback.xml" as described in Set forked JVM options.
javaOptions alone (without in container) should work, too, as could be seen in inspect actual under Dependencies and Delegates:
[play-new-app] $ inspect actual container:javaOptions
[info] Task: scala.collection.Seq[java.lang.String]
[info] Description:
[info] Options passed to a new JVM when forking.
[info] Provided by:
[info] {file:/Users/jacek/sandbox/play-new-app/}root/container:javaOptions
[info] Defined at:
[info] /Users/jacek/sandbox/play-new-app/build.sbt:26
[info] Dependencies:
[info] */*:javaOptions
[info] Delegates:
[info] container:javaOptions
[info] *:javaOptions
[info] {.}/container:javaOptions
[info] {.}/*:javaOptions
[info] */container:javaOptions
[info] */*:javaOptions
[info] Related:
[info] */*:javaOptions

How to publish webjar assets with publish/publishLocal in Play 2.3?

Since Play Framework 2.3 assets are packaged into one jar archive file. I would like to publish this jar automatically with the project, i.e. upon publish or publishLocal I want the assets jar to be published as well.
How to achieve that?
After inspect tree dist I managed to find the task playPackageAssets that generates the assets file:
[play-publish-webjar] $ inspect playPackageAssets
[info] Task: java.io.File
[info] Description:
[info]
[info] Provided by:
[info] {file:/Users/jacek/sandbox/play-publish-webjar/}root/*:playPackageAssets
[info] Defined at:
[info] (sbt.Defaults) Defaults.scala:641
[info] Dependencies:
[info] *:playPackageAssets::packageConfiguration
[info] *:playPackageAssets::streams
[info] Reverse dependencies:
[info] *:scriptClasspath
[info] universal:mappings
[info] Delegates:
[info] *:playPackageAssets
[info] {.}/*:playPackageAssets
[info] */*:playPackageAssets
A naive solution might be to attach the assets webjar as is generated by playPackageAssets to publishLocal task's artifacts. Add the following to build.sbt (the types are to show what you work with):
import play.PlayImport.PlayKeys._
packagedArtifacts in publishLocal := {
val artifacts: Map[sbt.Artifact, java.io.File] = (packagedArtifacts in publishLocal).value
val assets: java.io.File = (playPackageAssets in Compile).value
artifacts + (Artifact(moduleName.value, "asset", "jar", "assets") -> assets)
}
Repeat it for the other tasks you want to exhibit similar behaviour.
I'm however quite doubtful it's the best solution.

SBT Config extend vs DefaultSettings

If I define an SBT config with
val MyConfig = config("my") extend Test
is that basically the same as doing
val MyConfig = config("my")
val mySettings = inConfig(MyConfig)(Defaults.testSettings)
and then importing mySettings inside a build definition ?
No, calling extend method is not the same thing as calling inConfig. extend just returns a new configuration with passed in configurations prepended extendsConfigs, and it will not introduce any new settings.
When you add MyConfig into the project, it becomes part of the scoped key resolution path:
val MyConfig = config("my") extend Test
val root = (project in file(".")).
configs(MyConfig)
Suppose you type my:test in the sbt shell. Since test task is not found under my configuration, it will traverse extendsConfigs and check if the tasks are available under them. The first one it's going to hit is Test since we prepended it. You can check this by running inspect my:test:
root> inspect my:test
[info] Task: Unit
[info] Description:
[info] Executes all tests.
[info] Provided by:
[info] {file:/Users/eugene/work/quick-test/sbt-so/}root/test:test
[info] Defined at:
[info] (sbt.Defaults) Defaults.scala:365
[info] Delegates:
[info] my:test
[info] test:test
[info] runtime:test
[info] compile:test
[info] *:test
[info] {.}/my:test
[info] {.}/test:test
[info] {.}/runtime:test
[info] {.}/compile:test
[info] {.}/*:test
[info] */my:test
[info] */test:test
[info] */runtime:test
[info] */compile:test
[info] */*:test
[info] Related:
[info] test:test
"Provided by" says it delegated to root/test:test. This mechanism allows you to share some of the settings but override others, but you still have to know the inner wiring of the settings scoped to tasks etc, so it's tricky business. You probably already know, but I'll link to Additional test configurations, which specifically discusses configurations for testing.

Developing task to synchronize managed dependencies with another directory in SBT?

I'm trying to write a task which retrieves the dependency jars and puts them in a single directory at the project root. Here's what I have so far:
retrieveManaged := true
val libDir = settingKey[File]("the directory to retrieve dependency libraries to")
lazy val getLibs = taskKey[Unit]("retrieves all dependency libraries to the libDir")
libDir := baseDirectory.value / "libs"
getLibs := {
val a = update.value
sbt.IO.delete(libDir.value)
for (src <- (managedDirectory.value ** "*.jar").get) {
sbt.IO.copyFile(src, libDir.value / src.getName, true)
}
}
I use retrieveManaged := true so that sbt will copy the dependencies to managedDirectory. I then copy those libraries to a directory which I have defined. My questions are:
I want my task to depend on the update task, to ensure that the dependencies have been copied to managedDirectory first. If I understand How can I call another task from my SBT task?, I should be able to do this by calling update.value. But this doesn't seem to work.
Instead of copying the files, I'd really like to "synchronize". This means that only newly added files should be copied, and any files which no longer exist should be removed. How can I do this?
Update
Thanks to Jacek's suggestion, I was able to come up with the following solution, which addresses #1. I still need to figure out how to do #2 (synchronize instead of copy).
getLibs := {
sbt.IO.delete(libDir.value)
val depFiles = update.value.matching((s: String) => Set("compile", "runtime") contains s)
for (depFile <- depFiles) {
sbt.IO.copyFile(depFile, libDir.value / depFile.getName, true)
}
}
I think the answer is to really use the value of update task and parse its result. That would sort out the issue 1 and 2. If it does not, attach the symptoms to your question.
All val a = update.value does it to execute update task and assign its value to a. Since you don't use a at all, you lose all the previous information update gives you. To see what it is execute show update in the shell.
[triggeredby]> help update
Resolves and optionally retrieves dependencies, producing a report.
[triggeredby]> show update
[info] Updating {file:/Users/jacek/sandbox/so/triggeredby/}triggeredby...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Update report:
[info] Resolve time: 2152 ms, Download time: 114 ms, Download size: 0 bytes
[info] compile:
[info] org.scala-lang:scala-library:2.10.3 (): (Artifact(scala-library,jar,jar,None,List(),None,Map()),/Users/jacek/.sbt/boot/scala-2.10.3/lib/scala-library.jar)
[info] runtime:
[info] org.scala-lang:scala-library:2.10.3 (): (Artifact(scala-library,jar,jar,None,List(),None,Map()),/Users/jacek/.sbt/boot/scala-2.10.3/lib/scala-library.jar)
[info] test:
[info] org.scala-lang:scala-library:2.10.3 (): (Artifact(scala-library,jar,jar,None,List(),None,Map()),/Users/jacek/.sbt/boot/scala-2.10.3/lib/scala-library.jar)
[info] provided:
[info] optional:
[info] compile-internal:
[info] org.scala-lang:scala-library:2.10.3 (): (Artifact(scala-library,jar,jar,None,List(),None,Map()),/Users/jacek/.sbt/boot/scala-2.10.3/lib/scala-library.jar)
[info] runtime-internal:
[info] org.scala-lang:scala-library:2.10.3 (): (Artifact(scala-library,jar,jar,None,List(),None,Map()),/Users/jacek/.sbt/boot/scala-2.10.3/lib/scala-library.jar)
[info] test-internal:
[info] org.scala-lang:scala-library:2.10.3 (): (Artifact(scala-library,jar,jar,None,List(),None,Map()),/Users/jacek/.sbt/boot/scala-2.10.3/lib/scala-library.jar)
[info] plugin:
[info] sources:
[info] docs:
[info] pom:
[info] scala-tool:
[info] org.scala-lang:scala-compiler:2.10.3: (Artifact(scala-compiler,jar,jar,None,List(),None,Map()),/Users/jacek/.sbt/boot/scala-2.10.3/lib/scala-compiler.jar)
[info] org.scala-lang:scala-library:2.10.3 (): (Artifact(scala-library,jar,jar,None,List(),None,Map()),/Users/jacek/.sbt/boot/scala-2.10.3/lib/scala-library.jar)
[info] org.scala-lang:scala-reflect:2.10.3 (): (Artifact(scala-reflect,jar,jar,None,List(),None,Map()),/Users/jacek/.sbt/boot/scala-2.10.3/lib/scala-reflect.jar)
[info] org.scala-lang:jline:2.10.3: (Artifact(jline,jar,jar,None,List(),None,Map()),/Users/jacek/.sbt/boot/scala-2.10.3/lib/jline.jar)
[info] org.fusesource.jansi:jansi:1.4: (Artifact(jansi,jar,jar,None,ArraySeq(master),None,Map()),/Users/jacek/.ivy2/cache/org.fusesource.jansi/jansi/jars/jansi-1.4.jar)
[success] Total time: 3 s, completed Mar 14, 2014 12:16:26 AM
I think it's pretty much what you need - use the return value to fit your needs.

Resources