Passing JVM option to forked test in SBT - sbt

I am trying to use a JVM option in a forked test, which has been set externally to SBT prior to its launch. I'm also setting additional JVM options like so:
javaOptions in ThisBuild ++= List("-Xmx3072m")
to my understanding, based on the SBT documentation the JVM options provided to the SBT process should be available to the forked process:
By default, a forked process uses the same Java and Scala versions being used for the build and the working directory and JVM options of the current process.
However, I can't seem to retrieve those "external" JVM options in the forked tests, i.e. System.getProperty("foo") will always return null. Given that I am trying to pass along a password, I can't set it directly in the build file. My questions therefore are:
is there an SBT task / key to access the JVM options passed to the JVM in which SBT is running? That way I would attempt to add the key to the javaOptions
is there any other means by which to pass external Java Options to a forked test?

You may control your options with testGrouping. Below copy'n'paste from one of my projects. It properly handles hierarchical projects and root project without tests too. Options are merged from javaOptions in run and test.options file. This allow modify arguments without project reloading. That project has load time more then minute. So I use test.options to fast switch between production and debug mode with -Xrunjdwp:transport=dt_...
testGrouping in Test <<= (definedTests in Test, javaOptions in run, baseDirectory in LocalRootProject) map { (tests, javaOptions, baseDirectory) ⇒
if (tests.nonEmpty) {
val testOptionsFile = baseDirectory / "test.options"
val externalOptions = if (testOptionsFile.exists()) {
val source = scala.io.Source.fromFile(testOptionsFile)
val options = source.getLines().toIndexedSeq
source.close()
options
} else Nil
tests map { test ⇒
new Tests.Group(
name = test.name,
tests = Seq(test),
// runPolicy = Tests.InProcess)
runPolicy = Tests.SubProcess(javaOptions = javaOptions ++ externalOptions))
}
} else {
Seq(new Tests.Group(
name = "Empty",
tests = Seq(),
runPolicy = Tests.InProcess))
}
},

Related

Is it possible to disable publish without disabling publishLocal in sbt?

I have an sbt project where docker:publishLocal will create a docker image on my machine for testing, and docker:publish will publish the image to a repository and also publish jar files from the build to a repository.
If my project is a snapshot, I would like to disable publishing to the repositories, while still being able to build the local image.
ThisBuild / publishArtifact := ! isSnapshot.value
does the right thing for the publish command, but it also disables publishLocal.
I want to write something like
if (isSnapshot.value) {
publish := { }
}
but that gives me an error that I do not understand at all:
[info] Loading project definition from /Users/dev/project
/Users/dev/build.sbt:1: error: type mismatch;
found : Unit
required: sbt.internal.DslEntry
if (isSnapshot.value) {
^
Past experience dictates that redefining publish to conditionally call the original version won't work, as
publish := {
if (!isSnapshot.value) publish.value
}
gives warnings that the task is always evaluated.
Is there a way to do this?
The problem with this code is that it evaluates publish.value regardless of the if structure. I recommend reading the documentation on task dependencies. If you want to "delay" the evaluation of a task in one of the if branches, you need to use dynamic task definition:
publish := Def.taskDyn {
if (isSnapshot.value)
Def.task {} // doing nothing
else
Def.task { publish.value } // could be written as just publish
}.value
But apart from fixing your code, you should be aware that there is a special setting for the functionality you want, it's called skip:
publish/skip := isSnapshot.value
Another thing to notice, is the scoping. If you want to override docker:publish, which is the same as Docker/publish in the new syntax, you should add this Docker/ scope prefix to every mention of publish in the code above.

Override project's setting inside of SBT task

In my build.sbt a compilation phase depends on running scapegoat inspection
(compile in Compile) := (compile in Compile).dependsOn(scapegoat).value
I'm trying to introduce a new task for running tests (for development purposes to speed things up) that does not depend on scapegoat like this:
lazy val fastTests = taskKey[Unit]("")
fastTests := {
scapegoat in Compile := {}
(test in Test).value
}
but gets ignored
You cannot do it with a task because tasks cannot change settings. You can solve it either with different configurations or with a command (which can change settings). See for example:
How to disable “Slow” tagged Scalatests by default, allow execution with option?
(for the configurations approach)
How to change setting inside SBT command?

How to publish an artifact with pom-packaging in SBT?

I have a multi-project build in SBT where some projects should aggregate dependencies and contain no code. So then clients could depend on these projects as a single dependency instead of directly depending on all of their aggregated dependencies. With Maven, this is a common pattern, e.g. when using Spring Boot.
In SBT, I figured I can suppress the generation of the empty artifacts by adding this setting to these projects:
packagedArtifacts := Classpaths.packaged(Seq(makePom)).value
However, the makePom task writes <packaging>jar</packaging> in the generated POM. But now that there is no JAR anymore, this should read <packaging>pom</packaging> instead.
How can I do this?
This question is a bit old, but I just came across the same issue and found a solution. The original answer does point to the right page where this info can be found, but here is an example. It uses the pomPostProcess setting to transform the generated POM right before it is written to disk. Essentially, we loop over all the XML nodes, looking for the element we care about and then rewrite it.
import scala.xml.{Node => XmlNode, NodeSeq => XmlNodeSeq, _}
import scala.xml.transform._
pomPostProcess := { node: XmlNode =>
val rule = new RewriteRule {
override def transform(n: XmlNode): XmlNodeSeq = n match {
case e: Elem if e != null && e.label == "packaging" =>
<packaging>pom</packaging>
case _ => n
}
}
new RuleTransformer(rule).transform(node).head
},
Maybe you could modify the result pom as described here: Modifying the generated POM
You can disable publishing the default artifacts of JAR, sources, and docs, then opt in explicitly to publishing the POM. sbt produces and publishes a POM only, with <packaging>pom</packaging>.
// This project has no sources, I want <packaging>pom</pom> with dependencies
lazy val bundle = project
.dependsOn(moduleA, moduleB)
.settings(
publishArtifact := false, // Disable jar, sources, docs
publishArtifact in makePom := true,
)
lazy val moduleA = project
lazy val moduleB = project
lazy val moduleC = project
Run sbt bundle/publishM2 to verify the POM in ~/.m2/repository.
I dare say this is almost intuitive, a rare moment of pleasant surprise with sbt 😅
I confirmed this with current sbt 1.3.9, and 1.0.1, the oldest launcher I happen to have installed on my machine.
The Artifacts page in the reference docs may be helpful, perhaps this trick should be added there.

Different mappings for stage and tarball

I want to produce different log configuration for my stage app and on the tarball that I'd use to distribute my application
I have a task to generate the configuration and want it to be called with different parameters on doing stage and packageZipTar. This is my config
mappings in Universal in stage += {
val f = generateLoggingConfigTask(LogType.ConsoleAndFiles).value
f -> ("conf/" + f.getName)
},
mappings in Universal in packageZipTarball += {
val f = generateLoggingConfigTask(LogType.Files).value
f -> ("conf/" + f.getName)
},
The first task fires only when doing stage but on packageZipTarball I get both tasks running, furthermore they run in an unpredictable order so sometimes I have one config sometimes another
Any hints on how to proceed?
The issue is that packageZipTarball depends on stage, which is how native-packager works. stage always creates a "ready to build" directory, which the packaging format uses to create the package. Diverging these two, would be confusing and inconsistent with native-packagers behavior.
I would recommend one of the following options
1. Create your app with stage and pass the configuration explicitly with a cli parameter
2. Create a bash script that adds the logging configuration parameter explicitly and you call this to start your app

Setting Logback Configuration file Programmatically

I'm using the sbt run command to run my project. My project uses the Logback logging mechanism and if I would like to enable logging, then I have to use the following command:
sbt -Dlogback.configurationFile=/path/to/log/file/app-logger.xml run
Is there a way that I could set this programmatically? I mean I would like to just say
sbt run
and it picks up automagically the app-logger.xml by itself via the application.
This is how I do it!
def loadLogger() = Option(System.getProperty("logback.configurationFile")) match {
case Some(logXml) =>
logger.info(s"using logger $logXml")
case None =>
val path = s"${System.getProperty("user.dir")}/conf/app-logger.xml"
System.setProperty("logback.configurationFile", path)
logger.info(s"using logger $path")
}

Resources