SBT "package" depend on "test" - sbt

How do I make target "package" depend on target "test" ?
There is a solution here: Force sbt 0.11 to run tests
But it doesn't really work with the xsbt-web-plugin.

Here's what I did:
Keys.`package` <<= (test in Test, Keys.`package` in Compile) map { (_, in) =>
println("after package & test")
in
}
It runs test, and iff the tests were successful, runs package task. (tested on fresh install of lift-2.5-RC2)

Related

Is it possible to disable publish without disabling publishLocal in sbt?

I have an sbt project where docker:publishLocal will create a docker image on my machine for testing, and docker:publish will publish the image to a repository and also publish jar files from the build to a repository.
If my project is a snapshot, I would like to disable publishing to the repositories, while still being able to build the local image.
ThisBuild / publishArtifact := ! isSnapshot.value
does the right thing for the publish command, but it also disables publishLocal.
I want to write something like
if (isSnapshot.value) {
publish := { }
}
but that gives me an error that I do not understand at all:
[info] Loading project definition from /Users/dev/project
/Users/dev/build.sbt:1: error: type mismatch;
found : Unit
required: sbt.internal.DslEntry
if (isSnapshot.value) {
^
Past experience dictates that redefining publish to conditionally call the original version won't work, as
publish := {
if (!isSnapshot.value) publish.value
}
gives warnings that the task is always evaluated.
Is there a way to do this?
The problem with this code is that it evaluates publish.value regardless of the if structure. I recommend reading the documentation on task dependencies. If you want to "delay" the evaluation of a task in one of the if branches, you need to use dynamic task definition:
publish := Def.taskDyn {
if (isSnapshot.value)
Def.task {} // doing nothing
else
Def.task { publish.value } // could be written as just publish
}.value
But apart from fixing your code, you should be aware that there is a special setting for the functionality you want, it's called skip:
publish/skip := isSnapshot.value
Another thing to notice, is the scoping. If you want to override docker:publish, which is the same as Docker/publish in the new syntax, you should add this Docker/ scope prefix to every mention of publish in the code above.

Making package a dependency of a new sbt task

When at the sbt CLI I can just type package and everything works fine - two jar files are produced. But I want to make package a dependency of a new task I am creating, so I want to make packaging happen as part of the build script. This is what I have:
lazy val deployTask = TaskKey[Unit]("deploy")
deployTask := { println("deploy happening now!") }
deployTask := {
(deployTask.in(file("."))).value
(Keys.`package` in Compile).value
}
My reading of the documentation tells me that Compile really means file("src/main/scala"), which is not what I want. It seems that I have to put in <Something>. What do I need to put in instead of <Something> to get package to mean what it means when I type it at the CLI?
At the CLI I should be able to:
clean
show deploy
, but unfortunately it does not do the packaging I expect.
These are the projects:
projects
[info] In file:/C:/dev/v2/atmosphere/
[info] atmosphereJS
[info] atmosphereJVM
[info] * root
So it makes sense that when I run package from the CLI the root project is used.
So another way of asking this question might be: "How do I make package work for root from the deploy task I am creating?"

ScalaJs PhantomJsEnv doesn't work when phantomjs is installed from npm

I am trying to use phantomjs as installed via npm to perform my unit tests for ScalaJS.
When I run the tests I am getting the following error:
/usr/bin/env: node: No such file or directory
I believe that is because of how phatomjs when installed with npm loads node:
Here is the first line from phantomjs:
#!/usr/bin/env node
If I change that first line to hardcode to the node executable (this involves modifying a file installed by npm so it's only a temporary solution at best):
#!/home/bjackman/cgta/opt/node/default/bin/node
Everything works.
I am using phantom.js btw because moment.js doesn't work in the NodeJSEnv.
Work Around:
After looking through the plugin source is here the workaround:
I am forwarding the environment from sbt to the PhantomJSEnv:
import scala.scalajs.sbtplugin.ScalaJSPlugin._
import scala.scalajs.sbtplugin.env.nodejs.NodeJSEnv
import scala.scalajs.sbtplugin.env.phantomjs.PhantomJSEnv
import scala.collection.JavaConverters._
val env = System.getenv().asScala.toList.map{case (k,v)=>s"$k=$v"}
olibCross.sjs.settings(
ScalaJSKeys.requiresDOM := true,
libraryDependencies += "org.webjars" % "momentjs" % "2.7.0",
ScalaJSKeys.jsDependencies += "org.webjars" % "momentjs" % "2.7.0" / "moment.js",
ScalaJSKeys.postLinkJSEnv := {
if (ScalaJSKeys.requiresDOM.value) new PhantomJSEnv(None, env)
else new NodeJSEnv
}
)
With this I am able to use moment.js in my unit tests.
UPDATE: The relevant bug in Scala.js (#865) has been fixed. This should work without a workaround.
This is indeed a bug in Scala.js (issue #865). For future reference; if you would like to modify the environment of a jsEnv, you have two options (this applies to Node.js and PhantomJS equally):
Pass in additional environment variables as argument (just like in #KingCub's example):
new PhantomJSEnv(None, env)
// env: Map[String, String]
Passed-in values will take precedence over default values.
Override getVMEnv:
protected def getVMEnv(args: RunJSArgs): Map[String, String] =
sys.env ++ additionalEnv // this is the default
This will allow you to:
Read/Modify the environment provided by super.getVMEnv
Make your environment depend on the arguments to the runJS method.
The same applies for arguments that are passed to the executable.

Specifying rootProject in build.sbt

In my Build.scala, I have:
override def rootProject = Some(frontendProject)
I'm trying to convert to the newer build.sbt format, but don't know the equivalent of this line. How do I set the project for sbt to load by default when using build.sbt?
I'm still not sure that I understood you right, but you said about multi-project build, so I assume that you want to define a root project which aggregates subprojects. Here is how you can do that (in your root build.sbt):
lazy val root = project.in( file(".") ).aggregate(subProject1, subProject2)
lazy val subProject1 = project in file("subProject1")
lazy val subProject2 = project in file("subProject2")
See sbt documentation about multi-projects.
Then if you want to set the default project to load on sbt startup to a sub-project, in addition to your answer to this SO question, I can suggest
run sbt with sbt "project XXX" shell command
or adding this line to your build.sbt:
onLoad in Global := { Command.process("project XXX", _: State) } compose (onLoad in Global).value
In both cases sbt first loads the root project and then the subproject.
I've found the following script to be helpful:
#!/usr/bin/env bash
exec "sbt" "project mysubproject" "shell"
exit $?

Passing JVM option to forked test in SBT

I am trying to use a JVM option in a forked test, which has been set externally to SBT prior to its launch. I'm also setting additional JVM options like so:
javaOptions in ThisBuild ++= List("-Xmx3072m")
to my understanding, based on the SBT documentation the JVM options provided to the SBT process should be available to the forked process:
By default, a forked process uses the same Java and Scala versions being used for the build and the working directory and JVM options of the current process.
However, I can't seem to retrieve those "external" JVM options in the forked tests, i.e. System.getProperty("foo") will always return null. Given that I am trying to pass along a password, I can't set it directly in the build file. My questions therefore are:
is there an SBT task / key to access the JVM options passed to the JVM in which SBT is running? That way I would attempt to add the key to the javaOptions
is there any other means by which to pass external Java Options to a forked test?
You may control your options with testGrouping. Below copy'n'paste from one of my projects. It properly handles hierarchical projects and root project without tests too. Options are merged from javaOptions in run and test.options file. This allow modify arguments without project reloading. That project has load time more then minute. So I use test.options to fast switch between production and debug mode with -Xrunjdwp:transport=dt_...
testGrouping in Test <<= (definedTests in Test, javaOptions in run, baseDirectory in LocalRootProject) map { (tests, javaOptions, baseDirectory) ⇒
if (tests.nonEmpty) {
val testOptionsFile = baseDirectory / "test.options"
val externalOptions = if (testOptionsFile.exists()) {
val source = scala.io.Source.fromFile(testOptionsFile)
val options = source.getLines().toIndexedSeq
source.close()
options
} else Nil
tests map { test ⇒
new Tests.Group(
name = test.name,
tests = Seq(test),
// runPolicy = Tests.InProcess)
runPolicy = Tests.SubProcess(javaOptions = javaOptions ++ externalOptions))
}
} else {
Seq(new Tests.Group(
name = "Empty",
tests = Seq(),
runPolicy = Tests.InProcess))
}
},

Resources