SBT: running end-to-end tests - sbt

I'm trying to setup sbt to run end-to-end tests.
What it needs to do:
Build and run the app: sbt re-start command
Execute integration tests against it sbt e2e:test
At the moment I can do both things separately, but not together with one command.
The end result should be sbt e2eTests which executes both tasks.
I added this to my build.sbt
val EndToEndTest = config("e2e") extend(Test)
val e2eSettings =
inConfig(EndToEndTest)(Defaults.testSettings) ++
Seq(
fork in EndToEndTest := false,
parallelExecution in EndToEndTest := false,
scalaSource in EndToEndTest := baseDirectory.value / "src/e2e/scala")
plus
lazy val root =
Project("root", file("."))
.configs( EndToEndTest )
.settings( inConfig(EndToEndTest)(Defaults.testSettings) : _*)
I created a dummy test in src/e2e/scala and I can run it just fine with sbt e2e:test
I know that tasks can depend on each other, but I have no idea how I can depend on e2e:test tasks. How do I reference it in my new e2eTests task?

Related

Get scrooge to generate source files in test phase?

I have a multi module build that looks kind of like:
lazy val root = (project in file(".")).
settings(common).
aggregate(finagle_core, finagle_thrift)
lazy val finagle_core =
project.
settings(common).
settings(Seq(
name := "finagle-core",
libraryDependencies ++= Dependencies.finagle
))
lazy val finagle_thrift =
project.
settings(common).
settings(Seq(
name := "finagle-thrift",
libraryDependencies ++= Dependencies.finagleThrift,
scroogeThriftSourceFolder in Test <<= baseDirectory {
base => {
base / "target/thrift_external/"
}
},
scroogeThriftDependencies in Test := Seq(
"external-client"
),
scroogeBuildOptions in Test := Seq(
WithFinagle
)
)).dependsOn(finagle_core)
Where finagle_thrift has a dependency on a jar file external-client that has thrift files in it. I want it to extract the thrift files to target/thrift_external and compile the thrift files into a client.
This does work, however I have to execute sbt twice to get it to work. The first time I run sbt, it doesn't extract the files. The second time it does. I am at a loss as to why that is happening.
==
EDIT:
I see whats happening. It does unpack the dependencies on test, however because the settings are evaluated before the unpack, the generated code doesn't get the list of files that are generated. The second time it runs, its already extracted so it picks up the thrift files
==
EDIT 2:
I solved this in a super janky way:
addCommandAlias("build", "; test:scroogeUnpackDeps; compile")
And now it gets unpacked first, then compiled
SBT resolves the scroogeThriftSourceFolder directory when it loads (before running the tasks) at which point the external files are not there yet.
Performing a reload will make it discover the downloaded files:
sbt scroogeUnpackDeps reload compile

How to avoid activator executing compile task twice upon accessing Play page?

I am trying to run a custom task before compilation of a Play 2.3 application. I have this in my build.sbt file:
lazy val helloTask = TaskKey[Unit]("hello", "hello")
helloTask := {
println("hello test")
}
(compile in Compile) <<= (compile in Compile) dependsOn helloTask
When I run activator ~run and then open a page in the browser, I get the following output:
C:\Development\test>activator ~run
[info] Loading project definition from C:\Development\test\project
[info] Set current project to play (in build file:/C:/Development/test/)
--- (Running the application from SBT, auto-reloading is enabled) ---
[info] play - Listening for HTTP on /0:0:0:0:0:0:0:0:9000
(Server started, use Ctrl+D to stop and go back to the console...)
hello test
[success] Compiled in 418ms
hello test
hello test
[info] play - Application started (Dev)
It seems my custom task is running three times. Is there a way I can avoid this?
I had the same problem and I found solution.
In Sbt you have three Scopes by configuration axis :
Compile which defines the main build (src/main/scala).
Test which defines how to build tests (src/test/scala).
Runtime which defines the classpath for the run task.
You must use Runtime instead of Compile. It should looks like this:
lazy val helloTask = taskKey[Unit]("hello")
helloTask := println("hello test")
(compile in Runtime) <<= (compile in Runtime) dependsOn helloTask
This was the first result on google so I would like to post my current solution to the problem which actually works with play 2.8 and a multi project build. It is slightly modified. The proposed solution by #bartholomaios results in a compile loop for me.
lazy val helloTask = taskKey[Unit]("hello")
helloTask := println("hello test")
lazy val module1: Project = (project in file("modules/module1"))
# Run a task before sbt module1/run
((module1 / run) in Compile) := (((module1 / run) in Compile) dependsOn Compile / helloTask).evaluated
# Run a task before sbt module1/docker:stage
((module1 / stage) in Docker) := (((module1 / stage) in Docker) dependsOn Compile / helloTask).value

Excluding dependency transitively for assembly only? [duplicate]

Here's an example build.sbt:
import AssemblyKeys._
assemblySettings
buildInfoSettings
net.virtualvoid.sbt.graph.Plugin.graphSettings
name := "scala-app-template"
version := "0.1"
scalaVersion := "2.9.3"
val FunnyRuntime = config("funnyruntime") extend(Compile)
libraryDependencies += "org.spark-project" %% "spark-core" % "0.7.3" % "provided"
sourceGenerators in Compile <+= buildInfo
buildInfoPackage := "com.psnively"
buildInfoKeys := Seq[BuildInfoKey](name, version, scalaVersion, target)
assembleArtifact in packageScala := false
val root = project.in(file(".")).
configs(FunnyRuntime).
settings(inConfig(FunnyRuntime)(Classpaths.configSettings ++ baseAssemblySettings ++ Seq(
libraryDependencies += "org.spark-project" %% "spark-core" % "0.7.3" % "funnyruntime"
)): _*)
The goal is to have spark-core "provided" so it and its dependencies are not included in the assembly artifact, but to reinclude them on the runtime classpath for the run- and test-related tasks.
It seems that using a custom scope will ultimately be helpful, but I'm stymied on how to actually cause the default/global run/test tasks to use the custom libraryDependencies and hopefully override the default. I've tried things including:
(run in Global) := (run in FunnyRuntime)
and the like to no avail.
To summarize: this feels essentially a generalization of the web case, where the servlet-api is in "provided" scope, and run/test tasks generally fork a servlet container that really does provide the servlet-api to the running code. The only difference here is that I'm not forking off a separate JVM/environment; I just want to manually augment those tasks' classpaths, effectively "undoing" the "provided" scope, but in a way that continues to exclude the dependency from the assembly artifact.
For a similar case I used in assembly.sbt:
run in Compile <<= Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run))
and now the 'run' task uses all the libraries, including the ones marked with "provided". No further change was necessary.
Update:
#rob solution seems to be the only one working on latest SBT version, just add to settings in build.sbt:
run in Compile := Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run)).evaluated,
runMain in Compile := Defaults.runMainTask(fullClasspath in Compile, runner in(Compile, run)).evaluated
Adding to #douglaz' answer,
runMain in Compile <<= Defaults.runMainTask(fullClasspath in Compile, runner in (Compile, run))
is the corresponding fix for the runMain task.
Another option is to create separate sbt projects for assembly vs run/test. This allows you to run sbt assemblyProj/assembly to build a fat jar for deploying with spark-submit, as well as sbt runTestProj/run for running directly via sbt with Spark embedded. As added benefits, runTestProj will work without modification in IntelliJ, and a separate main class can be defined for each project in order to e.g. specify the spark master in code when running with sbt.
val sparkDep = "org.apache.spark" %% "spark-core" % sparkVersion
val commonSettings = Seq(
name := "Project",
libraryDependencies ++= Seq(...) // Common deps
)
// Project for running via spark-submit
lazy val assemblyProj = (project in file("proj-dir"))
.settings(
commonSettings,
assembly / mainClass := Some("com.example.Main"),
libraryDependencies += sparkDep % "provided"
)
// Project for running via sbt with embedded spark
lazy val runTestProj = (project in file("proj-dir"))
.settings(
// Projects' target dirs can't overlap
target := target.value.toPath.resolveSibling("target-runtest").toFile,
commonSettings,
// If separate main file needed, e.g. for specifying spark master in code
Compile / run / mainClass := Some("com.example.RunMain"),
libraryDependencies += sparkDep
)
If you use sbt-revolver plugin, here is a solution for its "reStart" task:
fullClasspath in Revolver.reStart <<= fullClasspath in Compile
UPD: for sbt-1.0 you may use the new assignment form:
fullClasspath in Revolver.reStart := (fullClasspath in Compile).value

Inter-module dependencies dependent on config possible in SBT?

I'm attempting to scope a dependency to a module in the same project using SBT's configurations.
In production, this dependency is satisfied by a jar on the classpath, but during dev it would be nice to do server/config-a:run or server/config-b:run to select the dependency manually.
Currently, I have something like this:
lazy val configA = config("config-a") extend Runtime
lazy val configB = config("config-b") extend Runtime
lazy val DevConfigA = Project(id = "dev-config-a", base = file("dev-config-a"))
lazy val DevConfigB = Project(id = "dev-config-b", base = file("dev-config-b"))
lazy val server = Project(id = "server",
base = file("server"),
dependencies = Seq(common))
.configs(configA, configB)
.dependsOn(DevConfigA % configA, DevConfigB % configB)
DevConfigA and DevConfigB bring in resources used for configuration. We want exactly one of them to be loaded. The goal is that server/config-a:run would depend on DevConfigA module, and not DevConfigB.
I had to move the configs and dependsOn out of the call to Project.apply to get it to compile. After that, the DevConfig* dependencies aren't showing up when I server/config-a:run or if I call show server/config-a:dependency-classpath.
Is there a way to make inter-module dependencies dependent on the config?
Yes, there's a way to make dependencies configuration-dependent - use libraryDependencies config-scoped.
I'm using the latest stable release of SBT.
[server]> show sbtVersion
[info] 0.13.1
Let's assume you need different versions of a library, e.g. scalaz, based upon what configuration you execute run with. As a matter of fact, you don't have to worry about the task, but the dependencies available in a given configuration, and since libraryDependencies drives it, I'm going to use it.
[server]> help libraryDependencies
Declares managed dependencies.
Here's the build.sbt that gives what you want.
build.sbt
lazy val configA = config("config-a") extend Runtime
lazy val configB = config("config-b") extend Runtime
lazy val server = project in file(".") configs(configA, configB)
val scalaz705 = "org.scalaz" %% "scalaz-core" % "7.0.5"
val scalaz710_M5 = "org.scalaz" %% "scalaz-core" % "7.1.0-M5"
libraryDependencies in configA += scalaz705
libraryDependencies in configB += scalaz710_M5
With the above build.sbt sbt lets us pick different versions of Scalaz based upon configuration.
[server]> show libraryDependencies
[info] List(org.scala-lang:scala-library:2.10.3)
[server]> show config-a:libraryDependencies
[info] List(org.scala-lang:scala-library:2.10.3, org.scalaz:scalaz-core:7.0.5)
[server]> show config-b:libraryDependencies
[info] List(org.scala-lang:scala-library:2.10.3, org.scalaz:scalaz-core:7.1.0-M5)

Passing JVM option to forked test in SBT

I am trying to use a JVM option in a forked test, which has been set externally to SBT prior to its launch. I'm also setting additional JVM options like so:
javaOptions in ThisBuild ++= List("-Xmx3072m")
to my understanding, based on the SBT documentation the JVM options provided to the SBT process should be available to the forked process:
By default, a forked process uses the same Java and Scala versions being used for the build and the working directory and JVM options of the current process.
However, I can't seem to retrieve those "external" JVM options in the forked tests, i.e. System.getProperty("foo") will always return null. Given that I am trying to pass along a password, I can't set it directly in the build file. My questions therefore are:
is there an SBT task / key to access the JVM options passed to the JVM in which SBT is running? That way I would attempt to add the key to the javaOptions
is there any other means by which to pass external Java Options to a forked test?
You may control your options with testGrouping. Below copy'n'paste from one of my projects. It properly handles hierarchical projects and root project without tests too. Options are merged from javaOptions in run and test.options file. This allow modify arguments without project reloading. That project has load time more then minute. So I use test.options to fast switch between production and debug mode with -Xrunjdwp:transport=dt_...
testGrouping in Test <<= (definedTests in Test, javaOptions in run, baseDirectory in LocalRootProject) map { (tests, javaOptions, baseDirectory) ⇒
if (tests.nonEmpty) {
val testOptionsFile = baseDirectory / "test.options"
val externalOptions = if (testOptionsFile.exists()) {
val source = scala.io.Source.fromFile(testOptionsFile)
val options = source.getLines().toIndexedSeq
source.close()
options
} else Nil
tests map { test ⇒
new Tests.Group(
name = test.name,
tests = Seq(test),
// runPolicy = Tests.InProcess)
runPolicy = Tests.SubProcess(javaOptions = javaOptions ++ externalOptions))
}
} else {
Seq(new Tests.Group(
name = "Empty",
tests = Seq(),
runPolicy = Tests.InProcess))
}
},

Resources