How to show all config scope axis? - sbt

From the document of sbt, I see there are 3 scope axis in sbt:
project
config
task
For project and task, I can use command:
projects
tasks
to see the list of them of the project.
But how to see the configs?

It'd be ivyConfigurations.
> help ivyConfigurations
The defined configurations for dependency management. This may be different
from the configurations for Project settings.
> ivyConfigurations
[info] List(compile, runtime, test, provided, optional, compile-internal,
runtime-internal, test-internal, plugin, sources, docs, pom, scala-tool)

Related

opendaylight: using external jar file

I am developing an application on opendaylight Carbon (based on Karaf). I need to use a library (specifically dnsjava) in my bundle. How do I go about including this?
I tried the following which did not work:
In my features/pom.xml, I included a mvn dependency for my jar file.
In my features/src/main/features/features.xml, I added a bundle:
<bundle>wrap:mvn:dnsjava/dnsjava/${dnsjava.version}</bundle>
However, I still have an error when I go to start my feature:
Error executing command: Error executing command on bundles:
Unable to execute command on bundle 278: The bundle "gov.nist.sdnmud.impl_0.1.0.SNAPSHOT [278]" could not be resolved. Reason: Missing Constraint: Import-Package: org.xbill.DNS; version="[2.1.0,3.0.0)"
Thanks for any help.
I'm not an expert, but if the artifact doesn't have OSGi properties in the jar, which is likely why you've added the "wrap" prefix, then you have to manually set the required OSGi properties on the features.xml dependency line, in an odd microformat syntax.
In our environment, we have to do something like this:
wrap:mvn:<group>/<artifact>/<version>$Bundle-SymbolicName=<bundlename>&Bundle-Version=<version>
This issue doesn't have anything to do with opendaylight.

How to run sbt Revolver in test scope?

In a Akka project we're using the SBT Revolver plugin to run the application.
During development it would be useful if it would be possible to run the application in test scope so log- and application configuration get loaded which helps during development.
However, running 'sbt test:re-start' does not seems to use the test classpath and therefore does not run the correct application and does not use the correct configuration files.
Looking at the Revolver page it looks like the plugin creates it's own scope.
Does anyone know how to be able to use the test scope for running the Revolver plugin?
Try to configure the fullClasspath setting of revolver and add the Test classpath to it:
fullClasspath in Test in reStart <<= Classpaths.concatDistinct(fullClasspath in Test, fullClasspath in Runtime)

Is it possible to add a resolver to an SBT project's build via an AutoPlugin?

I am creating an AutoPlugin that wraps a non-auto plugin flywaydb. Unfortunately, the non-auto plugin requires a custom resolver. When I publish our autoplugin, the resolver is not used in the client project's meta build, causing SBT to fail to start with a big stack trace that begins with:
sbt.ResolveException: unresolved dependency: org.flywaydb#flyway-sbt;3.2.1: not found
I did not catch this locally because I had the flyway artifacts cached in ~/.ivy/cache/scala_2.10/sbt_0.13/org.flywaydb/
The error is obviously due to SBT not using the custom flyway resolver when loading the build with the AutoPlugin enabled. My question, is there a way to add a meta build resolver via an AutoPlugin setting? Or is it required that all plugin dependencies must be resolvable via the default SBT resolvers?
A secondary question (could be the real issue): could it be an SBT bug that the client project's meta build does not transitively depend on dependencies of enabled plugins?
This project is OSS. Links to code in case it is helpful:
Here is where the plugin project adds the resolver:
https://github.com/allenai/sbt-plugins/blob/a3ea78319836fd39cc8f2e13305e85bb9bfef5c7/build.sbt#L44-L45
Here is the auto plugin:
https://github.com/allenai/sbt-plugins/blob/a3ea78319836fd39cc8f2e13305e85bb9bfef5c7/src/main/scala/org/allenai/plugins/DatabasePlugin.scala
I found out that I was pointing to the wrong resolver, which now is
resolvers += "Flyway" at "https://flywaydb.org/repo"
Unfortunately your build user still needs to put in the resolver setting inside project/*.sbt.

SBT Plugin to modify compile and test tasks

I am in the process of creating a plugin that will modify both the compile:compile and test:test tasks. My ultimate aim is to be able to do sbt monkjack or sbt monkjack:test (either is fine). In the compile:compile scope I need to add a compiler plugin, and in the test:test scope I need to run some code after the tests have finished.
My first attempts were around trying to create a custom configuration but which to extend, compile or test, was unclear as both are needed (At the moment I have two, and I copy the CustomTest into the CustomCompile and then run monkjack:test). My second attempts were focusing on a custom task that in turn invoked (compile in Compile).value and (test in Test).value after setting various options.
I realize my knowledge of SBT tasks and how they are related/inherited/scoped is not great.
Q1. Is there a chain of tasks like in maven? In maven if you execute test, it will execute the other phases in order. So mvn clean test will automatically run prepare-sources, compile, etc etc. So in SBT if I run sbt test how are the other tasks automatically executed.
Q2. If you execute a task with a custom config, eg sbt millertime:test will that config propagate to the other tasks that run. Eg, is this the same as sbt monkjack:compile monkjack:test or the same as sbt compile monkjack:test or neither :)
Q3. How do tasks know which is their default config? If I do sbt compile how does SBT know that means sbt compile:compile?
Q4. Which is the best way to go here, a custom configuration or a new task.

Add JavaFX (of JDK8) in sbt (using the play framework)

The goal is to have a standalone Play Framework (2.2) application having an additional status window open containing some javafx (javafx-8) elements.
Since JavaFX classes are now on the default runtime classpath for an Oracle Java 8 implementation using javafx.* in my classes and compiling with sbt should just be fine.
However sbt can't find these classes and quits with
play.api.UnexpectedException: Unexpected exception[NoClassDefFoundError: javafx/application/Application]
when executing
..\path-to-play-framework-2.2\play project run
The best way to fix this problem seems to be the modification of build.sbt in the project directory. What can I do to add the missing (class) path?
Sadly JavaFX doesn't link that easily to an sbt build. You need to set your JAVA_HOME environment variable and do modifications to your build file.
Here I have a repository where this is set up. The important bit if you are using an sbt build rather than a scala build is this one:
unmanagedJars in Compile += Attributed.blank(
file(System.getenv("JAVA_HOME") + "/jre/lib/jfxrt.jar")),
fork in run := true
The reason for this is that jfxrt.jar is the archive containing the JavaFX runtime and it is not included in the classpath of an sbt project by default.
Anotherway is to set the Classpath for sbt. This can be done on the machines which can't resolve JavaFX.
SBT_OPTS="-Xbootclasspath/p:/usr/share/java/openjfx/jre/lib/ext/jfxrt.jar"

Resources