I am migrating from activator from 0.13.x to sbt 1.x.
I used to compile my modules like this $ activator clean compile publish-local -Dversion=1
Now, I am trying to do it with sbt since activator has been deprecated, but I can not find how I should migrate to something similar like $ sbt clean compile publish-local -Dversion=1?
Activator (the CLI part) was just a wrapper around sbt with some custom commands. So what you wrote should work the same, expect that the snake-case was deprecated in favor of the camelCase:
sbt clean compile publishLocal
If you need to pass a var to the Java runtime with -D you have to place it before any commands: sbt -Dversion=1 ....
Notice that you use batch mode to run the commands:
Running in batch mode requires JVM spinup and JIT each time, so your build will run much slower. For day-to-day coding, we recommend using the sbt shell or Continuous build and test feature described below.
To follow this recommendation, just run sbt and then enter those commands one by one. Or to run them all sequentially, enter ; clean; compile; publishLocal.
Related
VSCode, Java 11 JavaFX 18.0.2
I am trying to package my code up for distribution as a desktop app. In my case I want a fully self-contained app because of my target user's profile.
I have been through Jenkov add the Oracle docs here and here which suggest I need ant-javafx.jar. That jar file seems to have been dropped from the standard Java SDK some time around Java 7 and put into the regular JavaFX install lib folder.
It's not there in the build I have.
JavaFX seems to have gone to openjfx.io and nowhere in there can I see support for the ant packaging jar. In fact I see openjfx as a retrograde step as they are increasingly forcing everyone into paid plans (try going round and round the loop of downloading anything that doesn't require an LTS payment).
I have a suspicion that there is some silent assumption that everyone will use something from maven or gradle, and maybe the packaging tools are buried away in one of those build tools. For historical reasons I don't use either and it should be possible to do this packaging without one of them.
So where do I get the JavaFX Ant build tasks from without having to pay someone?
I have found that the following works as an alternative with Java 19 and OpenJFX 19. I use the maven-dependency-plugin to copy all the dependency jars (excluding JavaFX, which I use as modules from a "full" JDK [one that includes JavaFX)] into the target/lib directory.
#!/bin/bash
set -o errexit
set -o noclobber
set -o xtrace
# find dependency modules of required modules
DEP_MODS=$(jdeps -quiet --class-path "target/lib/*" --add-modules java.base,java.logging,java.sql,javafx.controls,javafx.fxml --multi-release base --ignore-missing-deps --print-module-deps target/myapp-4.0-beta.jar)
# create a modular runtime image
jlink --compress=1 --no-header-files --no-man-pages --add-modules "java.logging,java.sql,javafx.controls,javafx.fxml,$DEP_MODS" --output target/myapp-4.0-beta
# Example of running it out of the runtime image
# TEST target/myapp-4.0-beta/bin/java -cp "../../myapp-4.0-beta.jar:../../lib/*" org.myapp.App
# symlink to the artifact jar from the lib directory
$(cd target/lib && ln -s ../myapp-4.0-beta.jar)
# use the lib directory and modular runtime image as input to jpackage
jpackage --input target/lib --runtime-image target/myapp-4.0-beta --main-jar myapp-4.0-beta.jar --main-class org.myapp.App --type app-image --app-version 4.0 --name app --dest target/dist/bundle --mac-entitlements src/dist/mac/entitlements.plist
In the following project
https://github.com/spark-jobserver/spark-jobserver,
we recently upgraded from 0.13.12 to 1.1.6.
In 0.13.12 we used to do sbt "testOnly *JobServerSpec" and tests cases for the specific class used to run but with 1.1.6, it is not the case anymore.
To reproduce the issue,
NOTE: If you get a Not in Gzip format exception, then in the project root, execute rm -rf **/target
clone the project, currently it is using 1.1.6.
git checkout d7e231ea4ee9981e49b411d09f132e396c901b98 will switch to a point before 1.1.6 was introduce
Execute sbt "testOnly *JobServerSpec", tests should be running
git checkout master to switch back to version with sbt 1.1.6
Execute sbt "testOnly *JobServerSpec"
It should not run any tests.
I was wondering if SBT has something similar to the Gradle Wrapper?
I would like to use it on a CI server without having to install SBT first.
The documentation mentions a sbt-launcher, but that seems to be geared towards running actual application instead of builds.
Yes, sbt-extras is a bash script that you can commit to your repository to act like the gradle wrapper.
The sbt-extras project is centered around a stand-alone script called sbt which can be directly used to run sbt without having it on the machine first.
The script has logic to determine the proper version of sbt for the project, download the correct version of the sbt jar, and then run the tasks through sbt.
If you copy the sbt script into your project, you can simply call it — from your CI server, locally, or wherever — to run sbt tasks without needing sbt installed separately.
I am in the process of creating a plugin that will modify both the compile:compile and test:test tasks. My ultimate aim is to be able to do sbt monkjack or sbt monkjack:test (either is fine). In the compile:compile scope I need to add a compiler plugin, and in the test:test scope I need to run some code after the tests have finished.
My first attempts were around trying to create a custom configuration but which to extend, compile or test, was unclear as both are needed (At the moment I have two, and I copy the CustomTest into the CustomCompile and then run monkjack:test). My second attempts were focusing on a custom task that in turn invoked (compile in Compile).value and (test in Test).value after setting various options.
I realize my knowledge of SBT tasks and how they are related/inherited/scoped is not great.
Q1. Is there a chain of tasks like in maven? In maven if you execute test, it will execute the other phases in order. So mvn clean test will automatically run prepare-sources, compile, etc etc. So in SBT if I run sbt test how are the other tasks automatically executed.
Q2. If you execute a task with a custom config, eg sbt millertime:test will that config propagate to the other tasks that run. Eg, is this the same as sbt monkjack:compile monkjack:test or the same as sbt compile monkjack:test or neither :)
Q3. How do tasks know which is their default config? If I do sbt compile how does SBT know that means sbt compile:compile?
Q4. Which is the best way to go here, a custom configuration or a new task.
I have a multi-module build, and would like to run tests for different sub-projects independently.
Is there a way to do this in sbt, e.g. if my multi-project build has a core and commons projects, I'd like to only run test in the commons project.
Run sbt commons/test. See detailed explanation in Scopes.
You may also use the combination of two commands from sbt - changing the current project using project and executing test afterwards.
sbt "project commons" test
You can also use
sbt "; project commons; test"
It you are running sbt in interactive mode:
> project commons
> test
You can switch back to core with:
> project core
Never mind, I came across this:
How to execute package for one submodule only on Jenkins?
sbt "project core" test
Another way to do this.
Get into SBT interactive mode
> sbt
sbt:core> commons / test
No need to switch between projects.
to run sbt test only for ONLY the submodules having added, modified on deleted files, if you use git:
while read -r proj ; do sbt "project $proj" test ; \
done < <(git status --porcelain | cut -c 3- | cut -d/ -f1
There is another way to have sbt open for a particular module.
Go to the location of the module directory and type in the "sbt" command.
sbt will then open in interactive mode for that module only