I'm creating an sbt project I'd like to deploy to heroku. The build pack for heroku uses the 'stage' task to compile the project. For my project I want to set the 'mainClass' that gets built into the unix executable that 'stage' creates so that heroku starts things correctly. I've found plenty of help on how to set the main class for the 'run' and 'package' tasks and that works great for me. I just can't find any directions on how to set it so that the 'stage' task picks it up. Is there any documentation on this that someone could point me to? (I'm using sbt version 0.13.5)
Thanks,
Ryan
It probably uses mainClass in Compile in stage (guessing stage is from the sbt-native packager).
As for the documentation, it mentions mainClass once: http://www.scala-sbt.org/sbt-native-packager/DetailedTopics/archetypes.html
Note that mainClass is equal to mainClass in Compile for most builds as compile is set as the default configuration.
Related
I am have this Exception please help me!
"Error occurred during initialization of boot layer
java.lang.module.Find Exception: Module test not found"
But i write VM option "--module-path "D:\UT java\javafx-sdk-17.0.1\lib" --add-modules javafx.controls,javafx.fxml"
and i have module-info.java "
requires javafx.fxml;
requires javafx.controls;
requires javafx.graphics;
requires java.sql;
requires java.desktop;
requires jdk.jfr;"
i add my sdk. And if i create javafx demo project and execute him it work. and if i start change fxml file and change controller i have this exception.
I have IntellIJIdea 2021, javafx-sdk-17.0.1, jdbc jr 8,11,16
Steps to fix:
Delete the JavaFX sdk (you don’t need it).
Delete old Java versions (they are obsolete).
Update your IntelliJ IDE and IDE plugins to the most recent release, 2021.3.2+.
Create a new JavaFX project using JDK and JavaFX 17.0.2+.
Select Maven for the build system unless you know and prefer Gradle.
Do not set VM arguments, you don’t need them.
Adding modules via the --add-modules VM arguments is unnecessary when you have a valid module-info.java file.
The --module-path is still required so that the modules can be found, but Idea will provide the correct path for your modules automatically when it recognizes the modules through your Maven dependencies.
So you don't need to explicitly define the --module-path VM argument yourself for a Maven based build (that would be difficult to do anyway because the modules are all downloaded to different directories in your local maven repository).
Test it works following the Idea create new JavaFX project execution instructions.
Add additional modules one at a time by adding their maven dependency to pom.xml and the requires clause to module-info.java.
Ensure you synchronize the Maven and Idea projects between each
addition.
See, for example, this question on correctly adding the javafx.media module.
Adding other modules such as javafx.web, javafx.fxml or javafx.swing follows a similar pattern.
Test between each addition by building and running the project, to ensure you haven’t broken anything.
Copy your original source code into the appropriate package directories under the new project source directory:
src/main/java
Place resources in:
src/main/resources
following the Eden resource location guide.
Fix any errors, ensure everything compiles and runs, then test it.
So, there I'm writing some library and desided to have a proper unit-testing for it (like TDD and so on). QtTest framework looked suitable to start with. The library itself is fine, so is the test.
But when I added both the library and the test project to my job project, CI build surprisingly failed. It occured that the test executable (LibraryTest.exe or similar on Linux) was being copied:
to %QTDIR%/tests in a separate folder on Windows
to /usr/tests on Linux
My test project settings added this behavior to the "install" build stage, there they are (important ones):
QT += core testlib
# The problem is below
CONFIG += c++11 qt warn_on depend_includepath #testcase
CONFIG -= app_bundle
LIBS *= -L$$PWD -lmylibrary # not exact, does not matter
TARGET = LibraryTest
SOURCES += \
tst_my_library_test.cpp
DEFINES *= QT_FORCE_ASSERTS
DESTDIR = $$PWD/bin
As you can see, after commenting out CONFIG += testcase the executable is not copied somewhere anymore. I've read that this configuration option is used for automated tests, that looks useful, but nothing is written about any special install stage. The test executable exists in the DESTDIR just fine, so it is not some accident error.
My question is: why is it happening? Can I specify some other folder?
Automation is usefull, but even if implemented it would be probably bound to some more convenient directory.
Am I cooking the QtTest wrong? Thanks in advance for your attention.
Okay, after hacking the tests in my own way and getting enough time to study the problem thoroughly it occured that Qt's tests are designed to be used in a quite another way than I've thought.
It was obvious, that the testcase's makefile differs from the regular one, but the official documentation just stated the following:
For testcase projects, qmake will insert a check target into the generated Makefile. This target will run the application. The test is considered to pass if it terminates with an exit code equal to zero.
This gave a hint, what exactly resulted in makefile having extra install_target: first FORCE with strange and wrong file copying, but didn't explain the behavior deeper.
After some more seraching I found the following here:
Note also that Qt tests have only been tested with a non-installing Qt (the -prefix $PWD option above). The test project files override the make install target, so they are not installable. And Qt doesn't work at all if it's not at its installation path.
As far as my project uses heavilly the install build step and the tests were part of the project tree, it explained the problem.
In a Akka project we're using the SBT Revolver plugin to run the application.
During development it would be useful if it would be possible to run the application in test scope so log- and application configuration get loaded which helps during development.
However, running 'sbt test:re-start' does not seems to use the test classpath and therefore does not run the correct application and does not use the correct configuration files.
Looking at the Revolver page it looks like the plugin creates it's own scope.
Does anyone know how to be able to use the test scope for running the Revolver plugin?
Try to configure the fullClasspath setting of revolver and add the Test classpath to it:
fullClasspath in Test in reStart <<= Classpaths.concatDistinct(fullClasspath in Test, fullClasspath in Runtime)
The goal is to have a standalone Play Framework (2.2) application having an additional status window open containing some javafx (javafx-8) elements.
Since JavaFX classes are now on the default runtime classpath for an Oracle Java 8 implementation using javafx.* in my classes and compiling with sbt should just be fine.
However sbt can't find these classes and quits with
play.api.UnexpectedException: Unexpected exception[NoClassDefFoundError: javafx/application/Application]
when executing
..\path-to-play-framework-2.2\play project run
The best way to fix this problem seems to be the modification of build.sbt in the project directory. What can I do to add the missing (class) path?
Sadly JavaFX doesn't link that easily to an sbt build. You need to set your JAVA_HOME environment variable and do modifications to your build file.
Here I have a repository where this is set up. The important bit if you are using an sbt build rather than a scala build is this one:
unmanagedJars in Compile += Attributed.blank(
file(System.getenv("JAVA_HOME") + "/jre/lib/jfxrt.jar")),
fork in run := true
The reason for this is that jfxrt.jar is the archive containing the JavaFX runtime and it is not included in the classpath of an sbt project by default.
Anotherway is to set the Classpath for sbt. This can be done on the machines which can't resolve JavaFX.
SBT_OPTS="-Xbootclasspath/p:/usr/share/java/openjfx/jre/lib/ext/jfxrt.jar"
I'm trying to add a 3rd party jar to my java library path. If I invoke sbt with -Djava.library.path=a-3rd-party-lib.jar, then it works for the first invocation of run-main MyClass inside sbt, but thereafter the 3rd party code complains that the jar is not in the java library path. I have also tried adding javaOptions += "-Djava.library.path=a-3rd-party-lib.jar" to my build.sbt file, but this hasn't worked (even for the first run). Qualifying this command as javaOptions in (Test,run) += "-Djava.library.path=a-3rd-party-lib.jar" (as seen in the docs) hasn't worked either.
Am I doing something wrong, or is this a strange bug?
FYI I'm using sbt 0.13.0
javaOptions only takes effect if you fork run and sbt does not fork by default. See the Forking documentation for details, but forking is enabled for run and runMain with:
fork in run := true