Are the locations intentional? I have an issue where sbt always seems to fetch dependency externally rather than from local. When external resolvers are removed, the dependency cannot be resolved because the path .ivy2/local does not exist.
Related
I am have this Exception please help me!
"Error occurred during initialization of boot layer
java.lang.module.Find Exception: Module test not found"
But i write VM option "--module-path "D:\UT java\javafx-sdk-17.0.1\lib" --add-modules javafx.controls,javafx.fxml"
and i have module-info.java "
requires javafx.fxml;
requires javafx.controls;
requires javafx.graphics;
requires java.sql;
requires java.desktop;
requires jdk.jfr;"
i add my sdk. And if i create javafx demo project and execute him it work. and if i start change fxml file and change controller i have this exception.
I have IntellIJIdea 2021, javafx-sdk-17.0.1, jdbc jr 8,11,16
Steps to fix:
Delete the JavaFX sdk (you don’t need it).
Delete old Java versions (they are obsolete).
Update your IntelliJ IDE and IDE plugins to the most recent release, 2021.3.2+.
Create a new JavaFX project using JDK and JavaFX 17.0.2+.
Select Maven for the build system unless you know and prefer Gradle.
Do not set VM arguments, you don’t need them.
Adding modules via the --add-modules VM arguments is unnecessary when you have a valid module-info.java file.
The --module-path is still required so that the modules can be found, but Idea will provide the correct path for your modules automatically when it recognizes the modules through your Maven dependencies.
So you don't need to explicitly define the --module-path VM argument yourself for a Maven based build (that would be difficult to do anyway because the modules are all downloaded to different directories in your local maven repository).
Test it works following the Idea create new JavaFX project execution instructions.
Add additional modules one at a time by adding their maven dependency to pom.xml and the requires clause to module-info.java.
Ensure you synchronize the Maven and Idea projects between each
addition.
See, for example, this question on correctly adding the javafx.media module.
Adding other modules such as javafx.web, javafx.fxml or javafx.swing follows a similar pattern.
Test between each addition by building and running the project, to ensure you haven’t broken anything.
Copy your original source code into the appropriate package directories under the new project source directory:
src/main/java
Place resources in:
src/main/resources
following the Eden resource location guide.
Fix any errors, ensure everything compiles and runs, then test it.
I am developing an application on opendaylight Carbon (based on Karaf). I need to use a library (specifically dnsjava) in my bundle. How do I go about including this?
I tried the following which did not work:
In my features/pom.xml, I included a mvn dependency for my jar file.
In my features/src/main/features/features.xml, I added a bundle:
<bundle>wrap:mvn:dnsjava/dnsjava/${dnsjava.version}</bundle>
However, I still have an error when I go to start my feature:
Error executing command: Error executing command on bundles:
Unable to execute command on bundle 278: The bundle "gov.nist.sdnmud.impl_0.1.0.SNAPSHOT [278]" could not be resolved. Reason: Missing Constraint: Import-Package: org.xbill.DNS; version="[2.1.0,3.0.0)"
Thanks for any help.
I'm not an expert, but if the artifact doesn't have OSGi properties in the jar, which is likely why you've added the "wrap" prefix, then you have to manually set the required OSGi properties on the features.xml dependency line, in an odd microformat syntax.
In our environment, we have to do something like this:
wrap:mvn:<group>/<artifact>/<version>$Bundle-SymbolicName=<bundlename>&Bundle-Version=<version>
This issue doesn't have anything to do with opendaylight.
I publish my project to nexus/sonatype staging repo. Then I try to pull that dependency with sbt update on another project.
I get an unresolved dependency error. sbt shows me all the directories it tried, including the path for the staging repo I expected to succeed.
I initially thought it was a deployment/delay thing with nexus (waiting for the package to propagate) but I can reliably hit the same pom file link that sbt says failed from a browsers with no problems (cut 'n paste the precise URL).
Does sbt have some kind of resolution cache not cleared by a clean?
If the pom can be downloaded from Nexus then the issue isn't with Nexus, it is with SBT. You might try explicitly running update from SBT to clear it's cache.
How will I download all the robolectric dependent jars, to avoid runtime downloading and make it offline? I need to use Robolectric.buildActivity(), which is part of 2.x.x versions.
any idea on this ?
Starting with Robolectric 2.4 they added two system properties to allow you to tell the Robolectric test runner to use local copies of the dependencies. See the Configuring Robolectric page.
The settings are:
robolectric.offline - Set to true to disable runtime fetching of jars
robolectric.dependency.dir - When in offline mode, specifies a folder containing runtime dependencies
One way to figure out which files you need to copy to the dependencyDir, is to run gradlew testDebug -i (or maybe with -d) and watch the output to see which jars are being downloaded at runtime. Then copy them to a known location on your build machine. (Another way to see which files you need, is to look at SdkConfig.java and get the dependency jars mentioned there along with their dependencies.)
For the current Robolectric 3.0-rc2, these are the files it needs:
accessibility-test-framework-1.0.jar
android-all-5.0.0_r2-robolectric-1.jar
icu4j-53.1.jar
json-20080701.jar
robolectric-annotations-3.0-rc2.jar
robolectric-resources-3.0-rc2.jar
robolectric-utils-3.0-rc2.jar
shadows-core-3.0-rc2.jar
sqlite4java-0.282.jar
tagsoup-1.2.jar
vtd-xml-2.11.jar
Copy these files to a known location, like say /home/jenkins/robolectric-files/, and then edit your build.gradle with something like this:
afterEvaluate {
project.tasks.withType(Test) {
systemProperties.put('robolectric.offline', 'true')
systemProperties.put('robolectric.dependency.dir', '/home/jenkins/robolectric-files/')
}
}
Here is how I solved it for org.robolectric:robolectric:3.0
https://gist.github.com/kotucz/60ae91767dc71ab7b444
I downloads the runtime dependencies into the build folder and configures the tests to use it - see setting the system properties.
I had this issue too, and found the cause to be the org.robolectric.Testrunner creating a org.robolectric.MavenCentral object, which declares a Maven repository using an Internet-url (Robolectric 2.3-release). Offline builds will not be able to access that url.
In my case I'm required to use a Maven repository proxy, so I replaced the url pointing to http://oss.sonatype.org with my local Maven repository proxy. That meant subclassing RobolectricTestRunner to org.robolectric.MyRobolectricTestRunner, and creating a custom MavenCentral object for it to use, and overriding the methods where RobolectricTestRunner references its private MAVEN_CENTRAL object.
The source code for RobolectricTestRunner and MavenCentral are available on Robolectric's Github page.
I used Robolectric version 3.0, and the dependency jars were downloaded from my repository, instead of sonatype.
I'm getting an error when running SBT, which I don't know where it originates from:
[info] Set current project to root (in build file:/home/dcs/.sbt/plugins/)
[warn] Potentially incompatible versions specified:
[warn] org.scala-tools.sbt: 0.10.1, 0.10.0
The JAR file for the sbt laucher is version's 0.10.1. The error happens even outside projects, such as when running screpl.
How do I fix it?
Do you have a build.sbt in /home/dcs/.sbt/build.properties that sets a particular sbt.version?
If not, you may have a global plugin installed that requires 0.10.0 as mentioned at https://groups.google.com/forum/#!topic/simple-build-tool/YoXd0Tp_cjo/discussion. The solution there was to wipe the global .sbt directory (your /home/dcs/.sbt).