Plugin changing maven coordinates - artifactory

I'm trying to create a maven script that build an app and deploys to Artifactory (cloud) running from Bitbucket pipelines (cloud).
When I build locally, the artifact is stored correctly in the local repository. When the pipeline runs, it adds the repository to the Maven group id. The build then fails, I believe since it can't find it after storing it. The artifact does make it into Artifactory, just with the wrong coordinates.
What I'd expect the coordinates to look like:
<groupId>edu.dkist</groupId>
<artifactId>artifactory-hello</artifactId>
<version>1.0-20180319.211356-1</version>
But what I'm getting is
<groupId>dkistdc-snapshots.edu.dkist</groupId>
<artifactId>artifactory-hello</artifactId>
<version>1.0-20180319.211356-1</version>
dkistdc-snapshots is the name of the repository.
I'm pretty much using the examples provided, but the builds are failing. Here's a hello world app as an example.
The POM file I'm using is
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>edu.dkist</groupId>
<artifactId>artifactory-hello</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
</properties>
<distributionManagement>
<repository>
<id>central</id>
<name>dkistdc-releases</name>
<url>https://dkistdc.jfrog.io/dkistdc/java/dkistdc-releases</url>
</repository>
<snapshotRepository>
<id>snapshots</id>
<name>dkistdc-snapshots</name>
<url>https://dkistdc.jfrog.io/dkistdc/java/dkistdc-snapshots</url>
</snapshotRepository>
</distributionManagement>
<pluginRepositories>
<pluginRepository>
<id>jcenter</id>
<url>https://jcenter.bintray.com</url>
</pluginRepository>
</pluginRepositories>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>properties-maven-plugin</artifactId>
<version>1.0-alpha-2</version>
<executions>
<execution>
<phase>generate-resources</phase>
<goals>
<goal>write-project-properties</goal>
</goals>
<configuration>
<outputFile>
${project.build.outputDirectory}/dkist-producer.properties
</outputFile>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.jfrog.buildinfo</groupId>
<artifactId>artifactory-maven-plugin</artifactId>
<version>2.6.1</version>
<inherited>false</inherited>
<executions>
<execution>
<id>build-info</id>
<goals>
<goal>publish</goal>
</goals>
<configuration>
<deployProperties>
<groupId>${project.groupId}</groupId>
<artifactId>${project.artifactId}</artifactId>
<version>${project.version}</version>
</deployProperties>
<artifactory>
<includeEnvVars>true</includeEnvVars>
<timeoutSec>60</timeoutSec>
<propertiesFile>publish.properties</propertiesFile>
</artifactory>
<publisher>
<contextUrl>${url}</contextUrl>
<username>${username}</username>
<password>${password}</password>
<excludePatterns>*-tests.jar</excludePatterns>
<repoKey>dkistdc-releases</repoKey>
<snapshotRepoKey>dkistdc-snapshots</snapshotRepoKey>
</publisher>
<buildInfo>
<buildName>artifactory-hello</buildName>
<buildNumber>${buildnumber}</buildNumber>
<buildUrl>http://dkist.nso.edu</buildUrl>
</buildInfo>
<licenses>
<autoDiscover>true</autoDiscover>
<includePublishedArtifacts>false</includePublishedArtifacts>
<runChecks>true</runChecks>
<scopes>compile,runtime</scopes>
</licenses>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
The pipelines script I'm using is
image: maven:3.3-jdk-8
pipelines:
default:
- step:
caches:
- maven
script:
- chmod 777 deploy-to-artifactory.bash
- ./deploy-to-artifactory.bash
deployment: test
And the deploy-to-artifactory script is
#!/bin/bash -ex
if [ -z "$buildNumber" ]; then
buildNumber=`date +%s`
fi
mvn install -DskipTests=true
mvn test
mvn deploy -Dusername=${ARTIFACTORY_USERNAME} -Dpassword=${ARTIFACTORY_PASSWORD} -Durl=${ARTIFACTORY_CONTEXT_URL}
The errors I'm seeing in the logs are
[INFO] Artifactory Build Info Recorder: Saving Build Info to
'/opt/atlassian/pipelines/agent/build/target/build-info.json'
[INFO] Deploying artifact: https://dkistdc.jfrog.io/dkistdc/java/dkistdc-snapshots/edu/dkist/artifactory-hello/1.0-SNAPSHOT/artifactory-hello-1.0-SNAPSHOT.jar
[INFO] Response received:
[ERROR] Failed while reading the response from: PUT https://dkistdc.jfrog.io/dkistdc/java/dkistdc-snapshots/edu/dkist/artifactory-hello/1.0-SNAPSHOT/artifactory-hello-1.0-SNAPSHOT.jar.sha1;groupId=edu.dkist;artifactId=artifactory-hello;build.timestamp=1521494036708;build.name=dkist-producer;build.number=1521494036708;version=1.0-SNAPSHOT HTTP/1.1
java.io.EOFException: No content to map to Object due to end of input
at org.codehaus.jackson.map.ObjectMapper._initForReading(ObjectMapper.java:2775)
at org.codehaus.jackson.map.ObjectMapper._readValue(ObjectMapper.java:2691)
at org.codehaus.jackson.map.ObjectMapper.readValue(ObjectMapper.java:1286)
at org.codehaus.jackson.JsonParser.readValueAs(JsonParser.java:1337)
at org.jfrog.build.client.ArtifactoryHttpClient.execute(ArtifactoryHttpClient.java:209)
at org.jfrog.build.client.ArtifactoryHttpClient.upload(ArtifactoryHttpClient.java:195)
at org.jfrog.build.extractor.clientConfiguration.client.ArtifactoryBuildInfoClient.uploadChecksums(ArtifactoryBuildInfoClient.java:638)
at org.jfrog.build.extractor.clientConfiguration.client.ArtifactoryBuildInfoClient.deployArtifact(ArtifactoryBuildInfoClient.java:285)
at org.jfrog.build.extractor.maven.BuildDeploymentHelper.deployArtifacts(BuildDeploymentHelper.java:275)
at org.jfrog.build.extractor.maven.BuildDeploymentHelper.deploy(BuildDeploymentHelper.java:98)
at org.jfrog.build.extractor.maven.BuildInfoRecorder.sessionEnded(BuildInfoRecorder.java:170)
at org.apache.maven.lifecycle.internal.DefaultExecutionEventCatapult.fire(DefaultExecutionEventCatapult.java:64)
at org.apache.maven.lifecycle.internal.DefaultExecutionEventCatapult.fire(DefaultExecutionEventCatapult.java:42)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:137)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:307)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:193)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:106)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:863)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:288)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:199)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
[INFO] Response received:
[ERROR] Failed while reading the response from: PUT https://dkistdc.jfrog.io/dkistdc/java/dkistdc-snapshots/edu/dkist/artifactory-hello/1.0-SNAPSHOT/artifactory-hello-1.0-SNAPSHOT.jar.md5;groupId=edu.dkist;artifactId=artifactory-hello;build.timestamp=1521494036708;build.name=dkist-producer;build.number=1521494036708;version=1.0-SNAPSHOT HTTP/1.1
java.io.EOFException: No content to map to Object due to end of input
at org.codehaus.jackson.map.ObjectMapper._initForReading(ObjectMapper.java:2775)
at org.codehaus.jackson.map.ObjectMapper._readValue(ObjectMapper.java:2691)
at org.codehaus.jackson.map.ObjectMapper.readValue(ObjectMapper.java:1286)
at org.codehaus.jackson.JsonParser.readValueAs(JsonParser.java:1337)
at org.jfrog.build.client.ArtifactoryHttpClient.execute(ArtifactoryHttpClient.java:209)
at org.jfrog.build.client.ArtifactoryHttpClient.upload(ArtifactoryHttpClient.java:195)
at org.jfrog.build.extractor.clientConfiguration.client.ArtifactoryBuildInfoClient.uploadChecksums(ArtifactoryBuildInfoClient.java:653)
at org.jfrog.build.extractor.clientConfiguration.client.ArtifactoryBuildInfoClient.deployArtifact(ArtifactoryBuildInfoClient.java:285)
at org.jfrog.build.extractor.maven.BuildDeploymentHelper.deployArtifacts(BuildDeploymentHelper.java:275)
at org.jfrog.build.extractor.maven.BuildDeploymentHelper.deploy(BuildDeploymentHelper.java:98)
at org.jfrog.build.extractor.maven.BuildInfoRecorder.sessionEnded(BuildInfoRecorder.java:170)
at org.apache.maven.lifecycle.internal.DefaultExecutionEventCatapult.fire(DefaultExecutionEventCatapult.java:64)
at org.apache.maven.lifecycle.internal.DefaultExecutionEventCatapult.fire(DefaultExecutionEventCatapult.java:42)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:137)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:307)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:193)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:106)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:863)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:288)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:199)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
[INFO] Deploying artifact: https://dkistdc.jfrog.io/dkistdc/java/dkistdc-snapshots/edu/dkist/artifactory-hello/1.0-SNAPSHOT/artifactory-hello-1.0-SNAPSHOT.pom
[INFO] Response received:
[ERROR] Failed while reading the response from: PUT https://dkistdc.jfrog.io/dkistdc/java/dkistdc-snapshots/edu/dkist/artifactory-hello/1.0-SNAPSHOT/artifactory-hello-1.0-SNAPSHOT.pom.sha1;groupId=edu.dkist;artifactId=artifactory-hello;build.timestamp=1521494036708;build.name=dkist-producer;build.number=1521494036708;version=1.0-SNAPSHOT HTTP/1.1
java.io.EOFException: No content to map to Object due to end of input
at org.codehaus.jackson.map.ObjectMapper._initForReading(ObjectMapper.java:2775)
at org.codehaus.jackson.map.ObjectMapper._readValue(ObjectMapper.java:2691)
at org.codehaus.jackson.map.ObjectMapper.readValue(ObjectMapper.java:1286)
at org.codehaus.jackson.JsonParser.readValueAs(JsonParser.java:1337)
at org.jfrog.build.client.ArtifactoryHttpClient.execute(ArtifactoryHttpClient.java:209)
at org.jfrog.build.client.ArtifactoryHttpClient.upload(ArtifactoryHttpClient.java:195)
at org.jfrog.build.extractor.clientConfiguration.client.ArtifactoryBuildInfoClient.uploadChecksums(ArtifactoryBuildInfoClient.java:638)
at org.jfrog.build.extractor.clientConfiguration.client.ArtifactoryBuildInfoClient.deployArtifact(ArtifactoryBuildInfoClient.java:285)
at org.jfrog.build.extractor.maven.BuildDeploymentHelper.deployArtifacts(BuildDeploymentHelper.java:275)
at org.jfrog.build.extractor.maven.BuildDeploymentHelper.deploy(BuildDeploymentHelper.java:98)
at org.jfrog.build.extractor.maven.BuildInfoRecorder.sessionEnded(BuildInfoRecorder.java:170)
at org.apache.maven.lifecycle.internal.DefaultExecutionEventCatapult.fire(DefaultExecutionEventCatapult.java:64)
at org.apache.maven.lifecycle.internal.DefaultExecutionEventCatapult.fire(DefaultExecutionEventCatapult.java:42)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:137)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:307)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:193)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:106)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:863)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:288)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:199)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
[INFO] Response received:
[ERROR] Failed while reading the response from: PUT https://dkistdc.jfrog.io/dkistdc/java/dkistdc-snapshots/edu/dkist/artifactory-hello/1.0-SNAPSHOT/artifactory-hello-1.0-SNAPSHOT.pom.md5;groupId=edu.dkist;artifactId=artifactory-hello;build.timestamp=1521494036708;build.name=dkist-producer;build.number=1521494036708;version=1.0-SNAPSHOT HTTP/1.1
java.io.EOFException: No content to map to Object due to end of input
at org.codehaus.jackson.map.ObjectMapper._initForReading(ObjectMapper.java:2775)
at org.codehaus.jackson.map.ObjectMapper._readValue(ObjectMapper.java:2691)
at org.codehaus.jackson.map.ObjectMapper.readValue(ObjectMapper.java:1286)
at org.codehaus.jackson.JsonParser.readValueAs(JsonParser.java:1337)
at org.jfrog.build.client.ArtifactoryHttpClient.execute(ArtifactoryHttpClient.java:209)
at org.jfrog.build.client.ArtifactoryHttpClient.upload(ArtifactoryHttpClient.java:195)
at org.jfrog.build.extractor.clientConfiguration.client.ArtifactoryBuildInfoClient.uploadChecksums(ArtifactoryBuildInfoClient.java:653)
at org.jfrog.build.extractor.clientConfiguration.client.ArtifactoryBuildInfoClient.deployArtifact(ArtifactoryBuildInfoClient.java:285)
at org.jfrog.build.extractor.maven.BuildDeploymentHelper.deployArtifacts(BuildDeploymentHelper.java:275)
at org.jfrog.build.extractor.maven.BuildDeploymentHelper.deploy(BuildDeploymentHelper.java:98)
at org.jfrog.build.extractor.maven.BuildInfoRecorder.sessionEnded(BuildInfoRecorder.java:170)
at org.apache.maven.lifecycle.internal.DefaultExecutionEventCatapult.fire(DefaultExecutionEventCatapult.java:64)
at org.apache.maven.lifecycle.internal.DefaultExecutionEventCatapult.fire(DefaultExecutionEventCatapult.java:42)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:137)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:307)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:193)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:106)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:863)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:288)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:199)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
[INFO] Artifactory Build Info Recorder: Deploying build info ...
[ERROR] Could not build the build-info object.
org.jfrog.build.util.VersionException: There is either an incompatible or no instance of Artifactory at the provided URL.
at org.jfrog.build.extractor.clientConfiguration.client.ArtifactoryBuildInfoClient.verifyCompatibleArtifactoryVersion(ArtifactoryBuildInfoClient.java:304)
at org.jfrog.build.extractor.clientConfiguration.client.ArtifactoryBuildInfoClient.buildInfoToJsonString(ArtifactoryBuildInfoClient.java:515)
at org.jfrog.build.extractor.clientConfiguration.client.ArtifactoryBuildInfoClient.sendBuildInfo(ArtifactoryBuildInfoClient.java:194)
at org.jfrog.build.extractor.maven.BuildDeploymentHelper.deploy(BuildDeploymentHelper.java:111)
at org.jfrog.build.extractor.maven.BuildInfoRecorder.sessionEnded(BuildInfoRecorder.java:170)
at org.apache.maven.lifecycle.internal.DefaultExecutionEventCatapult.fire(DefaultExecutionEventCatapult.java:64)
at org.apache.maven.lifecycle.internal.DefaultExecutionEventCatapult.fire(DefaultExecutionEventCatapult.java:42)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:137)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:307)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:193)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:106)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:863)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:288)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:199)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
[ERROR] org.jfrog.build.extractor.maven.BuildInfoRecorder.sessionEnded() listener has failed:
java.lang.RuntimeException: Error occurred while publishing Build Info to Artifactory.
at org.jfrog.build.extractor.maven.BuildDeploymentHelper.deploy(BuildDeploymentHelper.java:114)
at org.jfrog.build.extractor.maven.BuildInfoRecorder.sessionEnded(BuildInfoRecorder.java:170)
at org.apache.maven.lifecycle.internal.DefaultExecutionEventCatapult.fire(DefaultExecutionEventCatapult.java:64)
at org.apache.maven.lifecycle.internal.DefaultExecutionEventCatapult.fire(DefaultExecutionEventCatapult.java:42)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:137)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:307)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:193)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:106)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:863)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:288)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:199)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: java.io.IOException: Could not publish build-info: There is either an incompatible or no instance of Artifactory at the provided URL.
at org.jfrog.build.extractor.clientConfiguration.client.ArtifactoryBuildInfoClient.sendBuildInfo(ArtifactoryBuildInfoClient.java:197)
at org.jfrog.build.extractor.maven.BuildDeploymentHelper.deploy(BuildDeploymentHelper.java:111)
... 18 more
[ERROR] Internal error: java.lang.RuntimeException: org.jfrog.build.extractor.maven.BuildInfoRecorder.sessionEnded() listener has failed: Error occurred while publishing Build Info to Artifactory. Could not publish build-info: There is either an incompatible or no instance of Artifactory at the provided URL. -> [Help 1]
org.apache.maven.InternalErrorException: Internal error: java.lang.RuntimeException: org.jfrog.build.extractor.maven.BuildInfoRecorder.sessionEnded() listener has failed:
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:121)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:863)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:288)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:199)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: java.lang.RuntimeException: org.jfrog.build.extractor.maven.BuildInfoRecorder.sessionEnded() listener has failed:
at org.jfrog.build.extractor.maven.BuildInfoRecorder.sessionEnded(BuildInfoRecorder.java:179)
at org.apache.maven.lifecycle.internal.DefaultExecutionEventCatapult.fire(DefaultExecutionEventCatapult.java:64)
at org.apache.maven.lifecycle.internal.DefaultExecutionEventCatapult.fire(DefaultExecutionEventCatapult.java:42)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:137)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:307)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:193)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:106)
... 11 more
Caused by: java.lang.RuntimeException: Error occurred while publishing Build Info to Artifactory.
at org.jfrog.build.extractor.maven.BuildDeploymentHelper.deploy(BuildDeploymentHelper.java:114)
at org.jfrog.build.extractor.maven.BuildInfoRecorder.sessionEnded(BuildInfoRecorder.java:170)
... 17 more
Caused by: java.io.IOException: Could not publish build-info: There is either an incompatible or no instance of Artifactory at the provided URL.
at org.jfrog.build.extractor.clientConfiguration.client.ArtifactoryBuildInfoClient.sendBuildInfo(ArtifactoryBuildInfoClient.java:197)
at org.jfrog.build.extractor.maven.BuildDeploymentHelper.deploy(BuildDeploymentHelper.java:111)
... 18 more
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/InternalErrorException
Any ideas where I'm going wrong?

Related

Context initialization failed Alfresco

I have Alfresco Community Edition 5.2.
The project has been cloned from GitHub repository, and after that I cannot run it.
I am getting the following error at catalina.out file:
2023-02-02 13:16:49,623 ERROR [web.context.ContextLoader] [localhost-startStop-1] Context initialization failed
org.alfresco.error.AlfrescoRuntimeException: 01020049 Not all patches could be applied
at org.alfresco.repo.admin.patch.PatchExecuter.applyOutstandingPatches(PatchExecuter.java:118)
at org.alfresco.repo.admin.patch.PatchExecuter$1.doWork(PatchExecuter.java:131)
at org.alfresco.repo.admin.patch.PatchExecuter$1.doWork(PatchExecuter.java:1)
at org.alfresco.repo.security.authentication.AuthenticationUtil.runAs(AuthenticationUtil.java:555)
at org.alfresco.repo.admin.patch.PatchExecuter.onBootstrap(PatchExecuter.java:135)
at org.springframework.extensions.surf.util.AbstractLifecycleBean.onApplicationEvent(AbstractLifecycleBean.java:56)
at org.alfresco.repo.management.SafeApplicationEventMulticaster.multicastEventInternal(SafeApplicationEventMulticaster.java:214)
at org.alfresco.repo.management.SafeApplicationEventMulticaster.multicastEvent(SafeApplicationEventMulticaster.java:185)
at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:334)
at org.springframework.context.support.AbstractApplicationContext.finishRefresh(AbstractApplicationContext.java:954)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:482)
at org.springframework.web.context.ContextLoader.configureAndRefreshWebApplicationContext(ContextLoader.java:410)
at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:306)
at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:112)
at org.alfresco.web.app.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:70)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4939)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5434)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1559)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1549)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Feb 02, 2023 1:16:49 PM org.apache.catalina.core.StandardContext listenerStart
SEVERE: Exception sending context initialized event to listener instance of class org.alfresco.web.app.ContextLoaderListener
org.alfresco.error.AlfrescoRuntimeException: 01020049 Not all patches could be applied
at org.alfresco.repo.admin.patch.PatchExecuter.applyOutstandingPatches(PatchExecuter.java:118)
at org.alfresco.repo.admin.patch.PatchExecuter$1.doWork(PatchExecuter.java:131)
at org.alfresco.repo.admin.patch.PatchExecuter$1.doWork(PatchExecuter.java:1)
at org.alfresco.repo.security.authentication.AuthenticationUtil.runAs(AuthenticationUtil.java:555)
at org.alfresco.repo.admin.patch.PatchExecuter.onBootstrap(PatchExecuter.java:135)
at org.springframework.extensions.surf.util.AbstractLifecycleBean.onApplicationEvent(AbstractLifecycleBean.java:56)
at org.alfresco.repo.management.SafeApplicationEventMulticaster.multicastEventInternal(SafeApplicationEventMulticaster.java:214)
at org.alfresco.repo.management.SafeApplicationEventMulticaster.multicastEvent(SafeApplicationEventMulticaster.java:185)
at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:334)
at org.springframework.context.support.AbstractApplicationContext.finishRefresh(AbstractApplicationContext.java:954)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:482)
at org.springframework.web.context.ContextLoader.configureAndRefreshWebApplicationContext(ContextLoader.java:410)
at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:306)
at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:112)
at org.alfresco.web.app.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:70)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4939)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5434)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1559)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1549)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
and this one also:
2023-02-02 13:16:48,818 ERROR [admin.patch.PatchExecuter] [localhost-startStop-1] 01020048 org.alfresco.error.AlfrescoRuntimeException: 01020047 Bootstrap failed
at org.alfresco.repo.importer.ImporterBootstrap.bootstrap(ImporterBootstrap.java:367)
at com.prodyna.adama.gas.patch.SiteReloadedPatch.applyInternalImpl(SiteReloadedPatch.java:388)
at com.prodyna.adama.gas.patch.SiteReloadedPatch.applyInternal(SiteReloadedPatch.java:269)
at org.alfresco.repo.admin.patch.AbstractPatch$1.execute(AbstractPatch.java:455)
at org.alfresco.repo.admin.patch.AbstractPatch$1.execute(AbstractPatch.java:1)
at org.alfresco.repo.transaction.RetryingTransactionHelper.doInTransaction(RetryingTransactionHelper.java:464)
at org.alfresco.repo.admin.patch.AbstractPatch.applyWithTxns(AbstractPatch.java:462)
at org.alfresco.repo.admin.patch.AbstractPatch.access$0(AbstractPatch.java:442)
at org.alfresco.repo.admin.patch.AbstractPatch$4.doWork(AbstractPatch.java:620)
at org.alfresco.repo.admin.patch.AbstractPatch$4.doWork(AbstractPatch.java:1)
at org.alfresco.repo.security.authentication.AuthenticationUtil.runAs(AuthenticationUtil.java:555)
at org.alfresco.repo.admin.patch.AbstractPatch.apply(AbstractPatch.java:624)
at org.alfresco.repo.admin.patch.AbstractPatch.apply(AbstractPatch.java:586)
at org.alfresco.repo.admin.patch.PatchServiceImpl$PatchWork.applyPatch(PatchServiceImpl.java:564)
at org.alfresco.repo.admin.patch.PatchServiceImpl$PatchWork.execute(PatchServiceImpl.java:477)
at org.alfresco.repo.admin.patch.PatchServiceImpl.applyPatch(PatchServiceImpl.java:332)
at org.alfresco.repo.admin.patch.PatchServiceImpl.applyPatchAndDependencies(PatchServiceImpl.java:309)
at org.alfresco.repo.admin.patch.PatchServiceImpl.applyOutstandingPatches(PatchServiceImpl.java:198)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:317)
at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:183)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:96)
at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:260)
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:94)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:204)
at com.sun.proxy.$Proxy142.applyOutstandingPatches(Unknown Source)
at org.alfresco.repo.admin.patch.PatchExecuter.applyOutstandingPatches(PatchExecuter.java:83)
at org.alfresco.repo.admin.patch.PatchExecuter$1.doWork(PatchExecuter.java:131)
at org.alfresco.repo.admin.patch.PatchExecuter$1.doWork(PatchExecuter.java:1)
at org.alfresco.repo.security.authentication.AuthenticationUtil.runAs(AuthenticationUtil.java:555)
at org.alfresco.repo.admin.patch.PatchExecuter.onBootstrap(PatchExecuter.java:135)
at org.springframework.extensions.surf.util.AbstractLifecycleBean.onApplicationEvent(AbstractLifecycleBean.java:56)
at org.alfresco.repo.management.SafeApplicationEventMulticaster.multicastEventInternal(SafeApplicationEventMulticaster.java:214)
at org.alfresco.repo.management.SafeApplicationEventMulticaster.multicastEvent(SafeApplicationEventMulticaster.java:185)
at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:334)
at org.springframework.context.support.AbstractApplicationContext.finishRefresh(AbstractApplicationContext.java:954)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:482)
at org.springframework.web.context.ContextLoader.configureAndRefreshWebApplicationContext(ContextLoader.java:410)
at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:306)
at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:112)
at org.alfresco.web.app.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:70)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4939)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5434)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1559)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1549)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.alfresco.service.cmr.view.ImporterException: Failed to import package at line 487; column 75 due to error: Namespace URI http://www.adama.com/model/adamaStudyNotification/1.0 has not been defined in the Repository dictionary
at org.alfresco.repo.importer.view.ViewParser.parse(ViewParser.java:201)
at org.alfresco.repo.importer.ImporterComponent.parserImport(ImporterComponent.java:430)
at org.alfresco.repo.importer.ImporterComponent.importView(ImporterComponent.java:279)
at org.alfresco.repo.importer.ImporterBootstrap.doImport(ImporterBootstrap.java:485)
at org.alfresco.repo.importer.ImporterBootstrap.access$0(ImporterBootstrap.java:374)
at org.alfresco.repo.importer.ImporterBootstrap$1$1.execute(ImporterBootstrap.java:356)
at org.alfresco.repo.transaction.RetryingTransactionHelper.doInTransaction(RetryingTransactionHelper.java:464)
at org.alfresco.repo.importer.ImporterBootstrap$1.doWork(ImporterBootstrap.java:360)
at org.alfresco.repo.security.authentication.AuthenticationUtil.runAs(AuthenticationUtil.java:555)
at org.alfresco.repo.importer.ImporterBootstrap.bootstrap(ImporterBootstrap.java:363)
... 54 more
Caused by: org.alfresco.service.cmr.view.ImporterException: Namespace URI http://www.adama.com/model/adamaStudyNotification/1.0 has not been defined in the Repository dictionary
at org.alfresco.repo.importer.view.ViewParser.getName(ViewParser.java:995)
at org.alfresco.repo.importer.view.ViewParser.processStartElement(ViewParser.java:226)
at org.alfresco.repo.importer.view.ViewParser.parse(ViewParser.java:183)
... 63 more
While ago I added new sites to alfresco project, and bootstraped them. But now I do not know what could be the problem.
It looks like you have a problem with customized code:
com.prodyna.adama.gas.patch.SiteReloadedPatch.applyInternalImpl
is not an official Alfresco patch.
Failed to import package at line 487; column 75 due to error: Namespace URI http://www.adama.com/model/adamaStudyNotification/1.0 has not been defined in the Repository dictionary
seems to be at least one of your issues: the referenced model http://www.adama.com/model/adamaStudyNotification/1.0 is not defined. You may have forgotten to deploy a dependent module which contains the model xml?
If your intention is to run Alfresco without that module you would need to deploy at least the document model, otherwise Alfresco will not be able to read the metadata of existing documents having the type or aspects defined in that model.

UnsupportedClassVersionError occurs during execution jar

I have tried to package my project into jar by using default spring boot plugin. That is my build tag looks like - simple as is.
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
As expected, I receive in target folder my .jar (executing mvn install). But, when I tring to execute jar using
java -jar <filename>.jar
The problem araise, here is stacktrace
> C:\Users\Misha\IdeaProjects\discoveery-service\target>java -jar discoveery-service-0.0.1-SNAPSHOT.jar
Exception in thread "main" java.lang.UnsupportedClassVersionError: com/example/discoveeryservice/DiscoveryServiceApplication has been compiled by a more recent version of the Java Runtime (class file version 55.0), this version of the Java Runtime only recognizes class file versions up to 52.0
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(Unknown Source)
at java.security.SecureClassLoader.defineClass(Unknown Source)
at java.net.URLClassLoader.defineClass(Unknown Source)
at java.net.URLClassLoader.access$100(Unknown Source)
at java.net.URLClassLoader$1.run(Unknown Source)
at java.net.URLClassLoader$1.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at org.springframework.boot.loader.LaunchedURLClassLoader.loadClass(LaunchedURLClassLoader.java:151)
at java.lang.ClassLoader.loadClass(Unknown Source)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Unknown Source)
at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:46)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:107)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:58)
at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:88)
W
hat is that? I do not meet this problem yet.
Solve it - only thing I change is the version of java maven will package from 11 to 8
<properties>
<java.version>8</java.version>
</properties>

msf4j JWTAccessTokenBuilder threws ClassNotFoundException

I am running wso2is Version 5.7 and tried to implement a TokenGenerator based on msf4j JWTAccessTokenBuilder.
My identity.xml includes
<IdentityOAuthTokenGenerator>com.wso2.jwt.token.builder.JWTAccessTokenBuilder</IdentityOAuthTokenGenerator>
<AccessTokenValueGenerator>org.wso2.carbon.identity.oauth.tokenvaluegenerator.SHA256Generator</AccessTokenValueGenerator>
When I login an exception is thrown
[2019-05-16 18:27:18,163] ERROR
{org.apache.catalina.core.StandardWrapperValve} - Servlet.service()
for servlet [OAuth2Endpoints] in context with path [/oauth2] threw
exception java.lang.RuntimeException:
org.apache.cxf.interceptor.Fault:
com/nimbusds/jwt/ReadOnlyJWTClaimsSet
at org.apache.cxf.interceptor.AbstractFaultChainInitiatorObserver.onMessage(AbstractFaultChainInitiatorObserver.java:116)
at org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:336)
at org.apache.cxf.transport.ChainInitiationObserver.onMessage(ChainInitiationObserver.java:121)
at org.apache.cxf.transport.http.AbstractHTTPDestination.invoke(AbstractHTTPDestination.java:249)
at org.apache.cxf.transport.servlet.ServletController.invokeDestination(ServletController.java:248)
...
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:748) Caused by: org.apache.cxf.interceptor.Fault:
com/nimbusds/jwt/ReadOnlyJWTClaimsSet
at org.apache.cxf.service.invoker.AbstractInvoker.createFault(AbstractInvoker.java:170)
at org.apache.cxf.service.invoker.AbstractInvoker.invoke(AbstractInvoker.java:136)
at org.apache.cxf.jaxrs.JAXRSInvoker.invoke(JAXRSInvoker.java:204)
at org.apache.cxf.jaxrs.JAXRSInvoker.invoke(JAXRSInvoker.java:101)
at org.apache.cxf.interceptor.ServiceInvokerInterceptor$1.run(ServiceInvokerInterceptor.java:58)
at org.apache.cxf.interceptor.ServiceInvokerInterceptor.handleMessage(ServiceInvokerInterceptor.java:94)
at org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:272)
... 49 more Caused by: java.lang.NoClassDefFoundError: com/nimbusds/jwt/ReadOnlyJWTClaimsSet
at java.lang.Class.getDeclaredConstructors0(Native Method)
at java.lang.Class.privateGetDeclaredConstructors(Class.java:2671)
...
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.cxf.service.invoker.AbstractInvoker.performInvocation(AbstractInvoker.java:188)
at org.apache.cxf.service.invoker.AbstractInvoker.invoke(AbstractInvoker.java:104)
... 54 more Caused by: java.lang.ClassNotFoundException: com.nimbusds.jwt.ReadOnlyJWTClaimsSet cannot be found by
JWTAccessTokenBuilder_2.7.4.SNAPSHOT
Can anybody give me a hint?
As per the error com.nimbusds.jwt.ReadOnlyJWTClaimsSet cannot be found by JWTAccessTokenBuilder_2.7.4.SNAPSHOT, it is due to the OSGi class binding.
In the maven-bundle-plugin configuration Import-Package of the pom, make sure you have the com.nimbusds.jwt with the correct version range.
Or the quick fix is to add <DynamicImport-Package>*</DynamicImport-Package>
I found the problem:
I used an old version of org.wso2.carbon.identity.inbound.auth.oauth2
I updated the pom.xml to
<dependency>
<groupId>org.wso2.carbon.identity.inbound.auth.oauth2</groupId>
<artifactId>org.wso2.carbon.identity.oauth</artifactId>
<version>6.0.172</version>
<scope>provided</scope>
</dependency>
from https://mvnrepository.com and had to update the code to use the version of com.nimbusds.jwt used by wso2is.
And finally in the service provider settings you must select the Token Issuer.

How to execute Spark 2 action using Oozie 4.3 in AWS EMR

I'm using AWS EMR 5.7.0 with Oozie version 4.3.0 and spark version 2.1.1
I've a simple Spark program written in Scala. It is working fine when executed from shell using spark-submit.
But when I'm trying to execute this program using Oozie Spark action, I'm getting into errors.
Job.properties:
nameNode=hdfs://ip-xx-xx-xx-xx.ec2.internal:8020
jobTracker=ip-xx-xx-xx-xx.ec2.internal:8032
master=local
oozie.use.system.libpath=true
oozie.wf.application.path=hdfs://ip-xx-xx-xx-xx.ec2.internal:8020/test-artifacts/
oozie.action.sharelib.for.spark = spark2
Workflow.xml:
<?xml version="1.0" encoding="UTF-8"?>
<workflow-app xmlns="uri:oozie:workflow:0.5" name="Test program">
<start to="spark-node" />
<action name="spark-node">
<spark xmlns="uri:oozie:spark-action:0.2">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<master>${master}</master>
<name>Spark on Oozie - test job</name>
<class>TestPackage.TestObj</class>
<jar>/home/hadoop/oozie-test.jar</jar>
</spark>
<ok to="end" />
<error to="fail" />
</action>
<kill name="fail">
<message>Workflow failed, error message]</message>
</kill>
<end name="end" />
</workflow-app>
Workflow.xml is kept in the HDFS and job.properties is in the master node.
When the Oozie job is executed using the command "oozie job -oozie http:/ /ip-xx-xx-xx-xx.ec2.internal:11000/oozie -config job.properties -run", a map-reduce program started. Mapreduce job failing with errors without starting the spark job.
1) For Sparkmaster=yarn-cluster and mode=cluster , getting the following exception.
Log file: /mnt/yarn/usercache/hadoop/appcache/application_1502719828530_0011/container_1502719828530_0011_01_000001/spark-oozie-job_1502719828530_0011.log not present. Therefore no Hadoop job IDs found.
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SparkMain], main() threw exception, null
java.lang.NullPointerException
at java.io.File.<init>(File.java:277)
at org.apache.spark.deploy.yarn.Client.addDistributedUri$1(Client.scala:416)
at org.apache.spark.deploy.yarn.Client.org$apache$spark$deploy$yarn$Client$$distribute$1(Client.scala:454)
at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$11$$anonfun$apply$6.apply(Client.scala:580)
at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$11$$anonfun$apply$6.apply(Client.scala:579)
at scala.collection.mutable.ArraySeq.foreach(ArraySeq.scala:74)
at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$11.apply(Client.scala:579)
at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$11.apply(Client.scala:578)
at scala.collection.immutable.List.foreach(List.scala:381)
at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:578)
at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:814)
at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:169)
at org.apache.spark.deploy.yarn.Client.run(Client.scala:1091)
at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1150)
at org.apache.spark.deploy.yarn.Client.main(Client.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:755)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
at org.apache.oozie.action.hadoop.SparkMain.runSpark(SparkMain.java:340)
at org.apache.oozie.action.hadoop.SparkMain.run(SparkMain.java:259)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:60)
at org.apache.oozie.action.hadoop.SparkMain.main(SparkMain.java:80)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:234)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:455)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:344)
at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runSubtask(LocalContainerLauncher.java:380)
at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runTask(LocalContainerLauncher.java:301)
at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.access$200(LocalContainerLauncher.java:187)
at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler$1.run(LocalContainerLauncher.java:230)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
2) (master=yarn or master=local[*] or master =local[1] or master = local) with mode='client' getting the error as below.
Log file: /mnt/yarn/usercache/hadoop/appcache/application_1502719828530_0013/container_1502719828530_0013_01_000001/spark-oozie-job_1502719828530_0013.log not present. Therefore no Hadoop job IDs found.
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SparkMain], main() threw exception, No FileSystem for scheme: org.apache.spark
java.io.IOException: No FileSystem for scheme: org.apache.spark
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2708)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2715)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:93)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2751)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2733)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:377)
at org.apache.spark.deploy.SparkSubmit$.downloadFile(SparkSubmit.scala:865)
at org.apache.spark.deploy.SparkSubmit$$anonfun$downloadFileList$2.apply(SparkSubmit.scala:850)
at org.apache.spark.deploy.SparkSubmit$$anonfun$downloadFileList$2.apply(SparkSubmit.scala:850)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:186)
at org.apache.spark.deploy.SparkSubmit$.downloadFileList(SparkSubmit.scala:850)
at org.apache.spark.deploy.SparkSubmit$$anonfun$prepareSubmitEnvironment$2.apply(SparkSubmit.scala:317)
at org.apache.spark.deploy.SparkSubmit$$anonfun$prepareSubmitEnvironment$2.apply(SparkSubmit.scala:317)
at scala.Option.map(Option.scala:146)
at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:317)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
at org.apache.oozie.action.hadoop.SparkMain.runSpark(SparkMain.java:340)
at org.apache.oozie.action.hadoop.SparkMain.run(SparkMain.java:259)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:60)
at org.apache.oozie.action.hadoop.SparkMain.main(SparkMain.java:80)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:234)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:455)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:344)
at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runSubtask(LocalContainerLauncher.java:380)
at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runTask(LocalContainerLauncher.java:301)
at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.access$200(LocalContainerLauncher.java:187)
at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler$1.run(LocalContainerLauncher.java:230)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
===============================
From the link https: //issues.apache.org/jira/plugins/servlet/mobile#issue/OOZIE-2767 , it seems spark2 action is not yet supported by Oozie.
But based on the link https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.1/bk_spark-component-guide/content/ch_oozie-spark-action.html, there seems to be workarounds.
Along with Hortonworks link, I've also followed all the steps mentioned at https://aws.amazon.com/blogs/big-data/use-apache-oozie-workflows-to-automate-apache-spark-jobs-and-more-on-amazon-emr/
But no luck so far.
I couldn't find any documentation which confirms that Oozie + Spark 2 is either supported or not supported.
If it worked for anyone, please provide the detailed steps on how to get Oozie + Spark2 to work in AWS EMR.

JavaFX listen to hokeys

Does anyone know how i cant listen to global hotkeys in javaFX?
i found a library for normal java but i cant get it to work with javafx.
I can get it to work on a normal java project but. When i use it in a javafx project i get the following error
jfx-project-run:
Executing D:\Administrator\Documents\NetBeansProjects\MPDClient\dist\run1477960237\MPDClient.jar using platform C:\Program Files\Java\jdk1.7.0_51\jre/bin/java
Exception in Application start method
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.javafx.main.Main.launchApp(Main.java:698)
at com.javafx.main.Main.main(Main.java:871)
Caused by: java.lang.RuntimeException: Exception in Application start method
at com.sun.javafx.application.LauncherImpl.launchApplication1(LauncherImpl.java:403)
at com.sun.javafx.application.LauncherImpl.access$000(LauncherImpl.java:47)
at com.sun.javafx.application.LauncherImpl$1.run(LauncherImpl.java:115)
at java.lang.Thread.run(Thread.java:744)
Caused by: java.lang.NoClassDefFoundError: com/sun/jna/Platform
at com.tulskiy.keymaster.common.Provider.getCurrentProvider(Provider.java:52)
at mpdclient.FXMLDocumentController.initialize(FXMLDocumentController.java:173)
at javafx.fxml.FXMLLoader.load(FXMLLoader.java:2193)
at javafx.fxml.FXMLLoader.load(FXMLLoader.java:2069)
at javafx.fxml.FXMLLoader.load(FXMLLoader.java:2830)
at javafx.fxml.FXMLLoader.load(FXMLLoader.java:2809)
at javafx.fxml.FXMLLoader.load(FXMLLoader.java:2795)
at javafx.fxml.FXMLLoader.load(FXMLLoader.java:2782)
at javafx.fxml.FXMLLoader.load(FXMLLoader.java:2771)
at mpdclient.MPDClient.start(MPDClient.java:26)
at com.sun.javafx.application.LauncherImpl$5.run(LauncherImpl.java:319)
at com.sun.javafx.application.PlatformImpl$5.run(PlatformImpl.java:219)
at com.sun.javafx.application.PlatformImpl$4$1.run(PlatformImpl.java:182)
at com.sun.javafx.application.PlatformImpl$4$1.run(PlatformImpl.java:179)
at java.security.AccessController.doPrivileged(Native Method)
at com.sun.javafx.application.PlatformImpl$4.run(PlatformImpl.java:179)
at com.sun.glass.ui.InvokeLaterDispatcher$Future.run(InvokeLaterDispatcher.java:76)
at com.sun.glass.ui.win.WinApplication._runLoop(Native Method)
at com.sun.glass.ui.win.WinApplication.access$100(WinApplication.java:17)
at com.sun.glass.ui.win.WinApplication$3$1.run(WinApplication.java:67)
... 1 more
Caused by: java.lang.ClassNotFoundException: com.sun.jna.Platform
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 21 more
Link to the Libray
thanks in advance
You need to add jna library in your dependencies:
<dependency>
<groupId>net.java.dev.jna</groupId>
<artifactId>jna</artifactId>
<version>4.1.0</version>
</dependency>
Add this in pom.xml

Resources