How can i use HPROF for the jars present in karaf's deploy ? .For jars HPROF is used as follows:
java -agentlib:hprof[=options] -jar MyApplication.jar
But how can i provide the hprof option for the jars present in karaf as the jars are started by karaf through their blueprint?
Set the DEFAULT_JAVA_OPTS=%JAVA_MODE% -Xrunhprof:cpu=samples,depth=10,thread=y,file=hprof.txt,cpu=times.The Dump file will be created now
Related
We are using Corda Version 4 for our application.
We understand that the command gradlew.bat deployNodes creates following jars -
CorDapp (contracts, states, flows)
Corda platform
Dependencies
When any change is made in the contract/states/flows code, we had to run the command gradlew.bat deployNodes each time. Due to this the "Corda platform" and "Dependencies" jars always get recreated and consequently increases development time.
Does Corda platform provides alternative way to Only create "CorDapp" jar file and not the remaining ones?
You can use following cmd to generate only jar files
./gradlew build
this will generate jar files in you build/lib folder
I am new to Spring batch and want to run a batch with a command line using CommandLineJobRunner class, So I copied the generated jar file and CommandLineJobRunner to my Desktop and after I ran the following command:
Java -cp spring-batch-example.jar org.springframework.batch.core.launch. support. CommandLineJobRunner classpath: /jobs/file-import-job. xml simpleFileImportJob
which give this error (impossible to find or load main org. springframework. batch. core. launch. support. CommandLineJobRunner).
I think that I should deal with the classpath, I don't know how doing it.
You need to add Spring Batch jars to the classpath too, something like:
java -cp spring-batch-example.jar:lib/* org.springframework.batch.core.launch.support.CommandLineJobRunner classpath:/jobs/file-import-job.xml simpleFileImportJob
where lib contains Spring Batch jars and their dependencies. Note that if you are on windows, you need to use ';' instead of ':' to separate classpath entries.
I recommend to use maven shade plugin or a similar plugin to create an uber jar, or use Spring Boot and it will do it for you. In both cases, you would be able to run your job with:
java -jar spring-batch-example.jar
I'm trying to run spark streaming job on DC/OS platform and I've got issue with kafka packages. When I'm trying to include Kafka library and its dependencies (jar file downloaded from Maven, added to artifactory and read from there) with the use of --jars mode as follows:
dcos spark run --submit-args"--jars https://../../../spark-streaming 2.11-2.2.1.jar --conf spark.executor.memory=2g --py-files=https://../../../libs.zip,https://../../../test.py etc"
it seems that file libs.zip, test.py are correctly read but .jar file is omitted.
Any idea why? Is there any workaround for this kind of issue?
Thanks in advance for any help!
I'm not sure why the dcos spark submit command doesn't support --jar option, but you can use the spark.mesos.uris property to download artifacts to the working directory of a Spark driver and executor.
I'm not sure how your Python-based Spark job is going to use JARs, but you may need setting the spark.executor.extraClassPath and spark.driver.extraClassPath configuration property as well.
I am running JBoss Fuse 6.2.0.
I built a small camel application that just writes to the log every 5 seconds.
I built it and installed the SNAPSHOT bundle jar in my local Maven repository.
In the Karaf console I did the following:
fabric:profile-create --parent feature-camel logdemo
fabric:profile-edit --bundle mvn:com.company.project/logdemo logdemo
fabric:container-create-child --profile logdemo root child1
The camel application now worked as intended.
I then made a small change to the application, rebuilt it and installed the new SNAPSHOT bundle jar in my local Maven repo.
In the Karaf console I then did the following to get Karaf to load the new jar:
fabric:profile-refresh logdemo
But the loaded application is still the old version.
How do I get Karaf to look for the updated jar in my local maven repo? It seems like it has some internal cache it looks in instead.
Note: We're not using Maven to build the application, so all answers about using Maven plugins like the fabric8 plugin will be rejected.
You should use the fabric:watch * command for that. This will update all containers that run a snapshot version of an artifact that is updated in the local maven repo. If you want only a specific container to watch for updates use dev:watch * on the shell of that container.
See http://fabric8.io/gitbook/developer.html
I have my Java project in a Hg repository. How can I use this code or convert to Git repo to be able to deploy on Cloud Control?
Is there an Option for me to use a .war file to Deploy instead of the entire file structure upload and compile? If so, would it matter if the war file was built using Tomcat server or Glass-fish server?
Your help in answering these question will be a valuable information for me. Appreciate your help in advance.
Best Regards.
If you want to keep all mercurial branches, tags and history use fast-export. Otherwise just create git repository from scratch:
git init; git add . ; git commit -am "Initial commit"
Deploying prebuilt war file is not the recommended way. You should push to the repository (via git or cctrlapp) and let cloudControl platform take care of the build process. Anyway you can still use Maven to download the war file. You can find some examples here.
Apart from this you have to provide an embedded jetty or tomcat runner and specify start command in Procfile:
Jetty:
web: java -jar target/dependency/jetty-runner.jar --port $PORT target/YOU_WAR.war
Tomcat:
web: java -jar target/dependency/webapp-runner.jar --port $PORT target/YOUR_WAR.war
Keep in mind that war file should be built independently of any application servers.