linux script for restart jar if it crashes - jar

Is It possible create a script for restart an jar application if it crashes?
The possibility solution could be create a Linux service to automatically start when the system reboots or crashes.
Are there other possibility, for example by creating a script?

#!/usr/bin/bash
while true; do
java -jar myjar.jar
sleep 5 # in case myjar always fails
done

Related

Jar file run on a server background with close putty session

I have tried the run spring boot jar file using putty. but the problem is after closed the putty session service was stopped.
then i tried up the jar file with following command. its working fine .
**nohup java -jar /web/server.jar **
You should avoid using nohup as it will just disassociate your terminal and the process. Instead, use the following command to run your process as a service.
sudo ln -s /path/to/your-spring-boot-app.jar /etc/init.d/your-spring-boot-app
This command creates a symbolic link to your JAR file. Which then, you can run as a service using the command sudo service your-spring-boot-app start. This will write console log to /var/log/your-spring-boot-app.log
Moreover, you can configure spring-boot/application.properties to write console logs at your specified location using logging.path=path-to-your-log-directoryor logging.file=path-to-your-log-file.txt. Also, it may be worth noting that logging.file takes priority over logging.path

Oozie executing hadoop commands in shell action as yarn

Environment : Hortonworks Sandbox HDP 2.2.4
Issue : Unable to run the hadoop commands present in the shell scripts as a root user. The oozie job is getting triggered as a root user, but when the hadoop fs or any mapreduce command is executed, then it runs as yarn user. As yarn, doesn’t have access to some of the file system , so the shell script is failing to execute. Let me know what changes I need to do , for making it run the hadoop commands as root user.
It is an expected behaviour to get Yarn in place whenever we are invoking shell actions in oozie. Yarn user only have the capabilities to run shell actions. One thing we can do is to give access permissions to Yarn on the file system.
This is more like a shell script question than an Oozie question. In theory, Oozie job runs as the user who submits the job. In a kerberos' env, the user is whoever signed in with keytab/password.
Once job is running on Hadoop cluster, in order to change the ownership of command, you can use "sudo" within your shell script. In your case, you may also want to make sure user "yarn" is allowed to sudo to the commands you want to execute.
Add below property to workflow:
HADOOP_USER_NAME=${wf:user()}

QT gui application not starting automatically on startup in ubuntu 14.04

I have two Qt applications, one is non gui called "App1" and another one is gui called "App2". As per my need I need to start "App1" on startup of Ubuntu 14.04 machine.
This "App1" runs a sh file called "myshfile.sh" and I am starting "App2" into this shell script file by /opt/myprojectname/App2 &
So to do the same I make .sh file called "myupstart.sh" and write into /opt/myprojectname/App1 & it and copy the file at path /etc/init.d/ and gave it +x permission to start "App1" on startup.
When I restart my machine then it runs "App1" (which is qt non-gui app) automatically on startup and runs "myshfile.sh" as expected. Till now all are working fine but the problem occurs from here as per below.
As I have mentioned above that "App1" runs a sh file called "myshfile.sh" and I am starting "App2" into the shell script file by /opt/myprojectname/App2 & but "App2" is not staring ( which is qt gui app).
When I do the same by simply running command into termianl /opt/myprojectname/App1 then all works fine and it calls the "myshfile.sh" file and "myshfile.sh" file also starts "App2".
So what I found that when I do the same by manually into termianl then all works fine and by script etc/init.d/myupstart.sh, it starts only Qt non-gui application and not starting Qt gui application on startup.
Kindly suggest me where I am wrong.
Thanks.

How to get the daemon Rserve running as worker dyno on Heroku

This question is an obscure problem - sorry for the length. I'm trying to deploy an app to Heroku. The app runs Rserve - a daemon of the R language, for running statistical reports. This should in principle be no more difficult than getting any daemon, such as memcached, to run in Heroku.
In Mac OSX I just start the daemon in the command line and forget it - all works fine. I'm interfacing with Rserve from node.js, using https://github.com/albertosantini/node-rio (not a factor here though).
But in deploying to Heroku, not having much luck. I'm using a multipack of R and node. Installation runs fine, all build steps exit okay, R starts fine.
Now comes the job of starting the Rserve daemon on the worker dyno.
My procfile looks like this:
web: node server.js
worker: R CMD Rserve --no-save
When I run it, I get the following error in the logs (scroll to end of block):
Rserv started in daemon mode.
heroku[worker.1]: State changed from starting to crashed
The Rserve() config docs are here: http://www.rforge.net/Rserve/doc.html I am not expert at configuring it but perhaps there is something in there that I should be doing for it to work in this environment?
An oddity is that you can run this without error from the Heroku run console, but (see below), it does not seem to actually be running when I try to access it from node.js:
heroku run R CMD Rserve
[Previously saved workspace restored]
Rserv started in daemon mode.
>
In node.js (heroku run node), I try testing it thus:
var rio = require('rio');
rio.evaluate("pi / 2 * 2");
which gives the error "Rserve call failed".
This leads me to think something is fundamentally wrong with what I am trying to do or how I am trying to do it.
Rserve runs as a daemon by default, so use a script to execute it so it runs "in process".
E.g.
# example R script for executing Rserve
require('Rserve')
# get the port from environment (heroku)
port <- Sys.getenv('PORT')
# run Rserve in process
run.Rserve(debug = FALSE, port, args = NULL, config.file = "rserve.conf")
And then your Procfile will have an entry as follows:
rserve: R -f rserve.r --gui-none --no-save
So I tried a dozen ways to get it started on a worker dyno, but all would crash. I never got to the bottom of all environment issues - I am not very expert at Unix. However... I did get it work by spawning a child process to run Rserve at the end of my server.js initialization script on my web dyno. It works.
childProcess.exec('R CMD Rserve --no-save', function (error, stdout, stderr) {});
My plan is to implement it this way in the worker process and use Web Workers to communicate between the separate environments.

netsh mbn show interfaces results in command not found on Win7 64Bit

I'm trying to run "netsh mbn show interfaces" from a .bat or .jar file on a Windows 7 64bit system but every time I run my file, it results in "The following command was not found mbn show interfaces".
When I run that same command in a cmd.exe prompt, the result is correct and as expected.
When we run /? we see "mbn" in available commands. When we output that same result from a .bat or .jar, we don't see that "mbn" command in the available commands for netsh.
Anybody know what's happening?
We know there are 2 netsh.exe files, one in sys32 and one in syswow64.
All help is appreciated.
we solved the problem:
When running the command "netsh mbn show interface" was running the cmd as a 64bit operation.
When running the command from an application - that is 32bit - the cmd is run as a 32bit operation. And the mbn-context is not available in 32bit.
In a windows 64bit os, a behind the scenes function exists: file system redirection.
Meaning: a 64bit process will call the equivalent 32bit process.
The workaround is that you use a csharp script or something else where you can override the file system redirection:
IntPtr ptr = IntPtr.Zero;
Wow64DisableWow64FsRedirection(ref ptr);
// -- your proces information here --
Wow64RevertWow64FsRedirection(ptr);
//always revert the operation.
and that solved it!

Resources