I was wondering if in R there is a way/keyboard shortcut/command/processing-thread-killing-alternative. For instance, using RStudio as IDE, you got a little stop button to kill the current processing thread(s), but most of the times it ends up in the prompt to end the session and restart the environment. Suggestions?
Ctrl + C <- will stop the R process without exiting the R session
and not
Ctrl + Z <- will stop the R process and exit the R session
Related
I am writing an R code on a Linux system using RStudio. At some point in the code, I need to use a system call to a command that will download a few thousand of files from the lines of a text file:
down.command <- paste0("parallel --gnu -a links.txt wget")
system(down.command)
However, this command takes a little while to run (a couple of hours), and the R prompt stays locked while the command runs. I would like to keep using R while the command runs on the background.
I tried to use nohup like this:
down.command <- paste0("nohup parallel --gnu -a links.txt wget > ~/down.log 2>&1")
system(down.command)
but the R prompt still gets "locked" waiting for the end of the command.
Is there any way to circumvent this? Is there a way to submit system commands from R and keep them running on the background?
Using ‘processx’, here’s how to create a new process that redirects both stdout and stderr to the same file:
args = c('--gnu', '-a', 'links.txt', 'wget')
p = processx::process$new('parallel', args, stdout = '~/down.log', stderr = '2>&1')
This launches the process and resumes the execution of the R script. You can then interact with the running process via the p name. Notably you can signal to it, you can query its status (e.g. is_alive()), and you can synchronously wait for its completion (optionally with a timeout after which to kill it):
p$wait()
result = p$get_exit_status()
Based on the comment by #KonradRudolph, I became aware of the processx R package that very smartly deals with system process submissions from within R.
All I had to do was:
library(processx)
down.command <- c("parallel","--gnu", "-a", "links.txt", "wget", ">", "~/down.log", "2>&1")
processx::process$new("nohup", down.comm, cleanup=FALSE)
As simple as that, and very effective.
I have the following situation: In my R script I start a third-party program with system2. The program is called lots of times, and unfortunately it is not very stable and crashes sometimes. If this happens, control is not returned to R until I kill the program via Task Manager manually.
What I would like to do: If the program has not returned control after 10 minutes, kill it automatically.
I could of course wrap the program in C++, Java or similar, implement this functionality in the wrapper, and call the wrapper from R. Quite possibly I could also utilize Rcpp.
However, I wonder if there is any way to achive this in R directly?
Btw: I am on Windows 7.
Thanks for any hints!
If you are using a unix-like system, you can add the unix command timeout to your system call. Example:
# system command that times out
> exitcode = system('timeout 1 sleep 20')
> exitcode
[1] 124
# system command that does not time out
> exitcode = system('timeout 2 sleep 1')
> exitcode
[1] 0
system returns the exit status of the process so you can check whether it is 0 (OK) or 124 (timed out).
I am running a long job using a cluster of computers. On occasion, the process is interrupted and I have to manually restart. There is considerable downtime when the interruptions occur overnight. I was wondering if there is a way run a supervisor script in Julia that monitors whether the job running in another instance of Julia. It would restart the process if it is interrupted and would terminate once the job is finished. Unfortunately, I do not know exactly how to check that the process is running and how to restart the process. Here is the rough idea I have:
state = true
while state == true
#check every minute
sleep(60)
data = readcsv("outputfile.csv")
#read file to check if process is finished
if size(data,1) < N
#some function to check if the process is running
if isrunning() == true
#Do nothing.Keep running
else
#some function to spawn new instance of julia
#run the code
include("myscript.jl")
end
else
#Job finished, exit while loop
state = false
end
end
Right tool for the right Job.
Use your commandline shell.
If something it untimely terminated, it will give a error status code.
Eg Bash
until julia myscript.jl;
do echo "Failed/Interrupted. Restarting in 5s. Press Ctrl-C now to interrupt.";
sleep 5;
done`
Because Julia is not unuable as a commandline runner you could do, in julia:
while true
try
run(`julia myscript.jl`) #Run a separate process
break
catch
println("Failed/Interrupted. Restarting in 5s. Press Ctrl-C now to interrupt.")
sleep(5)
end
end
Let suppose there is a simple R script with only one statement:
q()
Using the R Script plugin in Pentaho Kettle/Spoon, executing the above R script causes Spoon/Kettle to crash.
How can we stop Kettle/Spoon from crashing abnormally with the above statement in our R script?
Kettle should instead stop executing the script and execution control should return to Kettle.
Try to use a return(value) instead q() to expect kettle handle the value from R script and continue the common kettle row flow.
I'm using Rstudio in Windows. There is no red octagon for me to click on. I've tried pressing ESC and Ctrl + C and Ctrl + Z in the console but none of those worked.
When running a code, the red octagon will only show while it is working things out. So while it is just running through your written code (reading data and names of things etc) then the octagon will not show.
Pressing ESC will work, unless Rstudio is frozen.
Good luck!
You could also try interrupting/terminating the R session via the menu:
Session -> Interrupt R
or
Session -> Terminate R...
or
Session -> Restart R (Ctrl+Shift+F10)
If nothing else helps, open the Windows command line and kill the rsession.exe proces:
> tasklist | grep rsession
# the output should be something like this:
rsession.exe 7260 Console 1 166,144 K
# The number (7260 in this case; will probably be different) is the process id (PID),
# and you can use it to kill the process:
> taskkill /F /PID 7260
Caution: this will forcefully stop the R session, and RStudio will probably have to restart (this is what happens on my machine, at least).
Using ctrl + alt + delete opens task manager and you can terminate the R session then restart it to continue working.