Run shell script from within an R script on a windows machine - r

I have a R-script within which i call a shell script using system command and use the paste command to pass arguments to the shell script ( on unix machine) and i now would like execute the same R-script on a windows machine and am struggling to get it working
Here are the steps i followed
R code
source('C:\\Users\\xxxx\\Documents\\R\\R-3.5.2\\ms\\ms\\MS_Config.R')
if(is.null(git_version) | git_version == "" | length(git_version) == 0){
print('ERROR: EXECUTION STOPPED !!!')
print('PLEASE SPECIFY GITHUB TAG_ID')
stop()
}
print("test4444")
print(enable_data_pull)
print (getwd())
system(paste('C:\\Users\\xxxx\\Documents\\R\\R- 3.5.2\\ms\\ms\\MS_ALLM_Parallel_Runner.sh -c ', num_cores,
'-s ', snapshot_dt,
'-p ' , local_storage_path,
'-t ', tag,
'-g ', git_version,
'-y ', enable_data_pull
))
print ("after shell script execution")
I tried the following, but did not succeed
Installed cygwin and called the rscript from the cygwin terminal(PATH variable is updated to include R and its binaries)
rscript "C:\Users\xxxx\Documents\R\R-3.5.2\ms\ms\MS_Model_Kickoff.R"
Below is the error message that i see after the r-script attempts to run the shell script
'CreateProcess' failed to run 'C:\Users\xxxx\DOCUME~1\R\R-35~1.2\ms\ms\CONRM_~1.SH -c 25 -s 201811 -p C:\Users\xxxx\Documents\Test -t Analytical -g verModelRefit2.2.2 -y N'
what does the above error mean and how do i fix this and execute the shell script within from the R-script on windows machine?r

Related

Start tcsh in a specific directory

Does tcsh support launching itself in a remote directory via an argument?
The setup I am dealing with does not allow me to chdir to the remote directory before invoking tcsh, and I'd like to avoid having to create a .sh file for this workflow.
Here are the available arguments I see for v6.19:
> tcsh --help
tcsh 6.19.00 (Astron) 2015-05-21 (x86_64-unknown-Linux) options wide,nls,dl,al,kan,rh,color,filec
-b file batch mode, read and execute commands from 'file'
-c command run 'command' from next argument
-d load directory stack from '~/.cshdirs'
-Dname[=value] define environment variable `name' to `value' (DomainOS only)
-e exit on any error
-f start faster by ignoring the start-up file
-F use fork() instead of vfork() when spawning (ConvexOS only)
-i interactive, even when input is not from a terminal
-l act as a login shell, must be the only option specified
-m load the start-up file, whether or not owned by effective user
-n file no execute mode, just check syntax of the following `file'
-q accept SIGQUIT for running under a debugger
-s read commands from standard input
-t read one line from standard input
-v echo commands after history substitution
-V like -v but including commands read from the start-up file
-x echo commands immediately before execution
-X like -x but including commands read from the start-up file
--help print this message and exit
--version print the version shell variable and exit
This works, but is suboptimal because it launches two instances of tcsh:
tcsh -c 'cd /tmp && tcsh'

UNIX commands from R via shell function

I need to issue unix commands from an R session. I'm on Windows R2 2012 server using RStudio 1.1.383 and R 3.4.3.
The shell() function looks to be the right one for me but when I specify the path to my bash shell (from Git for Windows install) the command fails with error code 127.
shell_path <- "C:\\Program Files\\Git\\git-bash.exe"
shell("ls -a", shell = shell_path)
## running command 'C:\Program Files\Git\git-bash.exe /c ls -a' had status 127'ls -a' execution failed with error code 127
Pretty sure my shell path is correct:
What am I doing wrong?
EDIT: for clarity I would like to pass any number of UNIX commands, I am just using ls -a for an example.
EDIT:
After some playing about 2018-03-09:
shell(cmd = "ls -a", shell = '"C:/Program Files/Git/bin/bash.exe"', intern = TRUE, flag = "-c")
The correct location of my bash.exe was at .../bin/bash.exe. This uses shell with intern = TRUE to return the output as an R object. Note the use of single quote marks around the shell path.
EDIT: 2018-03-09 21:40:46 UT
In RStudio we can also call bash using knitr and setting chunk options:
library(knitr)
```{bash my_bash_chunk, engine.path="C:\\Program Files\\Git\\bin\\bash.exe"}
# Using a call to unix shell
ls -a
```
Two things stand out here. Bash will return exit code 127 if a command is not found; you should try running the fully qualified command name.
I also see that your shell is being run with a /c flag. According to the documentation, the flag argument specifies "the switch to run a command under the shell" and it defaults to /c, but "if the shell is bash or tcsh or sh the default is changed to '-c'." Obviously this isn't happening for git-bash.exe.
Try these changes out:
shell_path <- "C:\\Program Files\\Git\\git-bash.exe"
shell("/bin/ls -a", shell = shell_path, flag = "-c")
Not on Windows, so can't be sure this will work.
Perhaps you need to use shQuote?
shell( paste("ls -a ", shQuote( shell_path) ) )
(Untested. I'm not on Windows. But do read ?shQuote))
If you just want to do ls -a, you can use the below commands:
shell("'ls -a'", shell="C:\\Git\\bin\\sh.exe")
#or
shell('C:\\Git\\bin\\sh.exe -c "ls -a"')
Let us know if the space in "Program Files" is causing problems.
And if you require login before you can call your command,
shell('C:\\Git\\bin\\sh.exe --login -c "ls -a"')
But if you are looking at performing git commands from R, the git2r by ropensci might suit your needs.

Passing SLURM batch command line arguments to R

I'm trying to run a SLURM sbatch command with various parameters that I can read in an R script. When using PBS system, I used to write qsub -v param1=x,param2=y (+ other system parameters like the memory requirements etc and the script name to be read by PBS) and then in the R script read it with x = Sys.getenv(‘param1’).
Now I tried
sbatch run.sh --export=basePath=‘a’
With run.sh:
#!/bin/bash
cd $SLURM_SUBMIT_DIR
echo $PWD
module load R/common/3.3.3
R CMD BATCH --quiet --no-restore --no-save runDo.R output.txt
And runDo.R:
base.path = Sys.getenv('basePath')
print(base.path)
The script is running but the argument value is not assigned to base.path variable (it prints an empty string).
The export parameter has to be passed to sbatch not to the run.sh script.
It should be like this:
sbatch --export=basePath=‘a’ run.sh

I created my shell script and I am trying to access data in NAS Mount directory. When I tried to execute the script I get the following error

Script as -
/bin/bash
1 START_TIME=($date +%s)
echo "calling enviroment variable"
and when i tried to run tho above script i got
error as
calling enviroment variable
/bin/bash 22-1-2016 line 1 : command not found
You should start your script with a she-bang (source) therefore :
/bin/bash
should be
#!/bin/bash

How to run multiple R scripts simultaneously?

I would like to run all the R script (script1.R, script2.R, ...) stored in a directory (~/Sims). Moreover I would like that each script run in a separate terminal. The os I'm using is OS X 10.9.5.
I used a bash script with the following commands:
#!/bin/bash
FILES=~/Sims/*.R
for f in $FILES
do
xterm -e bash -c "R --vanilla < $f; exec bash" &
done
I would like to find an alternative for xterm (given that under os x require to install the package X11 and on some machine I can't install it) that it is part of os x (like the Terminal app)
I would like to not exit from the R environment at the end of the R script
This will mimic your xterm configuration but use new Terminal.app sessions instead:
for f in *.R
do
osascript -e "tell app \"Terminal\" to do script \"R --vanilla < /FULL/PATH/TO/${f}\""
done
As far as keeping the R session alive, I'm not sure that's possible.

Resources