Automating R to run scripts - r

I'm basically looking for any way to automatically run R scripts just like it would run as if I was copy and pasting it into console. I've tried the package 'taskscheduleR' however it just seems to output to a log file in the directory which isn't as if I were to just run it inside the Rstudio application.
An example might be, say I want to get the last closing stock prices of 5 stocks each night, then the script in Rstudio and have the variables there and all of the code would be in the script file.
Any thoughts?

I would suggest the in-built Task Scheduler application if you using Windows.
Create a task that will run a batchscript file. This batchscript file has only 1 line which executes the Rscript you want. Set it to run each night (or whatever time you want).
I am not that well-versed in linux and MacOS but here's what I know:
Linux has cron. Add a job to crontab with your preferred timing and execute your script 'path/to/bin/r /path/to/script.r'
MacOS has Automator + iCal (for scheduling). It also has crontab like Linux.

Related

Running several R scripts at once using .bat file with CMD BATCH?

I wanted to use a Windows .bat file to run several R scripts and schedule it using windows scheduler.
However, when I test using windows scheduler, everything executes but the files that the R scripts should create are never created. Instead, when I double-click on the .bat file itself it works fine. My goal is for several scripts to just run overnight without me going in and running each one of them manually one-by-one.
If I were to add a second line to the .bat file that would make it run a second script, would this execute only after the first is complete? If not, would I be able to delay the second until the first finishes somehow?? For instance, my .bat file looks like this:
"C:\Program Files\R\R-3.4.2\bin\x64\R.exe" CMD BATCH C:\Users\gma\Desktop\R_Task\script1.R
"C:\Program Files\R\R-3.4.2\bin\x64\R.exe" CMD BATCH C:\Users\gma\Desktop\R_Task\script2.R
I used the answer provided by #Gautam (R taskscheduleR not executing script) to get this far

Close and reopen + run an R script from itself on Mac

I would like to write an R script that can close and reopen + run itself.
This is needed for an API query that I am trying to make and which seems to require me going through these steps once every hour to be able to make additional requests. I tried to use the source() function - and simply run my script from itself every hour- but with this the API keeps rejecting additional requests; it seems that actually closing and opening the program is necessary.
I also tried to use the system() command - as described here to actually open R and execute the script - but I was not able to figure out how to implement this in a Mac environment.
Would you have any suggestions on how to do this?
The usual way to run a script every x amount of time on a Unix system is a cronjob. I'm not familiar with macOS, but apparently it works just as on Linux.
Open the job list to edit it (you can of course use another editor instead of nano)
env EDITOR=nano crontab -e
Add a line/job to the file. This runs any command line command. You can use Rscript to run an R script.
0 * * * * Rscript "/path/to/your/script.R"
Exit and safe. This script should now run every hour.
If you want to change the timing check out crontab.guru. Check out this answer, if the cronjob reports Rscript to be missing.

Problems scheduling an R Script and saving data to the wd

I've just set up an R script to run on my Windows machine - I'm trying to have it save a dataframe on the working directory (which I know from getwd()).
I can see from the Task Scheduler that the script must be running as saves the last run time, however when I check the wd for new time stamps on the dataframes I'm trying to save, they haven't updated? (I save over them each time, or at least that's what I'd like to do, I manually saved them in there to start with).
I'm using this on the scheduler:
C:\Program Files\R\R-2.13.1\bin\R.exe" CMD BATCH  --vanilla --slave “C:\my projects\my_script.R
That appears to be working, but can anyone offer a reason as to why the script that I call doesn't seem to be saving my NEW DF to the wd? I'm using this command to save the DF:
write.table(m23,file="m23.csv",sep=",",row.names=F)
so DF m23 should get updated everyday in the wd when the scheduler calls the script at 6am?
Paul.
Are you sure you know what the current working directory is when the scheduler runs the script? My guess is it might not be what you think is it. I would look at the answers to this question: Rscript: Determine path of the executing script, especially the suggestion about using commandArgs to figure out where you are. Or you can explicitly set the working directory in your script with setwd()

batch process for R gui

I have created a batch file to launch R scripts in Rterm.exe. This works well for regular weekly tasks. The < PBWeeklyMeetingScriptV3.R > is the R script run by Rterm.
set R_TERM="C:\Program Files\R\R-2.14.0\bin\x64\Rterm.exe"
%R_TERM% --slave --no-restore --no-save --args 20120401 20110403 01-apr-12 03-apr-11 < PBWeeklyMeetingScriptV3.R > PBWeeklyMeetingScriptV3.batch 2> error.txt
I've tried to modify this to launch the R GUI instead of the background process as I'd like to inspect and potentially manipulate and inspect the data.
If I change my batch file to:
set R_TERM="C:\Program Files\R\R-2.14.0\bin\x64\Rgui.exe"
the batch file will launch the R GUI but doesn't start the script. Is there a way to launch the script too?
Alternatively is there a way to save/load the work space image to access the variables that are created in the script?
You can save and load workspaces by using save.image() and load(). I do this all the time when scripting to pass data sets between two separate script files, tied together using Python or bash. At the end of each R script, just add:
save.image("Your_image_name.RData")
The image will be the workspace that existed whenever the command was run (so, if it's the last command in the file, it's the workspace right before the exist of the file). We also use this at my job to create "snapshots" of input and output data, so we can reproduce the research later. (We use a simple naming convention to get the time of run, and then label the files with that).
Not sure about launching and then running the GUI with specific scripts in it; I don't think that's a feature you'll find in R, simply because the whole point of running a batch file is usually to avoid the GUI. But hopefully, you can just save the image to disk, and then look at it or pass it to other programs as needed. Hope that helps!

Schedule R script using cron

I am trying to schedule my R script using cron, but it is not working. It seems R can not find packages in cron. Anyone can help me? Thanks.
The following is my bash script
# source my profile
. /home/winie/.profile
# script.R will load packages
R CMD BATCH /home/script.R
Consider these tips
Use Rscript (or littler) rather than R CMD BATCH
Make sure the cron job is running as you
Make sure the script runs by itself
Test it a few times in verbose mode
My box is running the somewhat visible CRANberries via a cronjob calling an R script
(which I execute via littler but Rscript
should work just as well). For this, the entry in /etc/crontab on my Ubuntu server is
# every few hours, run cranberries
16 */3 * * * edd cd /home/edd/cranberries && ./cranberries.r
so every sixteen minutes past every third hour, a shell command is being run with my id. It changes into the working directory, and call the R script (which has executable modes etc).
Looking at this, I could actually just run the script and have setwd() command in it....

Resources