I have some scripts I'd like to run each morning at 6am. These scripts produce some pdfs of graphical output into a file: foo.pdf
I'd like my system (let's say Win 7, >= R 2.13) to email me these pdf's once the system has finished running the scripts.
Which is the best package - and most robust way of setting it up - to have these reports emailed to me directly via attachment from R?
Are there any 'cool' extensions to this (like sink() -ing report text output into the body of the email)?
Thanks in advance for any advice.
You can harness the power of a package that can handle emails coupled with a chron job. On Windows 7, I've achieved something akin to this using Windows Task Scheduler. Basically, you set it to run a particular script at a specified time.
I have a script running daily, and had lots of problems to get it to run. Take a look at Roman's link for the attachment and R code first. This is about the non-R part in Windows 7.
I had problems running the R script directly from Windows Task Scheduler, so I scheduled a batch file to run every day as follows:
#echo on
"C:\Rpath\R-2.15.1\bin\i386\Rcmd.exe" BATCH "C:\filepath\filetorun.R"
That's about the simplest you can get, but Quick R was a starting point.
Depending on your computer's settings, you might have to fiddle with the task scheduler. If it's a server type that's always on, then you shouldn't have too many issues (and you know what you're doing). If you have to log off and use a password to login or access a shared drive, you'll have to do some of the following. Also, I don't know if admin rights is a necessity or not.
Open Task Scheduler, make a new task, and open its properties window.
Under General, check the user account and select "Run whether user is logged on or not" and UNcheck "Do not store password." This will allow your script to run if you're logged off (I don't think it works when Locked). When you click Ok, it will ask for your password.
Basic setup: The Trigger is "On a schedule" and Advanced is Enabled. Under Actions, select "Start a program" with the Program/script as the .bat file.
Under Conditions, uncheck "Start the task only if comp is idle" and check "Wake the computer to run this task." Under Settings, check "Allow task to be run on demand," check "If the running task does not end..," and at the bottom, select "Stop the existing instance." These options might be necessary, though I'm not as sure about these.
Another trick is if your company has you switch passwords every once in a while. Open and close the task after changing so that it asks for your password again. Enter the new one or else it can't log in and won't run your script.
Related
I am used to using R in RStudio. For a new project, I have to use R on the command line, because the data storage and analysis are only allowed to be on a specific server that I connect to using ssh. This server doesn't have rstudio-server to support remote RStudio sessions.
The project involves an extremely large dataset, and some pre-written code to load/format the data that I have been told to run using "source()" before I do anything else. This takes several minutes to run and load the data each time.
What would a good workflow be for something like this? Editing my code in a .r file, saving, then running it would require taking several minutes to load the data each time. But just running R in an interactive session would make it hard to keep track of what I am doing and repeat things if necessary.
Is there some command-line equivalent to RStudio where you can have an interactive session but be editing/saving a file of your code as you go?
Sounds like JuPyteR might be your friend here.
The R kernel works great.
You can use it on a remote server either with exposing an open port (and setting up JuPyteR login credentials)
Or via port forwarding over SSH.
It is a lot like an interactive reply, except it holds state.
And you can go back and rerun cells.
(Of course state can be dangerous for reproduceability)
For RStudio you can launch console and ssh to your remote servers even if your servers don't use expensive RStudio for servers platform. You can then execute all commands from R Studio directly into the ssh with the default shortcut key. This might allow to continue using R studio, track what you're doing in the R script, execute interactively.
I am currently running R on a Microsoft Azure instance (Ubuntu virtual machine) using RStudio as my IDE, to which I connect simply through my browser. I am trying to run some commands that take quite some time to complete from within RStudio and figured that I could simply close my tab with RStudio open and the process would keep running. However, when I try to reconnect to see how the process is doing, the page keeps loading but I am unable to see RStudio.
I have a few questions regarding running RStudio on a server:
First, am I correct in thinking that I can close my tab and keep the process running?
Second, is it normal behaviour that I am unable to connect to the server while the process is running?
Third, am I going about this the correct way or are there better ways?
Yes, you can close your tab and keep it running.
RStudio Server waits on updates from the R process to update the UI. This means that if you have a long-running computation, your tab may not fully reload until it's finished. You may also have seen this in the middle of a session: when R is busy, you can have problems saving scripts that are open in the editor pane.
Logging out in the middle of a computation should be safe, but be aware that RStudio will save your workspace and shut R down after a period of inactivity. It then reloads everything when you log back in. But this only extends to objects in memory; if you have any files saved in your temp directory, they'll have disappeared when you come back. They're probably still on the disk, but since your new R session has a new temp directory, you'll have to do a manual search for them.
So I wrote my father a neat little R script that pulls financial indicators on stocks, and outputs the info to a csv...
I would like to have it set up so that the script will run automatically once a day, skipping the weekends if possible. I looked around for awhile online and it seems as though the Mac "Automator" App is what I'm looking for.
However, after reading many guides and posts (like this one https://www.r-bloggers.com/how-to-source-an-r-script-automatically-on-a-mac-using-automator-and-ical/) I cannot get it to work...
In trying to replicate what this man did above I get the error that the first path is a directory; while the latter returns stuff like "cat: Rscript: No such file or directory"
So I was wondering if anyone could recommend either any good free software that will allow me to do what I would like, or how to run an R script from the /bin/bash shell
EDIT:
The suggested solution isn't really answering my problem. The issue is making this as easy as possibly for my dad to run, that way he doesn't have to do anything, specifically use the terminal. Ideally I could just schedule a task that repeats every morning, but the cronR package requires Daemon, and the others are just command line tools
I had a similar experience.
I created an automator calendar alarm
added a Execute AppleScript Action and used the following code:
on run {input, parameters}
try
tell application "R"
activate
with timeout of 90000 seconds
cmd "source(\"Dropbox/RScripts/CV19/liibre_coronabr.R\")"
end timeout
end tell
end try
return input
end run
When you save it, just choose the date and time for it to run and select the option to repeat everyday
That's it!
I am fresher in a software company. I am a beginner in unix. For tomorrow's CR (regarding server patching), I need to stop and start server. I am given a doc in which its given that I need to go in dir- *.../bin and execute command ./xxx.server start command to start server but they have not given the step to stop server. My question is will the command ./xxx.server stop work to stop server?
xxx.server is present inside bin dir of machine.
Yes you can stop it with the parameter stop. You can look inside the script with a listing command like 'less' or 'more' or 'cat', like:
more ./xxx.server
Then you should find entries with start and stop ( and normally also status).
xxx.server should be a simple script, mostly as shell script, sometimes with a little WLST.
Normally oracle distribute their own start/stop scripts like startWebLogic.sh (to start the admin console) or startManagedWebLogic.sh to start a managed server. But most people write their own start stop scripts. So, if you want to do it right, you should inspect the xxx.server script and/or ask for documentation.
My initial tests have shown that Robot won't work without an active, visible desktop. For example, while a scheduled task (or executed command from the continuous integration server) may be able to start robot as a command-line process, Robot will actually fail to execute the recorded script.
Logging into the build machine to allow it an "active desktop" is not an acceptable solution.
Am I missing something? Is it possible to run a pre-recorded Rational Robot script on a continuous integration server in a manner that doesn't require the machine to be physically logged into?
Unfortunately, Robot does require that you are logged on to the machine and that the desktop is not locked.
So, no, you are not missing something.
Depending on your situation, though, you may be able to work around the issue. Can you clarify what type of application you are trying to test? If it is a web app, or a client app that is easily installed/copied, you might be able to have Robot run on a vmware image, rather than directly on the build server itself.
You can run Rational Robot from the command line, so you should be able to set up a scheduled task to run a .BAT file to do this for you. The command is something like:
[path to Rational Robot]\rtrobo [script file] /user "user name" /project [project file] /play /build "build name" /nolog /close
The Robot documentation will have other arguments you can pass in, depending on your situation.
If a simple scheduled task doesn't work, then you can try setting up a STAF (http://staf.sourceforge.net/index.php) environment and create a job to run this.
Good luck :)