How to source file.R without output - r

Is it possible to source a file without printing all the charts etc (already tried with echo=F)?
In my case I call the png("filename%03d.png") device early in the script. It is not to big a hassle to comment this out - but all the charts do take a lot of time to render. (the specific file I am working with now uses base-graphics - but mostly I'll be using ggplot2 - which makes the issue somewhat more important (ggplot2 is excellent, but in the current implementation not the fastest))
Thanks

It's not a problem for ggplot2 or lattice graphics - you always have to explicitly print them when they are called in non-interactive settings (like from within a script).

Good practise for coding R means wrapping as much of your code as possible into functions. (See, e.g., Chapter 5 of the R Inferno, pdf.) If you place your plotting code inside a function, it need not be displayed when you source it. Compare the following.
File foo.r contains
plot(1:10)
When you call source('foo.r'), the plot is shown.
File bar.r contains
bar <- function() plot(1:20)
When you call source('bar.r'), the plot is not shown. You can display it at your convenience by typing bar() at the command prompt.

Perhaps this might be of some help...
"A package that provides a null graphics device; includes a vignette, "devNull", that documents how to create a new graphics device as an add on package. "
from http://developer.r-project.org/

It's not the best sounding solution, but If you might be running this script often like this, you could declare a boolean whether graphics are required (graphics_required=TRUE) and then enclose all your plot commands in if/then clauses based on your boolean, then just change the boolean to change the behaviour of the script

Related

Knitr - Define Latex commands that call R functions using Sexpr

I'm basically wondering how to define a new Latex command such that it allows the nesting of Sexpr and some other R function, where the Latex argument is an R object.
As fortunute happenstance, the idea somewhat is transmitted by the new command structure given below:
\newcommand{\SomeLatexCommand}[1]{\Sexpr{"#1"}}
Where fortunately the argument is indeed shown, albeit in string. With this in mind, I was hoping upon the following command:
\newcommand{\SweetLatexCommand}[1]{\Sexpr{SomeRFunction(get("#1"))}}
However, once inside nested inside an R function, #1 is not read as a placeholder for the Latex argument, but instead as an existing R variable.
Is there a way to make the last comand work? Or else, are there also other neat ways to define Latex commands which in turn can call on any R function through R objects?
Good day,
No, you can't do that. The problem is the way knitr works:
R runs the knit() function (or some other knitr function). That function looks through the source for code chunks and \Sexpr calls, executes them, and replaces them with the requested output, producing a .tex file.
Then LaTeX processes that .tex file. R is no longer involved.
Since \newcommand is a LaTeX command, it is only handled in the final stage, after all R evaluation is done.
There may be a way in knitr to specify another "macro" that works the way \Sexpr works, but I don't think there's a way to have several of them.
So what you should do is write multiple functions in R, and call those to do what you want, as \Sexpr{fn1(...)}, \Sexpr{fn2(...)}, etc.
I suppose if you were really determined, you could add an extra preprocessor stage at the beginning, that went through your Rnw file and replaced all strings that looked like \SweetLatexCommand{blah} with \Sexpr{SomeRFunction(get("blah"))} and then called knit(), but that seems like way too much work.

R effects plot displays OK on screen but generated pdf contains no pages

I am using the effects package in R to derive (and plot) the effects of a complicated linear model.
When I use allEffects(linearModel) I can see the results on screen and save it to a pdf file as usual. Since the model has many terms, the output is not useful as it stands.
Therefore I use effects(someTerm, linearModel) to focus on the terms of interest and the results on screen are what I need. However, when saving it to a pdf file, the file contains no useful output (though it does take up 3.5Kb of space). There is no error message from R at the console.
To ease understanding, I created a simple data set and a script for generating the effects plots, in the same way that I tried with the "real" model.
factorData.csv
A,B,C,y
a1,b1,c1,3
a1,b2,c1,4
a2,b1,c1,5
a2,b2,c1,6
a1,b1,c1,2
a1,b1,c2,3.5
a1,b2,c2,4
a2,b1,c2,5.1
a2,b2,c2,6.2
plotEffect.r
require(effects)
dataFile <- '/tmp/factorData.csv'
effectABfile <- '/tmp/effect_AB.pdf'
effectABCfile <- '/tmp/effect_ABC.pdf'
allEffectFile <- '/tmp/lm_eff.pdf'
df <- read.csv(file=dataFile,header=TRUE,sep=',')
linearModel <- lm(y~A*B*C,data=df)
lm_eff <- allEffects(linearModel)
pdf(file=effectABfile)
plot(effect('A:B',linearModel))
dev.off()
pdf(file=allEffectFile)
plot(lm_eff)
dev.off()
pdf(file=effectABCfile)
plot(Effect(c('A','B','C'),linearModel))
dev.off()
As you can see, I tried allEffects, effect and Effect; allEffects is the only one that works for me. Please note that the script assumes that the data is placed in /tmp/factorData.csv - you might need to change the path of course. I also randomised the order in which the plots are generated, with no effect.
I had a look elsewhere on stackoverflow and saving plots to pdfs fails was the closest but the advice there (to issue dev.off() after each plot to 'close' the pdf file) is something I do already, as seen in plotEffect.r.
I tried this script on 2 machines, each running Lubuntu 14.04.1 64-bit with R version 3.0.2 and the latest effects package installed within R using install.packages. The results were the same.
I would be very grateful for advice on how to fix the problem of saving (instances of) this plot type to a pdf.
Fix/workaround
As suggested by #Roland in the comments below, if you wish to save grid plots (such as the output from the effects plots in this instance) into pdf files, it is better/more reliable to generate the plots separately/manually (rather than in a script). It does not appear to be (as much of) an issue for base graphics or even ggplot2 graphics, where I for one have never encountered this problem/needed this workaround in the past. Thanks to #Roland for suggesting this fix!
#Roland also added that Sys.sleep() might help in scripts. Although it did not do so in my case, I discovered that it was possible to paste several such plotting commands and R would run them as a batch, saving the plots to pdf correctly. Therefore, I believe it should be possible to recover much of the benefits of running the script by taking these steps:
Use R to create the text representation of the pdf(), plot() and dev.off() triad of commands
The main script outputs this plotting command text (specific to each instance of a plot) to a file
Open the plotting command text in a text editor
Copy the entire contents of the commands file and paste it into the R console
Optionally, you may wish to use the command line in Step 3 and 4 - How can I load a file's contents into the clipboard? has useful advice.
This two stage procedure is a reasonable workaround, but arguably there should be no need for it.

plotting data in R from matlab

I was wondering if it is possible to work between matlab and R to plot some data. I have a script in matlab which generates a text file. From this I was wondering if it was possible to open R from within matlab and plot the data from this text file and then return to matlab.
For example, if I save a text file called test.txt, in the path 'E:\', and then define the path of R which in my case will be:
pathR = 'C:\Program Files\R\R-2.14.1\bin\R';
Is it possible to run a script already written in R saved under test1.R (saved in the same directory as test.txt) in R from matlab?
If you're working with Windows (from the path it looks like you are), you can use the MATLAB R-link from the File Exchange to pass data from Matlab to R, execute commands there, and retrieve output.
I don't use R so this is not something I have done but I see no reason why you shouldn't be able to use the system function to call R from a Matlab session. Look in the product documentation under the section Run External Commands, Scripts, and Programs for this and related approaches.
There are some platform-specific peculiarities to be aware of and you may have to wrestle a little with what is returned (though since you are planning to have R create a plot which is likely to be a side-effect rather than what is returned you may not). As ever this is covered quite well in the product's documentation
After using R(D)COM and Matlab R-link for a while, I do not recommend it. The COM interface has trouble parsing many commands and it is difficult to debug the code. I recommend using a system command from Matlab as described in the R Wiki. This also avoids having to install RAndFriends.

issues of wrapping up a set of workable R commands into a function

I generate a dendrogam using a collection of r commands. It worked just fine and saved the generated dendromgram into a PDF file. To improve efficiency, I wrapped these commands as a function, which does not change anything. However, the pdf is just a blank file without any graphical content. Please let me know what’s wrong with my function defintion. Thanks.
myplot<-function(inputcsv, outputfile){
library(ggdendro)
library(ggplot2)
x<-read.csv(inputcsv,header=TRUE)
d<-as.dist(x,diag=FALSE,upper=FALSE)
hc<-hclust(d,"ave")
dhc<-as.dendrogram(hc)
ddata<-dendro_data(dhc,type="rectangle")
ddata$labels$text <- gsub("\\."," ",ddata$labels$text)
ggplot(segment(ddata))+geom_segment(aes(x=x0,y=y0,xend=x1,yend=y1))
pdf(outputfile, width=30,height=35)
last_plot()
dev.off()
}
R FAQ
Wrap your ggplot call in a print() function.
ggplot and friends return an object, and the plotting only happens when the object is printed. When you do this on the command line the printing happens automatically. When you stick it in a script or function you have to do it yourself.
The debate on whether this is a good idea or a dumb thing that just generates questions like this continues...

Sweave syntax highlighting in output

Has anyone managed to get color syntax-highlighting working in the output of Sweave documents? I've been able to customize the output style by adding boxes, etc. in the Sweave.sty file as follows:
\DefineVerbatimEnvironment{Sinput}{Verbatim}{fontseries=bc,frame=single}
\DefineVerbatimEnvironment{Soutput}{Verbatim}{frame=leftline}
\DefineVerbatimEnvironment{Scode}{Verbatim}{fontseries=bc}
And I can get the minted package to do syntax highlighting of verbatim-code blocks in my document like so:
\begin{minted}{perl}
use Foo::Bar;
...
\end{minted}
but I'm not sure how to combine the two for R input sections. I tried the following:
\DefineVerbatimEnvironment{Sinput}{minted}{r}
\DefineVerbatimEnvironment{Scode}{minted}{r}
Any suggestions?
Yes, look at some of the vignettes for Rcpp as for example (to pick just one) the Rcpp-FAQ pdf.
We use the highlight by Romain which itself can farm out to the hightlight binary by Andre Simon. It makes everything a little more involved---Makefiles for the vignettes etc pp---but we get colourful output from R and C/C++ code. Which makes it worth it.
I have a solution that has worked for me, I have not tried it on any other systems though so things may not work out of the box for you. I've posted some code at https://gist.github.com/797478 that is a set of modified Rweave driver functions that make use of minted blocks instead of verbatim blocks.
To use this driver just specify it when calling the Sweave function with the driver=RweaveLatexMinted() option.
Here's how I've ended up solving it, starting from #daroczig's suggestion.
\usepackage{minted}
\renewenvironment{Sinput}{\minted[frame=single]{r}}{\endminted}
\DefineVerbatimEnvironment{Soutput}{Verbatim}{frame=leftline}
\DefineVerbatimEnvironment{Scode}{Verbatim}{}
While I was at it, I needed to get caching working because I'm using large data sets and one chunk was taking around 3 minutes to complete. So I wrote this zsh shell function to process an .Rnw file with caching:
function sweaveCache() {
Rscript -e "library(cacheSweave); setCacheDir(getwd()); Sweave('$1.Rnw', driver = cacheSweaveDriver)" &&
pdflatex --shell-escape $1.tex &&
open $1.pdf
}
Now I just do sweaveCache myFile and I get the result opened in Preview (on OS X).
This topic on tex.StackExchange might be interesting for you, as it suggest loading the SweaveListingUtils package in R for easy solution.

Resources