I'm new in R and Rstudio and I'm making a little project. The fact is that i have on one hand an .R file with the code I want to execute. And on the other hand I've an .Rmd file that I should use to report my work, including the results of the execution of my code in the other file.
How can I access the results and/or functions from de .Rmd file to the .R file?
Thank you,
By default, your .Rmd file will have its working directory as wherever the .Rmd file is saved. You can use all of R's standard functions inside the .Rmd file, including source() to run a .R file. So if your files are in the same directory, you can include source("your_r_file.R") to run the .R file. If they are in different directories, you can use relative or absolute file paths (though you should try to avoid absolute file paths in case the .Rmd file is ever run on a different computer).
If you are using RStudio, I would strongly recommend using the "Projects" feature and the here package. The readme for the here package is quite good for explaining its benefits.
Source the R file in at the top of your .Rmd file like
```{r}
source("file-name.R")
```
and the functions/objects in that R file will be avilable
Related
Hoping for some help - I have been banging my head for a few hours and I can't find a solution. I am trying to optimise a workflow which involves sourcing a script from a Quarto file (I imagine this would be the same for an R markdown file). If I run the actual script using here() to load csv files, it works perfectly and sets the root directory as:
here() starts at /Users/Jobs/2023/project_x
Which is where the R project file is located.
But if I source that same script from within a Quarto file it sets the root directory to one up from the folder containing the .qmd file, and prevents the csv files from being able to be read:
here() starts at /Users/Jobs/2023/project_x/Analysis/code
The .qmd file is located at:
here() starts at /Users/Jobs/2023/project_x/Analysis/code/rmd
Is this expected behaviour and can I get around it?
Using:
here::set_here("/Users/Jobs/2023/project_x")
in a .qmd code chunk seems to work and keep all paths using here() intact in the source script, but I don't think it's ideal as I still need to specify an absolute path in the first instance.
Open to other suggestions...
I am creating a question in r-exams that contains a graph made in TikZ, more specifically https://texample.net/tikz/examples/the-3dplot-package/. For its correct operation it is required that the 3dplot.sty file be in a certain R folder. In which folder should I include this file?
Error message in RStudio: "!LaTeX Error: File`3dplot.sty'not found".
If you only need it for one project, simply place the .sty file in the same folder as your .rmd file. The current working folder is the normally the first place latex searches for packages, before looking in your personal texmf folder or your tex distribution.
I would strongly recommend to install this in the texmf tree of your LaTeX installation. Then it is always found, no matter where you compile a LaTeX file.
Alternatively, you can also specify it using the header argument in include_tikz() with the full absolute file path:
include_tikz(..., header = "\\usepackage{/full/path/to/3dplot}")
Note that the .sty suffix is not to be included, even when using the full file path.
I am building a PDF from several RMarkdown files using Bookdown in Rstudio. In the process, Bookdown first compiles the different Rmd files into one big Rmd file (_main.Rmd), and then knits it into a PDF. If no errors are thrown, this temporary _main.Rmd file is usually deleted. I would like to keep the _main.Rmd file even when the file has successfully knitted into a PDF.
Is there any option I can add to _bookdown.yml to achieve this?
I just added this feature in the development version of bookdown, you may try:
remotes::install_github('rstudio/bookdown')
and set delete_merged_file: false in the config file _bookdown.yml.
I have two projects in R. I've moved an .Rmd document from project 1 to project 2.
When I used the .Rmd file which I had moved to project 2 to try and read in some data I get the following error message:
cannot open file '"mydata.csv"': No such file or directoryError in file(file, "rt") : cannot open the connection.
This kind of error usually suggests to me it's a working directory issue, however when I run getwd() in the command line it's the correct working directory that is listed and points to where the csv is stored. I've also run getwd() within the rmd doc and again the wd is correct.
Does anyone else have this experience of moving one .Rmd file to another project and then it not working in the new project?
The code in the .Rmd file that I am trying to run is:
Data <- read.csv("mydata.csv", stringsAsFactors = T) and the data is definitely within the project and has the correct title, is a csv etc.
Has anyone else seen this issue when moving an RMarkdown document into another project before?
Thanks
This may not be the answer, but rmarkdown and knitr intentionally don't respect setwd(): the code in each block is run from the directory holding the .rmd file. So, if you've moved your .rmd file but are then using setwd() to change to the directory holding the data, that does not persist across code chunks.
If this is the issue, then one solution is to use the knitr options to set the root.dir to the data location:
opts_knit$set(root.dir = '/path/to/the/data')
See here.
Maybe not relevant but it seems to be the most likely explanation for what's happening here:
The project shouldn't really interfere with your code here. When opening the project it will set your working directory to the root location of the project. However, this shouldn't matter in this case since RMarkdown files automatically set the working directory to the location where the RMarkdown file is saved. You can see this when running getwd() once in the Console and once from the RMarkdown file via run current chunk.
The behavior is the same when the file is knitted. So except in the case when "mydata.csv" is in the same directory as the RMarkdown file, the code above won't work.
There are two workarounds: you can use relative or absolute paths to navigate to your data file. In a project I prefer relative paths. Let's say the rmd file is in a folder called "scripts" and your data file is in a folder called "data" and both are in the same project folder. Then this should work:
Data <- read.csv("../data/mydata.csv", stringsAsFactors = TRUE)
The other option, which I do not recommend, is to set the working diretory in the rmd file via:
opts_knit$set(root.dir = '../data/')
The reason why I wouldn't do that is because the working direcotry is only changed when knitting the document but using the rmd file interactivly, you have a different working directory (the location of the rmd file)
This is a great application of the here package for dealing with these types of issues.
here https://github.com/jennybc/here_here looks around the Rmd file (and all R files) for the .Rproj file and then uses that as an "anchor" to search for other files using relative references. If for instance you had the following project structure:
-data
|--mydata.csv
-src
|-00-import.R
|-01-make-graphs.R
|-02-do-analysis.R
-report
|--report.Rmd
-yourproject.Rproj
And you wanted to use mydata.csv in your report.Rmd you could do the following:
library(here)
dat <- read.csv(here("data", "mydata.csv"))
here will then convert this path to "~/Users/../data/mydata.csv" for you. Note that you have to be in the Rproject for this use case.
Here is such a great package and solves a lot of issues.
As explained by Yihui Xie in this post, when one uses the Compile PDF button of the RStudio IDE to produce a PDF from a .Rnw file, knit() uses the globalenv() of a new R session. Is there a way that this new R session would use the packrat libraries of my project (even the version of knitr included in my packrat libraries) instead of my personal user libraries to ensure a maximum level of reproducibility? I guess that the new R session would have to be linked to the project itself, but I don't know how to do this efficiently.
I know I could directly use the knit() function instead of the Compile PDF button and, that way, knit() would use my current globalenv(), but I don't like this solution since it's less reproducible.
I think I got the problem myself, but I want to share with others who could confirm I'm right, and possibly help improve my solution.
My specific problem is that my .Rnw file is in a sub-directory of my whole project. When the Compile PDF button creates a new R session, it is created in this sub-directory, thus not finding the .Rprofile file that would initialize packrat. I think the easiest solution would be to create a .Rprofile file in my subdirectory which contains
temp <- getwd()
setwd("..")
source("packrat/init.R")
setwd(temp)
rm(temp)
I have to change the working directory at the project level before source("packrat/init.R") because the file itself refers to the directory...
Anybody can see a better solution?
P.,
I don't know if this solution works for even the knitr package, but I am 99% sure it works for all other packages as it seems to for me.
(I believe) I have a very similar problem. I have my project folder, but my working directory has always been the sub folder where my .rnw file is located, in a subdirectory of my project folder.
The link to Yihiu Xie's answer was very helpful.
Originally I wanted a project folder such as:
project-a/
working/
data/
datas.csv
analysis/
library.R
rscripts.R
rnw/
report.rnw
child/
preamble.rnw
packrat/
But I'm not sure if that is possible with packrat when my R library() calls are not in the working directory and packrat cannot parse the .rnw file (I call the library.R file from a chunck using source() in my .rnw file). A few notes:
I wanted to use a .Rproj file to open the project and have project-a/working as the working directory
If this was true then packrat can find the library.R script
But the .rnw file still defaults to its own working directory when compiling
I thought an .Rprofile with knitr::opts_knit$set(root.dir = "..") would work but I don't think it works for latex commands like input\, it defaults back to the directory containing the .rnw file
I thought this was insufficient because then you have two working directories, one for your r chunks and one for your latex!
Since .rnw always sets the working directory, I put my library.R script in the same directory as my .rnw file which creates the packrat folder in project-a/working/rnw. I am 99% sure this works because when I created the packrat folder in the project-a/working/rnw folder WITHOUT relocating the library.R file it received an error that no packages could be found and I could not compile the .rnw file.
project-a/
working/
data/
datas.csv
analysis/
rscripts.R
rnw/
report.rnw
library.R
packrat/
child/
preamble.rnw
Again, unless I am overlooking something or misunderstanding which packages are being used, this seems to have worked for me. Disclaimer here that I am relatively new to packrat.