Developing R package jpeg/png files under inst/doc are not appearing - r

I am currently updating a package and am hitting a wall on this issue.
I haven't been able to find any documentation explaining why the png files under inst\doc are not appearing in the library folder when I run the following:
install(package_path)
using the devtools package where package_path is the path to the source folder for the package.
I do not have an .Rinstignore file and just in case I have made sure that my .Rbuildignore does not have any patterns matching png files.
Any help or direction towards documentation over this matter would be appreciated.
To help clarify:
Below (on the left) you can see the package within the library directory containing just the html files. This is the result of running the install() command on the library I am maintaining, "". On the right is the source directory containing the png files as well. These png files are not being transferred over.
There is no additional code involved on my part.
[![![enter image description here][1]][1]
Now what is strange is that if I move the image files out of the inst directory, (So for example putting them in the root directory of the package), they do get copied over to the library.

This is extremely old now, but I came across this thread because I had the same issue.
It seems to me that it is something to do with the name of the sub-directory in inst. I had the same problem with .png files in inst/doc/img which would not copy over on package build. But changing the file name to inst/markdown/img and everything copies fine.

Related

How to get here() to work with R markdown/Quarto

Hoping for some help - I have been banging my head for a few hours and I can't find a solution. I am trying to optimise a workflow which involves sourcing a script from a Quarto file (I imagine this would be the same for an R markdown file). If I run the actual script using here() to load csv files, it works perfectly and sets the root directory as:
here() starts at /Users/Jobs/2023/project_x
Which is where the R project file is located.
But if I source that same script from within a Quarto file it sets the root directory to one up from the folder containing the .qmd file, and prevents the csv files from being able to be read:
here() starts at /Users/Jobs/2023/project_x/Analysis/code
The .qmd file is located at:
here() starts at /Users/Jobs/2023/project_x/Analysis/code/rmd
Is this expected behaviour and can I get around it?
Using:
here::set_here("/Users/Jobs/2023/project_x")
in a .qmd code chunk seems to work and keep all paths using here() intact in the source script, but I don't think it's ideal as I still need to specify an absolute path in the first instance.
Open to other suggestions...

How to add external data folder into developing R package? [duplicate]

In the documentation, R suggests that raw data files (not Rdata nor Rda) should be placed in inst/extdata/
From the first paragraph in: http://cran.r-project.org/doc/manuals/R-exts.html#Data-in-packages
The data subdirectory is for data files, either to be made available
via lazy-loading or for loading using data(). (The choice is made by
the ‘LazyData’ field in the DESCRIPTION file: the default is not to do
so.) It should not be used for other data files needed by the package,
and the convention has grown up to use directory inst/extdata for such
files.
So, I have moved all of my raw data into this folder, but when I build and reload the package and then try to access the data in a function with (for example):
read.csv(file=paste(path.package("my_package"),"/inst/extdata/my_raw_data.csv",sep=""))
# .path.package is now path.package in R 3.0+
I get the "cannot open file" error.
However, it does look like there is a folder called /extdata in the package directory with the files in it (post-build and install). What's happening to the /inst folder?
Does everything in the /inst folder get pushed into the / of the package?
More useful than using file.path would be to use system.file. Once your package is installed, you can grab your file like so:
fpath <- system.file("extdata", "my_raw_data.csv", package="my_package")
fpath will now have the absolute path on your HD to the file.
You were both very close and essentially had this. A formal reference from 'Writing R Extensions' is:
1.1.3 Package subdirectories
[...]
The contents of the inst subdirectory will be copied recursively
to the installation directory. Subdirectories of inst should not
interfere with those used by R (currently, R, data, demo,
exec, libs, man, help, html and Meta, and earlier versions
used latex, R-ex). The copying of the inst happens after src
is built so its Makefile can create files to be installed. Prior to
R 2.12.2, the files were installed on POSIX platforms with the permissions in the package sources, so care should be taken to ensure
these are not too restrictive: R CMD build will make suitable
adjustments. To exclude files from being installed, one can specify a
list of exclude patterns in file .Rinstignore in the top-level
source directory. These patterns should be Perl-like regular
expressions (see the help for regexp in R for the precise details),
one per line, to be matched(10) against the file and directory paths,
e.g. doc/.*[.]png$ will exclude all PNG files in inst/doc based on
the (lower-case) extension.

R Package unable to access contents from `inst` folder [duplicate]

In the documentation, R suggests that raw data files (not Rdata nor Rda) should be placed in inst/extdata/
From the first paragraph in: http://cran.r-project.org/doc/manuals/R-exts.html#Data-in-packages
The data subdirectory is for data files, either to be made available
via lazy-loading or for loading using data(). (The choice is made by
the ‘LazyData’ field in the DESCRIPTION file: the default is not to do
so.) It should not be used for other data files needed by the package,
and the convention has grown up to use directory inst/extdata for such
files.
So, I have moved all of my raw data into this folder, but when I build and reload the package and then try to access the data in a function with (for example):
read.csv(file=paste(path.package("my_package"),"/inst/extdata/my_raw_data.csv",sep=""))
# .path.package is now path.package in R 3.0+
I get the "cannot open file" error.
However, it does look like there is a folder called /extdata in the package directory with the files in it (post-build and install). What's happening to the /inst folder?
Does everything in the /inst folder get pushed into the / of the package?
More useful than using file.path would be to use system.file. Once your package is installed, you can grab your file like so:
fpath <- system.file("extdata", "my_raw_data.csv", package="my_package")
fpath will now have the absolute path on your HD to the file.
You were both very close and essentially had this. A formal reference from 'Writing R Extensions' is:
1.1.3 Package subdirectories
[...]
The contents of the inst subdirectory will be copied recursively
to the installation directory. Subdirectories of inst should not
interfere with those used by R (currently, R, data, demo,
exec, libs, man, help, html and Meta, and earlier versions
used latex, R-ex). The copying of the inst happens after src
is built so its Makefile can create files to be installed. Prior to
R 2.12.2, the files were installed on POSIX platforms with the permissions in the package sources, so care should be taken to ensure
these are not too restrictive: R CMD build will make suitable
adjustments. To exclude files from being installed, one can specify a
list of exclude patterns in file .Rinstignore in the top-level
source directory. These patterns should be Perl-like regular
expressions (see the help for regexp in R for the precise details),
one per line, to be matched(10) against the file and directory paths,
e.g. doc/.*[.]png$ will exclude all PNG files in inst/doc based on
the (lower-case) extension.

Using packrat libraries with knitr and the rstudio compile PDF button

As explained by Yihui Xie in this post, when one uses the Compile PDF button of the RStudio IDE to produce a PDF from a .Rnw file, knit() uses the globalenv() of a new R session. Is there a way that this new R session would use the packrat libraries of my project (even the version of knitr included in my packrat libraries) instead of my personal user libraries to ensure a maximum level of reproducibility? I guess that the new R session would have to be linked to the project itself, but I don't know how to do this efficiently.
I know I could directly use the knit() function instead of the Compile PDF button and, that way, knit() would use my current globalenv(), but I don't like this solution since it's less reproducible.
I think I got the problem myself, but I want to share with others who could confirm I'm right, and possibly help improve my solution.
My specific problem is that my .Rnw file is in a sub-directory of my whole project. When the Compile PDF button creates a new R session, it is created in this sub-directory, thus not finding the .Rprofile file that would initialize packrat. I think the easiest solution would be to create a .Rprofile file in my subdirectory which contains
temp <- getwd()
setwd("..")
source("packrat/init.R")
setwd(temp)
rm(temp)
I have to change the working directory at the project level before source("packrat/init.R") because the file itself refers to the directory...
Anybody can see a better solution?
P.,
I don't know if this solution works for even the knitr package, but I am 99% sure it works for all other packages as it seems to for me.
(I believe) I have a very similar problem. I have my project folder, but my working directory has always been the sub folder where my .rnw file is located, in a subdirectory of my project folder.
The link to Yihiu Xie's answer was very helpful.
Originally I wanted a project folder such as:
project-a/
working/
data/
datas.csv
analysis/
library.R
rscripts.R
rnw/
report.rnw
child/
preamble.rnw
packrat/
But I'm not sure if that is possible with packrat when my R library() calls are not in the working directory and packrat cannot parse the .rnw file (I call the library.R file from a chunck using source() in my .rnw file). A few notes:
I wanted to use a .Rproj file to open the project and have project-a/working as the working directory
If this was true then packrat can find the library.R script
But the .rnw file still defaults to its own working directory when compiling
I thought an .Rprofile with knitr::opts_knit$set(root.dir = "..") would work but I don't think it works for latex commands like input\, it defaults back to the directory containing the .rnw file
I thought this was insufficient because then you have two working directories, one for your r chunks and one for your latex!
Since .rnw always sets the working directory, I put my library.R script in the same directory as my .rnw file which creates the packrat folder in project-a/working/rnw. I am 99% sure this works because when I created the packrat folder in the project-a/working/rnw folder WITHOUT relocating the library.R file it received an error that no packages could be found and I could not compile the .rnw file.
project-a/
working/
data/
datas.csv
analysis/
rscripts.R
rnw/
report.rnw
library.R
packrat/
child/
preamble.rnw
Again, unless I am overlooking something or misunderstanding which packages are being used, this seems to have worked for me. Disclaimer here that I am relatively new to packrat.

Build script not working with HTML5Boilerplate new download

My first problem was that, though the documentation warned the JDK was required, and though I set the bin directory to that of the JDK, tools.jar was being searched for in the JRE folder. This made no sense, but I copied the tools.jar file over and it got past that problem. The next problem I had was the build script failing due to a failure to find a main.css file. I'm on a Windows 7 machine, and this is what I did to attempt the build:
Downloaded WinAnt v7 and installed it, specifying the jdk1.7.0_04/bin folder when asked for a Java directory.
Downloaded and unpacked a brand new package from HTML5Boilerplate, keeping the extra comments and such.
Downloaded the build project, unpacked it, and dropped its contents into a build folder at the root of the HTML5Boilerplate folder.
Opened a command prompt, navigated to the build directory, and ran the ant command.
The only thing I could think of that was causing the JDK/JRE problem was that this is a 64-bit system. That's just a guess, but the copied file worked OK for now.
This process performs some of the work without complaint, creating intermediate and publish directories, but then fails out, saying that it can't find a main.css file to copy. I want to stress that I didn't make any modifications at all to the files, so I'm confused as to why the build script can't find a file I didn't remove or rename. In the config/default.properties file of the build folder, on lines 74 and 80, it hard-codes main.js and main.css as file names used. I'm not sure if those are supposed to be dynamically generated, or if they must be manually created and included in the project for the build script to run. If so, why doesn't the default structure downloaded from the website have them? If they're dynamically created, I need advice on what is going wrong.
I'd really like to get this up and running so I can get started using HTML5Boilerplate, but I'm a little lost here.
-edit
After renaming the styles.css file to main.css, the build completed correctly, but the resulting files aren't correct. I read that the script would update the html file references to css and javascript files, but it didn't. For instance, I ended up with e68668b.css after the script ran, but the html file still referenced styles.css. Same for the javascript file. Help!
I found the problem. The build script is now a separate project, which I downloaded from github. I downloaded the HTML5Boilerplate zip file from the HTML5Boilerplate website, which unfortunately still has the old folder structure. I went to github and downloaded the HTML5Boilerplate template there, and that made the difference.
The HTML5Boilerplate website's link points to github's 3.0.2 version
The github's link points to version 3.0.2-69
And that's all she wrote. The names of some files changes, as well as some of the folder structure, between these two versions, and the build script I downloaded referenced the newest structures.

Resources