testthat .Rbuildignore + external file (NOTE) - r

Building a package using testthat for tests; those require an external file which as recommended lies in /tests/testthat/my-file.
However the R CMD check produces
Found the following hidden files and directories:
tests/testthat/my-file
The above is NOTE (Status: 1 NOTE)
If I add my-file to .Rbuildignore (devtools::use_build_ignore("/tests/testthat/my-file") then the file is well, ignored during the check, thus all tests fail and the package cannot be build.
How can I solve this issue? I understand that a NOTE is passable but I would like to get rid of it nonetheless.

The preferred way (according to Hadley) to load API credentials is via environment variables. If you are sharing the credentials with your package, you can just set them in an .onLoad function that will be run with the package namespace is loaded. If you just want to be able to run tests locally using those credentials but not share them, then add them to global Renviron.site file (or, less conveniently, in an .Renviron file in your working directory). Then you can delete this file from your package structure (or just .Rbuildignore it) and make the tests conditional on the presence of the environment variable, with something like:
if (!identical(Sys.getenv("MY_ENV_VAR"), "")) {
test_all("package")
}

Related

How to run R projects / use their relative paths from the terminal without setwd() resp. cd

I'm kinda lost on that one:
I have set up an R project, let's call it "Test Project.Rproj". The beauty of R projects is the possibility to use relative paths (relative to the .Rproj file). My project consists of a "main.R" script, which is saved on the same level as the .Rproj file.
Additionally I have a directory called 'Output', where I want my plots and exported data to be saved. My "main.R" file looks like the following:
my_df <- data.frame(A = 1:10, B = 11:20)
my_df |>
writexl::write_xlsx(here::here("Output",
paste0("my_df_",
stringr::str_replace_all(as.character(Sys.time()), ":", ""),
".xlsx")))
My final goal is to automate the execution of the 'main.R' file using the Windows Task Scheduler. But in order to do so, I have to be able to run the script from the terminal. The problem here is the working directory. When opening an R project, all the paths are relative to .Rproj file. But in the terminal the current working directory is <C:\Users\my_name>. Of course I could manually set the working directory via cd "path\to\my\project. But I would like to avoid that.
My current call for the execution of the main.R file in the terminal is the following:
"C:\Program Files\R\R-4.1.0\bin\Rscript" -e "source('C:/Users/my_name/path/to/my/project/main.R')"
My two ideas for a solution are the following, but I am happy for other suggestions as well.
In order to replicate the usual use of a project: Is there a way to execute the .Rproj
file from the terminal? In order to create a similar environment as in RStudio, where all the relative paths are working, when executing scripts from the project afterwards?
There are two packages adressing the problem of relative paths: rprojroot and here, where the former is the basis for the latter. I am pretty sure that here does not provide the needed functionality. I tried adding here::i_am("main.R) to my main.R file, but the project root directory still is not found when executing in the terminal from a working directory outside the project.
For rprojroot to work, I think it is also necessary to have your current working directory somewhere within the project. But this package offers a lot of functionality, so I am not sure wheter I am overlooking something.
So I would be happy about any help. Maybe it is impossible and I have to change the working directory manually - then I would be glad to know that as well.
Some links I used in my research:
https://www.tidyverse.org/blog/2017/12/workflow-vs-script/
https://malco.io/2018/11/05/why-should-i-use-the-here-package-when-i-m-already-using-projects/
http://jenrichmond.rbind.io/post/how-to-use-the-here-package/
Thanks a lot!
Edit: My current implementation is an additional R script, where I manually set the working directory via setwd() and source the main.R file. However it is always suggested to avoid setwd, which is why this whole question exists.

Include and use precompiled .dll / .so in R package

I would like to create an R package that contains a precompiled .dll/.so file. It is unclear to me where to put the file in the package structure (e.g. in the folder inst?) and how to load it -- e.g. what lines do I need to add to other files that allow the .dll or .so to be loaded and functions contained in it to be used.
In particular, I would like to see some examples for the use of dyn.load(), .C() and library.dynam().
In a normal script that is not a package, I would load the dll via
dyn.load("path/to/my_dll.dll")
and then call specific functions contained in that .dll by using
.C("dll_func", input)
However, this seems to be different when trying to convert my script into a package.
Also, do I need .onLoad and how do I use it correctly?

testInstalledPackage in custom libs folder

Question : How do I make tools::testInstalledPackage work when I have a custom lib path defined in Rprofile.site.
I'm not an experience unix user so I might be wrong about this. Theres a line in tools::testInstalledPackage (shown below) which I suppose runs R in vanilla mode, i.e. my custom lib path in Rprofile.site does not get added in.
cmd <- paste(shQuote(file.path(R.home("bin"), "R")),
"CMD BATCH --vanilla --no-timing", Ropts, shQuote(Rfile),
shQuote(failfile))
In this case, when I try to test the package zoo, tools::testInstalledPackage returns a zoo-Ex.Rout.fail file with an error saying that there is no package called 'zoo' which makes sense as vanilla R does not contain my custom Lib folder.
Is there a way to use tools::testInstalledPackage to test my packages in my custom folder? Or do I have to copy the folder into the default file path
p.s. my current workaround is to create a new function without the --vanilla and attach it to the tools namespace but I don't think its a very elegant solution.

Load file implicitly from Path

I am trying to get my head around programming with multiple modules (in different files). I don't want to load explicitly the files with ìnclude in the right order.
I am using the Atom IDE as my development platform, so I don't run julia explicitly.
when I am just using importall Datastructures (where ModuleName is the name of the module) julia complains:
LoadError: ArgumentError: Module Datastructures not found in current path.
Run `Pkg.add("Datastructures")` to install the Datastructures package.
while loading F:\dev\ai\Interpreter.jl, in expression starting on line 8
There are two ways to build a package or module in julia:
1) Use the tools in PkgDev. You can get them with Pkg.add("PkgDev") ; using PkgDev. Now you can use PkgDev.generate("MyPackageName", "MIT") (or whatever license you prefer) to build your package folder. By default, julia will build this folder in the same directory as all your other external packages. On Linux, this would ~/.julia/v0.6/ (or whatever version you are running). Also by default, this folder will be on the julia path, so you can just type using MyPackageName at the REPL to load it.
Note that julia essentially loads the package by looking for the file ~/.julia/v0.6/MyPackageName/src/MyPackageName.jl and then running it. If your module consists of multiple files, you should have all of them in the ~/.julia/v0.6/MyPackageName/src/ directory, and then have a line of code in the MyPackageName.jl file that says include("MyOtherFileOfCode.jl").
2) If you don't want to keep your package in ~/.julia/v0.6/ for some reason, or you don't want to build your package using PkgDev.generate(), you can of course just set the files up yourself.
Let's assume you want MyPackageName to be stored in the ~/MyCode directory. First, create the directory ~/MyCode/MyPackageName/. Within this directory, I strongly recommend using the same structure that julia and github use, i.e. store all your code in a directory called ~/MyCode/MyPackageName/src/.
At a minimum, you will need a file in this directory called ~/MyCode/MyPackageName/src/MyPackageName.jl (just like in the method above). This file should begin with module MyPackageName and finish with end. Then, put whatever you want in-between (including include calls to other files in the src directory if you wish).
The final step is to make sure that julia can find MyPackageName. To do this, you will need ~/MyCode to be on the julia path. To do this, use: push!(LOAD_PATH, "~/MyCode") or push!(LOAD_PATH, "~/MyCode/MyPackageName").
Maybe you don't want to have to run this command every time you want to access MyPackageName. No problem, you just need to add this line to your .juliarc.jl file, which is automatically run every time you start julia. On Linux, your .juliarc.jl file should be in your home directory, i.e. ~/.juliarc.jl. If it isn't there, you can just create it and put whatever code you want in there. If you're on a different OS, you'll have to google where to put your .juliarc.jl.
This answer turned out longer than I planned...

R package development, Possible to create submaps within \R directory?

I'm trying to create a R package. Now I've used roxygen and devtools to help create all necessary files and it's working.
Among others I have the maps /man , /R, /tests. Now I would like to create some subfolders in /R directory, but once I do this and move any scripts inside I get an Error in namespaceExport(ns, exports) when trying to rebuild the package.
Can I only have script files directly within /R subdirectory, and is there any solution to this other than putting the script files in other maps one level up? (such as old scripts that one may use in the future)
Thanks

Resources