testing external excel output with R testthat package - r

I'm trying to test a function of mine with the testthat package. The function is supposed to create an excel file, and I want to test if that excel file exists and contains the things it's supposed to contain. I tried running the function to create a simple mockup excel file and then reading in the output again, but that seems to not do anything, I'm guessing because of the localized testing environment. I also feel like writing code outside of the tests is not a good idea since it gave me a warning message about it.
Is there a way to test an external output with testthat that I don't know about? I'm new to unit testing and this package so any help would be appreciated.

I actually solved this myself by accident: code run in test_that DOES produce output, just in the test folder. I could read it from there and then test.

Related

Is there an R function to make a copy of all the source code used to generate an analysis?

I have a file run_experiment.rmd which performs an analysis on data using a bunch of .r scripts in another folder.
Every analysis is saved into its own timestamped folder. I save the outputs of the analysis, the inputs used, and if possible I would also like to save the code used to generate the analysis (including the contents of both the .rmd file and the .r files).
The reason for this is because if I make changes to the way my analyses are run, then if I re-run the analysis using the new updated file, I will get different results. If possible, I would like to keep legacy versions of the code so that I can always, if need be, re-run the original analysis.
Have you considered using a git repository to commit your code and output each time you update/run it? I think this is the optimal solution for what you are describing. Each commit would have a timestamp associated with it for you to rollback to a previous version when needed.
The best way to do this is to put all of those scripts into an R package, and in your Rmd file, print sessionInfo() to record the package version used.
You should change the version number of the package each time you make a non-trivial change to it (or even better, with every change).
Then when you want to reproduce the analysis, you use the sessionInfo() listing to work out which version of R and of the packages to install, and you'll get the same environment.
There are packages to help with this (pak and renv, maybe others), but I haven't used them, so I can't give details or recommendations.

Missing command in an R package

So to get to the point: I need to use an R package called machuruku. To get familiar with the package I used the dataset provided in the original paper (https://academic.oup.com/sysbio/article/70/5/1033/6171196). While trying to run the code for the simulation I get an error message saying that the command "machu.simulation" doesn't exist. Any of you have any idea why that's happening? Am I missing a package?
I downloaded the dataset zip file, dove into the second nested zip file Guillory_and_Brown_simulation-validation.zip, then into its file code_simulation-validation.R, and noticed that this source file uses machu.simulation several times before defining the function starting in line 519.
Suggestions:
Grab lines 519 through the end, save into a different file, source that new file, then try to run the code in the beginning of the file again.
Complain (not quietly?) to the authors, the fact that they think this is reproducible means they might have missed something else, too.

R Source file location problem while using testthat

I'm trying to setup the testthat unit test framework and having some trouble to get the source file location right.
My package folder structure is like below:
.\R\abc.R
.\R\def.R
.\tests\testthat\test_01.R
In my test case file test_01.R, I need to import abc.R. I managed to get this working by specifying a relative path like below:
'../../R/abc.R'
Now the abc.R file can be sourced successfully from my test cases. However, it failed at the step where abc.R tries to source def.R. I think this is because the working directory is set to ./tests/testthat by testthat.
The fix I can think of is to add a relative path '../../R/' to def.R, but this looks to me like a terrible solution as it will break when I run abc.R directly. And also there are a lot more files like abc.R and def.R in my package.
Is there a more graceful way to handle this?
Sorry if this is a straightforward question as I'm still new to R.
Inside ./tests/ there should be a file named testthat.R
Within this file you can add 3 lines:
library(testthat)
library(yourLibraryName)
test_check("yourLibraryName")
Of course replace "yourLibraryName" with the name of your package.
Then all the functions exported by your package will be loaded and tests will be able to use them.

C file does not work in my own R package?

I built my own package in R and created all my functions. Everything worked very well. Then, I want to include a .C files into my package.
I follow the structure in this link compiled code. Once I done that, my package stop working and cannot use it anymore.
I tried to fix it more than one time but nothing is happen. Then, I built another package and load my functions inside it (I was save a copy of my files).
Now I would like to start again but do not want to lose my function again. Any ideas?
Try to write your files first and make sure that they are work! Then build you package following the structures here.
Follow the structures step by step and you will be fine. Your package will set src file for you and all your other files.

How to call R script from another R script, both in same package?

I'm building a package that uses two main functions. One of the functions model.R requires a special type of simulation sim.R and a way to set up the results in a table table.R
In a sharable package, how do I call both the sim.R and table.R files from within model.R? I've tried source("sim.R") and source("R/sim.R") but that call doesn't work from within the package. Any ideas?
Should I just copy and paste the codes from sim.R and table.R into the model.R script instead?
Edit:
I have all the scripts in the R directory, the DESCRIPTION and NAMESPACE files are all set. I just have multiple scripts in the R directory. ~R/ has premodel.R model.R sim.R and table.R. I need the model.R script to use both sim.R and table.R functions... located in the same directory in the package (e.g. ~R/).
To elaborate on joran's point, when you build a package you don't need to source functions.
For example, imagine I want to make a package named TEST. I will begin by generating a directory (i.e. folder) named TEST. Within TEST I will create another folder name R, in that folder I will include all R script(s) containing the different functions in the package.
At a minimum you need to also include a DESCRIPTION and NAMESPACE file. A man (for help files) and tests (for unit tests) are also nice to include.
Making a package is pretty easy. Here is a blog with a straightforward introduction: http://hilaryparker.com/2014/04/29/writing-an-r-package-from-scratch/
As others have pointed out you don't have to source R files in a package. The package loading mechanism will take care of losing the namespace and making all exported functions available. So usually you don't have to worry about any of this.
There are exceptions however. If you have multiple files with R code situations can arise where the order in which these files are processed matters. Often it doesn't matter or the default order used by R happens to be fine. If you find that there are some dependencies within your package that aren't resolved properly you may be faced with a situation where a custom processing order for the R files is required. The DESCRIPTION file offers the optional Collate field for this purpose. Simply list all your R files in the order they should be processed to satisfy the dependencies.
If all your files are in R directory, any function will be in memory after you do a package build or Load_All.
You may have issues if you have code in files that is not in a function tho.
R loads files in alphabetical order.
Usually, this is not a problem, because functions are evaluated when they are called for execution, not at loading time (id. a function can refer another function not yet defined, even in the same file).
But, if you have code outside a function in model.R, this code will be executed immediately at time of file loading, and your package build will fail usually with a
ERROR: lazy loading failed for package 'yourPackageName'
If this is the case, wrap the sparse code of model.R into a function so you can call it later, when the package has fully loaded, external library too.
If this piece of code is there for initialize some value, consider to use_data() to have R take care of load data into the environment for you.
If this piece of code is just interactive code written to test and implement the package itself, you should consider to put it elsewhere or wrap it to a function anyway.
if you really need that code to be executed at loading time or really have dependency to solve, then you must add the collate line into DESCRIPTION file, as already stated by Peter Humburg, to force R to load files order.
Roxygen2 can help you, put before your code
#' #include sim.R table.R
call roxygenize(), and collate line will be generate for you into the DESCRIPTION file.
But even doing that, external library you may depend are not yet loaded by the package, leading to failure again at build time.
In conclusion, you'd better don't leave code outside functions in a .R file if it's located inside a package.
Since you're building a package, the reason why you're having trouble accessing the other functions in your /R directory is because you need to first:
library(devtools)
document()
from within the working directory of your package. Now each function in your package should be accessible to any other function. Then, to finish up, do:
build()
install()
although it should be noted that a simple document() call will already be sufficient to solve your problem.
Make your functions global by defining them with <<- instead of <- and they will become available to any other script running in that environment.

Resources