R: package depends on locally installed package - r

I am working on two separate, but related packages.
Among a bunch of other functions flowerR contains the following generic function show_match:
show_match <- function(x, ...) {
UseMethod("show_match")
}
#' #export
#' #rdname show_match
show_match.flowerClass <- function(x, ...) {
# do a bunch of stuff and generate some output
}
Package flowerR is installed locally on my machine, in "C:/my_R_packages", it loads fine into R (library(flowerR) and works fine by itself.
In package treeR I want to build on this generic method for a second class:
#' #export
#' #rdname show_match
show_match.treeClass <- function(x, ...) {
# do a bunch of stuff and generate some output
}
In order for R to be able to find the generic function, I included
Depends: flowerR
in the treeR DESCRIPTION file. However, when I check treeR (using devtools::check), I receive the error: Error : package 'flowerR' required by 'treeR' could not be found. This boggles me, because R has no problem attaching flowerR when using library(flowerR).
How can I make this work?
BTW, what does work fine is:
library(flowerR)
class(x) # [1] "treeData"
show_match(x) # works fine, with flowerR loaded, the method in treeR runs perfeclty
so, when flowerR is loaded into R interactively, all works well. Now, the question is how to make this happen when flowerR is not attached by hand.

Related

pkgdown fails parsing Rd files when examples are added

for some reason pkgdown is failing to parse one of the .Rd files that I have in my package. I found it fails when I add examples to the roxygen2 documentation using either the #examples tag or the #example inst/example/add.R alternative. I minimized my function to two arguments in order to make it more "reproducible" and still getting the same error. Please find bellow the error message, the .Rd file generated using that devtools::document() and the roxygen2 documentation of the function. As you can see I am using a very simple example that should run with no problems... One more thing to say is that when I run devtools::check() all my examples pass, so I don't understand why pkgdown is failing.
Thank you so much for your help.
Best,
Error message
Reading 'man/merge.Rd'
Error : Failed to parse Rd in merge.Rd
i unused argument (output_handler = evaluate::new_output_handler(value = pkgdown_print))
Error: callr subprocess failed: Failed to parse Rd in merge.Rd
i unused argument (output_handler = evaluate::new_output_handler(value = pkgdown_print))
.Rd file
% Generated by roxygen2: do not edit by hand
% Please edit documentation in R/merge.R
\name{merge}
\alias{merge}
\title{Merge two tables}
\usage{
merge(x, y)
}
\arguments{
\item{x}{data frame: referred to \emph{left} in R terminology, or \emph{master} in
Stata terminology.}
\item{y}{data frame: referred to \emph{right} in R terminology, or \emph{using} in
Stata terminology.}
}
\value{
a data.table joining x and y.
}
\description{
This is the main and, basically, the only function in joyn.
}
\examples{
x <- c(1, 2)
}
roxygen2 documentation
#' Merge two tables
#'
#' This is the main and, basically, the only function in joyn.
#'
#' #param x data frame: referred to *left* in R terminology, or *master* in
#' Stata terminology.
#' #param y data frame: referred to *right* in R terminology, or *using* in
#' Stata terminology.
#' #return a data.table joining x and y.
#' #export
#' #import data.table
#'
#' #examples
#' x <- c(1, 2)
This error comes from downlit::evaluate_and_highlight (sad it's not reported in the output), and can be fixed by installing the development version of downlit:
library(devtools)
install_github('r-lib/downlit')
It make sense only if you use the git version also from pkgdown, the stable pkgdown (version 1.6.1) runs just fine with the stable downlit. Of course development version of any package can break at any time, but until it doesn't, it's just alright.

R package data not available when importing in another package

I have one package "testing" with a data object "test_data" saved in data folder under file name "test_data.RData".
testing contains one function hello() that uses this data object
#' hello
#'
#' #return Prints hello "your_name"
#' #export
#'
#' #examples
#' hello()
hello <- function(your_name = "") {
print(paste("test_data has", nrow(test_data), "rows"))
print(sprintf("Hello %s!", your_name))
}
the following code works fine:
require(testing)
testing::hello()
[1] "test_data has 32 rows"
[1] "Hello !"
but this fails:
testing::hello()
Error in nrow(test_data) : object 'test_data' not found
Actually I do not use it directly but in another package testingtop that imports this function:
#' Title
#'
#' #export
#' #importFrom testing hello
hello2 <- function(){
hello()
}
I have testing in the Imports section of DESCRIPTION and this fails.
require(testingtop)
testingtop::hello2()
Error in nrow(test_data) : object 'test_data' not found
If I put it in Depends it works if I load the package with library()
otherwise it still fails:
> library(testingtop)
Loading required package: testing
> testingtop::hello2()
[1] "test_data has 32 rows"
[1] "Hello !"
Restarting R session...
> testingtop::hello2()
Error in nrow(test_data) : object 'test_data' not found
if it was a function instead of a data object Imports would be fine, why is it different with a data object and I need to load the imported package? Did I miss something? And is it related to LazyData and LazyLoad ?
Probably a duplicate of this question
SO I think I've found the solution from the doc of the data function ?data
Use of data within a function without an envir argument has the almost always undesirable side-effect of putting an object in the user's workspace (and indeed, of replacing any object of that name already there). It would almost always be better to put the object in the current evaluation environment by data(..., envir = environment()). However, two alternatives are usually preferable, both described in the ‘Writing R Extensions’ manual.
For sets of data, set up a package to use lazy-loading of data.
For objects which are system data, for example lookup tables used in calculations within the function, use a file ‘R/sysdata.rda’ in the package sources or create the objects by R code at package installation time.
A sometimes important distinction is that the second approach places objects in the namespace but the first does not. So if it is important that the function sees mytable as an object from the package, it is system data and the second approach should be used.
Putting the data in the internal data file made my function hello2() see it
> testingtop::hello2()
[1] "test_data has 32 rows"
[1] "Hello !"
To supplement Benoit's answer. I had essentially this problem, but when using my package data as a default for a function parameter. In that case there's a third solution in the ?data help file: "In the unusual case that a package uses a lazy-loaded dataset as a default argument to a function, that needs to be specified by ::, e.g., survival::survexp.us."
This third approach solved it for me. (But I found it thanks to Benoit's link.)

Function imported from dependency not found, requires library(dependency)

I am trying to create an R package that uses functions from another package (gamlss.tr).
The function I need from the dependency is gamlss.dist::TF (gamlss.dist is loaded alongside gamlss.tr), but it is referenced in my code as simply TF within a call to gamlss.tr::gen.trun.
When I load gamlss.tr manually with library(), this works. However, when I rely on the functions of the dependency automatically being imported by my package through #import, I get an "object not found" error as soon as TF is accessed.
My attempt to be more explicit and reference the function I need as gamlss.dist::TF resulted in a different error ("unexpected '::'").
Any tips on how to use this function in my package would be much appreciated!
The code below reproduces the problem if incorporated into a clean R package (as done in this .zip), built and loaded with document("/path/to/package"):
#' #import gamlss gamlss.tr gamlss.dist
NULL
#' Use GAMLSS
#'
#' Generate a truncated distribution and use it.
#' #export
use_gamlss <- function() {
print("gen.trun():")
gamlss.tr::gen.trun(par=0,family=TF)
#Error in inherits(object, "gamlss.family") : object 'TF' not found
#gamlss.tr::gen.trun(par=0,family=gamlss.dist::TF)
#Error in parse(text = fname) : <text>:1:1: unexpected '::'
y = rTFtr(1000,mu=10,sigma=5, nu=5)
print("trun():")
truncated_dist = gamlss.tr::trun(par=0,family=TF, local=TRUE)
model = gamlss(y~1, family=truncated_dist)
print(model)
}
use_gamlss() will only start working once a user calls library(gamlss.tr).
This is due to bad design of gamlss.tr in particular the trun.x functions (they take character vectors instead of family objects / they evaluate everything in the function environment instead of the calling environment).
To work around this, you have to make sure that gamlss.distr is in the search path of the execution environment of gamlss.tr functions (This is why ## import-ing it in your package does not help: it would need to be #' #import-ed in gamlss.tr).
This can be achieved by adding it to Depends: of your package.
If you want to avoid that attaching your package also attaches gamlss.distr, you could also add the following at the top of use_gamlss:
nsname <-"gamlss.dist"
attname <- paste0("package:", nsname)
if (!(attname %in% search())) {
attachNamespace(nsname)
on.exit(detach(attname, character.only = TRUE))
}
This would temporarily attach gamlss.dist if it is not attached already.
You can read more on namespaces in R in Hadley Wickham's "Advanced R"

Export error on overwriting primitives with S3 in R package

I'm trying to create a S3 method in my package called dimnames. This is a primitive in R, but there should be an S3 in my package with the same name.
I've got the following file dimnames.r
#' S3 overwriting primitive
#'
#' #param x object
#' #export
dimnames = function(x) {
UseMethod("dimnames")
}
#' title
#'
#' #export
dimnames.data.frame = function(x) {
dimnames.default(x)
}
#' title
#'
#' #export
dimnames.list = function(x) {
lapply(x, dimnames)
}
#' title
#'
#' #export
dimnames.default = function(x) {
message("in S3 method")
base::dimnames(x)
}
I then create a package from it (in R=3.3.2):
> package.skeleton("rpkg", code_files="dimnames.r")
> setwd("rpkg")
> devtools::document() # version 1.12.0
And then check the package
R CMD build rpkg
R CMD check rpkg_1.0.tar.gz
I get the following output (among other messages):
Warning: declared S3 method 'dimnames.default' not found
Warning: declared S3 method 'dimnames.list' not found
Loading the package and checking its contents, dimnames.data.frame is exported while dimnames.default and dimnames.list are not. This does not make sense to me. As far as I understand, I declared the exports correctly. Also, the NAMESPACE file looks good to me:
S3method(dimnames,data.frame)
S3method(dimnames,default)
S3method(dimnames,list)
export(dimnames)
Why does this not work, and how to fix it?
(Bonus points for: why do I need #' title in the S3 implementations when they should not be needed with roxygen=5.0.1?)
S3 methods are only exported if it is desired that the user be able to access them directly. If they are always to be invoked via the generic then there is no need to export them.
The problem with R CMD check is likely due to defining your own generic for dimnames. Normally one just defines methods and leverages off the primitive generic already in R. Remove the dimnames generic from dimnames.r.
There should be no problem in adding methods for new classes but you may have problems trying to override the functionality of dimnames for existing classes that R's dimnames handles itself.

Testing package exports using 'testthat'

Question: How can I get testthat to run in an environment that loads my package, rather than inherits from my package?
Background: The testthat package runs tests "in an environment that inherits from the package's namespace environment" [see the docs for test_check]. This means it doesn't make sure I've done my exports correctly, and that's bitten me several times.
For example, I have the following code in my package:
##' The foo() method
##' #param x object
##' #export
foo <- function(x)
UseMethod('foo')
##' #rdname foo
foo.data.frame <- function(x) {
message("foo data.frame")
}
##' #rdname foo
foo.default <- function(x) {
message("foo default")
}
And the following in my tests:
x <- 5:13
foo(x)
That tests just fine. But if a user installs the package, they'll get this error:
Error in UseMethod("foo") :
no applicable method for 'foo' applied to an object of class "c('integer', 'numeric')"
The solution is to put #exports declarations for the two methods, but it's a bummer that the tests didn't catch that.
I would much prefer to run all my tests from the point of view of a user, because I tend to screw up my exporting sometimes. Perhaps an option could be added to testthat:::run_tests that selects which behavior is desired?
Use test_dir. I don't use test_check for this exact reason. In "/tests/run-tests.R" (file name doesn't matter, it just needs to be in that directory and end in ".R") write:
library(testthat)
library(<my package>) # insert actual package name here
test_dir('testthat') # assuming your tests are in "tests/testthat"
Then, in order to run your tests:
setwd("<pkg dir>/tests")
source("run-tests.R")
Or from the command line:
cd <pkg-dir>/tests
Rscript run-tests.R
Or do R CMD build and R CMD check to run the tests that way.
The setwd is not strictly necessary if your tests don't care about working directory. However, if they do doing so will replicate the working directory set by R CMD check.

Resources