S4 class [ (subset) inheritance with additional arguments - r

This is an extension of Using callNextMethod() within accessor function in R.
Update 2017-03-25: To illustrate how this only fails when loading the methods, but not when it's in a built package, I created a dummy package: https://github.com/zkamvar/inheritest#readme
Basic problem:
I have a class bar that inherits another class foo, and both of them have additional arguments for the [ method. The method for foo works consistently, but the method for bar fails after the first use.
Error and Traceback:
Error in callNextMethod(x, i, j, ..., drop): bad object found as method (class "function")
4: stop(gettextf("bad object found as method (class %s)", dQuote(class(method))),
domain = NA)
3: callNextMethod(x, i, j, ..., drop) at #9
2: .local(x, i, j, ..., drop = drop)
1: BAR["x"]
Further details:
I have a package that implements a class that depends on a class from another package. When the packages are built, everything works fine, but when my package is simply loaded (using devtools::load_all(".")), I get the behavior below.
Minimum Working Example:
foo <- setClass("foo", representation(x = "numeric", y = "numeric"))
bar <- setClass("bar", representation(distance = "numeric"), contains = "foo")
setMethod(f = "[", signature = signature(x = "foo", i = "ANY", j = "ANY", drop = "ANY"),
definition = function(x, i, j, ..., foo = TRUE, drop = FALSE) {
if (foo)
message("FOOOOOOO")
if (i == "x") {
return(x#x)
} else {
if (i == "y") {
return(x#y)
}
}
})
#> [1] "["
setMethod(f = "[", signature = signature(x = "bar", i = "ANY", j = "ANY", drop = "ANY"),
definition = function(x, i, j, ..., bar = TRUE, drop = FALSE) {
if (bar)
message("BAAAAAAR")
if (i == "distance") {
return(x#distance)
} else {
callNextMethod(x, i, j, ..., drop)
}
})
#> [1] "["
FOO <- new("foo", x = 1, y = 4)
BAR <- new("bar", x = 1, y = 4, distance = 3)
FOO["x"]
#> FOOOOOOO
#> [1] 1
BAR["x"]
#> BAAAAAAR
#> FOOOOOOO
#> [1] 1
FOO["x"]
#> FOOOOOOO
#> [1] 1
BAR["distance"]
#> BAAAAAAR
#> [1] 3
BAR["x"] # fails
#> BAAAAAAR
#> Error in callNextMethod(x, i, j, ..., drop): bad object found as method (class "function")
BAR["x", foo = FALSE]
#> BAAAAAAR
#> [1] 1
Note: when I passed this through reprex, The first and last calls to BAR resulted in errors as well, but I am showing what I experience in an interactive session. I am using R version 3.3.3

This is because callNextMethod() is not smart enough to handle methods on primitives with augmented formals. I've fixed it and will commit to trunk soon.

Here's a partial answer: it is to do with "[" specifically. Here is some working code, that replaces the '[' method with a 'bat' method. It works fine for me:
foo <- setClass("foo", representation(x = "numeric", y = "numeric"))
bar <- setClass("bar", representation(distance = "numeric"), contains = "foo")
bat <- function (x, i, j, ..., drop = FALSE) message('in bat')
setGeneric('bat')
setMethod(f = "bat", signature = signature(x = "foo"),
definition = function(x, i, j, ..., foo = TRUE, drop = FALSE) {
if (foo)
message("FOOOOOOO")
if (i == "x") {
return(x#x)
} else {
if (i == "y") {
return(x#y)
}
}
})
#> [1] "["
setMethod(f = "bat", signature = signature(x = "bar"),
definition = function(x, i, j, ..., bar = TRUE, drop = FALSE) {
if (bar)
message("BAAAAAAR")
if (i == "distance") {
return(x#distance)
} else {
callNextMethod(x, i, j, ..., drop)
}
})
FOO <- new("foo", x = 1, y = 4)
BAR <- new("bar", x = 1, y = 4, distance = 3)
bat(FOO, 'x')
bat(BAR, 'distance')
bat(BAR, 'x')
And now:
bat(FOO, 'x')
FOOOOOOO
[1] 1
bat(BAR, 'x')
BAAAAAAR
FOOOOOOO
[1] 1
bat(BAR, 'distance')
BAAAAAAR
[1] 3
bat(BAR, 'x')
BAAAAAAR
FOOOOOOO
[1] 1
So, I think this is something to do with the interaction of S4 dispatch and ['s own dispatching... and solutions? I have none, except to avoid S4 like the plague it seems to be. Maybe R-devel can help. It's possible this is a genuine R bug, given that the code only breaks for [.

The issue has likely to do with the fact that [ is a primitive, and primitives are dealt with differently when using S4. Digging into callNextMethod shows that the callstack isn't analyzed correctly in the case that the method has different arguments compared to the generic for that primitive function. If you drop the argument bar from the method definition, dispatching works correctly.
That said, there is another workaround that doesn't require you to choose another function name. I add an extra function as.foo and recall the generic after converting to a foo object:
setGeneric("as.foo", function(x)standardGeneric("as.foo"))
setMethod("as.foo", signature = "bar",
function(x)
new("foo", x = x#x, y = x#y))
setMethod(f = "[", signature = signature(x = "bar", i = "ANY", j = "ANY", drop = "ANY"),
definition = function(x, i, j, ..., bar = TRUE, drop = FALSE) {
if (bar)
message("BAAAAAAR")
if (i == "distance") {
return(x#distance)
} else {
x <- as.foo(x)
callGeneric()
}
}
)
This way you circumvent the hiccup in dispatching, and all the code that used to fail now works
FOO["x"]
#> FOOOOOOO
#> [1] 1
BAR["x"]
#> BAAAAAAR
#> FOOOOOOO
#> [1] 1
BAR["distance"]
#> BAAAAAAR
#> [1] 3
BAR["x"]
#> BAAAAAAR
#> FOOOOOOO
#> [1] 1
BAR["x", foo = FALSE]
#> BAAAAAAR
#> [1] 1

Related

Let a function within a function extract arguments it expects from the ellipses and disregard all others?

Advanced R gives a simple example of "dot dot dot" and how to pass named arguments through to a function within a function:
i01 <- function(y, z) {
list(y = y, z = z)
}
i02 <- function(x, ...) {
i01(...)
}
str(i02(x = 1, y = 2, z = 3))
# List of 2
# $ y: num 2
# $ z: num 3
This approach only works when the inner function accepts all the arguments provided via the ellipses. For example, adding an argument that the inner function doesn't use causes an error:
i01 <- function(y, z) {
list(y = y, z = z)
}
i02 <- function(x, ...) {
i01(...)
}
# Errors since i01() receives j, which it doesn't know how to handle
str(i02(x = 1, y = 2, z = 3, j = 4))
# Error in i01(...) : unused argument (j = 4)
I found a way around this that gets very long-winded in real world uses (e.g. imagine with > dozen arguments):
i01 <- function(y, z) {
list(y = y, z = z)
}
i02 <- function(x, ...) {
ellip <- list(...)
i01(
if(!is.null(ellip$y)){ ellip$y },
if(!is.null(ellip$z)){ ellip$z }
)
}
str(i02(x = 1, y = 2, z = 3, j = 4))
# List of 2
# $ y: num 2
# $ z: num 3
Question
Is there an elegant way to tell a function within a function to i) look at ... and ii) extract everything it recognises and ignore everything else?
I wouldn't call this elegant, but here's one way to do it:
i02 <- function(x, ...) {
i01args <- list(...)[c("y", "z")]
do.call(i01, i01args)
}
This assumes that you will always include both y and z in the .... If they are optional, you could do something like
i02 <- function(x, ...) {
allargs <- list(...)
i01args <- allargs[which(names(allargs) %in% c("y", "z"))]
do.call(i01, i01args)
}
This version loses unnamed arguments.
Generally speaking I try to avoid this kind of construction. You sometimes need to do it when you want to pass optional arguments to two different functions, e.g. i02 calls both i01 and i03. But then it depends on the names of the arguments for those two functions being completely distinct. If both i01 and i03 have an argument named w, how would you know which one of them should receive it? A better construction is something like
i02 <- function(x, i01args = list(), i03args = list()) {
do.call(i01, i01args)
do.call(i03, i03args)
}
Here are three approaches. The last one is a generalization of the second.
Either take y and z out of ... as in io2a or without modifying the arguments as in io2b or if io2 doesn't know the arguments or io1 and must find them out dynamically construct the call as in io2c.
Note that the use of list(...) in the question has the disadvantage that it evaluates all the arguments including the ones that are not used whereas the approach here does not. Look at lm source code for another example of this.
io2a <- function(x, y, z, ...) i01(y, z)
io2a(x = 1, y = 2, z = 3, j = 4) |> str()
## List of 2
## $ y: num 2
## $ z: num 3
io2b <- function(x, ...) {
mf <- match.call()
m <- match(c("y", "z"), names(mf), 0L)
mf <- mf[c(1L, m)]
mf[[1L]] <- as.name("i01")
eval.parent(mf)
}
io2b(x = 1, y = 2, z = 3, j = 4) |> str()
## List of 2
## $ y: num 2
## $ z: num 3
# same except we dynamically determine y and z
io2c <- function(x, ...) {
mf <- match.call()
nms <- intersect(names(mf), names(formals(i01)))
m <- match(nms, names(mf), 0L)
mf <- mf[c(1L, m)]
mf[[1L]] <- as.name("i01")
eval.parent(mf)
}
io2c(x = 1, y = 2, z = 3, j = 4) |> str()
## List of 2
## $ y: num 2
## $ z: num 3

S3: Modify default argument before calling NextMethod()

In an S3 generic function, I'd like to modify a function argument before calling NextMethod(). As a starting point, I looked through #44 of Henrik Bengtsson's "Wishlist for R". The following snippet is taken from there and corresponds to his suggestion on to how modify an argument before calling NextMethod().
x <- structure(NA, class = "A")
expected <- list(x = x, a = 3)
foo <- function(x, a) UseMethod("foo")
foo.A <- function(x, a) {
a <- a + 1
NextMethod()
}
foo.default <- function(x, a) {
list(x = x, a = a)
}
identical(foo(x, a = 2), expected)
#> [1] TRUE
identical(foo(x, 2), expected)
#> [1] TRUE
Now what has me stumped is the following behavior where the argument to be modified has a default value.
bar <- function(x, a) UseMethod("bar")
bar.A <- function(x, a = 2) {
a <- a + 1
NextMethod()
}
bar.default <- function(x, a = 2) {
list(x = x, a = a)
}
identical(bar(x, a = 2), expected)
#> [1] TRUE
identical(bar(x, 2), expected)
#> [1] TRUE
identical(bar(x), expected)
#> [1] FALSE
Can someone help me understand what is happening here? Any ideas on how to make the default argument case work (apart from an explicit call of bar.default())?
I'm not sure how realistic this set-up is, but the problem with it is that calling bar(x) means that you are calling bar.A(x), then (via NextMethod()) you are calling bar.default(x), rather than bar.default(x, a = 3) as you might expect.
The way round this is to specifically pass a as a parameter in NextMethod. The issue you will have with this is that if the user doesn't name the second parameter, then bar.default will throw because it is being given 3 parameters instead of two (x, 2 and a = 3). You can get round this by including a ... parameter in bar.default so that unnamed parameters are ignored.
x <- structure(NA, class = "A")
expected <- list(x = x, a = 3)
bar <- function(x, ...) UseMethod("bar")
bar.A <- function(x, a = 2) {
a <- a + 1
NextMethod("bar", x, a = a)
}
bar.default <- function(x, ..., a = 2) {
list(x = x, a = a)
}
identical(bar(x, a = 2), expected)
#> [1] TRUE
identical(bar(x, 2), expected)
#> [1] TRUE
identical(bar(x), expected)
#> [1] TRUE
Created on 2020-04-02 by the reprex package (v0.3.0)

"unused arguments" error when using a method

This is really a mystery for me. I have defined my method like this (for class "graf"):
addStatistics <- function(x) UseMethod("addStatistics")
addStatistics.graf <- function (x, stat_name = NULL, value = NULL)
{
if (stat_name == "env_coef_delta_mnll") {
x$env_coef_delta_mnll <- value
}
x
}
I am calling the method like this, and getting an error:
addStatistics(m, "env_coef_delta_mnll", 0)
#Error in addStatistics(m, "env_coef_delta_mnll", 0) :
# unused arguments ("env_coef_delta_mnll", 0)
Why the method doesn't accept those supplied arguments and says they are "unused"?
Here is a way of solving the problem. Apparently you are creating a setter function, so I will change the generic a bit.
`addStatistics<-` <- function(x, ...) UseMethod("addStatistics<-")
`addStatistics<-.graf` <- function (x, stat_name = NULL, value = NULL)
{
if (stat_name == "env_coef_delta_mnll") {
x$env_coef_delta_mnll <- value
}
x
}
as.graf <- function(x){
class(x) <- "graf"
x
}
x <- as.graf(list())
addStatistics(x, "env_coef_delta_mnll") <- 1234
x
#$env_coef_delta_mnll
#[1] 1234
#
#attr(,"class")
#[1] "graf"
#GGrothendieck beat me to the punch, but here's a reprex to prove it;
addStatistics <- function(...) UseMethod("addStatistics")
addStatistics.graf <- function (x, stat_name, value)
{
if(!missing(stat_name)){
if (stat_name == "env_coef_delta_mnll") {
x$env_coef_delta_mnll <- value
}}
x
}
m <- list(env_coef_delta_mnll = 3)
class(m) <- "graf"
addStatistics(m, stat_name = "env_coef_delta_mnll", 4)
#> $env_coef_delta_mnll
#> [1] 4
#>
#> attr(,"class")
#> [1] "graf"
Created on 2020-02-20 by the reprex package (v0.3.0)

Specifying names of arguments with a function in R

What I want to do is the following: I want to write a function, let's call it 'function_creator' with one argument: name, and some additional arguments, such that its output is a function for which the name of the argument is the argument I passed to 'function_creator'.
The following code-snippet illustrates how I would like function_creator to behave:
f <- function_creater(name = "y", y_min = 2, y_max = 3)
f
function(y) {
y >= 2 && y <= 3
}
How do I do this in R?? I guess that something like sys.call() might be helpful but I don't really know how to proceed from there.
Using rlang::new_function() you can do this. This uses exprs() from rlang as well to create an argument with no default value (i.e., a named list with nothing in it. The body of the function is put in the substitute() function to swap in the values for the variable names.
library(rlang)
function_creater <- function(name, y_min, y_max) {
new_args <- setNames(exprs(temp_name = ), name)
new_body <- substitute((y >= y_min && y <= y_max), list(y = sym(name),
y_min = y_min,
y_max = y_max))
new_function(new_args, new_body)
}
Testing:
> f <- function_creater(name = "y", y_min = 2, y_max = 3)
> f
function (y)
(y >= 2 && y <= 3)
<environment: 0x000000001af5f518>
There are better ways to do this, but there's also the poor man's method: create a string with code and parse/eval it. I'm using glue for readability, but the same thing could be done with paste.
library(glue)
function_creator <- function(name = 'y', y_min = 2, y_max = 3){
code_string <-
glue('
function({name}){{
{name} >= {y_min} && {name} <= {y_max}
}}
')
eval(parse(text = code_string))
}
f <- function_creator(name = 'bob', y_min = 10, y_max = 20)
f(bob = 11) # TRUE
f(bob = 8) # FALSE
Note: With glue, objects within {} are evaluated and the result replaces the {}, e.g. '{name}' is replaced with 'y' within the string. Because of this substitution method, actual {s and }s need an extra { or } to escape.
Or, using the method of #Adam
library(rlang)
function_creator <- function(name, y_min, y_max){
new_function(pairlist2(name = ),
expr((!!sym(name) >= !!y_min && !!sym(name) <= !!y_max)))
}
function_creator(name = "y", y_min = 2, y_max = 3)
#> function (name)
#> (y >= 2 && y <= 3)
#> <environment: 0x7f8d0eeda028>
Created on 2021-12-05 by the reprex package (v2.0.1)

List of quosures as input of a set of functions

This question refers to "Programming with dplyr"
I want to slice the ... argument of a function and use each element as an argument for a corresponding function.
foo <- function(...){
<some code>
}
should evaluate for example foo(x, y, z) in this form:
list(bar(~x), bar(~y), bar(~z))
so that x, y, z remain quoted till they get evaluated in bar.
I tried this:
foo <- function(...){
arguments <- quos(...)
out <- map(arguments, ~bar(UQ(.)))
out
}
I have two intentions:
Learn better how tidyeval/rlang works and when to use it.
turn future::futureOf() into a function that get me more then one futures at once.
This approach might be overly complicated, because I don't fully understand the underlying concepts of tidyeval yet.
You don't really need any packages for this. match.call can be used.
foo <- function(..., envir = parent.frame()) {
cl <- match.call()
cl$envir <- NULL
cl[[1L]] <- as.name("bar")
lapply(seq_along(cl)[-1], function(i) eval(cl[c(1L, i)], envir))
}
# test
bar <- function(...) match.call()
foo(x = 1, y = 2, z = 3)
giving:
[[1]]
bar(x = 1)
[[2]]
bar(y = 2)
[[3]]
bar(z = 3)
Another test
bar <- function(...) ..1^2
foo(x = 1, y = 2, z = 3)
giving:
[[1]]
[1] 1
[[2]]
[1] 4
[[3]]
[1] 9

Resources