get ... inside R functions - r

in R you can pass on parameters in functions using "...", e.g.
myfun <- function(..., cex=0.7, mytext=1:10){
plot(...)
text(...,label=mytext)
}
Is there a way to get the values of "..." inside the function?
Suppose you do
myfun(runif(10), runif(10), cex=0.3, label=1:10)
You will get an error because cex is set twice in text(), so nice to extract x and y from "..." for use in text(). (I know par("usr") can be used for getting x and y from the plotting device).

There are different ways:
The simplest method is to use
dots = list(...)
which returns a named list of the arguments. However, this evaluates all arguments and although this is usually what you want, sometimes it isn’t. To get an unevaluated list of the arguments, use
dots = match.call(expand.dots = FALSE)$...
To illustrate the difference, consider
f = function (...) list(...)
g = function (...) match.call(expand.dots = FALSE)$...
Called as
f(2 + 2)
g(2 + 2)
Results in:
[[1]]
4
[[1]]
2 + 2

Related

Nested use of call_modify

I am trying to create a call to a function f whose first argument is a call to another function (for which I've chosen dbinom as an example). The call to dbinom (passed on to f) does not include values for all the arguments as these should be finalised within f, and the completed call is returned by f. Here is my failed minimal attempt:
f <- function(a_call) {
call_modify(a_call, x=1)
}
a_call <- call2(dbinom, size=1, prob=0.5)
y <- call2(f, a_call)
The output for y is:
(function(a_call) {
call_modify(a_call, x=1)
})((function (x, size, prob, log = FALSE)
.Call(C_dbinom, x, size, prob, log))(size = 1, prob = 0.5))
This call will
call a_call without any arguments, and then;
pass this result on to f.
If I evaluate y, it errors because dinom's first argument is missing.
I similar-but-related construct:
> call2(call2(dbinom, x=1, size=1, prob=0.5))
((function (x, size, prob, log = FALSE)
.Call(C_dbinom, x, size, prob, log))(x = 1, size = 1, prob = 0.5))()
(function (x, size, prob, log = FALSE)
I get the sense there is something 'not even wrong' with what I'm trying here, and nesting a call modification is best done another way.
It seems that what you are trying to do is handled more naturally by purrr::partial(), which fills in one or more arguments of a function:
f <- function( a_fun ) {purrr::partial( a_fun, x=1 )}
a_fun <- purrr::partial( dbinom, size=1, prob=0.5 )
y <- f(a_fun)
y(...) is now effectively dbinom( x=1, size=1, prob=0.5, ... )
y() # 0.5
y(log=TRUE) # -0.6931472
The great thing about partial() is that it can be naturally chained with the %>% pipe:
z <- partial(dbinom, size=1) %>% partial(prob=0.5) %>% partial(x=1)
z(log=TRUE) # -0.6931472
If I understand correctly what you're trying to do,
then maybe this works better:
f <- function(a_call) {
call_modify(call_standardise(call2(ensym(a_call)),
caller_env()),
x=1)
}
Which you can use with or without characters:
f(print)
# print(x = 1)
f("print")
# print(x = 1)
eval(f(print))
# 1
Or with more indirection:
a_call <- expr(print)
eval(call2(f, a_call))
# print(x = 1)
eval(expr(f(!!a_call)))
# print(x = 1)
Since we do a bit of non-standard evaluation here,
things get a bit tricky.
call_standardise needs to be able to find the function you specify,
and it's very probable that it will be found in the environment that calls f,
and not necessarily in the environment that calls call_standardise,
which would be f's execution environment in this case.
That's why caller_env() is explicitly specified when calling call_standardise even though that's the default for the latter's env,
because default arguments are evaluated in the function's execution environment,
whereas explicit arguments are evaluated in the caller's environment.
Here's a contrived-looking example for this problem:
f2 <- function(a_call) {
call_modify(call_standardise(call2(ensym(a_call))),
x=1)
}
e <- new.env()
e$foo <- function(x) { x + 1 }
with(e, f(foo))
# foo(x = 1)
with(e, f2(foo))
# Error in eval_bare(node_car(expr), env) : object 'foo' not found
However, if you were to develop a package that provides f,
the example is no longer contrived:
f would live in your package's environment,
and other packages could call it for functions that are only available in their respective namespaces.
For more specifics and depictions,
check this reference,
and maybe try drawing the call tree for my example.
call2 constructs a call by passing evaluated ... arguments on to the callable object (the first argument). For example, the command below outputs to the console "y" as the second argument passed to call2 is evaluated,
> A <- call2(print, x=print('y'))
[1] "y"
and constructs a call to print which takes x="y" as its argument (not x=print("y")):
> A
(function (x, ...)
UseMethod("print"))(x = "y")
In order to get around a_call being evaluated and then passed (to f) in the constructed call, it can be quoted, e.g.
f <- function(a_call) {
call_modify(a_call, x=1)
}
a_call <- call2(dbinom, size=1, prob=0.5)
y <- call2(f, quote(a_call))
Now:
> y
(function(a_call) {
call_modify(a_call, x=1)
})(a_call)

Non-standard evaluation in a user-defined function with lapply or with in R

I wrote a wrapper around ftable because I need to compute flat tables with frequency and percentage for many variables. As ftable method for class "formula" uses non-standard evaluation, the wrapper relies on do.call and match.call to allow the use of the subset argument of ftable (more details in my previous question).
mytable <- function(...) {
do.call(what = ftable,
args = as.list(x = match.call()[-1]))
# etc
}
However, I cannot use this wrapper with lapply nor with:
# example 1: error with "lapply"
lapply(X = warpbreaks[c("breaks",
"wool",
"tension")],
FUN = mytable,
row.vars = 1)
Error in (function (x, ...) : object 'X' not found
# example 2: error with "with"
with(data = warpbreaks[warpbreaks$tension == "L", ],
expr = mytable(wool))
Error in (function (x, ...) : object 'wool' not found
These errors seem to be due to match.call not being evaluated in the right environment.
As this question is closely linked to my previous one, here is a sum up of my problems:
The wrapper with do.call and match.call cannot be used with lapply or with.
The wrapper without do.call and match.call cannot use the subset argument of ftable.
And a sum up of my questions:
How can I write a wrapper which allows both to use the subset argument of ftable and to be used with lapply and with? I have ideas to avoid the use of lapply and with, but I am looking to understand and correct these errors to improve my knowledge of R.
Is the error with lapply related to the following note from ?lapply?
For historical reasons, the calls created by lapply are unevaluated,
and code has been written (e.g., bquote) that relies on this. This
means that the recorded call is always of the form FUN(X[[i]], ...),
with i replaced by the current (integer or double) index. This is not
normally a problem, but it can be if FUN uses sys.call or match.call
or if it is a primitive function that makes use of the call. This
means that it is often safer to call primitive functions with a
wrapper, so that e.g. lapply(ll, function(x) is.numeric(x)) is
required to ensure that method dispatch for is.numeric occurs
correctly.
The problem with using match.call with lapply is that match.call returns the literal call that passed into it, without any interpretation. To see what's going on, let's make a simpler function which shows exactly how your function is interpreting the arguments passed into it:
match_call_fun <- function(...) {
call = as.list(match.call()[-1])
print(call)
}
When we call it directly, match.call correctly gets the arguments and puts them in a list that we can use with do.call:
match_call_fun(iris['Species'], 9)
[[1]]
iris["Species"]
[[2]]
[1] 9
But watch what happens when we use lapply (I've only included the output of the internal print statement):
lapply('Species', function(x) match_call_fun(iris[x], 9))
[[1]]
iris[x]
[[2]]
[1] 9
Since match.call gets the literal arguments passed to it, it receives iris[x], not the properly interpreted iris['Species'] that we want. When we pass those arguments into ftable with do.call, it looks for an object x in the current environment, and then returns an error when it can't find it. We need to interpret
As you've seen, adding envir = parent.frame() fixes the problem. This is because, adding that argument tells do.call to evaluate iris[x] in the parent frame, which is the anonymous function in lapply where x has it's proper meaning. To see this in action, let's make another simple function that uses do.call to print ls from 3 different environmental levels:
z <- function(...) {
print(do.call(ls, list()))
print(do.call(ls, list(), envir = parent.frame()))
print(do.call(ls, list(), envir = parent.frame(2)))
}
When we call z() from the global environment, we see the empty environment inside the function, then the Global Environment:
z()
character(0) # Interior function environment
[1] "match_call_fun" "y" "z" # GlobalEnv
[1] "match_call_fun" "y" "z" # GlobalEnv
But when we call from within lapply, we see that one level of parent.frame up is the anonymous function in lapply:
lapply(1, z)
character(0) # Interior function environment
[1] "FUN" "i" "X" # lapply
[1] "match_call_fun" "y" "z" # GlobalEnv
So, by adding envir = parent.frame(), do.call knows to evaluate iris[x] in the lapply environment where it knows that x is actually 'Species', and it evaluates correctly.
mytable_envir <- function(...) {
tab <- do.call(what = ftable,
args = as.list(match.call()[-1]),
envir = parent.frame())
prop <- prop.table(x = tab,
margin = 2) * 100
bind <- cbind(as.matrix(x = tab),
as.matrix(x = prop))
margin <- addmargins(A = bind,
margin = 1)
round(x = margin,
digits = 1)
}
# This works!
lapply(X = c("breaks","wool","tension"),
FUN = function(x) mytable_envir(warpbreaks[x],row.vars = 1))
As for why adding envir = parent.frame() makes a difference since that appears to be the default option. I'm not 100% sure, but my guess is that when the default argument is used, parent.frame is evaluated inside the do.call function, returning the environment in which do.call is run. What we're doing, however, is calling parent.frame outside do.call, which means it returns one level higher than the default version.
Here's a test function that takes parent.frame() as a default value:
fun <- function(y=parent.frame()) {
print(y)
print(parent.frame())
print(parent.frame(2))
print(parent.frame(3))
}
Now look at what happens when we call it from within lapply both with and without passing in parent.frame() as an argument:
lapply(1, function(y) fun())
<environment: 0x12c5bc1b0> # y argument
<environment: 0x12c5bc1b0> # parent.frame called inside
<environment: 0x12c5bc760> # 1 level up = lapply
<environment: R_GlobalEnv> # 2 levels up = globalEnv
lapply(1, function(y) fun(y = parent.frame()))
<environment: 0x104931358> # y argument
<environment: 0x104930da8> # parent.frame called inside
<environment: 0x104931358> # 1 level up = lapply
<environment: R_GlobalEnv> # 2 levels up = globalEnv
In the first example, the value of y is the same as what you get when you call parent.frame() inside the function. In the second example, the value of y is the same as the environment one level up (inside lapply). So, while they look the same, they're actually doing different things: in the first example, parent.frame is being evaluated inside the function when it sees that there is no y= argument, in the second, parent.frame is evaluated in the lapply anonymous function first, before calling fun, and then is passed into it.
As you only want to pass all the arguments passed to ftable u do not need the do.call().
mytable <- function(...) {
tab <- ftable(...)
prop <- prop.table(x = tab,
margin = 2) * 100
bind <- cbind(as.matrix(x = tab),
as.matrix(x = prop))
margin <- addmargins(A = bind,
margin = 1)
return(round(x = margin,
digits = 1))
}
The following lapply creates a table for every Variable separatly i don't know if that is what you want.
lapply(X = c("breaks",
"wool",
"tension"),
FUN = function(x) mytable(warpbreaks[x],
row.vars = 1))
If you want all 3 variables in 1 table
warpbreaks$newVar <- LETTERS[3:4]
lapply(X = cbind("c(\"breaks\", \"wool\", \"tension\")",
"c(\"newVar\", \"tension\",\"wool\")"),
FUN = function(X)
eval(parse(text=paste("mytable(warpbreaks[,",X,"],
row.vars = 1)")))
)
Thanks to this issue, the wrapper became:
# function 1
mytable <- function(...) {
do.call(what = ftable,
args = as.list(x = match.call()[-1]),
envir = parent.frame())
# etc
}
Or:
# function 2
mytable <- function(...) {
mc <- match.call()
mc[[1]] <- quote(expr = ftable)
eval.parent(expr = mc)
# etc
}
I can now use the subset argument of ftable, and use the wrapper in lapply:
lapply(X = warpbreaks[c("wool",
"tension")],
FUN = function(x) mytable(formula = x ~ breaks,
data = warpbreaks,
subset = breaks < 15))
However I do not understand why I have to supply envir = parent.frame() to do.call as it is a default argument.
More importantly, these methods do not resolve another issue: I can not use the subset argument of ftable with mapply.

R: S3 Method dispatch depending on arguments

I have a generic function foo that I want to call three different ways depending on the arguments given to it.
foo <- function(...) UseMethod("foo")
#default
foo.default <- function(x, y, ...) {
#does some magic
print("this is the default method")
}
#formula
foo.formula <- function(formula, data = list(), ...) {
print("this is the formula method")
}
#data.frame
foo.data.frame <- function(data, x, y, ...) {
print("this is the data.frame method")
}
In the following I'm going to show how I am expecting the method dispatch to work but the outputs are presented under each call...
mydata <- data.frame(x=c(1,2,3,4),y=c(5,6,7,8))
#ways to call default function
foo(x = mydata$x, y = mydata$y)
#[1] "this is the default method"
#ways to call formula
foo(formula = mydata$x~mydata$y)
#[1] "this is the formula method"
foo(formula = x~y, data = mydata)
#[1] "this is the formula method"
foo(data = mydata, formula = x~y) #ERROR
#[1] "this is the data.frame method"
#ways to call data.frame method
foo(data = mydata, x = x, y = y)
#[1] "this is the data.frame method"
foo(x = x, y = y, data = mydata) #ERROR
#Error in foo(x = x, y = y, data = mydata) : object 'x' not found
from what I can tell, the method used depends on the class of the first argument. Essentially, I would like for the method dispatch to depend on the arguments passed to the generic function foo and not the first argument.
I would like the dispatch to have the following priority:
If the formula argument is present the formula method is used (data argument should be optional here)
Then, if no formula argument is found, if data argument is present use data.frame method (which requires x and y arguments)
else foo expects the x and y arguments or it will fail.
Note
I would like to avoid defining the generic function foo as follows
foo <- function(formula, data,...) UseMethod("foo")
while this would fix all my issues (I believe all except the last case), this will cause a devtools::check() warning because the some of S3 functions will not have the same arguments as the generic function and will no longer be consistent (specifically foo.default and foo.data.frame). And I wouldn't like to include the missing arguments because those methods do not have use for those arguments.
As Thomas has pointed out, this is not the standard behavior for S3 classes. If you really want to stick to S3, however, you could write your functions so as to "mimick" UseMethod, even though it won't be pretty and is probably not what you want to do. Nevertheless, here an idea that is based on capturing all arguments first, and then checking for the presence of your "preferred" argument type:
Get some objects first:
a <- 1; class(a) <- "Americano"
b <- 2; class(b) <- "Espresso"
Let the function in question capture all arguments with dots, and then check for the presence of an argument type in order of your preference:
drink <- function(...){
dots <- list(...)
if(any(sapply(dots, function(cup) class(cup)=="Americano"))){
drink.Americano(...)
} else { # you can add more checks here to get a hierarchy
# try to find appropriate method first if one exists,
# using the first element of the arguments as usual
tryCatch(get(paste0("drink.", class(dots[[1]])))(),
# if no appropriate method is found, try the default method:
error = function(e) drink.default(...))
}
}
drink.Americano <- function(...) print("Hmm, gimme more!")
drink.Espresso <- function(...) print("Tripple, please!")
drink.default <- function(...) print("Any caffeine in there?")
drink(a) # "Americano", dispatch hard-coded.
# [1] "Hmm, gimme more!"
drink(b) # "Espresso", not hard-coded, but correct dispatch anyway
# [1] "Tripple, please!"
drink("sthelse") # Dispatches to default method
# [1] "Any caffeine in there?"
drink(a,b,"c")
# [1] "Hmm, gimme more!"
drink(b,"c", a)
# [1] "Hmm, gimme more!"

Pass arguments in nested function to update default arguments

I have nested functions and wish to pass arguments to the deepest function. That deepest function will already have default arguments, so I will be updating those argument values.
My mwe is using plot(), but in reality I'm working with png(), with default height and width arguments.
Any suggestions?
f1 <- function(...){ f2(...)}
f2 <- function(...){ f3(...)}
f3 <- function(...){ plot(xlab="hello1", ...)}
#this works
f1(x=1:10,y=rnorm(10),type='b')
# I want to update the default xlab value, but it fails:
f1(x=1:10,y=rnorm(10),type='b', xlab='hello2')
In your f3(), "hello1" is not a default value for xlab in the list of function's formal arguments. It is instead the supplied value in the function body, so there's no way to override it:
f3 <- function(...){ plot(xlab="hello1", ...)}
I suspect you meant instead to do something like this.
f1 <- function(...){ f2(...)}
f2 <- function(...){ f3(...)}
f3 <- function(..., xlab="hello1") plot(..., xlab=xlab)
## Then check that it works
par(mfcol=c(1,2))
f1(x=1:10,y=rnorm(10),type='b')
f1(x=1:10,y=rnorm(10),type='b', xlab='hello2')
(Do notice that the formal argument xlab must follow the ... argument here, so that it can only be matched exactly (and not by partial matching). Otherwise, in the absence of an argument named xlab, it'll get matched by an argument named x, potentially (and actually here) causing you a lot of grief.)
My usual approach for modifying arguments in ... is as follows:
f1 = function(...) {
dots = list(...)
if (!('ylab' %in% names(dots))) {
dots$ylab = 'hello'
}
do.call(plot, dots)
}
# check results
f1(x = 1:10, y = rnorm(10))
f1(x = 1:10, y = rnorm(10), ylab = 'hi')
What happens here is that ... is captured in a list called dots. Next, R checks if this list dots contains any information about ylab. If there is no information, we set it to a specified value. If there is information, we do nothing. Last, do.call(a, b) is a function that basically stands voor execute function a with arguments b.
edit
This works better with multiple default arguments (and probably also better in general).
f1 = function(...) {
# capture ... in a list
dots = list(...)
# default arguments with their values
def.vals = list(bty = 'n', xlab = 'hello', las = 1)
# find elements in dots by names of def.vals. store those that are NULL
ind = unlist(lapply(dots[names(def.vals)], is.null))
# fill empty elements with default values
dots[names(def.vals)[ind]] = def.vals[ind]
# do plot
do.call(plot, dots)
}
f1(x = 1:10, y = rnorm(10), ylab = 'hi', bty = 'l')

Default argument in R function (formal argument matched by multiple actual arguments)

Simple question, I hope. I want to write a plotting function that has a default value for the y-axis label if the user doesn't specify. I'd also like to allow the ... argument for other plotting parameters, and allow the user to set ylab manually. But I can't figure out how to do this.
# simple scatterplot function with a default ylab
scatter <- function(x,y, ...) {
plot(x, y, ylab="Default y-axis label", ...)
}
# generate data
x <- rnorm(100)
y <- x+rnorm(100)
# use the default
scatter(x,y)
# here I want to use my own label, but I get an error!
scatter(x, y, ylab="New y-axis label")
The error I get is:
Error in plot.default(x, y, ylab = "Default y-axis label", ...) :
formal argument "ylab" matched by multiple actual arguments
I understand the problem, but I don't know the best way to fix it. Thanks for the help!
EDIT: I realize I can do something like
scatter <- function(x,y,ylab = "Default y-axis label", ...) {
plot(x, y, ylab= ylab, ...)
}
...but if I'm writing a package to submit to CRAN, and I have lots of default options I'd like to fiddle with, I don't want to have to document all these standard plotting arguments because they're used in my function definition.
Try doing this instead:
scatter <- function(x,y,ylab = "Default y-axis label", ...) {
plot(x, y, ylab= ylab, ...)
}
Expanding slightly on Arun's answer, this is a sketch of one route to take if you have many arguments:
def_args <- list(ylab = "Default Label",xlab = "Default Label")
scatter <- function(x,y, ...) {
cl <- as.list(match.call())[-1L]
do.call("plot",c(cl,def_args[!names(def_args) %in% names(cl)]))
}
Some thought would be needed to decide how you want to handle partial matching of arguments (if at all). e.g. perhaps something like this:
scatter <- function(x,y, ...) {
cl <- as.list(match.call())[-1L]
names(cl) <- match.arg(names(cl),
names(formals(plot.default)),several.ok = TRUE)
do.call("plot",c(cl,def_args[!names(def_args) %in% names(cl)]))
}
would handle partial matching of arguments.
One way using match.call to check if ylab has been specified as an argument:
scatter <- function(x,y, ...) {
mcall = as.list(match.call())[-1L]
if (!"ylab" %in% names(mcall))
plot(x, y, ylab="Default y-axis label", ...)
else plot(x, y, ...)
}
As mentioned under comment list(...) is a nicer way to get just the dots argument expanded than having to get all the formal arguments with match.call.
You might also try using pmatch instead of %in% for partial matching of arguments.
I use a function to build an argument list. In my case, I do not care about partially matching argument names, which is good because this won't support it.
# Create a list of input arguments.
# Allow arguments to be specified multiple times, first definition wins.
# The resulting list is intended to be passed to do.call().
make.args <- function(..., PRE.ARGS=list(), POST.ARGS=list()) {
a <- list()
l <- c(PRE.ARGS, list(...), POST.ARGS)
for (name in unique(names(l))) {
a[[name]] <- l[[name]] # First occurrence will be found.
}
return(a)
}
An example of its use:
plot.rate <- function(col, cond=NULL, ...) {
col <- paste(col, collapse=' + ')
f <- paste(col, '~ Rate')
if (!is.null(cond)) {
cond <- paste(cond, collapse=' + ')
f <- paste(f, cond, sep='|')
}
arg.list <- make.args(...
, x = as.formula(f)
, main=col
, grid=TRUE
, scales=list(x=list(alternating=1) # bottom(/left)
, y=list(alternating=3)) # both
, xlab='x RTM'
)
do.call(xyplot, arg.list)
}

Resources