For the moment, at least, this is an exercise in learning for me, so the actual functions or their complexity is not the issue. Suppose I write a function whose argument list includes some input variables and a function name, passed as a string. This function then calculates some variables internally and "decides" how to feed them to the function name I've passed in.
For nonprimitive functions, I can do (for this example, assume non of my funcname functions have any arguments other than at most (x,y,z). If they did, I'd have to write some code to search for matching names(formals(get(funcname))) so as not to delete the other arguments):
foo <- function (a,b,funcname) {
x <- 2*a
y <- a+3*b
z <- -b
formals(get(funcname)) <- list(x=x, y=y, z=z)
bar <- get(funcname)()
return(bar)
}
And the nice thing is, even if the function funcname will execute without error even if it doesn't use x, y or z (so long as there are no other args that don't have defaults) .
The problem with "primitive" functions is I don't know any way to find or modify their formals. Other than writing a wrapper, e.g. foosin <-function(x) sin(x), is there a way to set up my foo function to work with both primitive and nonprimitive function names as input arguments?
formals(args(FUN)) can be used to get the formals of a primitive function.
You could add an if statement to your existing function.
> formals(sum)
# NULL
> foo2 <- function(x) {
if(is.primitive(x)) formals(args(x)) else formals(x)
## formals(if(is.primitive(x)) args(x) else x) is another option
}
> foo2(sum)
# $...
#
#
# $na.rm
# [1] FALSE
#
> foo2(with)
# $data
#
#
# $expr
#
#
# $...
Building on Richard S' response, I ended up doing the following. Posted just in case anyone else ever tries do things as weird as I do.
EDIT: I think more type-checking needs to be done. It's possible that coleqn could be
the name of an object, in which case get(coleqn) will return some data. Probably I need
to add a if(is.function(rab)) right after the if(!is.null(rab)). (Of course, given that I wrote the function for my own needs, if I was stupid enough to pass an object, I deserve what I get :-) ).
# "coleqn" is the input argument, which is a string that could be either a function
# name or an expression.
rab<-tryCatch(get(coleqn),error=function(x) {} )
#oops, rab can easily be neither NULL nor a closure. Damn.
if(!is.null(rab)) {
# I believe this means it must be a function
# thanks to Richard Scriven of SO for this fix to handle primitives
# we are not allowed to redefine primitive's formals.
qq <- list(x=x,y=y,z=z)
# matchup the actual formals names
# by building a list of valid arguments to pass to do.call
argk<-NULL
argnames<-names(formals(args(coleqn)))
for(j in 1:length(argnames) )
argk[j]<-which(names(qq)==argnames[1] )
arglist<-list()
for(j in 1:length(qq) )
if(!is.na(argk[j])) arglist[[names(qq)[j]]]<-qq[[j]]
colvar<- do.call(coleqn,arglist)
} else {
# the input is just an expression (string), not a function
colvar <- eval(parse(text=coleqn))
}
The result is an object generated either by the expression or the function just created, using variables internal to the main function (which is not shown in this snippet)
Related
When working with packages like openxlsx, I often find myself writing repetetive code such as defining the wb and sheet arguments with the same values.
To respect the DRY principle, I would like to define one variable that contains multiple arguments. Then, when I call a function, I should be able to provide said variable to define multiple arguments.
Example:
foo <- list(a=1,b=2,c=3)
bar <- function(a,b,c,d) {
return(a+b+c+d)
}
bar(foo, d=4) # should return 10
How should the foo() function be defined to achieve this?
Apparently you are just looking for do.call, which allows you to create and evaluate a call from a function and a list of arguments.
do.call(bar, c(foo, d = 4))
#[1] 10
How should the foo() function be defined to achieve this?
You've got it slightly backwards. Rather than trying to wrangle the output of foo into something that bar can accept, write foo so that it takes input in a form that is convenient to you. That is, create a wrapper function that provides all the boilerplate arguments that bar requires, without you having to specify them manually.
Example:
bar <- function(a, b, c, d) {
return(a+b+c+d)
}
call_bar <- function(d=4) {
bar(1, 2, 3, d)
}
call_bar(42) # shorter than writing bar(1, 2, 3, 42)
I discovered a solution using rlang::exec.
First, we must have a function to structure the dots:
getDots <- function(...) {
out <- sapply(as.list(match.call())[-1], function(x) eval(parse(text=deparse(x))))
return(out)
}
Then we must have a function that executes our chosen function, feeding in our static parameters as a list (a, b, and c), in addition to d.
execute <- function(FUN, ...) {
dots <-
getDots(...) %>%
rlang::flatten()
out <- rlang::exec(FUN, !!!dots)
return(out)
}
Then calling execute(bar, abc, d=4) returns 10, as it should do.
Alternatively, we can write bar %>% execute(abc, d=4).
Let me give you an example!
How to get two or more return values from a function
Method 1: Set global variables, so that if you change global variables in formal parameters, it will also be effective in actual parameters. So you can change the value of multiple global variables in the formal parameter, then in the actual parameter is equivalent to returning multiple values.
Method 2: If you use the array name as a formal parameter, then you change the contents of the array, such as sorting, or perform addition and subtraction operations, and it is still valid when returning to the actual parameter. This will also return a set of values.
Method 3: Pointer variables can be used. This principle is the same as Method 2, because the array name itself is the address of the first element of the array. Not much to say.
Method 4: If you have learned C++, you can quote parameters
You can try these four methods here, I just think the problem is a bit similar, so I provided it to you, I hope it will help you!
Note: This is separate from, though perhaps similar to, the
deparse-substitute trick
of attaining the name of a passed argument.
Consider the following situation: I have some function to be called, and the return value
is to be assigned to some variable, say x.
Inside the function, how can I capture that the name to be assigned to the
returned value is x, upon calling and assigning the function?
For example:
nameCapture <- function() {
# arbitrary code
captureVarName()
}
x <- nameCapture()
x
## should return some reference to the name "x"
What in R closest approximates captureVarName() referenced in the example?
My intuition was that there would be something in the call stack to do with
assign(), where x would be an argument and could be extracted, but
sys.call() yielded nothing of the sort; does it then occur internally, and if
so, what is a sensible way to attain something like captureVarName()?
My notion is that it would act in a similar manner to how the following works, though without the assign() function, using the <- operator instead:
nameCapture <- function() sys.call(1)[[2]]
assign("x", nameCapture())
x
# [1] "x"
I am not sure if it is possible, but I would like to be able to grab the default argument values of a function and test them and the code within my functions without having to remove the commas (this is especially useful in the case when there are many arguments).
In effect, I want to be able to have commas when sending arguments into the function but not have those commas if I copy and paste the arguments and run them by themselves.
For example:
foo=function(
x=1,
y=2,
z=3
) {
bar(x,y,z)
}
Now to test pieces of the function outside of the code block, copy and paste
x=1,
y=2,
z=3
bar(x,y,z)
But this gives an error because there is a comma after x=1
Perhaps I am not asking the right question. If this is strange, what is the preferred method for debugging functions?
Please note, just posted nearly identical question in Julia.
If you want to programmatically get at the arguments of a function and their default values, you can use formals:
fxn = function(a, b, d = 2) {}
formals(fxn)
# $a
#
#
# $b
#
#
# $d
# [1] 2
I suppose if you wanted to store the default value of every argument to your function into a variable of that name (which is what you're asking about in the OP), then you could do this with a for loop:
info <- formals(fxn)
for (varname in names(info)) {
assign(varname, info[[varname]])
}
a
# Error: argument "a" is missing, with no default
b
# Error: argument "b" is missing, with no default
d
# [1] 2
So for arguments without default values, you'll need to provide a value after running this code (as you would expect). For arguments with defaults, they'll now be set.
I'm looking for a simple function to speed up my ability to write and debug R functions. Consider the following blocks of code:
# Part A:
myfun = function(a, b = 5, out = "hello"){
if(a>b) print(out)
return(a-b)
}
# Part B:
b = 5
out = "hello"
# Part C:
do.args = function(f){
#intialize the arguments of myfun in the parent environment
???
}
The function myfun is a trivial example of a bigger problem: I often have a complicated function with many arguments. To efficiently write and debug such a function, I find it useful to initialize the arguments of the function, and 'step through' the function line-by-line. Initializing the arguments, as in Part B above, is somewhat a hassle, when there are lots of arguments, and I would prefer to have a function as in Part C, which takes only the string myfun as it arguments and produces the same effect as running Part B in the current environment.
This only works for functions where all the arguments are defined. In other words, myfun has to have a value for a defined in the function.
some.func <- function(infunc){
forms <- formals(infunc)
for(i in 1:length(forms)){
assign(names(forms)[i],forms[[i]],envir=globalenv())
}
}
You could add a qualifier to deal with the variables that do not have default values, but it may not work in all examples. In this example I defined all missing variables to NA - and you could change the definition. Note: assigning the missing variables to NULL will not work.
some.func <- function(infunc){
forms <- formals(infunc)
for(i in 1:length(forms)){
if(class(forms[[i]])=="name") forms[[i]] <- NA
assign(names(forms)[i],forms[[i]],envir=globalenv())
}
}
You could also adjust the function and simply skip assigning the missing variables by using next after the if statement rather than defining the missing variables to NA, or some other value. The next example:
some.func <- function(infunc){
forms <- formals(infunc)
for(i in 1:length(forms)){
if(class(forms[[i]])=="name") next
assign(names(forms)[i],forms[[i]],envir=globalenv())
}
}
If you want to reassign formal arguments there is a formals<- function. By default the environment in which it does the assignment is the same as that in which it was created, bu that could be changed. See ?formals and ?alist
formals(myfun) <- alist(a=,b=4, out="not awake")
myfun
#------------------
function (a, b = 4, out = "not awake")
{
if (a > b)
print(out)
return(a - b)
You need to use alist with the argument of the form a= if you want the default to be missing.
}
In R, the idiomatic way to call another function without evaluating the parameters you give it is apparently as follows:
Call <- match.call(expand.dots = TRUE)
# Modify parameters here as needed and set unneeded ones to NULL.
Call[[1L]] <- as.name("name.of.function.to.be.called.here")
eval.parent(Call)
However, when I put a namespaced name (e.g. utils::write.csv) in the as.name() call, I get an error:
"could not find function "utils::write.csv"
What is the proper way of using this R idiom to call a namespaced function?
Here is a solution using do.call(), which both constructs and evaluates the function call.
Like the approach you started with, this one uses the fact that R calls are lists in which: (a) the first element is the name of a function; and (b) all following elements are arguments to that function.
j <- function(x, file) {
Call <- match.call(expand.dots = TRUE)
arglist <- as.list(Call)[-1]
do.call(utils::write.csv, arglist)
}
dat <- data.frame(x=1:10, y=rnorm(10))
j(dat, file="outfilename.csv")
EDIT: FWIW, here's an example from plot.formula in base R, which uses a construct similar to the one above:
{
m <- match.call(expand.dots = FALSE)
eframe <- parent.frame()
. . .
. . .
m <- as.list(m)
m[[1L]] <- stats::model.frame.default
m <- as.call(c(m, list(na.action = NULL)))
mf <- eval(m, eframe)
. . .
. . .
}
The function uses the do.call() construct later on. Going a bit deeper into the weeds, my reading is that in the snippet shown here, it instead uses several steps mostly because of the need to add na.action=NULL to the list of arguments.
In any case, it looks like the do.call() options is as close to canonical as could be desired.
As #Josh O'Brien answered, do.call is much more straight forward to use.
The first argument to do.call can be either a function name or an actual function.
The function name can NOT contain the namespace qualifier. The :: part is actually a function that takes the names on both sides and find the corresponding function, so it must be evaluated separately to work.
So, with do.call, you need something like:
# ...Stuff from Josh's answer goes here
# And then:
do.call(utils::write.csv, arglist)
And with eval:
Call <- match.call(expand.dots = TRUE)
# Modify parameters here as needed and set unneeded ones to NULL.
Call[[1L]] <- utils::write.csv
eval.parent(Call)
Note the lack of quotes around the function name. That evaluates to the function closure.
Another way of getting the function from a namespace-qualified name:
eval(parse(text="utils::write.csv"))
Again, the :: function is called that correctly finds the function.
Another more manual way is to extract the namespace name & function name and then do the lookup yourself:
x <- strsplit("utils::write.csv", "::")[[1]]
get(x[2], asNamespace(x[1]))