R pass function in as variable - r

I am working on a project to profile function outputs so need to pass a function in as an argument in R. To clarify, I have a varying number of models, and am not looking for assistance on setting up the models, just passing in the model function names into the scoring function.
This works for a direct call, but I want to make it more generic for building out the module. Here is a brief example:
#create a test function:
model1 = function(y,X){
fit = lm(y~X)
output = data.frame(resid = fit$residuals)
}
#score function:
score = function(y,X,model){
y= as.matrix(y)
X = as.matrix(X)
fitModel = model(y,X)
yhat = y - fitModel$residual
output = data.frame(yhat=yhat)
}
I can call this code with valid y and X mats with
df <- data.frame(x=rnorm(5),y=runif(5))
scoreModel1 = score(df$y,df$x,model1)
But what I am looking for is a method of listing all of the models, and looping through, and/or calling the score function in a generic way. For instance:
models = c("model1")
scoreModel1 = score(df$y,df$x,models[1])
The error that I get with the above code is
Error in score(y, X, model) :
could not find function "model"
I have played around with as.function(), and listing and unlisting the args, but nothing works. For instance all the following args have rendered the same error as above
models = c(model1)
models = list(model1)
models = list("model1")
Thank you in advance for your help.

For anyone arriving here from google wondering how to pass a function as an argument, here's a great example:
randomise <- function(f) f(runif(1e3))
randomise(mean)
#> [1] 0.5029048
randomise(sum)
#> [1] 504.245
It's from Hadley's book found here

your list objects can simply be the functions directly. Maybe you can get some use out of this structure, or else take Roland's advice and pass formulas. Richiemorrisroe's answer is probably cleaner.
fun1 <- function(x,y){
x+y
}
fun2 <- function(x,y){
x^y
}
fun3 <- function(x,y){
x*y
}
models <- list(fun1 = fun1, fun2 = fun2, fun3 = fun3)
models[["fun1"]](1,2)
[1] 3
models[[1]](1,2)
[1] 3
lapply(models, function(FUN, x, y){ FUN(x = 1, y = 2)})
$fun1
[1] 3
$fun2
[1] 1
$fun3
[1] 2

match.fun is your friend. It is what apply tapply et al use for the same purpose. Note that if you need to pass arguments to the model fitting functions then you will either need to bundle all of these up into a function like so function(x) sum(x==0, na.rm=TRUE) or else supply them as a list and use do.call like so do.call(myfunc, funcargs).
Hope this helps.

another response:
models = list(model1)
scoreModel1 = score(df$y,df$x,models[[1]])
Example to pass function in as variable:
f_add<- function(x,y){ x + y }
f_subtract<- function(x,y){ x - y }
f_multi<- function(x,y){ x * y }
operation<- function(FUN, x, y){ FUN(x , y)}
operation(f_add, 9,2)
#> [1] 11
operation(f_subtract, 17,5)
#> [1] 12
operation(f_multi,6,8)
#> [1] 48
good luck

Related

Convenient way to delegate function calls

I have a series of similar functions that all need to extract some values from a data frame. Something like this:
foo_1 <- function(data, ...) {
x <- data$x
y <- data$y
# some preparatory code common to all foo_X functions
# .. do some boring stuff with x and y
# pack and process the result into 'ret'
return(ret)
}
These functions are then provided as arguments to some other function (let us call it "the master function". I cannot modify the master function).
However, I wish I could avoid re-writing the same preparatory code in each of these functions. For example, I don't want to use data$x instead of assigning it to x and using x because it makes the boring stuff hard to read. Presently, I need to write x <- data$x (etc.) in all of the foo_1, foo_2... functions. Which is annoying and clutters the code. Also, packing and processing is common for all the foo_N functions. Other preparatory code includes scaling of variables or regularization of IDs.
What would be an elegant and terse way of doing this?
One possibility is to attach() the data frame (or use with(), as Hong suggested in the answer below), but I don't know what other variables would then be in my name space: attaching data can mask other variables I use in fun_1. Also, preferably the foo_N functions should be called with explicit parameters, so it is easier to see what they need and what they are doing.
Next possibility I was thinking of was a construction like this:
foo_generator <- function(number) {
tocall <- switch(1=foo_1, 2=foo_2, 3=foo_3) # etc.
function(data, ...) {
x <- data$x
y <- data$y
tocall(x, y, ...)
# process and pack into ret
return(ret)
}
foo_1 <- function(x, y, ...) {
# do some boring stuff
}
Then I can use foo_generator(1) instead of foo_1 as the argument for the master function.
Is there a better or more elegant way? I feel like I am overlooking something obvious here.
You might be overthinking it. You say that the code dealing with preparation and packing are common to all foo_n functions. I assume, then, that # .. do some boring stuff with x and y is where each function differs. If that's the case then just create a single prep_and_pack function which takes a function name as a parameter, and then pass in foo_1, foo_2, etc. For example:
prep_and_pack <- function(data, func){
x <- data$x
y <- data$y
# preparatory code here
xy_output <- func(x, y) # do stuff with x and y
# code to pack and process into "ret"
return(ret)
}
Now you can create your foo_n functions that do different things with x and y:
foo_1 <- function(x, y) {
# .. do some boring stuff with x and y
}
foo_2 <- function(x, y) {
# .. do some boring stuff with x and y
}
foo_3 <- function(x, y) {
# .. do some boring stuff with x and y
}
Finally, you can pass multiple calls to prep_and_pack into your master function, where foo_1 etc. are passed in via the func argument:
master_func(prep_and_pack(data = df, func = foo_1),
prep_and_pack(data = df, func = foo_2),
prep_and_pack(data = df, func = foo_3)
)
You could also use switch in prep_and_pack and/or forgo the foo_n functions completely in favor of if-else conditionals to deal with the various cases, but I think the above keeps things nice a clean.
The requirements still seem a bit vague to me,
but if your code is so similar that you can simply wrap it around a helper function like tocall in your example,
and your input is in a list-like structure
(like a data frame which is just a list of columns),
then just write all your foo_* functions to take the "spliced" parameters like in your proposed solution,
and then use do.call:
foo_1 <- function(x, y) {
x + y
}
foo_2 <- function(x, y) {
x - y
}
data <- list(x = 1:2, y = 3:4)
do.call(foo_1, data)
# [1] 4 6
do.call(foo_2, data)
# [1] -2 -2
I'm not sure the following is a good idea. It reminds me a bit of programming with macros. I don't think I would do this. You'd need to carefully document because it is unexpected, confusing and not self-explanatory.
If you want to reuse the same code in different functions, it might be an option to create it as an unevaluated call and evaluate that call in the different functions:
prepcode <- quote({
x <- data$x
y <- data$y
}
)
foo_1 <- function(data, ...) {
eval(prepcode)
# some preparatory code common to all foo_X functions
# .. do some boring stuff with x and y
# pack and process the result into 'ret'
return(list(x, y))
}
L <- list(x = 1, y = "a")
foo_1(L)
#[[1]]
#[1] 1
#
#[[2]]
#[1] "a"
It might be better, to then have prepcode as an argument to foo_1 to make sure there won't be any scoping issues.
Use with inside the function:
foo_1 <- function(data, ...) {
with(data, {
# .. in here, x and y refer to data$x and data$y
}
}
I'm not sure I understand fully, but can't you simply use a function for all common stuff, and then unpack that into the foo_N functions using list2env? For example:
prepFun <- function(data, ...) {
x <- data$x
y <- data$y
tocall(x, y, ...)
# process and pack into a named list called ret
# so you know specifically what the elements of ret are called
return(ret)
}
foo_1 <- function(data, ...) {
# Get prepFun to do the prepping, then use list2env to get the result in foo_1
# You know which are the named elements, so there should be no confusion about
# what does or does not get masked.
prepResult <- prepFun(data, ...)
list2env(prepResult)
# .. do some boring stuff with x and y
# pack and process the result into 'ret'
return(ret)
}
Hope this is what you're looking for!
I think defining a function factory for this task is a bit overkill and confusing. You can define a general function and use purrr::partial() on it when passing it to your master function.
Something like :
foo <- function(data, ..., number, foo_funs) {
tocall <- foo_funs[[number]])
with(data[c("x", "y")], tocall(x, y, ...))
# process and pack into ret
return(ret)
}
foo_1 <- function(x, y, ...) {
# do some boring stuff
}
foo_funs <- list(foo_1, foo_2, ...)
Then call master_fun(fun = purrr::partial(foo, number =1) , ...)
another possibility is to use list2env which saves the components of a list in to a specified environment:
foo_1 <- function(data){
list2env(data, envir = environment())
x + y
}
foo_1(data.frame(x = 1:2, y = 3:4))
See also this question.

How to use lapply or a family of the apply function for calling a function within a function in R?

How to use lapply or a family of the apply function for calling a function within a function?
I have a parent function (i.e., hrat) that calls a sister function (i.e., drat) within it. I would like to apply this function over certain vector. I am providing a code to demonstrate my logic. I get following error message.
Code:
drat <- function(y){
x <- y * 5
return(x)
}
hrat <- function(z, j, drat){
y <- z +1
w <- drat(y) + j
return(w)
}
z <- c(1:5)
j <- 4
result <- lapply(z,j, function(x) hrat(x, drat(x)))
ERROR MESSAGE:
Error in get(as.character(FUN), mode = "function", envir = envir) :
object 'j' of mode 'function' was not found
Any help will be appreciated. Thank you
To avoid confusion, it is better to have anonymous function call
lapply(z, function(x) hrat(x, drat))

Can I avoid the `eval(parse())` defining a function with `polynomial()` in R?

I want to avoid using parse() in a function definition that contains a polynomial().
My polynomial is this:
library(polynom)
polynomial(c(1, 2))
# 1 + 2*x
I want to create a function which uses this polynomial expression as in:
my.function <- function(x) magic(polynomial(c(1, 2)))
where for magic(), I have tried various combinations of expression(), formula(), eval(), as.character(), etc... but nothing seems to work.
My only working solution is using eval(parse()):
eval(parse(text = paste0('poly_function <- function(x) ', polynomial(c(1, 2)))))
poly_function(x = 10)
# 21
Is there a better way to do want I want? Can I avoid the eval(parse())?
Like you, I though that the polynomial function was returning an R expression, but we were both wrong. Reading the help Index for package:polynom would have helped us both:
str(pol)
#Class 'polynomial' num [1:2] 1 2
help(pac=polynom)
So user20650 is correct and:
> poly_function <- as.function(pol)
> poly_function(10)
[1] 21
So this was how the authors (Venables, Hornick, Maechler) do it:
> getAnywhere(as.function.polynomial)
A single object matching ‘as.function.polynomial’ was found
It was found in the following places
registered S3 method for as.function from namespace polynom
namespace:polynom
with value
function (x, ...)
{
a <- rev(coef(x))
w <- as.name("w")
v <- as.name("x")
ex <- call("{", call("<-", w, 0))
for (i in seq_along(a)) {
ex[[i + 2]] <- call("<-", w, call("+", a[1], call("*",
v, w)))
a <- a[-1]
}
ex[[length(ex) + 1]] <- w
f <- function(x) NULL
body(f) <- ex
f
}
<environment: namespace:polynom>
Since you mention in your comments that getAnywhere was new then it also might be the case that you could gain by reviewing the "run up" to using it. If you type a function name at the console prompt, you get the code, in this case:
> as.function
function (x, ...)
UseMethod("as.function")
<bytecode: 0x7f978bff5fc8>
<environment: namespace:base>
Which is rather unhelpful until you follow it up with:
> methods(as.function)
[1] as.function.default as.function.polynomial*
see '?methods' for accessing help and source code
The asterisk at the end of the polynomial version tells you that the code is not "exported", i.e. available at the console just by typing. So you need to pry it out of a loaded namespace with getAnywhere.
It seems like you could easily write your own function too
poly_function = function(x, p){
sum(sapply(1:length(p), function(i) p[i]*x^(i-1)))
}
# As 42- mentioned in comment to this answer,
# it appears that p can be either a vector or a polynomial
pol = polynomial(c(1, 2))
poly_function(x = 10, p = pol)
#[1] 21
#OR
poly_function(x = 10, p = c(1,2))
#[1] 21

Dynamically creating functions and expressions

I am currently dealing with a problem. I am working on a package for some specific distributions where among other things I would like to create a function that will fit an mixture to some data. For this I would like to use for example the fitdistr function. The problem is that I don't know from what distributions and weights and number of components the mixture will be composed of. Hence I need a function that will dynamically create an density function of some specified mixture so the fitdistr function can use it. For example if the user will call:
fitmix(data,dist=c(norm,chisq),params=list(c(mean=0,sd=3),df=2),wights=c(0.5,0.5))
to use ML method the code needs to create an density function
function(x,mean,sd,df) 0.5*dnorm(x,mean,sd)+0.5*dchisq(x,df)
so it can call optim or fitdistr.
An obvious solution is to use a lot of paste+eval+parse but I don't think this is the most elegant solution. A nice solution is probably hiding somewhere in non-standard evaluation and expression manipulation, but I have not enough skills in this problematic.
P.S. the params can be used as starting values for the optimizer.
Building expressions is relatively straight forward in R with functions like as.call and bquote and the fact that functions are first class objects in R. Building functions with dynamic signatures is a bit trickier. Here's a pass at some function that might help
to_params <- function(l) {
z <- as.list(l)
setNames(lapply(names(z), function(x) bquote(args[[.(x)]])), names(z))
}
add_exprs <- function(...) {
x <- list(...)
Reduce(function(a,b) bquote(.(a) + .(b)), x)
}
get_densities <- function(f) {
lapply(paste0("d", f), as.name)
}
weight_expr <- function(w, e) {
bquote(.(w) * .(e))
}
add_params <- function(x, p) {
as.call(c(as.list(x), p))
}
call_with_x <- function(fn) {
as.call(list(fn, quote(x)))
}
fitmix <- function(data, dist, params, weights) {
fb <- Reduce( add_exprs, Map(function(d, p, w) {
weight_expr(w, add_params(call_with_x(d), to_params(p)))
}, get_densities(dist), params, weights))
f <- function(x, args) {}
body(f) <- fb
f
}
Note that I changed the types of some of your parameters. The distributions should be strings. The parameters should be a list of named vectors. It would work with a call like this
ff <- fitmix(data, dist=c("norm","chisq"), params=list(c(mean=0,sd=3),c(df=2)),
weights=c(0.5,0.5))
It returns a function that takes an x and a list of named arguments. You could call it like
ff(0, list(mean=3, sd=2, df=2))
# [1] 0.2823794
which returns the same value as
x <- 0
0.5 * dnorm(x, mean = 3, sd = 2) + 0.5 * dchisq(x, df = 2)
# [1] 0.2823794

Anonymous passing of variables from current environment to subfunction calls

The function testfun1, defined below, does what I want it to do. (For the reasoning of all this, see the background info below the code example.) The question I wanted to ask you is why what I tried in testfun2 doesn't work. To me, both appear to be doing the exact same thing. As shown by the print in testfun2, the evaluation of the helper function inside testfun2 takes place in the correct environment, but the variables from the main function environment get magically passed to the helper function in testfun1, but not in testfun2. Does anyone of you know why?
helpfun <- function(){
x <- x^2 + y^2
}
testfun1 <- function(x,y){
xy <- x*y
environment(helpfun) <- sys.frame(sys.nframe())
x <- eval(as.call(c(as.symbol("helpfun"))))
return(list(x=x,xy=xy))
}
testfun1(x = 2,y = 1:3)
## works as intended
eval.here <- function(fun){
environment(fun) <- parent.frame()
print(environment(fun))
eval(as.call(c(as.symbol(fun))))
}
testfun2 <- function(x,y){
print(sys.frame(sys.nframe()))
xy <- x*y
x <- eval.here("helpfun")
return(list(x=x,xy=xy))
}
testfun2(x = 2,y = 1:3)
## helpfun can't find variable 'x' despite having the same environment as in testfun1...
Background info: I have a large R code in which I want to call helperfunctions inside my main function. They alter variables of the main function environment. The purpose of all this is mainly to unclutter my code. (Main function code is currently over 2000 lines, with many calls to various helperfunctions which themselves are 40-150 lines long...)
Note that the number of arguments to my helper functions is very high, so that the traditional explicit passing of function arguments ( "helpfun(arg1 = arg1, arg2 = arg2, ... , arg50 = arg50)") would be cumbersome and doesnt yield the uncluttering of the code that I am aiming for. Therefore, I need to pass the variables from the parent frame to the helper functions anonymously.
Use this instead:
eval.here <- function(fun){
fun <- get(fun)
environment(fun) <- parent.frame()
print(environment(fun))
fun()
}
Result:
> testfun2(x = 2,y = 1:3)
<environment: 0x0000000013da47a8>
<environment: 0x0000000013da47a8>
$x
[1] 5 8 13
$xy
[1] 2 4 6

Resources