I am facing some problem with the apply function passing on arguments to a function when not needed. I understand that apply don't know what to do with the optional arguments and just pass them on the function.
But anyhow, here is what I would like to do:
First I want to specify a list of functions that I would like to use.
functions <- list(length, sum)
Then I would like to create a function which apply these specified functions on a data set.
myFunc <- function(data, functions) {
for (i in 1:length(functions)) print(apply(X=data, MARGIN=2, FUN=functions[[i]]))
}
This works fine.
data <- cbind(rnorm(100), rnorm(100))
myFunc(data, functions)
[1] 100 100
[1] -0.5758939 -5.1311173
But I would also like to use additional arguments for some functions, e.g.
power <- function(x, p) x^p
Which don't work as I want to. If I modify myFunc to:
myFunc <- function(data, functions, ...) {
for (i in 1:length(functions)) print(apply(X=data, MARGIN=2, FUN=functions[[i]], ...))
}
functions as
functions <- list(length, sum, power)
and then try my function I get
myFunc(data, functions, p=2)
Error in FUN(newX[, i], ...) :
2 arguments passed to 'length' which requires 1
How may I solve this issue?
Sorry for the wall of text. Thank you!
You can use Curry from functional to fix the parameter you want, put the function in the list of function you want to apply and finally iterate over this list of functions:
library(functional)
power <- function(x, p) x^p
funcs = list(length, sum, Curry(power, p=2), Curry(power, p=3))
lapply(funcs, function(f) apply(data, 2 , f))
With your code you can use:
functions <- list(length, sum, Curry(power, p=2))
myFunc(data, functions)
I'd advocate using Colonel's Curry approach, but if you want to stick to base R you can always:
funcs <- list(length, sum, function(x) power(x, 2))
which is roughly what Curry ends up doing
One option is to pass the parameters in a list with the arguments needed for each function. You can add those parameters to the others needed for apply using c and then use do.call to call the function. Something like this. I also wrap all the output in a list here rather than using print; your usage may vary.
power <- function(x, p) x^p
myFunc <- function(data, functions, parameters) {
lapply(seq_along(functions), function(i) {
p0 <- list(X=data, MARGIN=2, FUN=functions[[i]])
do.call(apply, c(p0, parameters[[i]]))
})
}
d <- matrix(1:6, nrow=2)
functions <- list(length, sum, power)
parameters <- list(NULL, NULL, p=3)
myFunc(d, functions, parameters)
You can use lazyeval package:
library(lazyeval)
my_evaluate <- function(data, expressions, ...) {
lapply(expressions, function(e) {
apply(data, MARGIN=2, FUN=function(x) {
lazy_eval(e, c(list(x=x), list(...)))
})
})
}
And use it like this:
my_expressions <- lazy_dots(sum = sum(x), sumpow = sum(x^p), length_k = length(x)*k )
data <- cbind(rnorm(100), rnorm(100))
my_evaluate(data, my_expressions, p = 2, k = 2)
Related
I am trying to write a function that will apply a user-specified binary operator (e.g. < ) to a raster object. To do so is fairly simple. For example:
selection <- raster::overlay(x = data, fun = function(x) {return(x < 2)}
My issue is that this code would be running within a function, with which I would like to specify both the binary operator and the criteria value (which is 2 in the example above) as variables. For example:
my.func <- function(data, binary_operator, value){
selection <- raster::overlay(x=data, fun=function(x) {x criteria value})
return(selection)
}
I have tried to construct the function as a call without success.
my.func <- function(data, binary_operator, value){
selection <- raster::overlay(x=data, fun=function(x) {call(sprintf("x %s %s", criteria, value))}
return(selection)
}
Is there a way to construct the call of the second function using variables in the first function?
Thanks for your help.
Write your code like this:
my.func <- function(data, binary_operator, value){
selection <- raster::overlay(x=data, fun=function(x) binary_operator(x, value))
return(selection)
}
You need to call this as
my.func(data, `<`, 2)
(with backticks for quotes). If you want to allow "<" for the operator, you could use do.call:
my.func <- function(data, binary_operator, value){
selection <- raster::overlay(x=data, fun=function(x)
do.call(binary_operator, list(x, value)))
return(selection)
}
This will work with either form of argument.
The example is probably simpler than the real case, but you in the example you use, it would be more direct to do:
selection <- data < 2
I am currently dealing with a problem. I am working on a package for some specific distributions where among other things I would like to create a function that will fit an mixture to some data. For this I would like to use for example the fitdistr function. The problem is that I don't know from what distributions and weights and number of components the mixture will be composed of. Hence I need a function that will dynamically create an density function of some specified mixture so the fitdistr function can use it. For example if the user will call:
fitmix(data,dist=c(norm,chisq),params=list(c(mean=0,sd=3),df=2),wights=c(0.5,0.5))
to use ML method the code needs to create an density function
function(x,mean,sd,df) 0.5*dnorm(x,mean,sd)+0.5*dchisq(x,df)
so it can call optim or fitdistr.
An obvious solution is to use a lot of paste+eval+parse but I don't think this is the most elegant solution. A nice solution is probably hiding somewhere in non-standard evaluation and expression manipulation, but I have not enough skills in this problematic.
P.S. the params can be used as starting values for the optimizer.
Building expressions is relatively straight forward in R with functions like as.call and bquote and the fact that functions are first class objects in R. Building functions with dynamic signatures is a bit trickier. Here's a pass at some function that might help
to_params <- function(l) {
z <- as.list(l)
setNames(lapply(names(z), function(x) bquote(args[[.(x)]])), names(z))
}
add_exprs <- function(...) {
x <- list(...)
Reduce(function(a,b) bquote(.(a) + .(b)), x)
}
get_densities <- function(f) {
lapply(paste0("d", f), as.name)
}
weight_expr <- function(w, e) {
bquote(.(w) * .(e))
}
add_params <- function(x, p) {
as.call(c(as.list(x), p))
}
call_with_x <- function(fn) {
as.call(list(fn, quote(x)))
}
fitmix <- function(data, dist, params, weights) {
fb <- Reduce( add_exprs, Map(function(d, p, w) {
weight_expr(w, add_params(call_with_x(d), to_params(p)))
}, get_densities(dist), params, weights))
f <- function(x, args) {}
body(f) <- fb
f
}
Note that I changed the types of some of your parameters. The distributions should be strings. The parameters should be a list of named vectors. It would work with a call like this
ff <- fitmix(data, dist=c("norm","chisq"), params=list(c(mean=0,sd=3),c(df=2)),
weights=c(0.5,0.5))
It returns a function that takes an x and a list of named arguments. You could call it like
ff(0, list(mean=3, sd=2, df=2))
# [1] 0.2823794
which returns the same value as
x <- 0
0.5 * dnorm(x, mean = 3, sd = 2) + 0.5 * dchisq(x, df = 2)
# [1] 0.2823794
I would like to write a wrapper function for two functions that take optional arguments.
Here is an example of a function fun to wrap funA and funB
funA <- function(x = 1, y = 1) return(x+y)
funB <- function(z = c(1, 1) return(sum(z))
fun <- function(x, y, z)
I would like fun to return x+y if x and y are provided, and sum(z) if a vector z is provided.
I have tried to see how the lm function takes such optional arguments, but it is not clear exactly how, e.g., match.call is being used here.
After finding related questions (e.g. How to use R's ellipsis feature when writing your own function? and using substitute to get argument name with )), I have come up with a workable solution.
My solution has just been to use
fun <- function(...){
inputs <- list(...)
if (all(c("x", "y") %in% inputs){
ans <- funA(x, y)
} else if ("z" %in% inputs){
ans <- funB(z)
}
Is there a better way?
Note: Perhaps this question can be closed as a duplicate, but hopefully it can serve a purpose in guiding other users to a good solution: it would have been helpful to have expanded my search to variously include ellipsis, substitute, in addition to match.call.
Use missing. This returns funA(x, y) if both x and y are provided and returns funB if they are not but z is provided and if none of them are provided it returns NULL:
fun <- function(x, y, z) {
if (!missing(x) && !missing(y)) {
funA(x, y)
}
else if (!missing(z)) {
funB(z)
}
This seems to answer your question as stated but note that the default arguments in funA and funB are never used so perhaps you really wanted something different?
Note the fun that is provided in the question only works if the arguments are named whereas the fun here works even if they are provided positionally.
I would something like for example this using match.call. This is similar to your solution but more robust.
fun <- function(...){
arg <- as.list(match.call())[-1]
f <- ifelse(length(arg)>1,"funA","funB")
do.call(f,arg)
}
fun(x=1,y=2) ## or fun(1,2) no need to give named arguments
[1] 3
> fun(z=1:10) ## fun(1:10)
[1] 55
This is not really a problem, but I'm wondering if there is a more elegant solution:
Lets say i have a vector vec <- rlnorm(10) and I want to apply a not vectorized function to it, e.g. exp (ignore for the moment that it is vectorized), I can do
sapply( vec, exp )
But when the function I want to apply is nested, the expression becomes directly less simple:
sapply( vec, function(x) exp( sqrt(x) ) )
This happens to me all the time with the apply and plyr family.
So my question is, is there in general an elegant way to nest (or pipe) functions without defining explicitly an (anonymous) function function(x){...}? Something like
# notrun
sapply( vec, sqrt | exp )
or similar.
See the examples for ?Reduce:
## Iterative function application:
Funcall <- function(f, ...) f(...)
## Compute log(exp(acos(cos(0))
Reduce(Funcall, list(log, exp, acos, cos), 0, right = TRUE)
Here's a more bare-bones implementation with a slightly different interface:
Compose <- function(x, ...)
{
lst <- list(...)
for(i in rev(seq_along(lst)))
x <- lst[[i]](x)
x
}
sapply(0, Compose, log, exp, acos, cos)
The package functional includes a Compose function.
library(functional)
id <- Compose(exp, log)
id(2) # 2
Its implementation is simple enough to include in your source, if, say, you don't need the rest of the stuff in the functional package.
R> Compose
function (...)
{
fs <- list(...)
if (!all(sapply(fs, is.function)))
stop("Argument is not a function")
function(...) Reduce(function(x, f) f(x), fs, ...)
}
<environment: namespace:functional>
I wrote this nifty function to apply a function for every combination of vectorized arguments:
require(plyr)
require(ggplot2)
###eapply accepts a function and and a call to expand grid
###where columns created by expand.grid must correspond to arguments of fun
##each row created by expand.grid will be called by fun independently
###arguments
##fun either a function or a non-empty character string naming the function to be called.
###... vectors, factors, or a list containing thse
###value
###a data frame
##Details
##at this time, elements of ... must be at least partially named to match args of fun
##positional matching does not work
###from the ddply documentation page:
###The most unambiguous behaviour is achieved when fun returns a data frame - in that case pieces will
###be combined with rbind.fill. If fun returns an atomic vector of fixed length, it will be rbinded
###together and converted to a data frame. Any other values will result in an error.
eapply <- function(fun,...){
if(!is.character(fun)) fun <- as.character(substitute(fun))
adply(
expand.grid(...),
1,
function(x,fun) do.call(fun,x),
fun
)
}
##example use:
m <- function(n,visit.cost){
if(n*visit.cost < 250){
c("total.cost"=n*visit.cost)
}else{
c("total.cost"=250 + (n*visit.cost-250)*.25)
}
}
d <- eapply(m, n=1:30, visit.cost=c(40,60,80,100))
ggplot(d,aes(x=n,y=total.cost,color=as.factor(visit.cost),group=visit.cost)) + geom_line()
How can I rewrite the function such that the arguments passed to expand.grid need not be named:
d <- eapply(m, 1:30, c(40,60,80,100))
Alternatively, are there any existing functions that have similar functionality?
Not the most elegant but this works. Most importantly, it allows you to pass variables to expand.grid without naming them.
eeyore <- function(fun, ...){
if(!is.character(fun)) fun <- as.character(substitute(fun))
f <- match.fun(fun)
args <- as.list(substitute(list(...)))[-1]
foo <- expand.grid(llply(args, eval))
foo$F <- apply(foo, 1, function(x) { f(x[[1]], x[[2]])})
foo
}
d <- eeyore(m, 1:30, c(40,60,80,100))