In python you can do:
val = [1, 2, 3]
def f(a, b, c):
return(a+b+c)
f(*val)
>>>6
But is there an R equivalent to passing a list/vector to a function and having it unpack the list/vector as the arguments to the function?
val <- c(1, 2, 3)
f <- function(a,
b,
c) {
a+b+c
}
#f(*val)
Base R
do.call In R it is do.call. The first argument is the function or a character string giving the name of the function and the second argument is a list whose components will be passed as individual arguments to the function. No packages are used.
val <- c(1, 2, 3)
f <- function(a, b, c) a+b+c
do.call("f", as.list(val))
## [1] 6
Reduce Another approach is to curry f creating a new function with the first argument fixed, repeatedly doing so using Reduce to handle each successive argument. No packages are used.
Reduce(function(f, x, ...) function(...) f(x, ...), val, init = f)()
## [1] 6
purrr package
invoke The purrr package has invoke which basically just calls do.call but it will also convert the second argument to a list if it is not already a list:
library(purrr)
invoke(f, val)
## [1] 6
lift purrr also has lift which will convert a function that takes individual arguments to a new function that takes a list or vector. It also wraps do.call
lift(f)(val)
## [1] 6
partial purrr also has partial which will curry the function creating a new function with the first argument fixed taking only the remaining arguments so using reduce (also in purrr) to repeatedly invoke such currying:
reduce(val, partial, .init = f)()
## [1] 6
functional package
Curry Curry from the functional package could also be used to fix the first argument. When used together with Reduce from base R to repeatedly apply Curry it gives the same result. Note that Curry uses do.call internally.
library(functional)
Reduce(Curry, init = f, val)()
## [1] 6
Another option is to lift the domain of a function using lift() from the purrr package. Your definition of f takes multiple arguments, which in R terms is known as "dots". You can change its signature to accept a vector instead:
f2 <- purrr::lift_dv(f)
f2(val)
## [1] 6
# Or as a one-liner
purrr::lift_dv(f)(val)
Related
When trying to create a list of similar functions using lapply, I find that all the functions in the list are identical and equal to what the final element should be.
Consider the following:
pow <- function(x,y) x^y
pl <- lapply(1:3,function(y) function(x) pow(x,y))
pl
[[1]]
function (x)
pow(x, y)
<environment: 0x09ccd5f8>
[[2]]
function (x)
pow(x, y)
<environment: 0x09ccd6bc>
[[3]]
function (x)
pow(x, y)
<environment: 0x09ccd780>
When you try to evaluate these functions you get identical results:
pl[[1]](2)
[1] 8
pl[[2]](2)
[1] 8
pl[[3]](2)
[1] 8
What is going on here, and how can I get the result I desire (the correct functions in the list)?
R passes promises, not the values themselves. The promise is forced when it is first evaluated, not when it is passed, and by that time the index has changed if one uses the code in the question. The code can be written as follows to force the promise at the time the outer anonymous function is called and to make it clear to the reader:
pl <- lapply(1:3, function(y) { force(y); function(x) pow(x,y) } )
This is no longer true as of R 3.2.0!
The corresponding line in the change log reads:
Higher order functions such as the apply functions and Reduce() now
force arguments to the functions they apply in order to eliminate
undesirable interactions between lazy evaluation and variable capture
in closures.
And indeed:
pow <- function(x,y) x^y
pl <- lapply(1:3,function(y) function(x) pow(x,y))
pl[[1]](2)
# [1] 2
pl[[2]](2)
# [1] 4
pl[[3]](2)
# [1] 8
When trying to create a list of similar functions using lapply, I find that all the functions in the list are identical and equal to what the final element should be.
Consider the following:
pow <- function(x,y) x^y
pl <- lapply(1:3,function(y) function(x) pow(x,y))
pl
[[1]]
function (x)
pow(x, y)
<environment: 0x09ccd5f8>
[[2]]
function (x)
pow(x, y)
<environment: 0x09ccd6bc>
[[3]]
function (x)
pow(x, y)
<environment: 0x09ccd780>
When you try to evaluate these functions you get identical results:
pl[[1]](2)
[1] 8
pl[[2]](2)
[1] 8
pl[[3]](2)
[1] 8
What is going on here, and how can I get the result I desire (the correct functions in the list)?
R passes promises, not the values themselves. The promise is forced when it is first evaluated, not when it is passed, and by that time the index has changed if one uses the code in the question. The code can be written as follows to force the promise at the time the outer anonymous function is called and to make it clear to the reader:
pl <- lapply(1:3, function(y) { force(y); function(x) pow(x,y) } )
This is no longer true as of R 3.2.0!
The corresponding line in the change log reads:
Higher order functions such as the apply functions and Reduce() now
force arguments to the functions they apply in order to eliminate
undesirable interactions between lazy evaluation and variable capture
in closures.
And indeed:
pow <- function(x,y) x^y
pl <- lapply(1:3,function(y) function(x) pow(x,y))
pl[[1]](2)
# [1] 2
pl[[2]](2)
# [1] 4
pl[[3]](2)
# [1] 8
When trying to create a list of similar functions using lapply, I find that all the functions in the list are identical and equal to what the final element should be.
Consider the following:
pow <- function(x,y) x^y
pl <- lapply(1:3,function(y) function(x) pow(x,y))
pl
[[1]]
function (x)
pow(x, y)
<environment: 0x09ccd5f8>
[[2]]
function (x)
pow(x, y)
<environment: 0x09ccd6bc>
[[3]]
function (x)
pow(x, y)
<environment: 0x09ccd780>
When you try to evaluate these functions you get identical results:
pl[[1]](2)
[1] 8
pl[[2]](2)
[1] 8
pl[[3]](2)
[1] 8
What is going on here, and how can I get the result I desire (the correct functions in the list)?
R passes promises, not the values themselves. The promise is forced when it is first evaluated, not when it is passed, and by that time the index has changed if one uses the code in the question. The code can be written as follows to force the promise at the time the outer anonymous function is called and to make it clear to the reader:
pl <- lapply(1:3, function(y) { force(y); function(x) pow(x,y) } )
This is no longer true as of R 3.2.0!
The corresponding line in the change log reads:
Higher order functions such as the apply functions and Reduce() now
force arguments to the functions they apply in order to eliminate
undesirable interactions between lazy evaluation and variable capture
in closures.
And indeed:
pow <- function(x,y) x^y
pl <- lapply(1:3,function(y) function(x) pow(x,y))
pl[[1]](2)
# [1] 2
pl[[2]](2)
# [1] 4
pl[[3]](2)
# [1] 8
How would one implement in R the function apply.func(func, arg.list), which takes an arbitrary function func and a suitable list arg.list as arguments, and returns the result of calling func with the arguments contained in arg.list. E.g.
apply.func(foo, list(x="A", y=1, z=TRUE))
is equivalent to
foo(x="A", y=1, z=TRUE)
Thanks!
P.S. FWIW, the Python equivalent of apply.func would be something like
def apply_func(func, arg_list):
return func(*arg_list)
or
def apply_func(func, kwarg_dict):
return func(**kwarg_dict)
or some variant thereof.
I think do.call is what you're looking for. You can read about it via ?do.call.
The classic example of how folks use do.call is to rbind data frames or matrices together:
d1 <- data.frame(x = 1:5,y = letters[1:5])
d2 <- data.frame(x = 6:10,y = letters[6:10])
do.call(rbind,list(d1,d2))
Here's another fairly trivial example using sum:
do.call(sum,list(1:5,runif(10)))
R allows functions to be passed as arguments to functions. This means you can define apply.func as follows (where f is a function and ... indicates all other parameters:
apply.func <- function(f, ...)f(...)
You can then use apply.func to and specify any function where the parameters makes sense:
apply.func(paste, 1, 2, 3)
[1] "1 2 3"
apply.func(sum, 1, 2, 3)
[1] 6
However, note that the following may not produce the results you expected, since mean takes a vector as an argument:
apply.func(mean, 1, 2, 3)
[1] 1
Note that there is also a base R function called do.call which effectively does the same thing.
I can create a compose operator in R:
`%c%` = function(x,y)function(...)x(y(...))
To be used like this:
> numericNull = is.null %c% numeric
> numericNull(myVec)
[2] TRUE FALSE
but I would like to know if there is an official set of functions to do this kind of thing and other operations such as currying in R. Largely this is to reduce the number of brackets, function keywords etc in my code.
My curry function:
> curry=function(...){
z1=z0=substitute(...);z1[1]=call("list");
function(...){do.call(as.character(z0[[1]]),
as.list(c(eval(z1),list(...))))}}
> p = curry(paste(collapse=""))
> p(letters[1:10])
[1] "abcdefghij"
This is especially nice for e.g. aggregate:
> df = data.frame(l=sample(1:3,10,rep=TRUE), t=letters[1:10])
> aggregate(df$t,df["l"],curry(paste(collapse="")) %c% toupper)
l x
1 1 ADG
2 2 BCH
3 3 EFIJ
Which I find much more elegant and editable than:
> aggregate(df$t, df["l"], function(x)paste(collapse="",toupper(x)))
l x
1 1 ADG
2 2 BCH
3 3 EFIJ
Basically I want to know - has this already been done for R?
Both of these functions actually exist in the roxygen package (see the source code here) from Peter Danenberg (was originally based on Byron Ellis's solution on R-Help):
Curry <- function(FUN,...) {
.orig = list(...);
function(...) do.call(FUN,c(.orig,list(...)))
}
Compose <- function(...) {
fs <- list(...)
function(...) Reduce(function(x, f) f(x),
fs,
...)
}
Note the usage of the Reduce function, which can be very helpful when trying to do functional programming in R. See ?Reduce for more details (which also covers other functions such as Map and Filter).
And your example of Curry (slightly different in this usage):
> library(roxygen)
> p <- Curry(paste, collapse="")
> p(letters[1:10])
[1] "abcdefghij"
Here's an example to show the utility of Compose (applying three different functions to letters):
> Compose(function(x) x[length(x):1], Curry(paste, collapse=""), toupper)(letters)
[1] "ZYXWVUTSRQPONMLKJIHGFEDCBA"
And your final example would work like this:
> aggregate(df[,"t"], df["l"], Compose(Curry(paste, collapse=""), toupper))
l x
1 1 ABG
2 2 DEFH
3 3 CIJ
Lastly, here's a way to do the same thing with plyr (could also easily be done with by or aggregate as already shown):
> library(plyr)
> ddply(df, .(l), function(df) paste(toupper(df[,"t"]), collapse=""))
l V1
1 1 ABG
2 2 DEFH
3 3 CIJ
The standard place for functional programming in R is now the functional library.
From the library:
functional: Curry, Compose, and other higher-order functions
Example:
library(functional)
newfunc <- Curry(oldfunc,x=5)
CRAN:
https://cran.r-project.org/web/packages/functional/index.html
PS: This library substitutes the ROxigen library.
There is a function called Curry in the roxygen package.
Found via this conversation on the R Mail Archive.
A more complex approach is required if you want the 'names' of the variables to pass through accurately.
For example, if you do plot(rnorm(1000),rnorm(1000)) then you will get nice labels on your x- and y- axes. Another example of this is data.frame
> data.frame( rnorm(5), rnorm(5), first=rpois(5,1), second=rbinom(5,1,0.5) )
rnorm.5. rnorm.5..1 first second
1 0.1964190 -0.2949770 0 0
2 0.4750665 0.8849750 1 0
3 -0.7829424 0.4174636 2 0
4 1.6551403 1.3547863 0 1
5 1.4044107 -0.4216046 0 0
Not that the data.frame has assigned useful names to the columns.
Some implementations of Curry may not do this properly, leading to unreadable column names and plot labels. Instead, I now use something like this:
Curry <- function(FUN, ...) {
.orig = match.call()
.orig[[1]] <- NULL # Remove first item, which matches Curry
.orig[[1]] <- NULL # Remove another item, which matches FUN
function(...) {
.inner = match.call()
.inner[[1]] <- NULL # Remove first item, which matches Curry
do.call(FUN, c(.orig, .inner), envir=parent.frame())
}
}
This is quite complex, but I think it's correct. match.call will catch all args, fully remembering what expressions defined the args (this is necessary for nice labels). The problem is that it catches too many args -- not just the ... but also the FUN. It also remembers the name of the function that's being called (Curry).
Therefore, we want to delete these first two entries in .orig so that .orig really just corresponds to the ... arguments. That's why we do .orig[[1]]<-NULL twice - each time deletes an entry and shifts everything else to the left.
This completes the definition and we can now do the following to get exactly the same as above
Curry(data.frame, rnorm(5), rnorm(5) )( first=rpois(5,1) , second=rbinom(5,1,0.5) )
A final note on envir=parent.frame(). I used this to ensure that there won't be a problem if you have external variables called '.inner' or '.orig'. Now, all variables are evaluated in the place where the curry is called.
in package purrr ,now there is a function partial
If you are already using the purrr package from tidyverse, then purrr::partial is a natural choice to curry functions. From the description of purrr::partial:
# Partial is designed to replace the use of anonymous functions for
# filling in function arguments. Instead of:
compact1 <- function(x) discard(x, is.null)
# we can write:
compact2 <- partial(discard, .p = is.null)