I'm trying to use the snow and snowfall packages, specifically the sfSapply() function to extract data from multiple raster files. It looks something like this:
queue <- list(rast1, rast2, rast3)
sfInit(parallel=TRUE, cpus=3)
sfLibrary(raster)
sfLibrary(rgdal)
sfLibrary(sp)
a <- sfSapply(queue, extract, sp=TRUE, fun=mean, y=tracts)
sfStop()
The fun argument passed to sfSapply() is intended for the extract() function (in the raster library). However, sfSapply() also takes a fun argument (extract()); in this example, I've provided it as the second positional argument.
How can I specify a fun argument for the passed function and not have it confused with the fun argument expected by sfSapply()?
One workaround is to create a custom extract function that has these arguments built in:
sfRasterExtract=function(raster_obj){
extract(raster_obj, sp=TRUE, fun=mean, y=tracts)
}
Make sure to use sfExportAll() after sfInit to import the function to all instances you will use, then run
a <- sfSapply(queue, sfRasterExtract)
Related
I have script main.R, where I create inv_cov_mat variable. I later load metrics.R and use it to calculate function value (I use it as kind of inter-script function closure). I get error "object 'inv_cov_mat' not found". My code:
main.R:
knn <- function(...)
{
# some code
source("./source/metrics.R")
if (metric == "mahalanobis")
inv_cov_mat <- solve(cov(training_set))
# other code
# calculate distance in given metric between current vector and every row vector from training set matrix
distances <- apply(training_set, 1, metric, vec2=curr_vec) # error
metrics.R:
mahalanobis <- function(vec1, vec2)
{
diff <- vec1 - vec2
sqrt(t(diff) %*% inv_cov_mat %*% diff)
}
I've found simple, even if not elegant answer: use inv_cov_mat as global variable, not inside knn function. Then other scripts can see it.
It's not entirely clear what you want, but if I understand you correctly---you have a character string identifying the metric you want to use, and a function with the same name. So you should be able to use get to retrieve the function based on the name.
metric == "mahalanobis"
metric.fun = get(metric)
distances <- apply(training_set, 1, metric.fun, vec2=curr_vec)
That said, there are probably better ways to organize your code that would avoid this problem entirely, e.g. create a named list of functions for accessing metrics.
EDIT regarding the issue of inv_cov_mat, either pass it as an argument to your metric function or use get inside that function to access variables from the parent environment using the envir argument. Passing the variable as an argument to your metric function is definitely the better and cleaner approach.
I have made a function that takes as an argument another function, the argument function takes as its argument some object (in the example a vector) which is supplied by the original function. It has been challenging to make the function call in the right way. Below are three approaches I have used after having read Programming with dplyr.
Only Option three works,
I would like to know if this is in fact the best way to evaluate a function within a function.
library(dplyr);library(rlang)
#Function that will be passed as an argument
EvaluateThis1 <- quo(mean(vector))
EvaluateThis2 <- ~mean(vector)
EvaluateThis3 <- quo(mean)
#First function that will recieve a function as an argument
MyFunc <- function(vector, TheFunction){
print(TheFunction)
eval_tidy(TheFunction)
}
#Second function that will recieve a function as an argument
MyFunc2 <- function(vector, TheFunction){
print(TheFunction)
quo(UQ(TheFunction)(vector)) %>%
eval_tidy
}
#Option 1
#This is evaluating vector in the global environment where
#EvaluateThis1 was captured
MyFunc(1:4, EvaluateThis1)
#Option 2
#I don't know what is going on here
MyFunc(1:4, EvaluateThis2)
MyFunc2(1:4, EvaluateThis2)
#Option 3
#I think this Unquotes the function splices in the argument then
#requotes before evaluating.
MyFunc2(1:4, EvaluateThis3)
My question is:
Is option 3 the best/most simple way to perform this evaluation
An explanation of what is happening
Edit
After reading #Rui Barradas very clear and concise answer I realised that I am actually trying to do someting similar to below which I didn't manage to make work using Rui's method but solved using environment setting
OtherStuff <-c(10, NA)
EvaluateThis4 <-quo(mean(c(vector,OtherStuff), na.rm = TRUE))
MyFunc3 <- function(vector, TheFunction){
#uses the captire environment which doesn't contain the object vector
print(get_env(TheFunction))
#Reset the enivronment of TheFunction to the current environment where vector exists
TheFunction<- set_env(TheFunction, get_env())
print(get_env(TheFunction))
print(TheFunction)
TheFunction %>%
eval_tidy
}
MyFunc3(1:4, EvaluateThis4)
The function is evaluated within the current environment not the capture environment. Because there is no object "OtherStuff" within that environment, the parent environments are searched finding "OtherStuff" in the Global environment.
I will try to answer to question 1.
I believe that the best and simpler way to perform this kind of evaluation is to do without any sort of fancy evaluation techniques. To call the function directly usually works. Using your example, try the following.
EvaluateThis4 <- mean # simple
MyFunc4 <- function(vector, TheFunction){
print(TheFunction)
TheFunction(vector) # just call it with the appropriate argument(s)
}
MyFunc4(1:4, EvaluateThis4)
function (x, ...)
UseMethod("mean")
<bytecode: 0x000000000489efb0>
<environment: namespace:base>
[1] 2.5
There are examples of this in base R. For instance approxfun and ecdf both return functions that you can use directly in your code to perform subsequent calculations. That's why I've defined EvaluateThis4 like that.
As for functions that use functions as arguments, there are the optimization ones, and, of course, *apply, byand ave.
As for question 2, I must admit to my complete ignorance.
I am trying to use the package Deriv, to compute symbolic derivatives of a function depending on one or two variables and a vector of parameters. However, i always obtain the error:
Error in FUN(X[[i]], ...) : Could not retrieve body of '[()'
I have tried:
test_fun <- function(x,y,par){x\*par[1]+y\*par[2]}
Deriv(test_fun,"x",par=c(2,2))
which yields the above error. So does
par <- c(2,2)
test_fun <- function(x,y,par){x\*par[1]+y\*par[2]}
Deriv(test_fun,"x")
Obviously
test_fun <- function(x,y,par){x\*2+y\*2}
Deriv(test_fun,"x")
works as intended, but is not what I want.
Reading the Documentation for the Deriv-package, it seems that directly passing additional arguments to the function is not supported. Is there any other way to achieve the desired result?
Updated answer (9/16):
There's two options below for passing the parameters through Deriv. Also note that Derive will retain curly braces in the original function, so either define your functions without them or call eval() afterwards.
Option 1. Define within the call to Deriv
test_fun <- function(x,y,par){x*par[1]+y*par[2]}
eval(Deriv(test_fun(par=c(2,2)),'x'))
# [1] 2
Option 2. Define the parameter values in as a function.
test_fun <- function(x,y,par){x*par[1]+y*par[2]}
tpar <- function() c(2,2)
eval(Deriv(test_fun(par=tpar()), "x"))
# [1] 2
I am using the package Matrix to create a sparse matrix inside a function that is contained in my personal R package, say
myfun <- function(x, ...){
d = Matrix(0, 5, 5, sparse = TRUE)
d[seq(1,25, by=x)] = 1
t(d)
}
( class(d) is "dgCMatrix" )
If I declare the function in the general environment and I run it from the console, everything works. However, when it is called from the package, I get
Error in t.default(D) : argument is not a matrix
Essentially R tries to call the t() function from the base environment, instead of the version of t() provided with the Matrix package, that possesses the methods to handle dgCMatrix class objects.
I tried explicitly calling library(Matrix) from within the function, but it does not help.
I had the same problem with another function, colSums(), and I solved it using Matrix::colSums() instead. However I would like to find a more general and practical remedy, instead of specifying the environment for every function.
I stress the fact that if I load the function in the general environment, R correctly loads the functions from Matrix.
Any idea on the source of the problem, or how I can force R to read from Matrix these functions?
I'd like to assign a function to variable, using a string. The get() function in the base package does almost exactly what I want. For example,
valueReadFromFile <- "median"
ds <- data.frame(X=rnorm(10), Y=rnorm(10))
dynamicFunction <- get(valueReadFromFile)
dynamicFunction(ds$X) #Returns the variable's median.
However, I want to qualify the function with its package, so that I don't have to worry about (a) loading the function's package with library(), or (b) calling the wrong function in a different package.
Is there a robust, programmatic way I can qualify a function's name with its package using get() (or some similar function)? The following code doesn't work, presumably because get() doesn't know how to interpret the package name before the ::.
require(scales) #This package has functions called `alpha()` and `rescale()`
require(psych) #This package also has functions called `alpha()` and `rescale()`
dynamicFunction1 <- get("scales::alpha")
dynamicFunction2 <- get("psych::alpha")
Try this:
dynamicFunction1 <- get("alpha", envir=as.environment("package:scales"))
dynamicFunction2 <- get("alpha", as.environment("package:psych"))
A matter of terminology: I would not call dynamicFunction1 a "variable" but rather a "name". There is not really a formal class of object named "variable" but I usually see that term used for data-objects whereas "names" are language objects.
You can also call :: directly with character values, or getExportedValue, which :: uses internally
eg
dynamicFunction1 <- `::`('scales', 'alpha')
dynamicFunction2 <- `::`('psych', 'alpha')
or
dynamicFunction1 <- getExportedValue('scales', 'alpha')
dynamicFunction2 <- getExportedValue('psych', 'alpha')