How do I combine tryCatch and UseMethod? - r

I'm trying to write some S3 methods, and I'd like them to share common error handling code. This seemed like the obvious way to me:
myMethod <- function(x) {
tryCatch(UseMethod("myMethod", x), error=function(e) paste("Caught:", e))
}
myMethod.default <- function(x) print("Default.")
But it doesn't work, since UseMethod doesn't like being wrapped in tryCatch:
myMethod(0)
[1] "Caught: Error in UseMethod(\"myMethod\", x): 'UseMethod' used in an inappropriate fashion\n"
Does anyone have any advice on where to go from here?

Just wrap it.
myMethod <- function(x) {
fn <- function() UseMethod("myMethod", x)
tryCatch(fn(), error = function(e) paste("Caught:", e))
}
myMethod.default <- function(x) print("Default.")
myMethod(structure('1', class='default'))
# [1] "Default."

Related

Is there a way to make match.call + eval combination work when called from a function?

I am using a package that has 2 functions which ultimately look like the following:
pkgFun1 <- function(group) {
call <- match.call()
pkgFun2(call)
}
pkgFun2 <- function(call) {
eval(call$group)
}
If I just call pkgFun1(group = 2), it works fine. But I want to call it from a function:
myFun <- function(x) {
pkgFun1(group = x)
}
myFun(x = 2)
## Error in eval(call$group) : object 'x' not found
Is there any way to avoid this error, if I can't modify the package functions, but only myFun?
There are similar questions, such as Issue with match.call or Non-standard evaluation in a user-defined function with lapply or with in R, but my particular issue is that I can't modify the part of code containing the eval call.
It's pkgFun2 that is wrong, so I think you're out of luck without some weird contortions. It needs to pass the appropriate environment to eval(); if you can't modify it, then you can't fix it.
This hack might appear to work, but in real life it doesn't:
pkgFun1 <- function(group) {
call <- match.call()
f <- pkgFun2
environment(f) <- parent.frame()
f(call)
}
With this, you're calling a copy of pkgFun2 modified so its environment is appropriate to evaluate the call. It works in the test case, but will cause you untold grief in the future, because everything that is not local in pkgFun2 will be searched for in the wrong place. For example,
myFun <- function(x) {
eval <- function(...) print("Gotcha!")
pkgFun1(group = x)
}
myFun(x = 2)
# [1] "Gotcha!"
Best is to fix pkgFun2. Here's one fix:
pkgFun1 <- function(group) {
call <- match.call()
pkgFun2(call, parent.frame())
}
pkgFun2 <- function(call, envir) {
eval(call$group, envir = envir)
}
Edited to add: Actually, there is another hack that is not so weird that should work with your original pkgFun1 and pkgFun2. If you force the evaluation of x to happen in myFun so that pkgFun1 never sees the expression x, it should work. For example,
myFun <- function(x) {
do.call("pkgFun1", list(group = x))
}
If you do this, then after myFun(2), the pkgFun1 variable call will be pkgFun1(group = 2) and you won't get the error about x.

Avoid argument duplication when passing through (...)

Consider the function
f <- function(x, X) mean(c(x,X))
How can I automatically (by manipulation of f()) change the signature of f() such that it can be used with lapply(), i.e., without returning the following obvious error?
lapply(X=list(1), FUN=f, X=1)
Error in lapply(X = list(1), FUN = f, X = 1) :
formal argument "X" matched by multiple actual arguments
The approach I used so far is to remove all arguments from f(), assign them into an environment, and evaluatef() in that environment.
integrateArgs <- function (f, args)
{
form <- formals(f)
if (!is.null(form))
for (i in seq_along(form)) assign(names(form)[i], form[[i]])
if (!is.null(args))
for (i in seq_along(args)) assign(names(args)[i], args[[i]])
ff <- function() {
}
parent.env(environment(ff)) <- parent.env(environment(f))
body(ff) <- body(f)
if (any(names(form) == "..."))
formals(ff) <- form[names(form) == "..."]
ff
}
fnew <- integrateArgs(f, list(x=1, X=4))
lapply(list(fnew), function(x) x())
[[1]]
[1] 2.5
However, that approach leads to the following error if f() is a function from another R package that calls compiled code.
fnew2 <- integrateArgs(dnorm, list(x=1, mean=4))
lapply(list(fnew2), function(x) x())
Error in x() (from #1) : object 'C_dnorm' not found
Are there better solutions?
As suggested in a comment by MrFlick, one solution is
library(purrr)
integrateArgs <- function(f, args){
do.call(partial, c(list(f), args))
}
fnew2 <- integrateArgs(dnorm, list(x=1, mean=4))
lapply(list(fnew2), function(x) x())
[[1]]
[1] 0.004431848
The following similar approach does not require the package purrr:
integrateArgs <- function(f, args){
do.call(function(f, ...) {
eval(call("function", NULL,
substitute(f(...))), envir = environment(f))},
c(f = list(f), args))
}
fnew2 <- integrateArgs(dnorm, list(x=1, mean=4))
lapply(list(fnew2), function(x) x())
[[1]]
[1] 0.004431848
A similar approach is now used in optimParallel version 0.7-4 to execute functions in parallel using parallel::parLapply(): https://cran.r-project.org/package=optimParallel

curve3d can't find local function "fn"

I'm trying to use the curve3d function in the emdbook-package to create a contour plot of a function defined locally inside another function as shown in the following minimal example:
library(emdbook)
testcurve3d <- function(a) {
fn <- function(x,y) {
x*y*a
}
curve3d(fn(x,y))
}
Unexpectedly, this generates the error
> testcurve3d(2)
Error in fn(x, y) : could not find function "fn"
whereas the same idea works fine with the more basic curve function of the base-package:
testcurve <- function(a) {
fn <- function(x) {
x*a
}
curve(a*x)
}
testcurve(2)
The question is how curve3d can be rewritten such that it behaves as expected.
You can temporarily attach the function environment to the search path to get it to work:
testcurve3d <- function(a) {
fn <- function(x,y) {
x*y*a
}
e <- environment()
attach(e)
curve3d(fn(x,y))
detach(e)
}
Analysis
The problem comes from this line in curve3d:
eval(expr, envir = env, enclos = parent.frame(2))
At this point, we appear to be 10 frames deep, and fn is defined in parent.frame(8). So you can edit the line in curve3d to use that, but I'm not sure how robust this is. Perhaps parent.frame(sys.nframe()-2) might be more robust, but as ?sys.parent warns there can be some strange things going on:
Strictly, sys.parent and parent.frame refer to the context of the
parent interpreted function. So internal functions (which may or may
not set contexts and so may or may not appear on the call stack) may
not be counted, and S3 methods can also do surprising things.
Beware of the effect of lazy evaluation: these two functions look at
the call stack at the time they are evaluated, not at the time they
are called. Passing calls to them as function arguments is unlikely to
be a good idea.
The eval - parse solution bypasses some worries about variable scope. This passes the value of both the variable and function directly as opposed to passing the variable or function names.
library(emdbook)
testcurve3d <- function(a) {
fn <- eval(parse(text = paste0(
"function(x, y) {",
"x*y*", a,
"}"
)))
eval(parse(text = paste0(
"curve3d(", deparse(fn)[3], ")"
)))
}
testcurve3d(2)
I have found other solution that I do not like very much, but maybe it will help you.
You can create the function fn how a call object and eval this in curve3d:
fn <- quote((function(x, y) {x*y*a})(x, y))
eval(call("curve3d", fn))
Inside of the other function, the continuous problem exists, a must be in the global environment, but it is can fix with substitute.
Example:
testcurve3d <- function(a) {
fn <- substitute((function(x, y) {
c <- cos(a*pi*x)
s <- sin(a*pi*y/3)
return(c + s)
})(x, y), list(a = a))
eval(call("curve3d", fn, zlab = "fn"))
}
par(mfrow = c(1, 2))
testcurve3d(2)
testcurve3d(5)

R: eval parse function call not accessing correct environments

I'm trying to read a function call as a string and evaluate this function within another function. I'm using eval(parse(text = )) to evaluate the string. The function I'm calling in the string doesn't seem to have access to the environment in which it is nested. In the code below, my "isgreater" function finds the object y, defined in the global environment, but can't find the object x, defined within the function. Does anybody know why, and how to get around this? I have already tried adding the argument envir = .GlobalEnv to both of my evals, to no avail.
str <- "isgreater(y)"
isgreater <- function(y) {
return(eval(y > x))
}
y <- 4
test <- function() {
x <- 3
return(eval(parse(text = str)))
}
test()
Error:
Error in eval(y > x) : object 'x' not found
Thanks to #MrFlick and #r2evans for their useful and thought-provoking comments. As far as a solution, I've found that this code works. x must be passed into the function and cannot be a default value. In the code below, my function generates a list of results with the x variable being changed within the function. If anyone knows why this is, I would love to know.
str <- "isgreater(y, x)"
isgreater <- function(y, x) {
return(eval(y > x))
}
y <- 50
test <- function() {
list <- list()
for(i in 1:100) {
x <- i
bool <- eval(parse(text = str))
list <- append(list, bool)
}
return(list)
}
test()
After considering the points made by #r2evans, I have elected to change my approach to the problem so that I do not arrive at this string-parsing step. Thanks a lot, everyone.
I offer the following code, not as a solution, but rather as an insight into how R "works". The code does things that are quite dangerous and should only be examined for its demonstration of how to assert a value for x. Unfortunately, that assertion does destroy the x-value of 3 inside the isgreater-function:
str <- "isgreater(y)"
isgreater <- function(y) {
return(eval( y > x ))
}
y <- 4
test <- function() {
environment(isgreater)$x <- 5
return(eval(parse(text = str) ))
}
test()
#[1] FALSE
The environment<- function is used in the R6 programming paradigm. Take a look at ?R6 if you are interested in working with a more object-oriented set of structures and syntax. (I will note that when I first ran your code, there was an object named x in my workspace and some of my efforts were able to succeed to the extent of not throwing an error, but they were finding that length-10000 vector and filling up my console with logical results until I escaped the console. Yet another argument for passing both x and y to isgreater.)

R allow error in lapply

related to this question. I wanted to build a simple lapply function that will output NULL if an error occur.
my first thought was to do something like
lapply_with_error <- function(X,FUN,...){
lapply(X,tryCatch({FUN},error=function(e) NULL))
}
tmpfun <- function(x){
if (x==9){
stop("There is something strange in the neiborhood")
} else {
paste0("This is number", x)
}
}
tmp <- lapply_with_error(1:10,tmpfun )
But tryCatch does not capture the error it seems. Any ideas?
You need to provide lapply with a function:
lapply_with_error <- function(X,FUN,...){
lapply(X, function(x, ...) tryCatch(FUN(x, ...),
error=function(e) NULL))
}

Resources