Overloading R function - is this right? - r

consumeSingleRequest <- function(api_key, URL, columnNames, globalParam="", ...)
consumeSingleRequest <- function(api_key, URL, columnNames, valuesList, globalParam="")
I am trying to overload a function like this, that takes in multiple lists in the first function and combines them into one list of lists. However, I don't seem to be able to skip passing in globalParam and pass in oly the multiple lists in the ...
Does anyone know how to do that?
I've heard S3 methods could be used for that? Does anyone know how?

R doesn't support a concept of overloading functions. It supports function calls with variable number of arguments. So you can declare a function with any number of arguments, but supply only a subset of those when actually calling a function. Take vector function as an example:
> vector
function (mode = "logical", length = 0L)
.Internal(vector(mode, length))
<bytecode: 0x103b89070>
<environment: namespace:base>
It supports up to 2 parameters, but can be called with none or some subset(in that case default values are used) :
> vector()
logical(0)
> vector(mode='numeric')
numeric(0)
So you only need a second declaration:
consumeSingleRequest <- function(api_key, URL, columnNames, valuesList, globalParam="")
And supply just supply the needed parameters when actually calling the function
consumeSingleRequest(api_key=..., valueList=...)
P.S. A good explanation can be found in Advanced R Book.

Related

Deparse, substitute with three-dots arguments

Let consider a typical deparse(substitute( R call:
f1 <-function(u,x,y)
{print(deparse(substitute(x)))}
varU='vu'
varX='vx'
varY='vy'
f1(u=varU,x=varX,y=varY)
That results in
[1] "varX"
which is what we expect and we want.
Then, comes the trouble, I try to get a similar behaviour using the ... arguments i.e.
f2 <- function(...)
{ l <- list(...)
x=l$x
print(deparse(substitute(x))) ### this cannot work but I would like something like that
}
That, not surprisingly, does not work :
f2(u=varU,x=varX,y=varY)
[1] "\"vx\"" ### wrong ! I would like "varX"
I tried to get the expected behaviour using a different combination of solutions but none provides me what expected and it seems that I am still not clear enough on lazy eval to find myself the how-to in a resonable amount of time.
You can get the list of all unevaluated arguments by doing
match.call(expand.dots = FALSE)$...
Or, if you only have dot arguments, via
as.list(match.call()[-1L])
This will give you a named list, similarly to list(...), but in its unevaluated form (similarly to what substitute does on a single argument).
An alternative is using rlang::quos(...) if you’re willing to use the {rlang} package, which returns a similar result in a slightly different form.

Is there a way to figure out the return type of a function?

I think this is a simple question.
As for many languages, you need to provide the return type before you write a function.
however I didn't find solution for R.
the only way I can do right now is to make a call, and using str(),mode(),class() to check the returned value.
but if the function takes long time, I can't use this way
Is there a simple way to know the return type of the function even before I call it?
by the way, I could find some of return type by typing ?function_name, but many helps didn't mention the return type of the function.
Okey, this is the senario why known this info could be so useful:
1.I need the return type to see, after I got the return value, how should I deal with the return, for simple case, if I don't know the return type is list or dataframe, I don't actually have the info to decide which function to use next~Sometimes you don't know you got S3 or S4 object, which made you don't know you should use # or $ to deal with it
2.suppose there are two functions in two package do the same thing, one return me a connection and another return me a html object, if I knew the return type then I can easily pick which function I should use based on my case. Sometimes you only have limit times to connect to some places, then you will waste several of your chances to check out the return type
In short no...
R is a dynamically typed language and many of it's function return different types depending on the parameters passed, being this in my opinion one of the strong points of R, many functions accept many types and return many types. A quick example:
mode(sapply(vector(mode="list", 10) ,function (x) return ('a')))
[1] "character"
mode(sapply(vector(mode="list", 10) ,function (x) return (1)))
[1] "numeric"
Here sapply returns a "character" type or a "numeric" type depending on the function passed to apply on each of the elements. Overall you just have to get used to the language and if nothing works do what you are doing on small tasks first.
In R, there are a number of ways to handle multiple return types from a function.
Let's say you have a function f that calls g, which can return various types. Your first option is with an explicit type check:
f <- function(x)
{
...
y <- g(x)
if(is.data.frame(y))
{
# process result as a data frame
}
else if(is.list(y))
{
# process y as a list
# this must go after is.data.frame, because a list is also a data frame
}
...
}
Now f will check the result of g when it returns, and then call the appropriate code to handle it.
This is fine if you only have a small number of possible types to choose from. Once the number of types becomes large, it's a better option to use something more systematic. That option would be to use R's object framework(s). The simplest framework is S3, so let's look at that.
f <- function(x)
{
y <- g(x)
f_result(y)
}
# this is the f_result _generic_: it dispatches individual methods based on the class of y
f_result <- function(y)
{
UseMethod("f_result")
}
# f_result _method_ for data frames
f_result.data.frame <- function(y)
{
# process result as a data frame
}
# f_result method for lists
f_result.list <- function(y)
{
# process result as a list
}

Call Arguments of Function inside Function / R language

I have a function:
func <- function (x)
{
arguments <- match.call()
return(arguments)
}
1) If I call my function with specifying argument in the call:
func("value")
I get:
func(x = "value")
2) If I call my function by passing a variable:
my_variable <-"value"
func(my_variable)
I get:
func(x = my_variable)
Why is the first and the second result different?
Can I somehow get in the second call "func(x = "value")"?
I'm thinking my problem is that the Environment inside a function simply doesn't contain values if they were passed by variables. The Environment contains only names of variables for further lookup. Is there a way to follow such reference and get value from inside a function?
In R, when you pass my_variable as formal argument x into a function, the value of my_variable will only be retrieved when the function tries to read x (if it does not use x, my_variable will not be read at all). The same applies when you pass more complicated arguments, such as func(x = compute_my_variable()) -- the call to compute_my_variable will take place when func tries to read x (this is referred to as lazy evaluation).
Given lazy evaluation, what you are trying to do is not well defined because of side effects - in which order would you like to evaluate the arguments? Which arguments would you like to evaluate at all? (note a function can just take an expression for its argument using substitute, but not evaluate it). As a side effect, compute_my_variable could modify something that would impact the result of another argument of func. This can happen even when you only passed variables and constants as arguments (function func could modify some of the variables that will be later read, or even reading a variable such as my_variable could trigger code that would modify some of the variables that will be read later, e.g. with active bindings or delayed assignment).
So, if all you want to do is to log how a function was called, you can use sys.call (or match.call but that indeed expands argument names, etc). If you wanted a more complete stacktrace, you can use e.g. traceback(1).
If for some reason you really wanted values of all arguments, say as if they were all read in the order of match.call, which is the order in which they are declared, you can do it using eval (returns them as list):
lapply(as.list(match.call())[-1], eval)
can't you simply
return paste('func(x =', x, ')')

How to force dispatch to an internal generic in R?

I have a class 'myClass' in R that is essentially a list. It has an assignment operator which is going to do some things and then should assign the value using the regular list assignment operator
`$<-.myClass`<-function(x,i,value){
# do some pre-processing stuff
# make the assignment using the default list assignment
x[[i]]<-value
x
}
But I can't actually use x[[i]]<-value as it will dispatch to the already existing [[<-.myClass method.
In similar S3 dispatching cases, I've been able use UseMethod or specifically call [[<-.list, or [[<-.default but those don't seem to exist because $<- and [[<- are primitive generics, right? And I'm sure I'll be sent to a special R hell if I try to call .Primitive("$<-"). What is the correct way to dispatch the assignment to the default assignment method?
It doesn't look like there is a particularly elegant way to do this. The data.frame method for $<- looks like this:
`$<-.data.frame` <- function (x, name, value) {
cl <- oldClass(x)
class(x) <- NULL
x[[name]] <- value
class(x) <- cl
x
}
(with error checking code omitted)
This should only create one copy of x, because class<- modifies in place, and so does the default method for [[<-.

Can an R function access its own name?

Can you write a function that prints out its own name?
(without hard-coding it in, obviously)
You sure can.
fun <- function(x, y, z) deparse(match.call()[[1]])
fun(1,2,3)
# [1] "fun"
You can, but just in case it's because you want to call the function recursively see ?Recall which is robust to name changes and avoids the need to otherwise process to get the name.
Recall package:base R Documentation
Recursive Calling
Description:
‘Recall’ is used as a placeholder for the name of the function in
which it is called. It allows the definition of recursive
functions which still work after being renamed, see example below.
As you've seen in the other great answers here, the answer seems to be "yes"...
However, the correct answer is actually "yes, but not always". What you can get is actually the name (or expression!) that was used to call the function.
First, using sys.call is probably the most direct way of finding the name, but then you need to coerce it into a string. deparse is more robust for that.
myfunc <- function(x, y=42) deparse(sys.call()[[1]])
myfunc (3) # "myfunc"
...but you can call a function in many ways:
lapply(1:2, myfunc) # "FUN"
Map(myfunc, 1:2) # (the whole function definition!)
x<-myfunc; x(3) # "x"
get("myfunc")(3) # "get(\"myfunc\")"
The basic issue is that a function doesn't have a name - it's just that you typically assign the function to a variable name. Not that you have to - you can have anonymous functions - or assign many variable names to the same function (the x case above).

Resources