Generic print function for S3 class - r

I have an S3 class, and I'm trying to work out how to set up a print function for it.
This part's good.
print.webglobe <- function(wg, ...){
"it worked!"
}
But, if I run devtools::check() on it, I get the following ominous message:
checking S3 generic/method consistency ... WARNING
print:
function(x, ...)
print.webglobe:
function(wg, ...)
I tried adding the additional code:
print <- function(wg, ...){
UseMethod("webglobe", wg)
}
But, with this present, print.webglobe() never seems to be accessed and my S3 class just prints as a list of some sort.
How can I set up this up correctly?

Change the wg to x. The formal arguments of a method have to match those of the generic because arguments from the generic call are passed, based upon name, to the method. That's why the print() isn't working the way you would expect because wg is being sent to the wg rather than the method's first argument.

Related

Changing imported R function globally

I want to globally add a parameter to a function after import. So in future function calls the function should be always called with the set parameter.
In this case, I want to add the function parameter in_schema("abc") to the function tbl from dplyr.
Normally, I would use the source code and modify the function parameters, save and source it. But in this case, I am already failing to get a proper source code file.
getAnywhere("tbl.DBIConnection")
A single object matching 'tbl.DBIConnection' was found
It was found in the following places
registered S3 method for tbl from namespace dplyr
namespace:dplyr
with value
function (src, from, ...)
{
check_dbplyr()
tbl(dbplyr::src_dbi(src, auto_disconnect = FALSE), from = from,
...)
}
How could I modify the tbl-function (in a script file) so future calls always use a certain Scheme?
like so:
tbl(connection, table, in_schema("abc"))
without having to provide the in_schema parameter all the time.
Don't copy and modify the function, it's messy, do something like this instead :
tbl_abc <- function(src, from, ...){
tbl(src, in_schema("abc", from), ...)
}
btw tbl(connection, table, in_schema("abc")) is improper syntax, in_schema("abc") needs a second argument, and it's passed to the ..., which are not used by tbl.DBIConnection()

Call Arguments of Function inside Function / R language

I have a function:
func <- function (x)
{
arguments <- match.call()
return(arguments)
}
1) If I call my function with specifying argument in the call:
func("value")
I get:
func(x = "value")
2) If I call my function by passing a variable:
my_variable <-"value"
func(my_variable)
I get:
func(x = my_variable)
Why is the first and the second result different?
Can I somehow get in the second call "func(x = "value")"?
I'm thinking my problem is that the Environment inside a function simply doesn't contain values if they were passed by variables. The Environment contains only names of variables for further lookup. Is there a way to follow such reference and get value from inside a function?
In R, when you pass my_variable as formal argument x into a function, the value of my_variable will only be retrieved when the function tries to read x (if it does not use x, my_variable will not be read at all). The same applies when you pass more complicated arguments, such as func(x = compute_my_variable()) -- the call to compute_my_variable will take place when func tries to read x (this is referred to as lazy evaluation).
Given lazy evaluation, what you are trying to do is not well defined because of side effects - in which order would you like to evaluate the arguments? Which arguments would you like to evaluate at all? (note a function can just take an expression for its argument using substitute, but not evaluate it). As a side effect, compute_my_variable could modify something that would impact the result of another argument of func. This can happen even when you only passed variables and constants as arguments (function func could modify some of the variables that will be later read, or even reading a variable such as my_variable could trigger code that would modify some of the variables that will be read later, e.g. with active bindings or delayed assignment).
So, if all you want to do is to log how a function was called, you can use sys.call (or match.call but that indeed expands argument names, etc). If you wanted a more complete stacktrace, you can use e.g. traceback(1).
If for some reason you really wanted values of all arguments, say as if they were all read in the order of match.call, which is the order in which they are declared, you can do it using eval (returns them as list):
lapply(as.list(match.call())[-1], eval)
can't you simply
return paste('func(x =', x, ')')

Inherit R function argument from global variable

I am trying to create a function where the default argument is given by a variable that only exists in the environment temporarily, e.g.:
arg=1:10
test=function(x=arg[3]){2*x}
> test()
[1] 6
The above works fine, as long as arg exists in the function environment. However, if I remove arg:
> rm(arg)
> test()
> Error in test() : object 'arg' not found
Is there a way such that the default argument is taken as 3, even when arg ceases to exist? I have a feeling the correct answer involves some mixture of eval, quote and/or substitute, but I can't seem to find the correct incantation.
The proper way to do it in my opinion would be:
test <- function(x=3) { 2 *x }
and then call it with an argument:
arg<-1:10
test(arg[3])
This way the default value is 3, then you pass it the argument you wish at runtime, if you call it without argument test() it will use the default.
The post above got me on the right track. Using formals:
arg=1:10
test=function(x){x*2}
formals(test)$x=eval(arg[3])
rm(arg)
test()
[1] 6
And that is what I was looking to achieve.

Why can't I call the methods method on a Perl 6's ClassHOW object?

I can call ^methods on an object and list the method names I can call:
my $object = 'Camelia';
my #object_methods = $object.^methods;
#object_methods.map( { .gist } ).sort.join("\n").say;
^methods returns a list which I store in #object_methods, then later I transform that list of method thingys by calling gist on each one to get the human-sensible form of that method thingy.
But, the ^ in ^methods is an implied .HOW, as show at the end of the object documentation this should work too:
my $object = 'Camelia';
my #object_methods = $object.HOW.methods;
But, I get an error:
Too few positionals passed; expected 2 arguments but got 1
in any methods at gen/moar/m-Metamodel.nqp line 490
in block <unit> at...
And, for what it's worth, this is an awful error message for a language that's trying to be person-friendly about that sort of thing. The file m-Metamodel.nqp isn't part of my perl6 installation. It's not even something I can google because, as the path suggests, it's something that a compilation generates. And, that compilation depends on the version.
A regular method call via . passes the invocant as implicit first argument to the method. A meta-method call via .^ passes two arguments: the meta-object as invocant, and the instance as first positional argument.
For example
$obj.^can('sqrt')
is syntactic sugar for
$obj.HOW.can($obj, 'sqrt')
In your example, this would read
my #object_methods = $object.HOW.methods($object);

Overloading R function - is this right?

consumeSingleRequest <- function(api_key, URL, columnNames, globalParam="", ...)
consumeSingleRequest <- function(api_key, URL, columnNames, valuesList, globalParam="")
I am trying to overload a function like this, that takes in multiple lists in the first function and combines them into one list of lists. However, I don't seem to be able to skip passing in globalParam and pass in oly the multiple lists in the ...
Does anyone know how to do that?
I've heard S3 methods could be used for that? Does anyone know how?
R doesn't support a concept of overloading functions. It supports function calls with variable number of arguments. So you can declare a function with any number of arguments, but supply only a subset of those when actually calling a function. Take vector function as an example:
> vector
function (mode = "logical", length = 0L)
.Internal(vector(mode, length))
<bytecode: 0x103b89070>
<environment: namespace:base>
It supports up to 2 parameters, but can be called with none or some subset(in that case default values are used) :
> vector()
logical(0)
> vector(mode='numeric')
numeric(0)
So you only need a second declaration:
consumeSingleRequest <- function(api_key, URL, columnNames, valuesList, globalParam="")
And supply just supply the needed parameters when actually calling the function
consumeSingleRequest(api_key=..., valueList=...)
P.S. A good explanation can be found in Advanced R Book.

Resources