import for package variable definition - r

Is the following a correct way to use the roxygen #import directive when defining package variables? I suspect not, because trying to use the db function yields Error in connection_quote_identifier(conn#ptr, x) : Invalid connection.
The connection should be valid. I can run the same code outside the package (after calling library(dplyr);library(RPostgres) and it works.
#' #import RPostgres
cxn <- dbConnect(Postgres(), service = 'plasticemission')
#' #import dplyr
#' #export
db <- function(name) {
tbl(cxn, name)
}

Extension of my comment, consider changing it to
cxn <- NULL
#' #param name character, name of the table
#' #param cxn database (DBI) connection object
#' #import DBI
#' #import RPostgres
#' #import dplyr
#' #export
db <- function(name, cxn = cxn) {
if (missing(cxn) && is.null(cxn)) {
cxn <<- DBI::dbConnect(RPostgres::Postgres(), service = 'plasticemission')
}
dplyr::tbl(cxn, name)
}
This way, you do not attempt connection instantiation when the package is loaded, just when the function is called. I provided an override-able variable cxn in the call in case you ever have need for multiple, temporary, or non-standard connections.
If you do really want it to be connected as soon as you load the package, consider using .onLoad (or .onAttach), with something like:
# it might be better defined here, not in your other function, but once is enough
cxn <- NULL
.onLoad <- function(libname, pkgname) {
if (getOption("mypackage.autocxn", default = FALSE)) {
if (is.null(cxn)) {
cxn <<- DBI::dbConnect(RPostgres::Postgres(), service = 'plasticemission')
}
}
}
In this case, I made it an explicit requirement for auto-connect to set a global option "mypackage.autocxn" (in this case as a logical). The name of the option is completely arbitrary, and you can do all sorts of flexible things here such as
.onLoad <- function(libname, pkgname) {
if (length(opts <- getOption("mypackage.autocxn", default = NULL)) &&
is.null(cxn)) {
cxn <<- do.call(DBI::dbConnect, c(list(drv = RPostgres::Postgres()), args))
}
}
And somewhere in your personal setup (perhaps your ~/.Rprofile), you can set
```lang-r
options(mypackage.autocxn = list(service = 'plasticemission'))
```

Related

R Package passing function through to new function

I'm creating a new package as a learning exercise. I've selected a few functions that serve the purpose I need and want to bundle them together into a single, new, package that I can then apply gWidgets to to make them GUI driven.
Finding documentation on how to pass existing functions has been pretty sparse and I am a novice at this. Any assistance would be appreciated.
I've added the necessary imports in my Description using usethis:use_package() updated the NAMESPACE (using Roxygen)
and created the .R files using a Stackoverflow as a framework. a sample .R looks like this:
#'ODBC list drivers
#'
#'#export
odbcListDrivers <- function() {
odbc::odbcListDrivers()
}
This Works.
But when it comes to the point I need to pass information with a more advanced function:
#'
#' u/export
#'
DBconnect <- function() {
DBI::dbConnect()
}
I get an unused argument error when I try to run the test code.
con <- DBconnect(RMySQL::MySQL(),
+ dbname = "sakila",
+ host = "192.168.50.71",
+ port = 49153,
+ user = "root",
+ password = "Camplovers01")
Error in DBconnect(RMySQL::MySQL(), dbname = "sakila", host = "192.168.50.71", :
unused arguments (dbname = "sakila", host = "192.168.50.71", port = 49153, user = "root", password = "Camplovers01")
The question then is when I pass a function like above how can I make sure I pass the correct arguments? Function I am trying to pass
Ok, asked and then answered by myself. The answer is in the error, unused arguments. I need to place the arguments from the source function in the function I create. These will pass through to the original function and return the correct response. The odbcListDrivers worked as it had no arguments to pass or expect.
Example
Old .R file
#'
#' u/export
#'
DBconnect <- function() {
DBI::dbConnect()
}
This of course fails as was the reason for my question.
New .R file
#'
#' #export
#'
DBconnect <- function(dsn = NULL,
...,
timezone = "UTC",
timezone_out = "UTC",
encoding = "",
bigint = c("integer64", "integer", "numeric", "character"),
timeout = Inf,
driver = NULL,
server = NULL,
database = NULL,
uid = NULL,
pwd = NULL,
dbms.name = NULL,
.connection_string = NULL) {
DBI::dbConnect()
}
Finding the arguments of the function was a simple matter of reviewing the .R file in the GitHub repo for the Package in question.

Allowing user to change R package global variable

I want to make an R package called my_package with the following with the following behavior:
library(my_package)
my_package::get_username() # should throw error "no username set"
my_package::set_username("john doe")
my_package::get_username() # should print "john doe"
I'm not sure how to do this. I tried with the following inside an R file test.R and using source('test.R') it works. But I don't know what would be the proper way to do this when creating a package.
pkg_env <- new.env()
pkg_env$username <- NULL
set_username <- function(username) {
pkg_env$username <- username
}
get_username <- function() {
if (is.null(pkg_env$username)) {
print("ERROR")
} else {
print(pkg_env$username)
}
}
The easiest way of doing what you want is to use options. For example, in devtools, you can set a variety of options using this technique - package?devtools
If you want to use environments, then you need to create the environment when the package is loaded, using the .onLoad function

How to integrate new R6Class functions stored in independent files into an existing R package?

I`m working in an existing program that uses a system of R6Class modules. An example function is stored in a file son.R and looks like:
#' #importFrom R6 R6Class
son_class <- R6Class("son", inherit = mother_class,
private = list( ... Some private elements ... ),
public = list(
initialize = function(x, y, z) {
...Some code...
super$initialize(x, y)
},
calculate = function(x,y,z) {
... More Code ...
calc_son(x,y,z)
}
)
)
#' #inheritParams ...
#' #return ...
#' #template ...
#' #examples ...
#' #export
son <- function(x = "son", y, z) {
son_class$new(x, y, z)
}
The authors of the package say that the way to create new modules is by creating new R6Classes that inherit from mother_class. Thus I create a daughter.R file that looks almost the same, just change the son to daughter, but when I try to compile, I get the following error:
==> R CMD INSTALL --no-multiarch --with-keep.source mypackage
Error in .install_package_code_files ( , instdir " . " ) :
missing files in 'path / to / mypackage / R' in the' Collate ' : daughter.R
ERROR: unable to collate and parse R files for package ‘mypackage’
Which may be the source of this error. I am following the authors instruction verbatim.
I got the way to do it (sort of). I include the NAMESPACE a line saying
export(daughter)
and in the DESCRIPTION file, after the
Collate:
'daughter.R'
After compiling the package everything seems OK and the function is fully integrated, except it lacks documentation. I am 100% sure that altering DESCRIPTION and NAMESPACE by hand is not standard practice and will created a follow up question regarding that:
Roxygen2: "Error in loadNamespace(name) : there is no package called ‘testthat’"?

Patch base::library with wrapper in R

Inside an R package, I'm trying to patch the base::library() function in R to specifically set the position of the loaded packages in the search path. I haveve defined several environments (all named env:<something>) and want to make sure that libraries are placed below these environments in the search path.
# wrap around library function.
library = function(..., pos = NULL) {
print("NEW LIBRARY FUNCTION!")
if (is.null(pos)) {
pos <- grep("env:", search())
pos <- if (length(pos) == 0) 2 else max(pos) + 1
}
base::library(..., pos=pos)
}
When I assign this function in the console, everything runs fine:
> library(stats)
[1] "NEW LIBRARY FUNCTION!"
> eval(parse(text = "library(stats)"))
[1] "NEW LIBRARY FUNCTION!"
> eval(parse(text = "library(stats)"), envir = globalenv())
[1] "NEW LIBRARY FUNCTION!"
When I define the above wrapper function inside my package, build it and load it in a new R Session, the following executes as expected:
> library(mypackage)
> mypackage:::library(stats)
[1] "NEW LIBRARY FUNCTION!"
But, when using eval() with the envir argument inside a function in mypackage, my new definition of library() is not retrieved:
# Functions defined in mypackage
testlibrary1 = function(...) {
library(...)
}
testlibrary2 = function(code) {
eval(parse(text = code))
}
testlibrary3 = function(code) {
eval(parse(text = code), envir = globalenv())
}
In console, I get the following results:
> mypackage:::testlibrary1(stats)
[1] "NEW LIBRARY FUNCTION!"
> mypackage:::testlibrary2("library(stats)")
[1] "NEW LIBRARY FUNCTION!"
> mypackage:::testlibrary3("library(stats)")
>
The last function, testlibrary3(), did not use the new wrapper function.
I want all functions that call library() inside mypackage to use my wrapper function. Can somebody help me out?
I guess the problem is the following, but as your question did not include a fully reproducible example (i.e., by uploading the package somewhere) it is difficult to tell.
As long as your library function is not exported from your package via the NAMESPACE it is not visible. Consequently, the only available library function to eval is base::library().
Note that while your function resides in the namespace of the package the calling environment for mypackage:::testlibraryX() is still the global environment. There your library functions is not available. Try to export is and see if this helps.

How to pass object in nested functions?

I'm trying to override save() in R so that it creates any missing directories before saving an object. I'm having trouble passing an object through one function to another using the ellipsis method.
My example:
save <- function(...,file){ #Overridden save()
target.dir <- dirname(file) #Extract the target directory
if(!file.exists(target.dir)) {
#Create the target directory if it doesn't exist.
dir.create(target.dir,showWarnings=T,recursive=T)
}
base::save(...,file=file.path(target.dir,basename(file)))
}
fun1 <- function(obj) {
obj1 <- obj + 1
save(obj1,file="~/test/obj.RData")
}
fun1(obj = 1)
The code above results in this error:
Error in base::save(..., file = file.path(target.dir, basename(file))) :
object ‘obj1’ not found
I realize that the problem is that the object 'obj1' doesn't exist inside my custom save() function, but I haven't yet figured out how to pass it from fun1 to base::save.
I have tried:
base::save(parent.frame()$...,file=file.path(target.dir,basename(file)))
and:
base::save(list=list(...),file=file.path(target.dir,basename(file)))
with no success.
Any suggestions?
You need to specify the parent's environment to 'base::save' :
save <- function(...,file){ #Overridden save()
target.dir <- dirname(file) #Extract the target directory
if(!file.exists(target.dir)) {
#Create the target directory if it doesn't exist.
dir.create(target.dir,showWarnings=T,recursive=T)
}
base::save(...,file=file.path(target.dir,basename(file)),envir=parent.frame())
}
Note the parameter added to the base::save call.
fun1 <- function(obj) {
obj1 <- obj + 1
save(obj1,file="~/test/obj.RData")
}
In addition, use '=' to specify parameter names:
fun1(obj = 1)

Resources