R Package passing function through to new function - r

I'm creating a new package as a learning exercise. I've selected a few functions that serve the purpose I need and want to bundle them together into a single, new, package that I can then apply gWidgets to to make them GUI driven.
Finding documentation on how to pass existing functions has been pretty sparse and I am a novice at this. Any assistance would be appreciated.
I've added the necessary imports in my Description using usethis:use_package() updated the NAMESPACE (using Roxygen)
and created the .R files using a Stackoverflow as a framework. a sample .R looks like this:
#'ODBC list drivers
#'
#'#export
odbcListDrivers <- function() {
odbc::odbcListDrivers()
}
This Works.
But when it comes to the point I need to pass information with a more advanced function:
#'
#' u/export
#'
DBconnect <- function() {
DBI::dbConnect()
}
I get an unused argument error when I try to run the test code.
con <- DBconnect(RMySQL::MySQL(),
+ dbname = "sakila",
+ host = "192.168.50.71",
+ port = 49153,
+ user = "root",
+ password = "Camplovers01")
Error in DBconnect(RMySQL::MySQL(), dbname = "sakila", host = "192.168.50.71", :
unused arguments (dbname = "sakila", host = "192.168.50.71", port = 49153, user = "root", password = "Camplovers01")
The question then is when I pass a function like above how can I make sure I pass the correct arguments? Function I am trying to pass

Ok, asked and then answered by myself. The answer is in the error, unused arguments. I need to place the arguments from the source function in the function I create. These will pass through to the original function and return the correct response. The odbcListDrivers worked as it had no arguments to pass or expect.
Example
Old .R file
#'
#' u/export
#'
DBconnect <- function() {
DBI::dbConnect()
}
This of course fails as was the reason for my question.
New .R file
#'
#' #export
#'
DBconnect <- function(dsn = NULL,
...,
timezone = "UTC",
timezone_out = "UTC",
encoding = "",
bigint = c("integer64", "integer", "numeric", "character"),
timeout = Inf,
driver = NULL,
server = NULL,
database = NULL,
uid = NULL,
pwd = NULL,
dbms.name = NULL,
.connection_string = NULL) {
DBI::dbConnect()
}
Finding the arguments of the function was a simple matter of reviewing the .R file in the GitHub repo for the Package in question.

Related

import for package variable definition

Is the following a correct way to use the roxygen #import directive when defining package variables? I suspect not, because trying to use the db function yields Error in connection_quote_identifier(conn#ptr, x) : Invalid connection.
The connection should be valid. I can run the same code outside the package (after calling library(dplyr);library(RPostgres) and it works.
#' #import RPostgres
cxn <- dbConnect(Postgres(), service = 'plasticemission')
#' #import dplyr
#' #export
db <- function(name) {
tbl(cxn, name)
}
Extension of my comment, consider changing it to
cxn <- NULL
#' #param name character, name of the table
#' #param cxn database (DBI) connection object
#' #import DBI
#' #import RPostgres
#' #import dplyr
#' #export
db <- function(name, cxn = cxn) {
if (missing(cxn) && is.null(cxn)) {
cxn <<- DBI::dbConnect(RPostgres::Postgres(), service = 'plasticemission')
}
dplyr::tbl(cxn, name)
}
This way, you do not attempt connection instantiation when the package is loaded, just when the function is called. I provided an override-able variable cxn in the call in case you ever have need for multiple, temporary, or non-standard connections.
If you do really want it to be connected as soon as you load the package, consider using .onLoad (or .onAttach), with something like:
# it might be better defined here, not in your other function, but once is enough
cxn <- NULL
.onLoad <- function(libname, pkgname) {
if (getOption("mypackage.autocxn", default = FALSE)) {
if (is.null(cxn)) {
cxn <<- DBI::dbConnect(RPostgres::Postgres(), service = 'plasticemission')
}
}
}
In this case, I made it an explicit requirement for auto-connect to set a global option "mypackage.autocxn" (in this case as a logical). The name of the option is completely arbitrary, and you can do all sorts of flexible things here such as
.onLoad <- function(libname, pkgname) {
if (length(opts <- getOption("mypackage.autocxn", default = NULL)) &&
is.null(cxn)) {
cxn <<- do.call(DBI::dbConnect, c(list(drv = RPostgres::Postgres()), args))
}
}
And somewhere in your personal setup (perhaps your ~/.Rprofile), you can set
```lang-r
options(mypackage.autocxn = list(service = 'plasticemission'))
```

What is the standard way to set default paths for function arguments in R packages?

Setting default values for R functions is straightforward, e.g.
myfunction = function(x, k=42, c=1){
result = x*x + k - c
return(result)
}
Here, by default k=42, c=1, and x is a required argument.
I'm creating an R package whereby I would like the arguments to be default files. (In this case, these could either be variables loaded via .rda files, or actual text or csv files.)
To provide the path to a file in inst/extdata, the documentation recommends using the following:
http://r-pkgs.had.co.nz/inst.html
For example, to find inst/extdata/mydata.csv, you’d call
system.file("extdata", "mydata.csv", package = "mypackage")
What is the recommended way to create function arguments which default to a certain file?
I think directly linking to the files would be not the best approach, e.g.
do_something_with_data = function(file=system.file("extdata", "mydata.csv", package = "mypackage")){
data.table::fread(file)
...
}
Another approach could be to set all such arguments to NULL, and then use the default arguments if nothing else is used:
do_something_with_data2 = function(file=NULL){
if (is.null(file)){
file = system.file("extdata", "mydata.csv", package = "mypackage")
}
...
}

Include query with R package

I have a SQL query that I would like to ship with an R package I have built, but when I try to include it in the inst or extdata or data folders inside my R package I don't know how to get the function to reference it. An example might be this: query file is myQuery.sql
runDbQuery = function(){
queryfile = 'folder/myQuery.sql'
query = readChar(queryfile, file.info(queryfile)$size)
require(RODBC)
channel <- odbcConnect("mydb", uid = "uid",
pwd = "pwd")
dbResults = sqlQuery(channel = channel, query = query, as.is = T)
close(channel)
return(dbResults)
}
I put .sql files I use in packages in /inst/sql and then get the path to them in functions via:
system.file("sql/myquery.sql",package = "mypackage")

Subclass and parent class not in the same file causes error in R

Here is the example from the help page again:
mEdit <- setRefClass("mEdit",
fields = list( data = "matrix",
edits = "list")
)
mEdit$methods(
initialize = function(data=matrix()) {
.self$data = data
}
)
mv <- setRefClass("matrixViewer",
fields = c("viewerDevice", "viewerFile"),
contains = "mEdit",
)
mv$methods( initialize =
function(file = "./matrixView.pdf", ...) {
viewerFile <<- file
pdf(viewerFile)
viewerDevice <<- dev.cur()
dev.set(dev.prev())
callSuper(...)
},
finalize = function() {
dev.off(viewerDevice)
})
There is no problem here, but if i put the mv class into a different file, say mv.R,
then R complains:
Loading testRefClass
Error in getClass(what, where = where) (from mv.R#1) : “mEdit” is not a defined class
You have to inlucde the other file where mEdit is defined via source(), as mv depends on it. If you have a large project with a plenty of classes and files, think about developing a package, where the folder with your R-files is automatically loaded - and only those files, which have been changed since the last loading.

Check that connection is valid

I'm using RPostgreSQL and sqldf inside my function like this:
MyFunction <- function(Connection) {
options(sqldf.RPostgreSQL.user = Connection[1],
sqldf.RPostgreSQL.password = Connection[2],
sqldf.RPostgreSQL.dbname = Connection[3],
sqldf.RPostgreSQL.host = Connection[4],
sqldf.RPostgreSQL.port = Connection[5])
# ... some sqldf() stuff
}
How do I test that connection is valid?
You can check that an existing connection is valid using isPostgresqlIdCurrent.
conn <- dbConnect("RPgSQL", your_database_details)
isPostgresqlIdCurrent(conn)
For testing new connections, I don't think that there is a way to know if a connection is valid without trying it. (How would R know that the database exists and is available until it tries to connect?)
For most analysis purposes, just stopping on an error and fixing the login details is the best approach. So just call dbConnect and don't worry about extra check functions.
If you are creating some kind of application where you need to to handle errors gracefully, a simple tryCatch wrapper should do the trick.
conn <- tryCatch(conn <- dbConnection(wherever), error = function(e) do_something)
My current design uses tryCatch:
Connection <- c('usr','secret','db','host','5432')
CheckDatabase <- function(Connection) {
require(sqldf)
require(RPostgreSQL)
options(sqldf.RPostgreSQL.user = Connection[1],
sqldf.RPostgreSQL.password = Connection[2],
sqldf.RPostgreSQL.dbname = Connection[3],
sqldf.RPostgreSQL.host = Connection[4],
sqldf.RPostgreSQL.port = Connection[5])
out <- tryCatch(
{
sqldf("select TRUE;")
},
error=function(cond) {
out <- FALSE
}
)
return(out)
}
if (!CheckDatabase(Connection)) {
stop("Not valid PostgreSQL connection.")
} else {
message("PostgreSQL connection is valid.")
}
One approach is to simply try executing the code, and catching any errors with a nice informative error message. Have a look at the documentation of tryCatch to see the details regarding how this works.
The following blog post provides an introduction to the exception-based style of programming.

Resources