Error in dbDriver("PostgreSQL") : could not find function "dbDriver" - r

I have a shiny-server set up on an Amazon Web Services instance, I am trying to get my app.R onto it but am getting this error:
Error in dbDriver("PostgreSQL") : could not find function "dbDriver"
Calls: runApp ... sourceUTF8 -> eval -> eval -> ..stacktraceon.. -> get_query
Execution halted
I think it has to do with the library install of the package DBI, but I've tried installing it again on the instance and haven't been successful.
Not sure what to try next..
Here's the whole image of the error and I can add any other information required:
Also I can confirm that the shiny-server is installed correctly because this page loads normally:
This is how I've tried to install my packages in the instance:
sudo su - -c "R -e \"install.packages(c('shiny', 'shinythemes', 'shinycssloaders', 'dplyr', 'xlsx','ggplot2','ggthemes','DT','stringr','RPostgreSQL','tidyr','dbplyr', DBI','splitstackshape'), repos='http://cran.rstudio.com/')\""
and dbDriver is a function in the DBI package
This is part of what my app.R code contains:
required_packages <- c("shiny", "shinythemes", "shinycssloaders", "dplyr", "xlsx","ggplot2","ggthemes","DT","stringr","RPostgreSQL","tidyr","dbplyr","DBI","splitstackshape"
,"magrittr","tidyverse","shinyjs","data.table","plotly")
absent_packages <- required_packages[!(required_packages %in% installed.packages()[,"Package"])]
if(length(absent_packages)) install.packages(absent_packages)
set.seed(1)
get_query <- function(querystring){
# create a connection
# loads the PostgreSQL driver
drv <- dbDriver("PostgreSQL")
# creates a connection to the postgres database
# note that "con" will be used later in each connection to the database
con <- dbConnect(drv, dbname = "postgres", host = "/var/run/postgresql", port = 5432, user = "postgres", password = "pw")
on.exit(dbDisconnect(con))
#rstudioapi::askForPassword("Database password")
query <- eval(parse(text = querystring))
return(query)
}
And these are the tables and connection info to the postgreSQL database on the same instance:
If I add DBI:: in front of dbConnect() and dbDisconnect() and used RPostgres::Postgres() as the driver in the dbConnect function I get this error:

Installing a package does not mean it is loaded into your namespace. Further, the use of dbDriver is deprecated, as shown in ?dbDriver:
These methods are deprecated, please consult the documentation of the individual backends for the construction of driver instances.
I suggest either explicitly loading DBI or using DBI:: with each call to its functions (not a bad idea anyway):
library(DBI)
get_query <- function(querystring){
# create a connection
# save the password that we can "hide" it as best as we can by collapsing it
# creates a connection to the postgres database
# note that "con" will be used later in each connection to the database
con <- DBI::dbConnect(RPostgres::Postgres(), dbname = "postgres", host = "/var/run/postgresql", port = 5432, user = "postgres", password = "pw")
on.exit(DBI::dbDisconnect(con))
#rstudioapi::askForPassword("Database password")
query <- eval(parse(text = querystring))
return(query)
}
(Again, you don't need to do both library(DBI) and use DBI::, you choose.)
I used RPostgres::Postgres() here, but this applies also to many other drivers, including RPostgreSQL::PostgreSQL(), RSQLite::SQLite(), and rodbc::odbc() (several others exist).
Further points, though I don't know what else you have going on here to be certain:
making a connection each time you call this function can get "expensive"; consider connecting outside of this function and passing in your con object; if this is a one-or-two-times thing, then you might be alright as-is;
the use of eval(parse(...)) seems wrong ... executing user-provided queries is flat-out dangerous, look up "SQL Injection" if you are not familiar. Why not just DBI::dbGetQuery(con, querystring)?

Related

Error with dbConnect to Snowflake via Rscript (but not R Studio)

I have successfully connected/queried Snowflake from R Studio using an ODBC driver. When I try the code in Rgui.exe, it also works. However, in Rterm (or calling rScript from a batch script), it does not. Rterm returns the following error:
OOB curl_easy_perform() failed: SSL peer certificate or SSH remote key was not OK
My R code is:
library(ROracle)
library(methods)
username <- keyring::key_list("blake-snowflake")[1,2]
password <- keyring::key_get("blake-snowflake", keyring::key_list("my-snowflake")[1,2])
### connect to EDW
con_snowflake <- dbConnect(
odbc::odbc(),
"EDW_sample",
uid=username,
pwd=password)
I switched from using ODBC to JDBC.
library(RJDBC)
jdbcDriver <- JDBC(driverClass="com.snowflake.client.jdbc.SnowflakeDriver", classPath = "..\\java\\snowflake-jdbc-3.7.2.jar")
con_snowflake <- dbConnect(jdbcDriver, "jdbc:snowflake://xxx.snowflakecomputing.com/", keyring::key_list("my-snowflake")[1,2], keyring::key_get("my-snowflake", keyring::key_list("my-snowflake")[1,2]), db="db_name", schema="schema_name")
### read in data
query = readr::read_file("...\\query.sql")
df <- ROracle::dbGetQuery(con_snowflake, query)

R - handle error when accessing a database

I'm trying to automate data download from db using RJDBC using a for loop. The database im using automatically closes the connection after every 10mins, so what i want to do is somehow catch the error, remake the connection, and continue the loop. In order to do this i need to capture the error somehow, the problem is, it is not an r error so none of the commands trycatch and similar works. I just get a text on the console telling me:
Error in .jcheck() : No running JVM detected. Maybe .jinit() would help.
How do i handle this in terms of:
if (output == ERROR) {remake connection and run dbQuery} else {run dbQuery}
thanks for any help
You could use the pool package to abstract away the logic of connection management.
This does exactly what you expect regarding connection management with DBI.
It should work with RJDBC which is an implentation of DBI, but I didn't test it with this driver.
libray(pool)
library(RJDBC)
conn <- dbPool(
drv = RJDBC::JDBC(...),
dbname = "mydb",
host = "hostadress",
username = "test",
password = "test"
)
on.exit(poolClose(conn))
dbGetQuery(conn, "select... ")

R connection to Redshift using AWS driver doesn't work but does work with Postgre driver

I am trying establish a connection to my redshift database after following the example provided by AWS https://blogs.aws.amazon.com/bigdata/post/Tx1G8828SPGX3PK/Connecting-R-with-Amazon-Redshift. However, I get errors when trying to establish the connection using their recommended driver. However, when I use the Postgre driver I can establish a connection to the redshift DB.
AWS says their driver is "optimized for performance and memory management", so I would rather use it. Can someone please review my code below, and let me know if they see something wrong? I suspect that I am not setting the URL up correctly, but not sure what I should be using instead? Thanks in advance for any help.
#' This code attempts to establish a connection to redshift database. It
#' attempts to establish a connection using the suggested redshift but doesn't
#' work.
## Clear up space and set working directory
#Clear Variables
rm(list=ls(all=TRUE))
gc()
## Libriries for analyis
library(RJDBC)
library(RPostgreSQL)
#Create DBI driver for working with redshift driver directly
# download Amazon Redshift JDBC driver
download.file('http://s3.amazonaws.com/redshift-downloads/drivers/RedshiftJDBC41-1.1.9.1009.jar',
'RedshiftJDBC41-1.1.9.1009.jar')
# connect to Amazon Redshift using specific driver
driver_redshift <- JDBC("com.amazon.redshift.jdbc41.Driver",
"RedshiftJDBC41-1.1.9.1009.jar", identifier.quote="`")
## Using postgre connection that works
#postgre driver
driver_postgre <- dbDriver("PostgreSQL")
#establish connection
conn_postgre <- dbConnect(driver_postgre, host="nhdev.c6htwjfdocsl.us-west-2.redshift.amazonaws.com",
port="5439",dbname="dev",
user="xxxx", password="xxxx")
#list the tables available
tables = dbListTables(conn_postgre)
## Use URL option to establish connection like the example on AWS website
# url <- "<JDBCURL>:<PORT>/<DBNAME>?user=<USER>&password=<PW>
# url <- "jdbc:redshift://demo.ckffhmu2rolb.eu-west-1.redshift.amazonaws.com
# :5439/demo?user=XXX&password=XXX" #useses example from AWS instructions
#url using my redshift database
url <- "jdbc:redshift://nhdev.c6htwjfdocsl.us-west-2.redshift.amazonaws.com
:5439/dev?user=xxxx&password=xxxx"
#attempt connect but gives an error
conn_redshift <- dbConnect(driver_redshift, url)
#gives the following error:
# Error in .jcall(drv#jdrv, "Ljava/sql/Connection;", "connect", as.character(url)[1], :
# java.sql.SQLException: Error message not found: CONN_GENERAL_ERR. Can't find bundle for base name com.amazon.redshift.core.messages, locale en
## Similier to postgre example that works but doesn't work when using redshift specific driver
#gives an error saying url is missing, but I am not sure which url to use?
conn <- dbConnect(driver_redshift, host="nhdev.c6htwjfdocsl.us-west-2.redshift.amazonaws.com",
port="5439",dbname="dev",
user="xxxx", password="xxxx")
# gives the following error:
#Error in .jcall("java/sql/DriverManager", "Ljava/sql/Connection;", "getConnection", :
# argument "url" is missing, with no default
I've done it this way it works for me:
drv <- JDBC("com.amazon.redshift.jdbc41.Driver","PathTO/RedshiftJDBC41-1.1.2.0002.jar")
conn <- dbConnect(drv,"jdbc:redshift://......redshift.amazonaws.com:5439/dev",User,PWD)
The difference I see in yours is that you don't mention the full path to redshift jar in driver_redshift.
Hope it works.

Connect to MSSQL using DBI

I can not connect to MSSQL using DBI package.
I am trying the way shown in package itself
m <- dbDriver("RODBC") # error
Error: could not find function "RODBC"
# open the connection using user, passsword, etc., as
# specified in the file \file{\$HOME/.my.cnf}
con <- dbConnect(m, dsn="data.source", uid="user", pwd="password"))
Any help appreciated. Thanks
As an update to this question: RStudio have since created the odbc package (or GitHub version here) that handles ODBC connections to a number of databases through DBI. For SQL Server you use:
con <- DBI::dbConnect(odbc::odbc(),
driver = "SQL Server",
server = <serverURL>,
database = <databasename>,
uid = <username>,
pwd = <passwd>)
You can also set a dsn or supply a connection string.
It looks like there used to be a RODBC driver for DBI, but not any more:
http://cran.r-project.org/src/contrib/Archive/DBI.RODBC/
A bit of tweaking has got this to install in a version 3 R but I don't have any ODBC sources to test it on. But m = dbDriver("RODBC") doesn't error.
> m = dbDriver("RODBC")
> m
<ODBCDriver:(29781)>
>
Suggest you ask on the R-sig-db mailing list to maybe find out what happened to this code and/or the author...
Solved.
I used library RODBC. It has great functionality to connect sql and run sql queries in R.
Loading Library:
library(RODBC)
# dbDriver is connection string with userID, database name, password etc.
dbhandle <- odbcDriverConnect(dbDriver)
Running Sql query
sqlQuery(channel=dbhandle, query)
Thats It.

R DBI / RPostgreSQL-- connection succeeds but dbListTables returns no tables

The following code connects to my PostgreSQL database successfully (or appears to, at any rate), but attempt to issue queries were met with "relation does not exist" errors, so I tried dbListTables, which doesn't return any tables at all. The database name passed to dbConnect is correct, and the tables do exist. I think the code I'm using is exactly the same as what I was using recently, which worked successfully. Any ideas?
> library(RPostgreSQL)
Loading required package: DBI
> drv <- dbDriver("PostgreSQL")
> con <- dbConnect(drv, dbname="mydb", user="user", password=password)
> dbListTables(con)
character(0)
I'm new to both R and DBI, so I'm sure I could be missing something extremely simple...any help would be appreciated.
Solved-- I was right; it was something incredibly simple (and very, very stupid) on my part. I was running the script from the wrong server. The server I was running it from has an empty copy of the database I was attempting to connect to, so everything succeeded, and the empty result from dbListTables was correct. Once I switched servers (or simply specified the host on the other server), everything worked.
1.Connet to MySQL
a)if Mysql is installed in your system, if not install it.
b)download the RMySQL IN R
library(RMySQL)
drv = dbDriver("MySQL 5.0.1")
make sure MySQL version is correct.
con = dbConnect(drv,host="localhost",dbname="test",user="root",pass="root")
use local host or use the server i.e ip address
use the required database name, user name and password
album = dbGetQuery(con,statement="select * from table")
run required query
close(con)
2.Another way to connect database
a)first install any database like MySQL,Oracle,SQL Server
b)install the ODBC connector for database
library(Rodbc)
channel <- odbcConnect("test", uid="ripley", pwd="secret")
test is the connection name of odbc conector which user has to set manualy
user can find this in Administrator tool
res <- sqlFetch(ch, "table name")
A table can be retrieved as a data frame
res<-sqlQuery(channel, paste("select query"))
part of the with condition one table can be retrieved as a data frame
sqlSave(channel, dataframe)
to save a dataframe to the database(dont use "res<-" something like this)
like user can use
sqlCopy()
sqlDrop()
sqlTables()
close(channel)
always close the connection

Resources