Keep alive function for PostgreSQL connection in Shiny app? - r

I have a shiny app that utilizes RPostgreSQL to connect to database & queries data ad-hoc.
# Initialize environment to hold SQL parameters:
library(RPostgreSQL)
if (!exists('.sql')) .sql <- new.env()
.sql$cxn <- dbConnect(PostgreSQL(), host = "localhost", dbname = "testdb", user = "myuser", password = "Passw0rd!", port = 5432)
This code runs when the app initializes, but after some time, the connection is terminated by the server:
> dbGetQuery(.sql$cxn, "SELECT 1")
Error in postgresqlExecStatement(conn, statement, ...) :
RS-DBI driver: (could not run statement: no connection to the server
)
If I simply call:
.sql$cxn <- dbConnect(PostgreSQL(), host = "localhost", dbname = "testdb", user = "myuser", password = "Passw0rd!", port = 5432)
again, it creates a second connection: (and I do not want that)
> dbListConnections(PostgreSQL())
[[1]]
<PostgreSQLConnection:(10601,4)>
[[2]]
<PostgreSQLConnection:(10601,5)>
(Eventually, the maximum number of connections is reached and you cannot create another: https://groups.google.com/forum/#!topic/shiny-discuss/0VjQc2a6z3M)
I want to create a function that connects to my PostgreSQL database (if it hasn't been already), and keeps the connection open (by SELECT'ing 1) if it has:
getConnection <- function(.host, .user, .pw) {
tryCatch({
if (!exists('cxn', where = .sql)) {
.sql$cxn <- dbConnect(PostgreSQL(), host = .host, dbname = "testdb", user = .user, password = .pw, port = 5432)
} else {
dbGetQuery(.sql$cxn, "SELECT 1")
}
}, warning = function(w) {
NULL # placeholder for warnings
}, error = function(e) {
print(e)
cat("Looks like PostgreSQL connection died. Let's try to reconnect...\n")
invisible(lapply(dbListConnections(PostgreSQL()), dbDisconnect)) # Close all db connections
.sql$cxn <<- dbConnect(PostgreSQL(), host = .host, dbname = "testdb", user = .user, password = .pw, port = 5432)
}, finally = {
return(.sql$cxn)
})
}
getConnection(.host = "localhost", .user = "myuser", .pw = "Passw0rd!")
EDIT 2016/03/12: Not sure why, but the function I wrote above doesn't seem to be working properly...
When I call it, I get:
> getConnection(.host = awsrds$host, .user = awsrds$username, .pw = awsrds$password)
Error in postgresqlExecStatement(conn, statement, ...) :
RS-DBI driver: (could not run statement: no connection to the server
)
<PostgreSQLConnection:(12495,0)>
In particular, this part dbGetQuery(.sql$cxn, "SELECT 1") returns:
Error in postgresqlExecStatement(conn, statement, ...) :
RS-DBI driver: (could not run statement: no connection to the server
)
NULL
Warning message:
In postgresqlQuickSQL(conn, statement, ...) :
Could not create executeSELECT 1
And the class of this output is NULL (as opposed to an error?).
Any ideas what I'm doing wrong? Thanks!

I believe you can solve the error by including a check in your connection function. This is what I did for mysql...should work the same in your case I believe. The key is checking if SELECT 1 == 'try-error'
connectionFunction <- function() {
if (!exists("connectionName", where = .GlobalEnv)) {
connectionName <<- dbConnect(MySQL(), default.file = mysqlconf, dbname = "dbName")
} else if (class(try(dbGetQuery(connectionName, "SELECT 1"))) == "try-error") {
dbDisconnect(connectionName)
connectionName <<- dbConnect(MySQL(), default.file = mysqlconf, dbName = "dbName")
}
return(connectionName)
}

Related

Check different connection credentials for r in different environments during testing?

I don't know if this is a bad configuration set up by I am testing connections to different database network settings.
Whilst I am testing the set up I want the app to be able to run with three different set ups:
The development one I have on a local machine.
On a standalone docker container on the deployment space using a docker network.
In the deployment space using the host machine network
2/3 but in a shinyproxy container setup
These all have marginally different setups, but as I am testing out which of these connection settings. The first three work, but 2 and 3 do not.
Also, 2, 3, and 4 share the same .Renviron file as I want to be able to create the set up and get it to use the first one that works according to the preferential order of set-up. This will make handover easier to clients as they can choose how secure to make it.
I am using RPostgres for the connection.
I want to be able to use the same app code for every setup, but be able to name the environmental variables differently to allow for two options in connection to be left for deployment.
Here is my plan but it seems long winded.
First, test all the connections using the different credentials and assigning a true false to a variable. Code stolen from https://github.com/brry/berryFunctions/blob/master/R/is.error.R for error testing.
admin_connect_error <-
inherits(try(dbConnect(
drv = RPostgres::Postgres(),
dbname = Sys.getenv("ADMINDBNAME"),
host = Sys.getenv("ADMINHOSTNAME"),
port = Sys.getenv("ADMINPORTNAME"),
user = Sys.getenv("ADMINUSERNAME"),
password = Sys.getenv("ADMINPASSNAME")
),
silent = TRUE)
,
"try-error")
fake_connect_error <-
inherits(try(dbConnect(
drv = RPostgres::Postgres(),
dbname = "db_name",
host = "host_name",
port = "port_name",
user = "user_name",
password = "pass_name"
),
silent = TRUE)
,
"try-error")
local_connect_error <-
inherits(tryCatch(
dbConnect(
drv = RPostgres::Postgres(),
dbname = Sys.getenv("USERDBNAME"),
host = Sys.getenv("LOCHOSTNAME"),
port = Sys.getenv("LOCPORTNAME"),
user = Sys.getenv("USERUSERNAME"),
password = Sys.getenv("USERPASSNAME")
),
silent = TRUE
),
"try-error")
docker_connect_error <-
inherits(try(dbConnect(
drv = RPostgres::Postgres(),
dbname = Sys.getenv("USERDBNAME"),
host = Sys.getenv("DOCKHOSTNAME"),
port = Sys.getenv("DOCKPORTNAME"),
user = Sys.getenv("USERUSERNAME"),
password = Sys.getenv("USERPASSNAME")
),
silent = TRUE)
,
"try-error")
Second, check which are correct and use that to set up the dbPool for the app to use.
if (local_connect_error == F) {
print("connection on local machine as host network")
pool <- dbPool(
drv = RPostgres::Postgres(),
dbname = Sys.getenv("USERDBNAME"),
user = Sys.getenv("USERUSERNAME"),
password = Sys.getenv("USERPASSNAME"),
host = Sys.getenv("LOCHOSTNAME"),
port = Sys.getenv("LOCPORTNAME")
)
} else if (docker_connect_error == F) {
print("Connecting on docker network")
pool <- dbPool(
drv = RPostgres::Postgres(),
dbname = Sys.getenv("USERDBNAME"),
host = Sys.getenv("DOCKHOSTNAME"),
port = Sys.getenv("DOCKPORTNAME"),
user = Sys.getenv("USERUSERNAME"),
password = Sys.getenv("USERPASSNAME")
)
} else if (fake_connect_error == F) {
print("connection on fake local machine as fake host network")
pool <- dbConnect(
drv = RPostgres::Postgres(),
dbname = "db_name",
host = "host_name",
port = "port_name",
user = "user_name",
password = "pass_name"
)
} else if (admin_connect_error == F) {
print("connection on admin machine as host admin")
pool <- dbConnect(
drv = RPostgres::Postgres(),
dbname = Sys.getenv("ADMINDBNAME"),
host = Sys.getenv("ADMINHOSTNAME"),
port = Sys.getenv("ADMINPORTNAME"),
user = Sys.getenv("ADMINUSERNAME"),
password = Sys.getenv("ADMINPASSNAME")
)
}else{
print("DB connection options not working - please debug")
}
This seems very longwinded. Is there a better way?

Connect to Database Using dbPool RJDBC in R

I am trying to use a pool to connect to my database in R, but I get the error:
Schema must be specified when session schema is not set
How does one specify a schema ? It seems like I need to specify it inside the pool. If that's the case, what's the parameter name for a schema?
pool <- dbPool(
drv = RJDBC::JDBC(
"xxx",
"dir_to_jar", "`"
),
dbname = "db",
schema = "schema" # this didn't work
url = url,
user = user,
password = password,
SSL = 'true'
)
pool %>% tbl("schema.table")
I tried several other methods using DBI::dbConnect combined with Id and it worked:
pool <- DBI::dbConnect(
drv = RJDBC::JDBC(
"xxx",
"dir_to_jar", "`"
),
url = url,
user = user,
password = password,
SSL = 'true'
)
# Didn't work
pool %>% tbl(dbplyr::in_schema("catalog.schema", "table"))
# Works!
s <- Id(catalog = "catalog", schema = "schema", table = "table")
df <- dbReadTable(pool, s)

st_write cannot add to non-public schema

I am trying to write data into a non-public schema using function st_write from sf package
I cannot change the following way I connect to the db, as it will break all other functions -
create_db_connection <- function(host, dbuser, dbpassword){
drv = RPostgreSQL::PostgreSQL()
DBuser = dbuser
DBhost = host
DBport = "5432"
DBpassword = dbpassword
db = RPostgreSQL::dbConnect(drv, dbname = "DIFM", user = DBuser, host = DBhost, port = DBport, password = DBpassword)
return(db)
}
Using the above connection, I have tried the following -
1. sf::st_write(obj = obj_geom, dsn = db, layer = c(schema_name, "temp_geometrytable"), row.names = FALSE, append = TRUE)
2. sf::st_write(obj = obj_geom, dsn = db, DBI::Id(schema=schema_name, table = "temp_geometrytable"), row.names = FALSE, append = TRUE)
But all of these keep adding into the public schema, but i want to dynamically provide schema name and then add the object into the database into a non-public schema

Connect Postgres using PLR in Rstudio

I am unable to connect to Postgres in Rstudio, how can I connect Postgres in Rstudio using PL/R?
Below is the code snippet I used for connecting to Postgres using R:
library(RPostgreSQL)
start_time <- Sys.time()
#************db connection to postgre**********************#
dsn_database = "postgres"
dsn_hostname = "localhost"
dsn_port = "5432"
dsn_uid = "postgres"
dsn_pwd = "tiger"
tryCatch({
drv <- dbDriver("PostgreSQL")
print("Connecting to database")
conn <- dbConnect(drv,
dbname = dsn_database,
host = dsn_hostname,
port = dsn_port,
user = dsn_uid,
password = dsn_pwd)
print("Connected!")
},
error=function(cond) {
print(cond)
print("Unable to connect to database.")
})
#************reading row count from postgr table***********#
print("tgt_count")
tgt_count <- dbGetQuery(conn, "SELECT * from COMPANY")
print(paste0("Current working dir: ", tgt_count))

Issue with src_snowflakedb(): 'src_sql' is not an exported object

I am trying to use dplyr with a snowflake db, using the dplyr.snowflakedb package (on GitHub). I am able to install and load the libraries, then set the classpath pointing to the latest JDBC driver (snowflake-jdbc-3.0.9.jar).
# need to load RJDBC, or error 'could not find function ".jinit"' is thrown
library(RJDBC)
library(dplyr)
library(dplyr.snowflakedb)
options(dplyr.jdbc.classpath = "drivers/snowflake-jdbc-3.0.9.jar")
When trying to setup the connection object with src_snowflakedb(), I get the following error message (I removed the account details, but they are correct in the actual code):
> nike_db <- src_snowflakedb(user = "user",
password = "user",
account = "acme",
opts = list(warehouse = "my_wh",
db = "my_db",
schema = "my_schema"))
URL: jdbc:snowflake://acme.snowflakecomputing.com:443/?account=acme&warehouse=my_wh&my_db=db&schema=my_schema
Error: 'src_sql' is not an exported object from 'namespace:dplyr'
Indeed the current version of dplyr doesn't export nor include any src_sql() function:
> dplyr:::src_sql
Error in get(name, envir = asNamespace(pkg), inherits = FALSE) :
object 'src_sql' not found
Is there any way to fix this?
Please try the below sample code :
Sys.getenv("JAVA_HOME")
Sys.setenv(JAVA_HOME="C:\\Program Files\\Java\\jdk-1.8\\jre")
Sys.getenv("JAVA_HOME")
install.packages(c("rJava"))
install.packages(c("RJDBC", "DBI", "dplyr"))
install.packages("devtools")
devtools::install_github("snowflakedb/dplyr-snowflakedb")
library(RJDBC)
library(dplyr)
library(dplyr.snowflakedb)
options(dplyr.jdbc.classpath = "C:\\Driver\\snowflake-jdbc-3.11.1.jar")
my_db <- src_snowflakedb(user = "USERNAME" , password = "PASSWORD", account = "test",host = 'test.us-east-1.snowflakecomputing.com',opts = list(warehouse = "WAREHOUSE_NAME",db='DATABASE_NAME',schema='SCHEMA_NAME'))
tbl(my_db, "TABLE_NAME")
Note :
a) If your Snowflake Account URL is like "https://test.snowflakecomputing.com" use below format :
my_db <- src_snowflakedb(user = "USERNAME" , password = "PASSWORD", account = "test", opts = list(warehouse = "WAREHOUSE_NAME",db='DATABASE_NAME',schema='SCHEMA_NAME'))
b) If your Snowflake Account URL is like "https://test.us-east-1.snowflakecomputing.com" use below format :
my_db <- src_snowflakedb(user = "USERNAME" , password = "PASSWORD", account = "test", host = 'test.us-east-1.snowflakecomputing.com', opts = list(warehouse = "WAREHOUSE_NAME",db='DATABASE_NAME',schema='SCHEMA_NAME'))
I had the same issue, and had to go back to resort to the new version of the JDBC connection via SF: you can see link here:
https://support.snowflake.net/s/article/ka131000000O5Jr/connecting-r-to-snowflake-using-the-jdbc-driver-mac-os-x
all you really need though is this:
result <- dbGetQuery(jdbcConnection, "select current_timestamp() as now")
print(result)

Resources