I don't know if this is a bad configuration set up by I am testing connections to different database network settings.
Whilst I am testing the set up I want the app to be able to run with three different set ups:
The development one I have on a local machine.
On a standalone docker container on the deployment space using a docker network.
In the deployment space using the host machine network
2/3 but in a shinyproxy container setup
These all have marginally different setups, but as I am testing out which of these connection settings. The first three work, but 2 and 3 do not.
Also, 2, 3, and 4 share the same .Renviron file as I want to be able to create the set up and get it to use the first one that works according to the preferential order of set-up. This will make handover easier to clients as they can choose how secure to make it.
I am using RPostgres for the connection.
I want to be able to use the same app code for every setup, but be able to name the environmental variables differently to allow for two options in connection to be left for deployment.
Here is my plan but it seems long winded.
First, test all the connections using the different credentials and assigning a true false to a variable. Code stolen from https://github.com/brry/berryFunctions/blob/master/R/is.error.R for error testing.
admin_connect_error <-
inherits(try(dbConnect(
drv = RPostgres::Postgres(),
dbname = Sys.getenv("ADMINDBNAME"),
host = Sys.getenv("ADMINHOSTNAME"),
port = Sys.getenv("ADMINPORTNAME"),
user = Sys.getenv("ADMINUSERNAME"),
password = Sys.getenv("ADMINPASSNAME")
),
silent = TRUE)
,
"try-error")
fake_connect_error <-
inherits(try(dbConnect(
drv = RPostgres::Postgres(),
dbname = "db_name",
host = "host_name",
port = "port_name",
user = "user_name",
password = "pass_name"
),
silent = TRUE)
,
"try-error")
local_connect_error <-
inherits(tryCatch(
dbConnect(
drv = RPostgres::Postgres(),
dbname = Sys.getenv("USERDBNAME"),
host = Sys.getenv("LOCHOSTNAME"),
port = Sys.getenv("LOCPORTNAME"),
user = Sys.getenv("USERUSERNAME"),
password = Sys.getenv("USERPASSNAME")
),
silent = TRUE
),
"try-error")
docker_connect_error <-
inherits(try(dbConnect(
drv = RPostgres::Postgres(),
dbname = Sys.getenv("USERDBNAME"),
host = Sys.getenv("DOCKHOSTNAME"),
port = Sys.getenv("DOCKPORTNAME"),
user = Sys.getenv("USERUSERNAME"),
password = Sys.getenv("USERPASSNAME")
),
silent = TRUE)
,
"try-error")
Second, check which are correct and use that to set up the dbPool for the app to use.
if (local_connect_error == F) {
print("connection on local machine as host network")
pool <- dbPool(
drv = RPostgres::Postgres(),
dbname = Sys.getenv("USERDBNAME"),
user = Sys.getenv("USERUSERNAME"),
password = Sys.getenv("USERPASSNAME"),
host = Sys.getenv("LOCHOSTNAME"),
port = Sys.getenv("LOCPORTNAME")
)
} else if (docker_connect_error == F) {
print("Connecting on docker network")
pool <- dbPool(
drv = RPostgres::Postgres(),
dbname = Sys.getenv("USERDBNAME"),
host = Sys.getenv("DOCKHOSTNAME"),
port = Sys.getenv("DOCKPORTNAME"),
user = Sys.getenv("USERUSERNAME"),
password = Sys.getenv("USERPASSNAME")
)
} else if (fake_connect_error == F) {
print("connection on fake local machine as fake host network")
pool <- dbConnect(
drv = RPostgres::Postgres(),
dbname = "db_name",
host = "host_name",
port = "port_name",
user = "user_name",
password = "pass_name"
)
} else if (admin_connect_error == F) {
print("connection on admin machine as host admin")
pool <- dbConnect(
drv = RPostgres::Postgres(),
dbname = Sys.getenv("ADMINDBNAME"),
host = Sys.getenv("ADMINHOSTNAME"),
port = Sys.getenv("ADMINPORTNAME"),
user = Sys.getenv("ADMINUSERNAME"),
password = Sys.getenv("ADMINPASSNAME")
)
}else{
print("DB connection options not working - please debug")
}
This seems very longwinded. Is there a better way?
Related
I am trying to use a pool to connect to my database in R, but I get the error:
Schema must be specified when session schema is not set
How does one specify a schema ? It seems like I need to specify it inside the pool. If that's the case, what's the parameter name for a schema?
pool <- dbPool(
drv = RJDBC::JDBC(
"xxx",
"dir_to_jar", "`"
),
dbname = "db",
schema = "schema" # this didn't work
url = url,
user = user,
password = password,
SSL = 'true'
)
pool %>% tbl("schema.table")
I tried several other methods using DBI::dbConnect combined with Id and it worked:
pool <- DBI::dbConnect(
drv = RJDBC::JDBC(
"xxx",
"dir_to_jar", "`"
),
url = url,
user = user,
password = password,
SSL = 'true'
)
# Didn't work
pool %>% tbl(dbplyr::in_schema("catalog.schema", "table"))
# Works!
s <- Id(catalog = "catalog", schema = "schema", table = "table")
df <- dbReadTable(pool, s)
I'm able to send emails using the following code.
OutlookForSend = RDCOMClient::COMCreate("Outlook.Application")
emailToSend = OutlookForSend$CreateItem(0)
emailToSend[["subject"]] = "Subject"
emailToSend[["HTMLBody"]] = bodyToSend
emailToSend[["To"]] = "Email"
emailToSend$Send()
However, I don't have outlook installed, in the server machine but still need to send emails.
I'm able to achieve the same using the package mailer in Python , what is the best way to achieve the same in R.
Thanks
Any SMTP client implemented in R will do the job.
Check out this one: Rmailer
From their example:
library(Rmailer)
message <- c(
"Hey,",
"",
"I have a nice pic for you!",
"",
"Best",
"C."
)
settings <- list(
server = "smtp.example.org",
username = "user",
password = "password"
)
## send message:
sendmail(
from = "sender#example.org",
to = "receiver#example.org",
subject = "Good news!",
msg = message,
smtpsettings = settings,
attachment = "nice_pic.jpg"
)
Solved the problem, using mailR package and it works well.
library(mailR)
send.mail(from = "email#company.com",
to = "email#company.com",
subject = subjectToSend ,
body = bodyToSend,
html = TRUE,
smtp = list(host.name = "smtp.company.com", port = 25),
send = TRUE)
#https://cran.r-project.org/web/packages/RPostgres/README.html
library(DBI)
# Connect to a specific postgres database i.e. Heroku
con <- dbConnect(RPostgres::Postgres(),dbname = 'DATABASE_NAME',
host = 'HOST', # i.e. 'ec2-54-83-201-96.compute-1.amazonaws.com'
port = 5432, # or any other port specified by your DBA
user = 'USERNAME',
password = 'PASSWORD')
Trying to connect to the DB, but I get a ssl verification error because the remote DB is Aurora. is there a parameter to pass the SSL CA root?
You could do:
rt_cert <- paste0("PATH_OF_ROOT_CERTIFICATE/root-ca.crt")
cl_cert <- paste0("PATH_OF_ROOT_CERTIFICATE/xxx.crt")
cl_key <- paste0("PATH_OF_ROOT_CERTIFICATE/xxx.key")
con <- dbConnect(drv = RPostgres::Postgres(),
dbname = 'DATABASE_NAME',
host = 'HOST', # i.e. 'ec2-54-83-201-96.compute-1.amazonaws.com'
port = 5432, # or any other port specified by your DBA
user = 'USERNAME',
password = 'PASSWORD',
sslmode = 'require',
sslrootcert = rt_cert,
sslcert = cl_cert,
sslkey = cl_key)
I am trying to write data into a non-public schema using function st_write from sf package
I cannot change the following way I connect to the db, as it will break all other functions -
create_db_connection <- function(host, dbuser, dbpassword){
drv = RPostgreSQL::PostgreSQL()
DBuser = dbuser
DBhost = host
DBport = "5432"
DBpassword = dbpassword
db = RPostgreSQL::dbConnect(drv, dbname = "DIFM", user = DBuser, host = DBhost, port = DBport, password = DBpassword)
return(db)
}
Using the above connection, I have tried the following -
1. sf::st_write(obj = obj_geom, dsn = db, layer = c(schema_name, "temp_geometrytable"), row.names = FALSE, append = TRUE)
2. sf::st_write(obj = obj_geom, dsn = db, DBI::Id(schema=schema_name, table = "temp_geometrytable"), row.names = FALSE, append = TRUE)
But all of these keep adding into the public schema, but i want to dynamically provide schema name and then add the object into the database into a non-public schema
I have a shiny app that utilizes RPostgreSQL to connect to database & queries data ad-hoc.
# Initialize environment to hold SQL parameters:
library(RPostgreSQL)
if (!exists('.sql')) .sql <- new.env()
.sql$cxn <- dbConnect(PostgreSQL(), host = "localhost", dbname = "testdb", user = "myuser", password = "Passw0rd!", port = 5432)
This code runs when the app initializes, but after some time, the connection is terminated by the server:
> dbGetQuery(.sql$cxn, "SELECT 1")
Error in postgresqlExecStatement(conn, statement, ...) :
RS-DBI driver: (could not run statement: no connection to the server
)
If I simply call:
.sql$cxn <- dbConnect(PostgreSQL(), host = "localhost", dbname = "testdb", user = "myuser", password = "Passw0rd!", port = 5432)
again, it creates a second connection: (and I do not want that)
> dbListConnections(PostgreSQL())
[[1]]
<PostgreSQLConnection:(10601,4)>
[[2]]
<PostgreSQLConnection:(10601,5)>
(Eventually, the maximum number of connections is reached and you cannot create another: https://groups.google.com/forum/#!topic/shiny-discuss/0VjQc2a6z3M)
I want to create a function that connects to my PostgreSQL database (if it hasn't been already), and keeps the connection open (by SELECT'ing 1) if it has:
getConnection <- function(.host, .user, .pw) {
tryCatch({
if (!exists('cxn', where = .sql)) {
.sql$cxn <- dbConnect(PostgreSQL(), host = .host, dbname = "testdb", user = .user, password = .pw, port = 5432)
} else {
dbGetQuery(.sql$cxn, "SELECT 1")
}
}, warning = function(w) {
NULL # placeholder for warnings
}, error = function(e) {
print(e)
cat("Looks like PostgreSQL connection died. Let's try to reconnect...\n")
invisible(lapply(dbListConnections(PostgreSQL()), dbDisconnect)) # Close all db connections
.sql$cxn <<- dbConnect(PostgreSQL(), host = .host, dbname = "testdb", user = .user, password = .pw, port = 5432)
}, finally = {
return(.sql$cxn)
})
}
getConnection(.host = "localhost", .user = "myuser", .pw = "Passw0rd!")
EDIT 2016/03/12: Not sure why, but the function I wrote above doesn't seem to be working properly...
When I call it, I get:
> getConnection(.host = awsrds$host, .user = awsrds$username, .pw = awsrds$password)
Error in postgresqlExecStatement(conn, statement, ...) :
RS-DBI driver: (could not run statement: no connection to the server
)
<PostgreSQLConnection:(12495,0)>
In particular, this part dbGetQuery(.sql$cxn, "SELECT 1") returns:
Error in postgresqlExecStatement(conn, statement, ...) :
RS-DBI driver: (could not run statement: no connection to the server
)
NULL
Warning message:
In postgresqlQuickSQL(conn, statement, ...) :
Could not create executeSELECT 1
And the class of this output is NULL (as opposed to an error?).
Any ideas what I'm doing wrong? Thanks!
I believe you can solve the error by including a check in your connection function. This is what I did for mysql...should work the same in your case I believe. The key is checking if SELECT 1 == 'try-error'
connectionFunction <- function() {
if (!exists("connectionName", where = .GlobalEnv)) {
connectionName <<- dbConnect(MySQL(), default.file = mysqlconf, dbname = "dbName")
} else if (class(try(dbGetQuery(connectionName, "SELECT 1"))) == "try-error") {
dbDisconnect(connectionName)
connectionName <<- dbConnect(MySQL(), default.file = mysqlconf, dbName = "dbName")
}
return(connectionName)
}