R connection to postgresql requiring SSL - r

I'm trying to connect R to a postgresql database that requires SSL.
Here's the connection string that works for me when using PSQL:
postgresql://USERNAME:PASSWORD#HOSTNAME:PORT/DBNAME?ssl=true, replacing all the uppercase strings with appropriate values.
In R, I don't know how to handle the SSL parameter. When I try
conn = dbConnect(RPostgreSQL::PostgreSQL(), host="HOST", dbname="DBNAME", user="USERNAME", password="PASSWORD", port=PORT)
I get
Error in postgresqlNewConnection(drv, ...) :
RS-DBI driver: (could not connect USERNAME#HOST on dbname "DBNAME"
)
When I try conn = dbConnect(RPostgreSQL::PostgreSQL(), host="HOST", dbname="DBNAME", user="USERNAME", password="PASSWORD", port=PORT, ssl=TRUE)
I get
Error in postgresqlNewConnection(drv, ...) : unused argument (ssl = TRUE)
Connect to Postgres via SSL using R suggests adding extra info to the dbname parameter. I've tried dbname="DBNAME ssl=TRUE" which results in RS-DBI driver: (could not connect (null)#HOST on dbname "(null)" I get the same result with sslmode=allow and sslmode=require (as suggested by the above post).
The documentation for the PostgreSQL driver says, under "User Authentication", "The passed string can be empty to use all default parameters, or it can contain one or more parameter settings separated by comma. Each parameter setting is in the form parameter = "value". Spaces around the equal sign are optional." But I haven't been able to get it to accept any parameters other than the three shown in the function prototype.
I'm out of ideas; help appreciated.

You can try this :
Create a configuration file like "configuration.yml" and add your setup :
db:
host : "your_localhost"
dbname : "your_database_name?ssl=true"
user : "your_user_name"
port : 5432
password : "your_password"
Install this packages :
install.packages(yaml, dependencies = TRUE)
install.package(RPostgreSQL, dependencies = TRUE)
install.packages(DBI, dependencies = TRUE)
Run :
driver <- DBI:::dbDriver("PostgreSQL")
con <- do.call( RPostgreSQL:::dbConnect,
c(drv = driver, yaml:::yaml.load_file("configuration.yml")$db)
)
/!\ Note /!\ : don't forget to add this statement
*?ssl=true*
on the configuration.yml file on the dbname field.
Hope this help !

Related

RPostgreSQL - SCRAM error when trying to connect to local database

I am trying to connect to my localhost postgres DB and I get the following error.
library("RPostgreSQL")
drv <- dbDriver("PostgreSQL")
connec <- dbConnect(drv, dbname = "dbnamehere", port = 5432,user = "some_username", password = "somepassword")
Error in postgresqlNewConnection(drv, ...) :
RPosgreSQL error: could not connect admin_sci4i#localhost:5432 on dbname "website": SCRAM authentication requires libpq version 10 or above
It seems related to authentication security but I am having a local DB.. Is there any way to do anything in pgAdmin 4 and avoid this error (even if it is less secure)?
Surprisingly I managed to connect using other packages
library("RODBC")
library("odbc")
library("RPostgres")
con <- dbCanConnect(RPostgres::Postgres(),dbname="dbnamehere",port = 5432,user = "some_username", password = "somepassword")
con # Checks connection is working
con1 = dbConnect(RPostgres::Postgres(),dbname="dbnamehere",port = 5432,user = "some_username", password = "somepassword")
con1 # Checks connect
dbListTables(con1) # See tables
A nice explanation and walkthrough here.

Connection reset by peer when trying to connect to Postgres via SSL using R

I'm trying to connect to a database via SSL using the code suggested here: https://github.com/ropensci/ssh/issues/13
I listed the dummy code below that shows how I enable the connection and try to query some data. The solution works great for 'smaller' queries.
However, when I try to get 'larger' data, the query fails and R gives back the following error:
System failure for: recv() from user (Connection reset by peer) follwed by a fetching error Failed to fetch row: SSL error: decryption failed or bad record mac (see output in code snipped)
Accordingly to the 1st error message, I suppose the error occurs served-sided ('reset by peer' --> What does "connection reset by peer" mean?).
Is that true or is there a way to fix this error on the client side (in R)?
ssh::ssh_read_key(file = ssh::ssh_home("id_rsa"), password = "rsa_password")
cmd <- "session <- ssh::ssh_connect('user#host:port');ssh::ssh_tunnel(session, port = 5432, target = '127.0.0.1:5432')"
pid <- sys::r_background(std_out = T, args = c("-e", cmd))
dbcon <-DBI::dbConnect(drv = RPostgres::Postgres(),
dbname = "db_name",
host = "127.0.0.1",
port = 5432,
user = "db_user",
password = "db_password",
base::list(sslmode="require"),
service = NULL)
# example of working query
res <- DBI::dbGetQuery(conn = dbcon, statement = "SELECT * FROM small_table") #
# example of non-working query (see R-otutput)
res <- DBI::dbGetQuery(conn = dbcon, statement = "SELECT * FROM large_table;") #
## R-output
# Tunneled 31897311 bytes...Fehler: System failure for: recv() from user (Connection reset by peer)
# Ausführung angehalten
# Fehler: Failed to fetch row: SSL error: decryption failed or bad record mac
# Warnmeldung:
# Disconnecting from unused ssh session. Please use ssh_disconnect()
"Connection reset by peer" means that whatever you have tried connecting to has responded in an RST flag, meaning that they have reset the connection.

Usage of r-dbwritetable for different oracle owner

I am not able to write data to a table part of the different owner (Oracle data base) using dbwriteTable() even after I have INSERT Grants.
I have tried using the below option but, getting an error. When I tried checking the dbExistsTable(), it is returning true for the same table. Execution Steps followed:
Connecting to DB:
library(RJDBC)
drv <- JDBC("oracle.jdbc.OracleDriver",
classPath = "C:/oracle_64/product/11.2.0/client_2/jdbc/lib/ojdbc6.jar", " ")
con <- dbConnect(drv, "jdbc:oracle:thin:#//hostname:1521/oracle_sid",
"MASTER_OWNER", "PASSWORD" )
Write Dataframe into Oracle Table:
dbWriteTable(conn, "R_STG_INSERT", DF_NAME, row.names = FALSE,
overwrite = FALSE, append = TRUE, schema = "TEST_OWNER")
Note: The same dbWriteTable() is working, if it is with same Database Owner. I am able to use it as expected.
Expected: Load the data frame into Oracle table
Actual: Error Message
Error in .local(conn, statement, ...) : execute JDBC update query
failed in dbSendUpdate (ORA-00942: table or view does not exist )
Calls: dbWriteTable ... dbWriteTable -> .local -> dbSendUpdate ->
dbSendUpdate -> .local
Execution halted

RPostgreSQL can't connect

I'm having trouble connecting my R client to redshift through the RPostgreSQL package, despite it working very easily through psql. I've tried downloading and sourcing the redshift-ssl-ca-cert.pem file, but this still doesn't seem to work. Any ideas what could be causing this? Here's my R code:
library("RPostgreSQL")
drv <- dbDriver("PostgreSQL")
host = 'host.com'
dbname = 'dbname'
port = 1234
password = 'password'
username = 'user'
redshift_cert = paste0(FILE_PATH, 'redshift-ssl-ca-cert.pem')
pg_dsn = paste0(
'dbname=', dbname, ' ',
'sslrootcert=', redshift_cert, ' ',
'sslmode=verify-full'
)
con <- dbConnect(drv, dbname=pg_dsn, host=host, port=port, password=password, user=username)
and I always get this error message
Error in postgresqlNewConnection(drv, ...) :
RS-DBI driver: (could not connect user#host.com on dbname "dbname"
)
Meanwhile this command using psql works perfectly
psql 'host=host.com dbname=dbname sslmode=require port=1234 user=user password=password'
I've also tried other variations of sslmode including require and allow for the R code, but nothing works. Would appreciate any ideas thanks!

R: How to select a table from a schema in redshift using dplyr (function scr_postgres)?

I have a connection with my db in redshift using dplyr (function scr_postgres), but because it is under a schema I can't select the table.
trying to access without the schema I got an error:
requests <- tbl(my_db, "sessions")
Error in postgresqlExecStatement(conn, statement, ...) : RS-DBI driver: (could not Retrieve the result : ERROR: relation "sessions" does not exist
trying to access with the schema in one string I got an error:
requests <- tbl(my_db, "analytics.sessions")
Erro: Table analytics.sessions not found in database
trying to access with schema combining strings I got an error:
requests <- tbl(my_db, c("analytics", "sessions"))
Erro: length(from) not equal to 1
But in RPostgreSQL it works:
dbExistsTable(my_db$con, c("analytics", "sessions"))
[1] TRUE
As suggested by #mike_db here I can set the schema in the search_path.
For that I need to set the [options] parameter at the opening of a connection with src_postgres:
my_db <- src_postgres(host="host", port="5439",
dbname = "dbname", user = "user", password = "XXXX",
options="-c search_path=analytics")
Or, you can use something like it at the selecting of a table:
requests <- tbl(my_db, sql("SELECT * from analytics.sessions WHERE 0=1"))

Resources