My company is instituting an ssl requirement soon for database connections.
I previously connected to our Vertica database via DBI and RJDBC packages. I have tried adding an sslmode='require' parameter to my connection. But adding this parameter has no effect. I can still connect to the database but the connection is not ssl.
Can anyone advise on how to enable ssl connection for DBI? In PyCharm I merely had to set ssl to true in the driver properties.
DBI::dbConnect(
drv = RJDBC::JDBC(
driverClass = driver_class,
classPath = class_path
),
url = url,
UID = user_id,
PWD = password,
sslmode = 'require'
)
}
A different ssl parameter was required. I am having connection success with the function below that uses ssl = 'true'
DBI::dbConnect(
drv = RJDBC::JDBC(
driverClass = driver_class,
classPath = class_path
),
url = url,
UID = user_id,
PWD = password,
ssl = 'true'
)
Related
I'm having difficulty making a connection to my MS SQL Server from R.
I believe I'm using the right driver, I checked what drivers I had listed and picked the same one that I'm using in DBeaver for the connection to the same database:
odbcListDrivers() %>% filter(name %like% "SQL Server" & !name %like% "Teradata")
My connection string looks as follow:
db_conn <- DBI::dbConnect(odbc::odbc(),
Driver = "SQL Server",
Server = "Server_address",
Database = "Database_Name",
UID = "myName",
PWD = "myPWd",
Trusted_Connection = "True",
Port = 1433,
ApplicationIntent = "ReadOnly"
)
I get the following error:
Error: nanodbc/nanodbc.cpp:1021: 01S00: [Microsoft][ODBC SQL Server Driver][SQL Server]The target database ('Database_Name') is in an availability group and is currently accessible for connections when the application intent is set to read only. For more information about application intent, see SQL Server Books Online. [Microsoft][ODBC SQL Server Driver]Invalid connection string attribute
I can connect to the same database using DBeaver and have specified the the connection has applicationIntent = readonly
Is there some other connection string property I need to include to make a connection?
Turns out that I was using the correct connection string properties but the server was running in cluster mode and therefore I needed to append 'CLS' to my database name.
My final connection string looks as follows:
db_conn <- DBI::dbConnect(odbc::odbc(),
Driver = "SQL Server",
Server = "SomeDBcls.xx.yy.zz.com",
Database = "NameOfSchema",
UID = "TheGoat",
PWD = rstudioapi::askForPassword("Database password"),
Trusted_Connection = "Yes",
Port = 1433,
applicationIntent = "readonly"
)
I have a connection to a mysql server like this:
mydb <- DBI::dbConnect(
drv = MySQL(),
host = "host",
port = 1111,
user = "user_1",
password = "password",
dbname = "database_name"
)
and then I do queries to that data table using this code
query1 <- fetch(dbSendQuery(mydb, "select * from table_1"), n = Inf)
so the result is that I have a query1 table in the R environment.
Now I have other databases in SQL Server, so I'm trying to do the same. I'm establishing the connection doing this:
con <- dbConnect(odbc(),
Driver = "SQL Server",
Server = "server",
Database = "database_2",
UID = "user_2",
PWD = "password",
Port = 2222)
and it seems that works, because in the Connection tab appears the database, but when I navigate and try to see the data an error occurs. Besides this, I'm looking for functions that do the same like the previous (fetch with dbSendQuery), having this way the data frames available in the environment.
I finally solved the problem using this code:
con <- dbConnect(odbc(),
Driver = "ODBC Driver 17 for SQL Server",
Server = "server",
Database = "database_2",
UID = "user_2",
PWD = "password",
Port = 2222)
So the problem was in the Driver.
I'm attempting to refactor older code to make use of DB pools using the pool package's dbPool function.
Historically I've been using the DBI package's dbConnect function without issue. I can successfully create a connection to my Oracle database with the below code (all credentials are faked):
conn <- DBI::dbConnect(
ROracle::Oracle(),
"database.abcd1234.us-east-1.rds.amazonaws.com/orcl",
username="username",
password="hunter2"
)
However, when I use the same credentials in the same development environment to attempt to create a pool like this:
pool <- pool::dbPool(
drv = ROracle::Oracle(),
dbname = "orcl",
host = "database.abcd1234.us-east-1.rds.amazonaws.com",
username = "username",
password = "hunter2"
)
I get an error:
Error in .oci.Connect(.oci.drv(), username = username, password = password, :
ORA-12162: TNS:net service name is incorrectly specified
I've used dbPool before but with Postgres databases instead of Oracle, and for Postgres, it just worked! I'm thinking that because my credentials work fine for dbConnect, there must be some small thing I'm missing that's needed for dbPool to work correctly too
orcl is the service name, not the database name.
Try:
pool <- pool::dbPool(
drv = ROracle::Oracle(),
host = "database.abcd1234.us-east-1.rds.amazonaws.com/orcl",
username = "username",
password = "hunter2"
)
Or
pool <- pool::dbPool(
drv = ROracle::Oracle(),
sid = "orcl",
host = "database.abcd1234.us-east-1.rds.amazonaws.com",
username = "username",
password = "hunter2"
)
I want to connect RStudio to a database in PostgreSQL. So firstly following the instructions I modified the odbcinst.ini file:
[PostgreSQL Driver]
Driver = /usr/local/lib/psqlodbcw.so
Ok, so now appears a PostgreSQL selectable in connection. And then I tried to make the connection:
con <- dbConnect(odbc::odbc(),
.connection_string = "Driver={PostgreSQL Driver};",
dbname = 'name',
host = 'host',
user = 'user',
password = 'pass',
port = 5432,
timeout = 10)
But it returns me an error:
rror: nanodbc/nanodbc.cpp:983: 00000: could not connect to server: No such file or directory
Is the server running locally and accepting
connections on Unix domain socket "/tmp/.s.PGSQL.5432"?
I have tried so far:
drv <- dbDriver("RPostgreSQL")
con <- dbConnect(drv,
dbname = 'name',
host = 'host',
user = 'user',
password = 'pass',
port = 5432)
That works but I'm unable to navigate the database (I mean, to see the tables and connection status) in RStudio.
My question is: Should I change my driven in odbcinst.ini? What would be the .so file? Any help pointing what to do will be greatly appreciated. By the way I'm running on Mac, and the host is in Amazon.
Perhaps RPostgres can help you out
con <- RPostgres::dbConnect(RPostgres::Postgres(),
dbname = 'name',
host = 'host',
user = 'user',
password = 'pass',
port = 5432)
After some trial and error this worked for me:
con <- dbConnect(odbc::odbc(),
Driver = "PostgreSQL Driver",
Database = 'name',
Server = 'host',
UID = 'user',
PWD = 'pass',
Port = 5432,
timeout = 10)
Voila, now I can see the tables in the database.
I'm trying to connect to remote publicly-accessible MySQL server EnsEMBL public server using RMySQL, but when I try to list the tables, an error occurs:
library(RMySQL)
mydb = dbConnect(MySQL(),
user = 'anonymous',
port = 5306,
host = 'asiadb.ensembl.org')
dbListTables(mydb)
Error in .local(conn, statement, ...) :
could not run statement: No database selected
Is there a a way to find out the name? Or Am I making a completely different mistake altogether?
You have to specify the name of the db in the dbConnect call. e.g. :
mydb = dbConnect(MySQL(),
user = 'anonymous',
port = 5306,
host = 'asiadb.ensembl.org',
db = 'homo_sapiens_core_83_38')
dbListTables(mydb)
It is weird that database = 'testdb' executed with dbExecute in R
db <- dbConnect(RMySQL::MySQL(),
user = 'root',
password = 'pwd123',
host = 'localhost',
database = 'testdb'
)
dbExecute(db, MySQLStatement) # Executed Without Error
But when used dbListTables(db) showing no databases selected.
Changed database into db worked as expected
db <- dbConnect(RMySQL::MySQL(),
user = 'root',
password = 'pwd123',
host = 'localhost',
db = 'testdb'
)