RMongo: Connect to mongoDB using port forwarding - r

I connect to a mongodb server using port forwarding like this:
ssh -i key.pem -Nf -L 27018:xx.xxx.xxx.xxx:27017 ubuntu#xx.xxx.xxx.xxx
mongo -u user -p pass --authenticationDatabase "db" --port 27018
Then I run R to connect and query the database:
library(RMongo)
mg1 <- mongoDbConnect(dbName = 'db', host = 'xx.xxx.xxx.xxx', port = 27018)
auth <- dbAuthenticate(rmongo.object = mg1, username = 'user', password = 'pass')
This gives me an error while authentication:
Error in .jcall(rmongo.object#javaMongo, "Z", "dbAuthenticate", username, :
com.mongodb.MongoException$Network: IOException authenticating the connection
Without port forwarding, my credentials work:
library(RMongo)
mg1 <- mongoDbConnect(dbName = 'db', host = 'xx.xxx.xxx.xxx', port = 27017)
auth <- dbAuthenticate(rmongo.object = mg1, username = 'user', password = 'pass')
How do I set my port in mongoDbconnect?
Thanks!

Ok, this works. No need to put the host because now we are mapped to the localhost:
library(RMongo)
mg1 <- mongoDbConnect(dbName = 'db', host = 'localhost', port = '27018')
auth <- dbAuthenticate(rmongo.object = mg1, username = 'user', password = 'pass')

Related

R Oracle DB connection fails with dbPool but succeeds with dbConnect

I'm attempting to refactor older code to make use of DB pools using the pool package's dbPool function.
Historically I've been using the DBI package's dbConnect function without issue. I can successfully create a connection to my Oracle database with the below code (all credentials are faked):
conn <- DBI::dbConnect(
ROracle::Oracle(),
"database.abcd1234.us-east-1.rds.amazonaws.com/orcl",
username="username",
password="hunter2"
)
However, when I use the same credentials in the same development environment to attempt to create a pool like this:
pool <- pool::dbPool(
drv = ROracle::Oracle(),
dbname = "orcl",
host = "database.abcd1234.us-east-1.rds.amazonaws.com",
username = "username",
password = "hunter2"
)
I get an error:
Error in .oci.Connect(.oci.drv(), username = username, password = password, :
ORA-12162: TNS:net service name is incorrectly specified
I've used dbPool before but with Postgres databases instead of Oracle, and for Postgres, it just worked! I'm thinking that because my credentials work fine for dbConnect, there must be some small thing I'm missing that's needed for dbPool to work correctly too
orcl is the service name, not the database name.
Try:
pool <- pool::dbPool(
drv = ROracle::Oracle(),
host = "database.abcd1234.us-east-1.rds.amazonaws.com/orcl",
username = "username",
password = "hunter2"
)
Or
pool <- pool::dbPool(
drv = ROracle::Oracle(),
sid = "orcl",
host = "database.abcd1234.us-east-1.rds.amazonaws.com",
username = "username",
password = "hunter2"
)

ssl connection for RJDBC

My company is instituting an ssl requirement soon for database connections.
I previously connected to our Vertica database via DBI and RJDBC packages. I have tried adding an sslmode='require' parameter to my connection. But adding this parameter has no effect. I can still connect to the database but the connection is not ssl.
Can anyone advise on how to enable ssl connection for DBI? In PyCharm I merely had to set ssl to true in the driver properties.
DBI::dbConnect(
drv = RJDBC::JDBC(
driverClass = driver_class,
classPath = class_path
),
url = url,
UID = user_id,
PWD = password,
sslmode = 'require'
)
}
A different ssl parameter was required. I am having connection success with the function below that uses ssl = 'true'
DBI::dbConnect(
drv = RJDBC::JDBC(
driverClass = driver_class,
classPath = class_path
),
url = url,
UID = user_id,
PWD = password,
ssl = 'true'
)

How can I connect to posgresql with new connection on RStudio

I want to connect RStudio to a database in PostgreSQL. So firstly following the instructions I modified the odbcinst.ini file:
[PostgreSQL Driver]
Driver = /usr/local/lib/psqlodbcw.so
Ok, so now appears a PostgreSQL selectable in connection. And then I tried to make the connection:
con <- dbConnect(odbc::odbc(),
.connection_string = "Driver={PostgreSQL Driver};",
dbname = 'name',
host = 'host',
user = 'user',
password = 'pass',
port = 5432,
timeout = 10)
But it returns me an error:
rror: nanodbc/nanodbc.cpp:983: 00000: could not connect to server: No such file or directory
Is the server running locally and accepting
connections on Unix domain socket "/tmp/.s.PGSQL.5432"?
I have tried so far:
drv <- dbDriver("RPostgreSQL")
con <- dbConnect(drv,
dbname = 'name',
host = 'host',
user = 'user',
password = 'pass',
port = 5432)
That works but I'm unable to navigate the database (I mean, to see the tables and connection status) in RStudio.
My question is: Should I change my driven in odbcinst.ini? What would be the .so file? Any help pointing what to do will be greatly appreciated. By the way I'm running on Mac, and the host is in Amazon.
Perhaps RPostgres can help you out
con <- RPostgres::dbConnect(RPostgres::Postgres(),
dbname = 'name',
host = 'host',
user = 'user',
password = 'pass',
port = 5432)
After some trial and error this worked for me:
con <- dbConnect(odbc::odbc(),
Driver = "PostgreSQL Driver",
Database = 'name',
Server = 'host',
UID = 'user',
PWD = 'pass',
Port = 5432,
timeout = 10)
Voila, now I can see the tables in the database.

Access Redshift tables under a schema

I am trying to access some tables on Redshift with shinyR that are under a schema, I can connect to a table that are not under a schema using library(RPostgresSQL) so I know this part is working:
pconn_r <- dbConnect(RPostgres::Postgres(),
host = "xxxxxxxxxxxx",
port = "5439",
user = "rcd_admin",
password = "xxxxxxx",
dbname = "xxxx"
)
but i'm unable to access my table fr__synapps_drug__outpatient_index under the schema synapps with this command :
sql_command <- "Select cip13_code,cis_label,presentation_label,brand,period,hco_code,hco_label,hco_code_type,hco_city,bse,rem,unit from synapps.fr__synapps_drug__outpatient_index"
outpatient <- dbSendQuery(pconn_r, sql_command)
I found a solution:
myRedshift <- DBI::dbConnect(RPostgreSQL::PostgreSQL(),
dbname = 'oip',
host = 'xxxx',
port = 5439,
user = "xxxadmin",
password = "xxx")
cis_has_bdpm <-data.frame( dbGetQuery(myRedshift, "select * from synapps.fr__synapps_drug__has_index"))
I change the way to connect to Redshift and it's working, but the loading of the data is very slow

No database selected with RMySQL

I'm trying to connect to remote publicly-accessible MySQL server EnsEMBL public server using RMySQL, but when I try to list the tables, an error occurs:
library(RMySQL)
mydb = dbConnect(MySQL(),
user = 'anonymous',
port = 5306,
host = 'asiadb.ensembl.org')
dbListTables(mydb)
Error in .local(conn, statement, ...) :
could not run statement: No database selected
Is there a a way to find out the name? Or Am I making a completely different mistake altogether?
You have to specify the name of the db in the dbConnect call. e.g. :
mydb = dbConnect(MySQL(),
user = 'anonymous',
port = 5306,
host = 'asiadb.ensembl.org',
db = 'homo_sapiens_core_83_38')
dbListTables(mydb)
It is weird that database = 'testdb' executed with dbExecute in R
db <- dbConnect(RMySQL::MySQL(),
user = 'root',
password = 'pwd123',
host = 'localhost',
database = 'testdb'
)
dbExecute(db, MySQLStatement) # Executed Without Error
But when used dbListTables(db) showing no databases selected.
Changed database into db worked as expected
db <- dbConnect(RMySQL::MySQL(),
user = 'root',
password = 'pwd123',
host = 'localhost',
db = 'testdb'
)

Resources