Connect SQL Server with RStudio and access data in the environment - r

I have a connection to a mysql server like this:
mydb <- DBI::dbConnect(
drv = MySQL(),
host = "host",
port = 1111,
user = "user_1",
password = "password",
dbname = "database_name"
)
and then I do queries to that data table using this code
query1 <- fetch(dbSendQuery(mydb, "select * from table_1"), n = Inf)
so the result is that I have a query1 table in the R environment.
Now I have other databases in SQL Server, so I'm trying to do the same. I'm establishing the connection doing this:
con <- dbConnect(odbc(),
Driver = "SQL Server",
Server = "server",
Database = "database_2",
UID = "user_2",
PWD = "password",
Port = 2222)
and it seems that works, because in the Connection tab appears the database, but when I navigate and try to see the data an error occurs. Besides this, I'm looking for functions that do the same like the previous (fetch with dbSendQuery), having this way the data frames available in the environment.

I finally solved the problem using this code:
con <- dbConnect(odbc(),
Driver = "ODBC Driver 17 for SQL Server",
Server = "server",
Database = "database_2",
UID = "user_2",
PWD = "password",
Port = 2222)
So the problem was in the Driver.

Related

How can i establish a connection to a db no using credentials every time

I'm working with several R files that need connect to database but server has timeout so timeout=inf is not functional
What i was looking for is something like this:
RMySQL Database connection
(I'm the only user in R, and in total users just a few)
Inserting in my .RProfile all what is needed (credentials) and only in the .r programs that connect and disconnect when were necessary
In .RProfile:
con <- dbConnect(odbc::odbc(), Driver = "{MariaDB ODBC 3.1 Driver}",
Server = "{host}", database = "db", UID = "userid",
PWD = "pwd",
Port = 1234)
and in .R programs use something like this:
conn <- dbConnect(odbc::odbc(), group = "what i should use here?")
#using database
tbl(conn,"table")
#more code
dbDisconnect(conn)
I was too looking for other option, pool
In .RProfile
library(pool)
pool<- dbPool(odbc::odbc(), Driver = "{MariaDB ODBC 3.1 Driver}",
Server = "{host}", database = "db", UID = "userid",
PWD = "pwd",
Port = 1234)
.Last <- function(){
poolClose(pool)
}
But i'm not sure if it works or the previous option is better.

How can I connect to posgresql with new connection on RStudio

I want to connect RStudio to a database in PostgreSQL. So firstly following the instructions I modified the odbcinst.ini file:
[PostgreSQL Driver]
Driver = /usr/local/lib/psqlodbcw.so
Ok, so now appears a PostgreSQL selectable in connection. And then I tried to make the connection:
con <- dbConnect(odbc::odbc(),
.connection_string = "Driver={PostgreSQL Driver};",
dbname = 'name',
host = 'host',
user = 'user',
password = 'pass',
port = 5432,
timeout = 10)
But it returns me an error:
rror: nanodbc/nanodbc.cpp:983: 00000: could not connect to server: No such file or directory
Is the server running locally and accepting
connections on Unix domain socket "/tmp/.s.PGSQL.5432"?
I have tried so far:
drv <- dbDriver("RPostgreSQL")
con <- dbConnect(drv,
dbname = 'name',
host = 'host',
user = 'user',
password = 'pass',
port = 5432)
That works but I'm unable to navigate the database (I mean, to see the tables and connection status) in RStudio.
My question is: Should I change my driven in odbcinst.ini? What would be the .so file? Any help pointing what to do will be greatly appreciated. By the way I'm running on Mac, and the host is in Amazon.
Perhaps RPostgres can help you out
con <- RPostgres::dbConnect(RPostgres::Postgres(),
dbname = 'name',
host = 'host',
user = 'user',
password = 'pass',
port = 5432)
After some trial and error this worked for me:
con <- dbConnect(odbc::odbc(),
Driver = "PostgreSQL Driver",
Database = 'name',
Server = 'host',
UID = 'user',
PWD = 'pass',
Port = 5432,
timeout = 10)
Voila, now I can see the tables in the database.

Connect with RJDBC like with odbc

I am trying to connect to my MS SQL database with RJDBC and I don't know what to fill in url argument. With odbc this was enough:
dbConnect(odbc::odbc(),
Driver = "SQL Server",
dsn = "MyDsn",
uid = "User",
pwd = "123456",
server = "myserver123456\\myserver1",
database = "MyDatabase")
When I swap the driver from odbc to jdbc then it fails:
dbConnect(RJDBC::JDBC(classPath = "C:/jdbc/mssql-jdbc-7.0.0.jre8.jar"),
Driver = "SQL Server",
dsn = "MyDsn",
uid = "User",
pwd = "123456",
server = "myserver123456\\myserver1",
database = "MyDatabase")
error: Error in .jcall("java/sql/DriverManager", "Ljava/sql/Connection;", "getConnection", :
argument "url" is missing, with no default
What should i write in url argument? How to get know?
RJDBC uses different arguments for the dbConnect function: a driver definition and a connection URL (the piece you are missing). For example (from https://www.rforge.net/RJDBC/), to connect to a MySQL Database, your code would look like the following:
library(RJDBC)
drv <- JDBC("com.mysql.jdbc.Driver",
"/etc/jdbc/mysql-connector-java-3.1.14-bin.jar",
identifier.quote="`")
conn <- dbConnect(drv, "jdbc:mysql://localhost/test", "user", "pwd")
Loads a JDBC driver for MySQL (adjust the path to the driver's JAR file as necessary) and connects to the local database "test". The connection handle conn is used for all subsequent operations.
For SQL Server, your code will look something like the following (from https://www.r-bloggers.com/connecting-to-sql-server-from-r-using-rjdbc/):
require(RJDBC)
drv <- JDBC("com.microsoft.sqlserver.jdbc.SQLServerDriver",
"C:/jdbc/mssql-jdbc-7.0.0.jre8.jar")
conn <- dbConnect(drv, "jdbc:sqlserver://serverName", "userID", "password")

Access Redshift tables under a schema

I am trying to access some tables on Redshift with shinyR that are under a schema, I can connect to a table that are not under a schema using library(RPostgresSQL) so I know this part is working:
pconn_r <- dbConnect(RPostgres::Postgres(),
host = "xxxxxxxxxxxx",
port = "5439",
user = "rcd_admin",
password = "xxxxxxx",
dbname = "xxxx"
)
but i'm unable to access my table fr__synapps_drug__outpatient_index under the schema synapps with this command :
sql_command <- "Select cip13_code,cis_label,presentation_label,brand,period,hco_code,hco_label,hco_code_type,hco_city,bse,rem,unit from synapps.fr__synapps_drug__outpatient_index"
outpatient <- dbSendQuery(pconn_r, sql_command)
I found a solution:
myRedshift <- DBI::dbConnect(RPostgreSQL::PostgreSQL(),
dbname = 'oip',
host = 'xxxx',
port = 5439,
user = "xxxadmin",
password = "xxx")
cis_has_bdpm <-data.frame( dbGetQuery(myRedshift, "select * from synapps.fr__synapps_drug__has_index"))
I change the way to connect to Redshift and it's working, but the loading of the data is very slow

No database selected with RMySQL

I'm trying to connect to remote publicly-accessible MySQL server EnsEMBL public server using RMySQL, but when I try to list the tables, an error occurs:
library(RMySQL)
mydb = dbConnect(MySQL(),
user = 'anonymous',
port = 5306,
host = 'asiadb.ensembl.org')
dbListTables(mydb)
Error in .local(conn, statement, ...) :
could not run statement: No database selected
Is there a a way to find out the name? Or Am I making a completely different mistake altogether?
You have to specify the name of the db in the dbConnect call. e.g. :
mydb = dbConnect(MySQL(),
user = 'anonymous',
port = 5306,
host = 'asiadb.ensembl.org',
db = 'homo_sapiens_core_83_38')
dbListTables(mydb)
It is weird that database = 'testdb' executed with dbExecute in R
db <- dbConnect(RMySQL::MySQL(),
user = 'root',
password = 'pwd123',
host = 'localhost',
database = 'testdb'
)
dbExecute(db, MySQLStatement) # Executed Without Error
But when used dbListTables(db) showing no databases selected.
Changed database into db worked as expected
db <- dbConnect(RMySQL::MySQL(),
user = 'root',
password = 'pwd123',
host = 'localhost',
db = 'testdb'
)

Resources