Issues with RS-DBI driver in R - r

I'm having an issue figuring out why I can't connect to a PSql DB from R. I am able to access the database from the terminal using the psql command, but when connecting through DBI and R I get the following message [with some information redacted]:
RS-DBI driver: (could not connect [username]#[database URI] on dbname "[dbname]"
The database string works fine both the terminal and this code works fine on the machine I am porting it from. I have reinstalled the versions of the libraries that match what was on the dev machine, and am still having problems.
Any advice?
Edit:
I was able to get it working by fiddling around with the library(...) statements. It seems changing the order of the DBI and RPostgreSQL libraries have an effect. RPostgreSQL requires DBI, but importing just RPostgreSQL still produced the could not connect error.
To future readers with this issue: fiddle with the order, it may help!

Just an educated guess: your psql is from the same machine, so uses the local connection. The DBI-based methods using the Postgresql library will use network connection so you actually have to open that the corresponding config file.
See eg here about pg_hba.conf.

Related

Trouble Connecting to DB2 Database with R error 1114

I've been trying to connect RStudio to a DB2 database and have been receiving the following error
rror: nanodbc/nanodbc.cpp:950: IM003: Specified driver could not be loaded due to system error 1114:
A dynamic link library (DLL) initialization routine failed.
(IBM DB2 ODBC DRIVER - DB2COPY1, C:\PROGRA~1\IBM\SQLLIB\BIN\DB2CLIO.DLL).
I've been using this code
connection<-DBI::dbConnect(odbc::odbc(),Driver="IBM DB2 ODBC DRIVER - DB2COPY1",
Server = "NRDCWIP6",uid="nxxx",pwd="Wxxx")
which has been working well for a different database (SQL server). I'm working in Windows 10 and don't have a lot of information about the database itself since it's managed by an IT group that's quite busy. I'm still quite new to connecting R to databases as well. I do know that the platform for the DSN is 32-bit, but when I look under the User DSN tab, it is listed as 32/64 bit.
I know 1114 is a rather well known error, but I'm not sure where the problem is and I've tried numerous variations of this code. Anything will help!
Here may be the answer for this situation:
Specified driver could not be loaded due to system error 1114
https://www.ibm.com/support/pages/specified-driver-could-not-be-loaded-due-system-error-1114
Here is the key note from above:
Resolving The Problem
Launch the odbcad32.exe from the Windows/SysWOW64/ folder and ensure that you have the current driver for the database version you are connecting to, specified in the Data Source that is being used in the map
Hope this helps.

Could there be any language issue with a ODBC in R with an Access DB?

I am using an R Script which connects to a local Access database. For that, I used the 'odbc' package in R and created an odbc Driver in Windows. It works well on my machine.
The issue I have is, that it can't connect to the database when running the script on a foreign computer with different language settings than English. Both machines are running Windows 64-bit with Access and R on 64-bit. Running following Code:
library(odbc)
con <- dbConnect(odbc::odbc(), "AccessDB")
results in following error message:
Error in connection_info(ptr) : nanodbc/nanodbc.cpp:1072:
I didn't find a solution yet, I am thinking of using another database.
I received the same error today on a setup that usually works. After downgrading the odbc-package to 1.1.6, it works fine again.

How to use psql in R? copy_to function of the "rpg" package not working

I am connecting to a PostgreSQL database and I would like to make use of psql commands (especially the \copy command) from within R.
I’m on a windows client using ODBC drivers to connect to the database. Basically any of the major ODBC package in R, including the „rpg“ package, is working to connect to the database and to read and write tables etc.
Apart from placing regular SQL queries the „rpg“ package allows to make use of psql commands. The „copy_to“ function should send the psql „\copy“ command to the database.
However, when running the function I get the error: „psql not found“
I also have pgAdmin III installed. Here running the \copy command is no problem at all.
Digging deeper I found that the rpg::copy_to function first runs Sys.which(„psql“) which returns: "" leading to said error.
Reading this thread made me think that adding the path to the pgAdmin psql.exe would do the trick. So I added the line
psql=C:\Program Files (x86)\pgAdmin III\1.16\psql.exe
in the R environment.
Running Sys.which(„psql“) still returns "", while Sys.getenv() correctly shows the path to the pqsl.exe that I specified.
How can I make Sys.which() find the psql.exe? Given that’s the correct way to solve this issue in the first place.
I would appreciate any help!

R Oracle connect via DBI::dbDriver("Oracle") throws error

I try to do a simple connect to an Oracle database via DBI and ROracle package following instructions from R to Oracle Database Connectivity: Use ROracle for both Performance and Scalability.
When I test the connection via Windows7 > ODBC Data Source Administrator (32bit), the connection is successful. It uses the installed Oracle client OraClient11g_home1 which resides in C:\oracle\Client112_32. ORACLE_HOME environment variable is set to C:\oracle\Client112_32.
I am guessing it may be connected to some 32bit/64bit issue? But even after quite some research I did not find any solution. I also tried running the same in R 32bit but fails as well. BTW, the connection via SQL Developer is also successful.
drv <- DBI::dbDriver("Oracle")
#>Error: Couldn't find driver Oracle. Looked in:
#>* global namespace
#>* in package called Oracle
#>* in package called ROracle
I've had this issue as well. I found that loading the ROracle library beforehand fixes the problem.
library("ROracle")
drv <- DBI::dbDriver("Oracle")
I don't know why though.
Building on user11227405 answer: it is actually enough to load ROracle without attaching it on the search path; library() does both instead:
loadNamespace("ROracle")
drv <- DBI::dbDriver("Oracle")
that might be preferred e.g. in packages, where changing the search path should be avoided

Redshift JDBC connection crashes on second opening in R

I am using the RJDBC package to connect to AWS Redshift from an EC2 ubuntu instance.
I can successfully connect using the JDBC() call, retrieve/insert rows and then close the connection.
However, when I re-open a second connection in the same R session, R crashes with a segmentation fault. This happens in both R Studio and console R. I'm using conda to manage the R.
I have tried the connection using the native redshift jar provided by Amazon and also another jar from Progess Software. I get the same effect with both drivers: first connection is fine, subsequent connections crash.
I've installed the latest JVM v8. I had seen some other threads that suggested installing v6 as a workaround, but unfortunately that is no longer available at the oracle site.
My gut feeling is that Java has a weird interaction with R, but I'm at a loss as to how to proceed.
OK, I solved this myself and thought I'd record in case this is useful to others.
The problem was really with rJava not re-initialising the JVM correctly.
I added the following line before opening a database connection:
rJava::.jinit(force.init = TRUE)
Now I can open and close connections without issue using RJDBC

Resources