Error connecting to DB2 via ODBC - r

I'm having trouble connecting to a DB2 database via ODBC. I'm on a Windows system, and have configured a Data Source Name within the ODBC Administrator. When I test the connection there I get Connection tested successfully.. I can also successfully test the connection within IBM's DB2 Configuration Assistant, using both CLI and ODBC.
I'm not able to connect within R. I've tried both the RODBC & odbc packages, the result is the same. My intent is to execute a simple query to verify the connection. When I run the following R script I get an error. Here's my pseudocode.
library('RODBC')
myQuery <- 'SELECT COLUMN1, COLUMN2 FROM DATABASE.TABLE FETCH FIRST 10 ROWS ONLY;'
cnxn <- odbcConnect('myDSN')
data <- sqlQuery(channel=cnxn, query=myQuery)
odbcCloseAll()
Here's the error that I get.
Error in sqlQuery(channel = cnxn, query = myQuery) :
first argument is not an open RODBC channel
In addition: Warning messages:
1: In RODBC::odbcDriverConnect("DSN=myDSN") :
[RODBC] ERROR: state 58031, code -1031, message [IBM][CLI Driver] SQL1031N The database directory cannot be found on the indicated file system. SQLSTATE=58031
2: In RODBC::odbcDriverConnect("DSN=myDSN") : ODBC connection failed
I've learned through experimentation that my script never gets to the point of sending the query. This error is generated at the odbcConnect command.
I don't have access to the server itself, only the database. Is there anything that I can do or try to resolve this on my own, without having to go through support?
EDIT:
I've now cataloged my database, and test connection is successful in 3 places, ODBC Data Source Administrator, Db2 Command Line & Db2 Configuration Assistant. I know that there's no issue with permissions, as I can execute queries via IBM Query Management Facility. I believe this is an issue with either my driver or my system's PATH statements, but I'm not sure how to trace that down.

Taking a non-RODBC approach, the below method works for connecting R and DB2. Assuming you know all the information below, you'll need to download an IBM DB2 jar file and locate it, in this case, in a folder on my machine called "IBM".
Note: there are two types of available jar files, db2jcc.jar and db2jcc4.jar. The below example is using db2jcc.jar.
library(rJava)
library(RJDBC)
library(DBI)
#Enter the values for you database connection
dsn_driver = "com.ibm.db2.jcc.DB2Driver"
dsn_database = "" # e.g. "BLUDB"
dsn_hostname = "" # e.g.: "awh-yp-small03.services.dal.bluemix.net"
dsn_port = "" # e.g. "50000"
dsn_protocol = "TCPIP" # i.e. "TCPIP"
dsn_uid = "" # e.g. "dash104434"
dsn_pwd = "" # e.g. "7dBZ39xN6$o0JiX!m"
jcc = JDBC("com.ibm.db2.jcc.DB2Driver", "C:/Program Files/IBM/SQLLIB/java/db2jcc.jar");
jdbc_path = paste("jdbc:db2://", dsn_hostname, ":", dsn_port, "/", dsn_database, sep="");
conn = dbConnect(jcc, jdbc_path, user=dsn_uid, password=dsn_pwd)
query = "SELECT *
FROM Table
FETCH FIRST 10 ROWS ONLY";
rs = dbSendQuery(conn, query);
df = fetch(rs, -1);
df

According to the DB2 Manual here
SQL1031N The database directory cannot be found on the indicated file system.
Explanation
The system database directory or local database directory could not be found. A database has not been created or it was not cataloged correctly.
The command cannot be processed.
User response
Verify that the database is created with the correct path specification. The Catalog Database command has a path parameter which specifies the directory where the database resides.
sqlcode: -1031
sqlstate: 58031

Related

R - handle error when accessing a database

I'm trying to automate data download from db using RJDBC using a for loop. The database im using automatically closes the connection after every 10mins, so what i want to do is somehow catch the error, remake the connection, and continue the loop. In order to do this i need to capture the error somehow, the problem is, it is not an r error so none of the commands trycatch and similar works. I just get a text on the console telling me:
Error in .jcheck() : No running JVM detected. Maybe .jinit() would help.
How do i handle this in terms of:
if (output == ERROR) {remake connection and run dbQuery} else {run dbQuery}
thanks for any help
You could use the pool package to abstract away the logic of connection management.
This does exactly what you expect regarding connection management with DBI.
It should work with RJDBC which is an implentation of DBI, but I didn't test it with this driver.
libray(pool)
library(RJDBC)
conn <- dbPool(
drv = RJDBC::JDBC(...),
dbname = "mydb",
host = "hostadress",
username = "test",
password = "test"
)
on.exit(poolClose(conn))
dbGetQuery(conn, "select... ")

Unable to query Azure SQL Server from RStudio

I am trying to query an SQL Server database deployed on Azure from RStudio.
I established a connection. I can see all of the tables and get a preview of the data from the RStudio GUI. However, when I try to query the data I get an error.
library(dplyr)
library(odbc)
con <- dbConnect(odbc(),
Driver = "SQL Server",
Server = "#.database.windows.net",
Database = "loremipsum",
UID = "loremipsum",
PWD = "loremipsum",
Port = 1433)
First query I tried:
users <- as.data.frame(sqlQuery(con, "SELECT * FROM AspNetUsers"))
The error I got:
Error in sqlQuery(con, "SELECT * FROM AspNetUsers") :
first argument is not an open RODBC channel
I also tried this query
result <- dbSendQuery(con, "SELECT * FROM AspNetUsers")
first_100 <- dbFetch(result, n = 100)
With the following error:
Error in result_fetch(res#ptr, n) :
nanodbc/nanodbc.cpp:2966: 07009: [Microsoft][ODBC SQL Server Driver]Invalid Descriptor Index
It strange because the connection must be valid, given that it gets access to the data through the GUI.
Any hints?
Congratulations you have solved the issue:
I eventually solved the issue by switching to another libary - RODBC. It seems that thats a bug in the way odbc handles character columns as point:R DBI ODBC error: nanodbc/nanodbc.cpp:3110: 07009: [Microsoft][ODBC Driver 13 for SQL Server]Invalid Descriptor Index
I post this as answer and this can be beneficial to other community members.

Connect to MSSQL using DBI

I can not connect to MSSQL using DBI package.
I am trying the way shown in package itself
m <- dbDriver("RODBC") # error
Error: could not find function "RODBC"
# open the connection using user, passsword, etc., as
# specified in the file \file{\$HOME/.my.cnf}
con <- dbConnect(m, dsn="data.source", uid="user", pwd="password"))
Any help appreciated. Thanks
As an update to this question: RStudio have since created the odbc package (or GitHub version here) that handles ODBC connections to a number of databases through DBI. For SQL Server you use:
con <- DBI::dbConnect(odbc::odbc(),
driver = "SQL Server",
server = <serverURL>,
database = <databasename>,
uid = <username>,
pwd = <passwd>)
You can also set a dsn or supply a connection string.
It looks like there used to be a RODBC driver for DBI, but not any more:
http://cran.r-project.org/src/contrib/Archive/DBI.RODBC/
A bit of tweaking has got this to install in a version 3 R but I don't have any ODBC sources to test it on. But m = dbDriver("RODBC") doesn't error.
> m = dbDriver("RODBC")
> m
<ODBCDriver:(29781)>
>
Suggest you ask on the R-sig-db mailing list to maybe find out what happened to this code and/or the author...
Solved.
I used library RODBC. It has great functionality to connect sql and run sql queries in R.
Loading Library:
library(RODBC)
# dbDriver is connection string with userID, database name, password etc.
dbhandle <- odbcDriverConnect(dbDriver)
Running Sql query
sqlQuery(channel=dbhandle, query)
Thats It.

R DBI / RPostgreSQL-- connection succeeds but dbListTables returns no tables

The following code connects to my PostgreSQL database successfully (or appears to, at any rate), but attempt to issue queries were met with "relation does not exist" errors, so I tried dbListTables, which doesn't return any tables at all. The database name passed to dbConnect is correct, and the tables do exist. I think the code I'm using is exactly the same as what I was using recently, which worked successfully. Any ideas?
> library(RPostgreSQL)
Loading required package: DBI
> drv <- dbDriver("PostgreSQL")
> con <- dbConnect(drv, dbname="mydb", user="user", password=password)
> dbListTables(con)
character(0)
I'm new to both R and DBI, so I'm sure I could be missing something extremely simple...any help would be appreciated.
Solved-- I was right; it was something incredibly simple (and very, very stupid) on my part. I was running the script from the wrong server. The server I was running it from has an empty copy of the database I was attempting to connect to, so everything succeeded, and the empty result from dbListTables was correct. Once I switched servers (or simply specified the host on the other server), everything worked.
1.Connet to MySQL
a)if Mysql is installed in your system, if not install it.
b)download the RMySQL IN R
library(RMySQL)
drv = dbDriver("MySQL 5.0.1")
make sure MySQL version is correct.
con = dbConnect(drv,host="localhost",dbname="test",user="root",pass="root")
use local host or use the server i.e ip address
use the required database name, user name and password
album = dbGetQuery(con,statement="select * from table")
run required query
close(con)
2.Another way to connect database
a)first install any database like MySQL,Oracle,SQL Server
b)install the ODBC connector for database
library(Rodbc)
channel <- odbcConnect("test", uid="ripley", pwd="secret")
test is the connection name of odbc conector which user has to set manualy
user can find this in Administrator tool
res <- sqlFetch(ch, "table name")
A table can be retrieved as a data frame
res<-sqlQuery(channel, paste("select query"))
part of the with condition one table can be retrieved as a data frame
sqlSave(channel, dataframe)
to save a dataframe to the database(dont use "res<-" something like this)
like user can use
sqlCopy()
sqlDrop()
sqlTables()
close(channel)
always close the connection

Why won't RODBC upload a dataframe to SQL Server?

library(RODBC)
con <- odbcDriverConnect("driver=SQL Server; server=name")
df <- data.frame(a=1:10, b=10:1, c=11:20)
Trying to upload the dataframe:
sqlSave(con, df, tablename='[MyDatabase].[MySchema].[MyTable]', rownames=F)
>Error in sqlColumns(channel, tablename) :
‘MyDatabase.MySchema.MyTable’: table not found on channel
..alternatively creating the table first and then appending to it:
cmd <- "create table [MyDatabase].[MySchema].[MyTable] ([a] int, [b] int, [c] int)"
sqlQuery(con, cmd)
sqlSave(con, df, tablename='[MyDatabase].[MySchema].[MyTable]', rownames=F, append=T)
>Error in sqlSave(con, df, tablename = "MyTable", rownames = F, :
42S01 2714 [Microsoft][ODBC SQL Server Driver][SQL Server]There is already an object named MyDatabase.MySchema.MyTable in the database.
[RODBC] ERROR: Could not SQLExecDirect 'CREATE TABLE MyDatabase.MySchema.MyTable ("a" int, "b" int, "c" int)'
What am I doing wrong?
If I add brackets I also get an error.
If I use a connection string with the database to make sure that I am in the correct database (not master) and execute the statement sqlSave(con, df, tablename='dbo.MyTable4', rownames=F) or sqlSave(con, df, tablename='MyTable5', rownames=F) it works.
When connecting to a Microsoft SQL Server, it is essential to use odbcDriverConnect instead of odbcConnect in order to perform sqlSave statements etc. Only odbcDriverConnect allows to specifiy a specific database in the connection string.
RODBC looks at the default folder for the ODBC server connection to see if you have write permissions there (even if you are going for a subdirectory). If you don't have master permissions, then it fails.
I had to create two connections within R, one for reading from the master and one for writing to my temp directory.
These were set by creating two server connections using my local computer's ODBC Administration (within Win7):
-One that defaults to the write-protected master server directory and that I use to pull read-only data.
-One that defaults to the server directory that I have write/drop permissions to.
In other words, your problem is solved by changing your machine's ODBC connection and having R point to the new server connection you will make (the one that defaults to your write-permissioned table.

Resources