RODBC error in Revolution R 64bit on winxp64 bit connected to Oracle using a 64bit ODBC driver thru a DSN
library(RODBC)
db <- odbcConnect("oraclemiso",uid="epicedf",pwd="…")
rslts = sqlQuery(db, "select count(*) from FTRAuction")
Error in .Call(C_RODBCFetchRows, attr(channel, "handle_ptr"), max, buffsize, :
negative length vectors are not allowed
I am able to connect but get an error when I query for stuff,
also the below works
library(RODBC)
channel <- odbcConnect("OraLSH", <user>, <password>))
odbcQuery (channel, "select sysdate from dual")
sqlGetResults(channel, as.is=FALSE, errors=FALSE, max=1, buffsize=1,
nullstring=NA, na.strings="NA", believeNRows=TRUE, dec=getOption("dec"))
SYSDATE
1 2010-01-24 15:10:02
but what if I dont know the rowsize(max=1) before hand
Thanks,
Arun
believeNRows=FALSE seems to be the key. Best to use it when opening the connection:
db <- odbcConnect(dsn="testdsn", uid="testuser", pwd="testpasswd", believeNRows=FALSE )
When testing with unixODBC's isql, it reports SQLRowCount to be 4294967295 (even if there's just one row) on 64bit Linux while it reports -1 on 32 bit Linux. This is probably an optimization as it enables quicker answers. It saves the database the burden of retrieving the complete response data set immediately. E.g. there might be lots of records while only the first few hits will ever be fetched.
4294967295 is (2^32)-1 which is the maximum value for an unsigned int, but will be tretated as -1 with a signed int. Thus R complains on a vector with negative length.
So I assume it's an issue about signed vs. unsigned integer (or sizeof(long) between 32 and 64 bit).
Setting believeNRows=FALSE solved the issue for me so I can use the same R code on both systems.
BTW: I'm using R 2.10.1, RODBC 1.3.2, unixODBC 2.3.0 with Oracle 10.2.0.4 on Linux 64 bit.
Be sure to use
export CFLAGS="-DBUILD_REAL_64_BIT_MODE -DSIZEOF_LONG=8 -fshort-wchar"
when doing configure for unixODBC as the Oracle ODBC driver expects REAL_64_BIT_MODE, not LEGACY_64_BIT_MODE.
And be aware of internationalization issues: R uses $LANG while Oracle uses $NLS_LANG.
I experienced problems with UTF8 so I use e.g.
LANG=en_US; NLS_LANG=American_America
The error
Error in .Call(C_RODBCFetchRows, attr(channel, "handle_ptr"), max, buffsize, :
negative length vectors are not allowed
very much looks like a 32-bit / 64-bit porting issue so I kindly suggest you get in touch with the two commercial vendors involved to have that fixed. I prefer direct database driver where available over ODBC but there is no reason why it shouldn't work as 64-bit Linux merrily plays along.
Dirk is right -- RODBC doesn't support 64-bit drivers for Oracle, at least not as of a few months ago. You may be out of luck. We had a similar issue trying to get R to access an Oracle database from a 64-bit Linux box using the following tools: 64-bit R, RODBC, unixODBC, Oracle Instant Client. I asked the R-sig-db list, including the package author (Prof. Ripley) about this, and there was no conclusive answer. I then asked Revolution if they would be willing to solve the problem, if we were to purchase licenses from them (at 5-figures/year!), and they said no.
My company is now trying to minimize use of R to areas where it is best suited. We will be using other tools (web services, JVM-based systems) to access the database, and sharing data with R only when necessary.
The underlying problem is that very few major users of R also use Oracle. R is primarily used by academics (Excel, MySQL), finance types (Postgres), and more cutting-edge analytics teams. Oracle is used by old businesses that value reliability over innovation, the exact opposite of what most R uses are looking for. So this explains why support for Oracle has fallen away, in my view.
Try max=0 and believeNRows=FALSE - that worked for me.
Related
I'm a noob so forgive my ignorance :-)
I'm creating a Shiny app to perform read/write operations on an existing MS Access database (.mdb) with about 20 small tables and a variety of joins on them. I may have a small number of people connecting simultaneously.
I have planned to use MariaDB (RMariaDB). I notice e.g. RODBC or dplyr can connect to .mdb directly? The app will be hosted on a remote server.
It's unlikely the largest table will exceed 5000 rows in the next 2 years. Should I be using MariaDB now or would the other 'direct' options be enough?
Many thanks in advance for your replies...
Gary
This is not an answer to 'which db shall I use?', but it is a recommendation for a starting point:
https://db.rstudio.com/
The short answer is 'ensure DBI compatibility' and the slightly longer answer is 'for Shiny apps consider using the pool package'.
I'm trying to connect to an Azure SQL Server (12.00.1900) from R on a Mac, using Microsoft's unixodbc SQL Server drivers (17).
I get a connection, but instead of seeing the 12 or so tables that live in the database, dbListTables returns 442 tables, all with nonsensical names, beginning with 'Csoe', 'Ote', and ending in 'xlshm_idad'. Instead of seeing the single schema that lives in the database, I see cin_1mro__e, IFRAINSHM, and s, none of which have any tables in them.
Note that when I use an ordinary SQL visualization app, that doesn't use the MS drivers, I'm able to see the tables and their content properly.
In addition, the RSQLServer package gets a working connection and sees the tables correctly, but isn't compatible with dplyr semantics.
Can anyone help or advise? I've looked for third party SQL Server unixodbc drivers for Mac, and I can't find any.
Until I see more info from OP, I'll leave as my answer the general recommendation to use R's odbc package. Assuming the correct drivers are installed, connection is configured correctly in odbc.ini, and assuming trusted_connection=yes is used in the same, then connecting from R is as simple as:
library(odbc)
dbConn <- dbConnect(odbc(), dsn = "myDSN")
if trusted connection is not on then you just need to pass uid and pwd arguments.
Also, it may be the case OP that you did not install freeTDS, so try (replace with equivalent for package manager you're using):
brew install freetds --with-unixodbc
This gives you the libtdsodbc.so driver. Make sure the DSN points to this.
I need to connect Tableau to HBase or Phoenix and Tableau does not support JDBC. Bummer!
I've read about the proprietary Simba driver but haven't seen any reports of people using it. I don't feel like forking over money when it's not ideal, and my employer feels the same way.
Is there another way to connect Tableau to HBase or Phoenix? How are other people doing it? I don't like the idea of using Hive to connect to HBase because one of the main reasons to go away from Hive is its atrocious performance, so I hope that's not my 'best' alternative.
On the other hand, if people have used Simba and it works well, I'm curious to hear about that.
I am the developer on Simba's Phoenix driver. Hortonworks, Cloudera, Databricks, Microsoft, Amazon, Google, etc all choose Simba's drivers for a variety of products.
ie. http://hortonworks.com/partner/simba/,
http://www.simba.com/news/databricks-offers-simba-technologies-developed-odbc-3-8-connectivity-sql-capability-apache-spark/
Also, you need to choose either Phoenix on HBase or HBase standalone for all of your applications. The two types of drivers encode data in different binary representations. String and unsigned integers will be interpreted correctly, but unsigned integers and more complex data-types will be decoded differently.
ie. Phoenix doesn't display negative integer values correctly
So if you use Phoenix JDBC for your / applications, you cannot use an HBase ODBC driver with Tableau (unless you have nothing but strings and unsigned integers in your datasource). From your other postings, you do use Phoenix JDBC. So HBase ODBC is not an option for you.
The CData ODBC driver will allow you to connect Tableau to HBase (full disclosure: I work for CData Software). You can download a free Beta here. We have an article that outlines the configuration and connection, but I've copied the relevant information steps:
Create/configure a DSN from the ODBC Driver by setting the server address and port (we use the REST API, so the default port is 8080)
You should click the "Test Connection" button in the DSN Configuration wizard to ensure that you can establish a proper connection to your HBase database.
Click through the "Connect to Data" options to find "Other Database (ODBC)" and select the DSN you configured
Select CData as the database
Enter a Table name (or leave the Table field blank and click search (the magnifying glass) to see a list of Tables).
Once you have access to the tables, you can work with them exactly as you would any other table in Tableau (drag the table to the join area, manipulate Measures and Dimensions to view your data, etc.). If you have any questions, I or our Support Team will be happy to help.
I'm new to Cassandra and R. When I'm connecting to Cassandra database using RCassandra package, connection is establishing. But When trying to use any keyspace, R is not responding. I used the following statements.
c <- RC.connect('192.168.1.20', 9042)
RC.use(c, 'effesensors')
Please give me a brief idea about how to use RCassandra to avoid this problem.
Are you aware that you may be using a non default port for Cassandra? If you can provide the Cassandra version and RStudio version I may be able to update my answer. I found this tutorial by tarkalabs useful as a checklist of steps to take before any connection is attempted.
From the tutorial,
Now connect to your database with connect.handle <-
RC.connect(host="127.0.0.1", port=9160)
Cassandra by default listens to port 9160 but you can change it
according to your configuration. To show the cluster type into your
prompt RC.cluster.name(connect.handle)
Just to verify that you are connected and your Cassandra instance is running try the following command:
RC.describe.keyspaces(connect.handle)
That should bring back a list of the settings in your keyspaces. If nothing returns, you are either not connected or your Cassandra instance is not properly installed.
EXAMPLE OUTPUT
$system_traces$strategy_options
replication_factor
"2"
$system_traces$cf_defs
named list()
$system_traces$durable_writes
[1] TRUE
Let me know what your results are if my answer does not work and I will update my answer. Good Luck!
make use of RODBC instead of using RCassandra. We need to install Cassandra ODBC driver.
Thanks #D. Venkata Naresh, your suggestion of using RODBC driver resolved my issue.
I am using R and datastax cassandra community edition.
This is the link I followed to configure the ODBC driver in my windows machine.
https://www.datastax.com/dev/blog/using-the-datastax-odbc-driver-for-apache-cassandra
Then, in my R studio, These are the commands to connect and fetch from the Cassandra
install.packages("RODBC")
library("RODBC")
require("RODBC")
conn <- odbcConnect(<ODBC datasource name>)
dataframe <- sqlFetch(conn, <column family / table name>)
dataframe
Hope, this answer helps someone who is facing issue with RCassandra.
I read your comments above, you are using the wrong port. You should run the following command
c <- RC.connect('192.168.1.20', 9160)
This will definitely work for you.
I want to be able to query an Exchange address book via ODBC. It seems like there ought to be a driver available for this puprose but I can't find one.
MS Access can link the Exchange table, and I can then query the Access database via ODBC, but it is pitifully slow.
For the record, I'm not programming so I don't need ADO connection strings or whathaveyou. The database software I'm using (Drawbase, a space database for facilities management) needs an ODBC system data source so I need an ODBC appropriate driver so I can create one.
There is one here:
http://www.connect-gate.com/index.php/en/why-connect-gate/sql-for-communication
It's quite fast and cover the most important SQL statements.
I've done a fair amount of ODBC work and also run Exchange and I've never heard of one. :(
CData seems to provide the driver now: https://www.cdata.com/drivers/exchange/odbc/ - not sure if this is the same as the Connect Gate driver above or not.