I am trying to connect to an Oracle database with dplyr and dbplyr backend. However since the upgrade of dbplyr to version 2.0.0 it no longer works and getting the following error.
x <- tbl(con, in_schema("dm", "DM_CLT_POS_OVL_LIAB_HEDGE"))
Error in .oci.GetQuery(conn, statement, data = data, prefetch = prefetch, :
ORA-00942: table or view does not exist
In the previous version of dbplyr I could make it work with the redirection functions ( see below) but they have been removed with the upgrade.
#below are required to make the translation done by dbplyr to SQL produce working Oracle
SQLsql_translate_env.OraConnection <- dbplyr:::sql_translate_env.Oracle
sql_select.OraConnection <- dbplyr:::sql_select.Oracle
sql_subquery.OraConnection <- dbplyr:::sql_subquery.Oracle
Any help would be appreciated.
Came across the same issue using RJDBC package instead of an ODBC driver based approach and found the solution for it on the github page of dbplyr.
I'm not sure, whether it works and cannot try it myself but you might try to replace your assignments with these here:
SQLsql_translate_env.OraConnection <- dbplyr:::sql_translation.Oracle
sql_select.OraConnection <- dbplyr:::sql_query_select.Oracle
sql_subquery.OraConnection <- dbplyr:::sql_query_wrap.Oracle
Related
I've had a working R script that uses the dbWriteTable command to write to a SQL Server table. It has worked well without issue for a while ... until the last few days.
Now when I run the dbWriteTable command, I get the following warning:
Found more than one class "blob" in cache; using the first, from namespace 'blob'
Also defined by ‘jsonlite’
Interestingly enough the table appears to write successfully.
Here is some sample code:
library("DBI")
db_test <- dbConnect(
odbc(),
driver = "SQL Server",
server = "test_server",
port = 1234,
database = "test_db"
)
dbWriteTable(
conn = db_test,
name = SQL("dbo.swc_test_write_table"),
value = df_test,
overwrite = TRUE
)
I've tried explicitly naming the package, DBI::dbWriteTable, but it throws the same warning. For reference, I'm not using using the jsonlite package, but I have it installed.
Any thoughts on why this is happening?
This seems to be a bug, which is caused by the jsonlite package in its latest release 1.7.3.
See the bug report https://github.com/jeroen/jsonlite/issues/373
It seems to be fixed upstream and as of now there's also an updated version available, Changelog from 1.7.3 to 1.8.0.
Same problem here, seems to be interfering with encoding, possibly
EDIT:
Sorry for the "me too" post. I checked around a bit more and it's caused for me at least by the loading of the tidyverse library. Loading only DBI and odbc for me resolves the warning.
I am using dplyr package to connect to PostgreSQL, and my code is as below:
# Connect to local PostgreSQL via dplyr
library('dplyr')
localdb <- src_postgres(dbname = 'postgres',
host = 'localhost',
port = 5432,
user = 'postgres',
password = '236236')
#Write table from R to PostgreSQL
iris<-as.data.frame(iris)
dbWriteTable(localdb$con,'iris',iris, row.names=FALSE)
The connection is successful, but after about 5 minutes, a message popped up, saying "Auto-disconnecting postgres connection (4308, 1)". I am not sure how this issue comes from, and I need to deal with large data which takes more than 5 minutes to write to PostgreSQL, so I want to know how to solve this auto-disconnecting issue.
I've had similar problems with src_sqlite(). The error that I got was Auto-disconnecting SQLiteConnection. Apparently, the usage of the src_* functions is now discouraged (from the documentation of tbl()).
However, modern best practice is to use tbl() directly on an DBIConnection.
Before I was using the code below. The code in itself didn't return any errors. However, after using the same db again, I would get the error Auto-disconnecting SQLiteConnection.
path <- 'C:/sql.db'
sql_data <- src_sqlite(path)
# work on the sql_data variable
tbl(sql_data)
DBI::dbDisconnect(thesql$con)
As already said, the usage of src_sqlite is discouraged. Using the preferred code below solved my issue.
sql_data <- DBI::dbConnect(RSQLite::SQLite(), dbname = path)
# work on the sql_data variable
tbl(sql_data)
DBI::dbDisconnect(thesql)
Pay attention that the disconnect statement slightly changed and that the SQLite() arguments are in fact dbConnect() arguments.
I have a MYSQL (which I'm very new to) database hosted on azure, which i'm trying to write data to from RStudio using the RMySQL package.
I am receiving the following message
> dbWriteTable(searchdb,"searchrecords",data, append = TRUE)
Error in .local(conn, statement, ...) :
could not run statement: The used command is not allowed with this MySQL version
Can anybody provide a suggestion as to why this may be?
I'm working with a Mac Os 10.9.2 and a R version 3.0.2.
I used dbDriver() and dbConnect() to initiate the connection to my database. Next, I tried to connect to my postgres database using
c = readOGR("PG:dbname=OB", layer="geo.countries")
This does not work, and always returns a "Cannot open file" error.
I understood from https://stat.ethz.ch/pipermail/r-sig-geo/2010-January/007519.html that the reason for this is the absence of a driver for PostgreSQL. As can be seen by using the command ogrDrivers()
Does anybody can help me on how to install the driver? Or how I can make this work? Any help is much appreciated!
Thanks!
In the absence of the right driver, gdal/ogr usually throws and error like
Unable to find driver PostgreSQL
First, make sure that the database and layer exist. If it's true that the driver for Postgres isn't installed, you'll have to re-install gdal. Using homebrew:
brew uninstall gdal
brew install gdal --with-postgresql
See also this question.
If you are really sure that gdal is properly installed, make sure that your dns is fully specified (it has more parameters than dbname)
dsn="PG:dbname=DB host=HOST user=USER password=PSSWD port=5432"
Moreover, if your table contains more than one spatial columns (layers), you have to specify the one you want:
layer = "DB.TABLE(YOUR_SPATIAL_COLUMN)"
Took me a while to find out. But it was obvious after calling function
ogrListLayers()
It works for me. Just one issue, if you have some raster column in you table, it will not be excluded (as all other layers/spatial columns). Instead, it will be loaded into spatialobject#data dataframe as text column. Quite annoying in case of big rasters. I have posted question for this.
Just updated to R 3.0 and updated all the packages, including DBI. To my surprise, a script that I often use stopped working.
I am unable to connect to a MySQL database using dbConnect. The code script instantly, so only a few lines will reproduce the problem
> require("RMySQL")
> m = dbDriver("MySQL")
> dbConnect(m, user = 'user', password = 'pass', dbname = 'dbname', host = 'localhost', client.flag = CLIENT_MULTI_STATEMENTS)
Error in as.integer(from) :
cannot coerce type 'S4' to vector of type 'integer'
Calls: dbConnect ... mysqlNewConnection -> isIdCurrent -> as -> asMethod
Also tried it as:
dbConnect(MySQL(), user = 'user', password = 'pass', dbname = 'dbname', host = 'localhost', client.flag = CLIENT_MULTI_STATEMENTS)
but the same problem
Also tried removing other parameters, but the same issue from the dbDriver.
What changed in the DBI package with the latest update? How can I fix this?
I noticed that the DBI package is orphaned so don't know who to ask.
I had the same issue with R 3.0.1 on ubuntu.
Installing the latest version of the RMySQL-package resolved the problem:
> install.pacakges("RMySQL")
Make sure to restart R after the installation.
I'm still digging into the issue, but I think I've identified multiple causes of this issue. At their root, they all have to do with R expecting an S4 object but getting back an integer instead. I believe these are generally a result of the connection failing to establish.
Why is it failing? One thing I've noticed is that if you fail to close to many of your connections (~16 [see the number of maximum connections specified in the driver handle call] open) DBI won't/can't open a new connection. Make sure you are calling dbDisconnect as needed. Usually, this sort of problem results in a sensible error message, however sometimes results in the above referenced error. If possible access the DB through an abstraction layer, e.g. dplyr as some will monitor the db connections and kill them if they are inactive. Whereas, AFIK if you open a connection in a function and the function breaks, you have no way to close the open connection unless you returned the driver object from your initial call to dbConnect. In this case you have no choice but to restart your instance of R (possibly resetting your machine and clearing your workspace as well).
The other issue I recently encountered is that if RMySQL masks RPostgreSQL, then RPostgreSQL will fail. The reverse does not appear to be the case, but because others have mentioned RPostgreSQL in here as showing the same error message, it seemed worthy of note.
Update
The latest version of RMySQL (0.10.1) seems to have finished off RPostgreSQL - RPostgreSQL now fails to work regardless of load order. The same people working on RMySQL appear to be working on RPostgres (https://github.com/rstats-db/RPostgres) and this conflict appears to be a non-issue if using that package instead of RPostgreSQL. Specifically, use RPostgres::Postgres() in the place of RPostgreSQL::PostgreSQL() when specifying the driver in dbConnect. Other packages, e.g. dplyr, currently assume RPostgreSQL, so this issue can still bite (but it seems a resolution is in the works (https://github.com/rstats-db/RMySQL/issues/28).