I'm trying to automate data download from db using RJDBC using a for loop. The database im using automatically closes the connection after every 10mins, so what i want to do is somehow catch the error, remake the connection, and continue the loop. In order to do this i need to capture the error somehow, the problem is, it is not an r error so none of the commands trycatch and similar works. I just get a text on the console telling me:
Error in .jcheck() : No running JVM detected. Maybe .jinit() would help.
How do i handle this in terms of:
if (output == ERROR) {remake connection and run dbQuery} else {run dbQuery}
thanks for any help
You could use the pool package to abstract away the logic of connection management.
This does exactly what you expect regarding connection management with DBI.
It should work with RJDBC which is an implentation of DBI, but I didn't test it with this driver.
libray(pool)
library(RJDBC)
conn <- dbPool(
drv = RJDBC::JDBC(...),
dbname = "mydb",
host = "hostadress",
username = "test",
password = "test"
)
on.exit(poolClose(conn))
dbGetQuery(conn, "select... ")
Related
I have to following code
drv <- RPostgreSQL::PostgreSQL()
con <- DBI::dbConnect(drv, dbname = 'dbname', user = 'user',
host = 'host.name', port = 5432, password = 'password')
When I run it on server (Ubuntu server 16.04 with latest updates) running the database I get the following error:
Error in .valueClassTest(ans, "data.frame", "dbGetQuery") :
invalid value from generic function ‘dbGetQuery’, class “NULL”, expected “data.frame”
But when I run R from commandline with sudo, it works, when I run it from different my laptop connecting to the DB on the server it also works. So it shouldn't be connection problem. I am thinking about problem with access rights to some libraries/executables/configs on the system? Any help will be appreciated.
When I run the dbConnect multiple times and it ends with the error, when I run drv_info <- RPostgreSQL::dbGetInfo(drv), I still get multiple connectionIds in the drv_info:
drv_info <- RPostgreSQL::dbGetInfo(drv)
> drv_info
$drvName
[1] "PostgreSQL"
$connectionIds
$connectionIds[[1]]
<PostgreSQLConnection>
$connectionIds[[2]]
<PostgreSQLConnection>
$fetch_default_rec
[1] 500
$managerId
<PostgreSQLDriver>
$length
[1] 16
$num_con
[1] 2
$counter
[1] 2
Found a source of confusion, but not necessarily the root problem. (I was assuming RPostgres, while you are using RPostgreSQL (github mirror).)
If you check the source, you'll find that the method is calling postgresqlNewConnection, which includes a call to dbGetQuery. The problem you're seeing is that your call to dbConnect is failing (my guess is at line 100) and returns something unexpected, but postgresqlNewConnection continues.
I see three options for you:
try calling dbConnect(..., forceISOdate=FALSE) to bypass that one call to dbGetQuery (note that this doesn't fix the connection problem, but it at least will not give you the unexpected query error on connection attempt);
raise an issue for the package maintainers; or
switch to using RPostgres, still DBI-based and actively developed (it looks like RPostgreSQL has not had significant commits in the last few years, not sure if that's a sign of good code stability or development stagnation)
One lesson you may take from this is that you should check the value returned from dbConnect; if is.null(con), something is wrong. (Again, this does not solve the connection problem.)
I am using Cassandra CQL- system in DBeaver database tool. I want to connect this cassandra to R to read data. Unfortunately the connection takes more time (i waited for more than 2 hours) with RCassandra package. but it does not seem to get connected at all and still loading. Does anyone has any idea on this?
the code as follows:
library(RCassandra)
rc <- RC.connect(host ="********", port = 9042)
RC.login(rc, username = "*****", password = "******")
after this step RC.login, it is still loading for more than 2 hours.
I have also tried using RJDBC package like posted here : How to read data from Cassandra with R?.
library(RJDBC)
drv <- JDBC("org.apache.cassandra.cql.jdbc.CassandraDriver",
list.files("C:/Program Files/DBeaver/jre/lib",
pattern="jar$",full.names=T))
But this throws error
Error in .jfindClass(as.character(driverClass)[1]) : class not found
None of the answers are working for me from the above link.I am using latest R version 3.4.0 (2017-04-21) and New version of DBeaver : 4.0.4.
For your first approach, which I am less familiar with, should you not have a line that sets the use of the connection?
such as:
library(RCassandra)
c <- RC.connect(host ="52.0.15.195", port = 9042)
RC.login(c, username = "*****", password = "******")
RC.use(c, "some_db")
Did you check logs that you are not getting some silent error while connecting?
For your second approach, your R program is not seeing a driver in a classpath for Java (JMV).
See this entry for help how to fix it.
I am using dplyr package to connect to PostgreSQL, and my code is as below:
# Connect to local PostgreSQL via dplyr
library('dplyr')
localdb <- src_postgres(dbname = 'postgres',
host = 'localhost',
port = 5432,
user = 'postgres',
password = '236236')
#Write table from R to PostgreSQL
iris<-as.data.frame(iris)
dbWriteTable(localdb$con,'iris',iris, row.names=FALSE)
The connection is successful, but after about 5 minutes, a message popped up, saying "Auto-disconnecting postgres connection (4308, 1)". I am not sure how this issue comes from, and I need to deal with large data which takes more than 5 minutes to write to PostgreSQL, so I want to know how to solve this auto-disconnecting issue.
I've had similar problems with src_sqlite(). The error that I got was Auto-disconnecting SQLiteConnection. Apparently, the usage of the src_* functions is now discouraged (from the documentation of tbl()).
However, modern best practice is to use tbl() directly on an DBIConnection.
Before I was using the code below. The code in itself didn't return any errors. However, after using the same db again, I would get the error Auto-disconnecting SQLiteConnection.
path <- 'C:/sql.db'
sql_data <- src_sqlite(path)
# work on the sql_data variable
tbl(sql_data)
DBI::dbDisconnect(thesql$con)
As already said, the usage of src_sqlite is discouraged. Using the preferred code below solved my issue.
sql_data <- DBI::dbConnect(RSQLite::SQLite(), dbname = path)
# work on the sql_data variable
tbl(sql_data)
DBI::dbDisconnect(thesql)
Pay attention that the disconnect statement slightly changed and that the SQLite() arguments are in fact dbConnect() arguments.
I am trying to follow the "Getting started with MongoDB in R" page to get a database up and running. I have mongoDB installed in my PATH so I am able to run mongod from the terminal and open an instance. Though when I open an instance in the background and try running the following commands in R:
library(mongolite)
m <- mongo(collection = "diamonds") #diamonds is a built in dataset
It throws an error after that last statement saying:
Error: No suitable servers found (`serverSelectionTryOnce` set): [Failed to resolve 'localhost']
How do I enable it to find the connection I have open? Or is it something else? Thanks.
It could be that mongolite is looking in the wrong place for the local server. I solved this same problem for myself by explicitly adding the local host address in the connection call:
m <- mongo(collection = "diamonds", url = "mongodb://127.0.0.1")
The following code connects to my PostgreSQL database successfully (or appears to, at any rate), but attempt to issue queries were met with "relation does not exist" errors, so I tried dbListTables, which doesn't return any tables at all. The database name passed to dbConnect is correct, and the tables do exist. I think the code I'm using is exactly the same as what I was using recently, which worked successfully. Any ideas?
> library(RPostgreSQL)
Loading required package: DBI
> drv <- dbDriver("PostgreSQL")
> con <- dbConnect(drv, dbname="mydb", user="user", password=password)
> dbListTables(con)
character(0)
I'm new to both R and DBI, so I'm sure I could be missing something extremely simple...any help would be appreciated.
Solved-- I was right; it was something incredibly simple (and very, very stupid) on my part. I was running the script from the wrong server. The server I was running it from has an empty copy of the database I was attempting to connect to, so everything succeeded, and the empty result from dbListTables was correct. Once I switched servers (or simply specified the host on the other server), everything worked.
1.Connet to MySQL
a)if Mysql is installed in your system, if not install it.
b)download the RMySQL IN R
library(RMySQL)
drv = dbDriver("MySQL 5.0.1")
make sure MySQL version is correct.
con = dbConnect(drv,host="localhost",dbname="test",user="root",pass="root")
use local host or use the server i.e ip address
use the required database name, user name and password
album = dbGetQuery(con,statement="select * from table")
run required query
close(con)
2.Another way to connect database
a)first install any database like MySQL,Oracle,SQL Server
b)install the ODBC connector for database
library(Rodbc)
channel <- odbcConnect("test", uid="ripley", pwd="secret")
test is the connection name of odbc conector which user has to set manualy
user can find this in Administrator tool
res <- sqlFetch(ch, "table name")
A table can be retrieved as a data frame
res<-sqlQuery(channel, paste("select query"))
part of the with condition one table can be retrieved as a data frame
sqlSave(channel, dataframe)
to save a dataframe to the database(dont use "res<-" something like this)
like user can use
sqlCopy()
sqlDrop()
sqlTables()
close(channel)
always close the connection