How do i fix the warning message "Closing open result set, cancelling previous query" when querying a PostgreSQL database in R? - r

Below is a snippet of my code that I use in R to extract IDs from a PostgreSQL database. When I run the function I get the following warning message from R:
In result_create(conn#ptr, statement) :
Closing open result set, cancelling previous query
How do I avoid this warning message from happening without making use of options(warn=-1) at the beginning of my code, suppressing the warning instead of
con <- dbConnect(RPostgres::Postgres(),
user = "postgres",
dbname = "DataBaseName",
password = "123456",
port = 5431)
get_id <- function(connection, table){
query <- toString(paste("SELECT id FROM ", table, sep = ""))
data_extract_query <- dbSendQuery(connection, query)
data_extract <- dbFetch(data_extract_query)
return(data_extract)
}
get_id(con, "users")

I found a method for solving the problem.
I found a thread on GitHub for RSQLite a https://github.com/r-dbi/RSQLite/issues/143. In this thread, they explicitly set n = -1 in the dbFetch() function.
This seemed to solve my problem, and the warning message did not show up again by editing the code like the following:
data_extract <- dbFetch(data_extract_query, n = -1)
The meaning of n is the number of rows that the query should return. By setting this to -1 all rows will be retrieved. By default, it is set to n = -1 but for some reason, in this build of R (3.6.3) the warning will still be shown.
Calling ?dbFetch in R you can see more information on this. I have included a snippet from the R-help page:
Usage
dbFetch(res, n = -1, ...)
fetch(res, n = -1, ...)
Arguments
res An object
inheriting from DBIResult, created by dbSendQuery().
n maximum number of records to retrieve per fetch. Use n = -1 or
n = Inf to retrieve all pending records. Some implementations may
recognize other special values.
... Other arguments passed on to methods.

This issue comes up with other database implementations if the results are not cleared before submitting a new one. From the docs of DBI::dbSendQuery
Usage
dbSendQuery(conn, statement, ...)
...
Value
dbSendQuery() returns an S4 object that inherits from DBIResult. The result set can be used with dbFetch() to extract records. Once you have finished using a result, make sure to clear it with dbClearResult(). An error is raised when issuing a query over a closed or invalid connection, or if the query is not a non-NA string. An error is also raised if the syntax of the query is invalid and all query parameters are given (by passing the params argument) or the immediate argument is set to TRUE.
To get rid of the warning the get_id() function must be modified as follows:
get_id <- function(connection, table){
query <- toString(paste("SELECT id FROM ", table, sep = ""))
data_extract_query <- dbSendQuery(connection, query)
data_extract <- dbFetch(data_extract_query)
# Here we clear whatever remains on the server
dbClearResult(data_extract_query)
return(data_extract)
}
See Examples section in help for more.

Related

I am trying to extract values and place them in a .db to use later, however my code has bugged and I can no longer even load SQLite

Using :
-IronPython
-AutoDesk
-Revit (PyRevit)
-Revit API
-SQLite3
My code is as follows:
try:
conn = sqlite3.connect('SQLite_Python.db')
c = conn.cursor()
print("connected")
Insert_Volume = """INSERT INTO Column_Coordinates
(x, y, z)
VALUES
(1, 2, 3)"""
count = c.execute(Insert_Volume)
conn.commit()
print("Volume values inserted", c.rowcount)
c.close()
except sqlite3.Error as error:
print("Failed to insert data into sqlite table", error)
finally:
if (conn):
conn.close()
print("The SQLite connection is closed")'''
This code used to work within PyRevit, but now does not, with the following error:
Exception : System.IO.IOException: Could not add reference to assembly IronPython.SQLite
Please advise, this is one of the early steps of a large project and therefore is delaying my work quite a bit.
I look forward to your reply.

RODBC channel error in global environnement

I've set up a connection between R and SQL using the package RODBC and managed to connect and query the database from R.
I've created a small R function which objective is to delete some lines (as parameter) in a specific table.
Here it is (nameDB is the name of my database, and values_conversion another function I did to convert some data from R format to SQL format) :
delete_SQL = function(data, table){
ch = odbcConnect(nameDB,"postgres")
names = sqlColumns(channel=ch,table,schema="public",catalog = nameDB)$COLUMN_NAME
for(i in 1:nrow(data)){
sqlQuery(channel=ch,query=paste0("DELETE FROM public.\"",table,"\" WHERE ",
paste0(names," = ",values_conversion(data[i,]),collapse = " and "),";"),errors = TRUE)
}
odbcCloseAll()
}
Query exemple : "DELETE FROM public.\"lieu_protection\" WHERE lieu_id = 3 and protection_id = 1430;"
The code inside this function works fine when I execute everything directly, but when I call the function it throws an
Error in sqlQuery(channel = ch, query = paste0("DELETE FROM public.\"", :
first argument is not an open RODBC channel
I have a similar function that's getting and returning data from SQL and which is working fine, so I guess it has something to do with the delete, but the error is about the channel, so I'm quite confused.
Thanks to anyone that can help !

Error in open.connection(con, "rb") : Timeout was reached: Resolving timed out after 10000 milliseconds

So I've got a list of "Player" objects, each with an ID, called players and I'm trying to reach a web JSON with information related to the relevant ID, using JSONlite.
The HTML stem is: 'https://fantasy.premierleague.com/drf/element-summary/'
I need to access every players respective page.
I'm trying to do so as follows:
playerDataURLStem = 'https://fantasy.premierleague.com/drf/element-summary/'
for (player in players) {
player_data_url <- paste(playerDataURLStem,player#id,sep = "")
player_data <- fromJSON(player_data_url)
# DO SOME STUFF #
}
When I run it, I'm getting the error Error in open.connection(con, "rb") : Timeout was reached: Resolving timed out after 10000 milliseconds. This error is produced at a different position in my list of players each time I run the code and when I check the webpage that is causing the error, I can't see anything erroneous about it. This leads me to believe that sometimes the pages just take longer than 10000 milliseconds to reply, but using
options(timeout = x)
for some x, doesn't seem to make it wait longer for a response.
For a minimum working example, try:
playerDataURLStem = 'https://fantasy.premierleague.com/drf/element-summary/'
ids <- c(1:540)
for (id in ids) {
player_data_url <- paste(playerDataURLStem, id, sep = "")
player_data <- fromJSON(player_data_url)
print(player_data$history$id[1])
}
options(timeout= 4000000) is working for me .try increasing value of timeout to higher number

R Scripts - Use Output Message as if condition

I am using the RGoogleAnalytics library to get all the data from my Google Analytics Account into R. However, complex queries deliver 0 results.
My code looks like:
query.list <- Init(start.date = paste(c(lastmonth.startdate)),
end.date = paste(c(lastmonth.enddate)),
metrics = "ga:goalCompletionsAll",
dimensions = "ga:countryIsoCode,ga:yearMonth",
filters = "ga:goalCompletionsAll>0",
max.results = 10000,
table.id = sprintf("ga:%s", sites$profile.id[i]))
# Create the Query Builder object so that the query parameters are validated
ga.query <- QueryBuilder(query.list)
# Extract the data and store it in a data-frame
ga.countriesConversions1 <- GetReportData(ga.query, token)
Everything is inside a "for", and the script stops if one of the queries end in 0 results, because GetReportData(ga.query, token) cannot create a dataframe if there is no data.
I would like to know if there is a way use the warning message ("Your query matched 0 results. Please verify your query using the Query Feed Explorer and re-run it") fired by the library to the console, assign it to a variable and use this as an if condition. So I could create a dummy data.frame before the next function comes.
Assuming getReportData is throwing an error, then you can try:
ga.countriesConversions1 <- try(GetReportData(ga.query, token), silent=TRUE)
if(inherits(ga.countriesConversions1, "try-error")) {
warning(geterrmessage())
... error handling logic ...
}

db operation exception handling in R

I am using the following function to select values from a table.Table name is given as the parameter to that function.If the table does not exist an error generated and the execution is stoped.I want to do something if the table is not found.Is it possible in R? something like try-catch exception ?
library('RSQLite')
getData <- function(portfolio){
lite <- dbDriver("SQLite", max.con = 25)
db <- dbConnect(lite, dbname = "portfolioInfo.db")
res <- dbSendQuery(db, paste("SELECT * from ",portfolio," ",sep=""))
data <- fetch(res)
return (data)
}
getData("table1")
One way to do this would be to check the class of the data that is returned.
I am not familiar with RSQLite but I guess it will return a dataframe if the table is found and a text error when not?
So a possibility would be to check whether or not an error is raised:
checkQueryResult<-function(data){
if(class(data)=='data.frame') cat('SQL execution worked!')
else cat('Something went wrong, table does not exist?')
}
checkQueryResult(getData("table1"))
But maybe the RSQLite package already has error handling stuff built in?

Resources