I am using the following function to select values from a table.Table name is given as the parameter to that function.If the table does not exist an error generated and the execution is stoped.I want to do something if the table is not found.Is it possible in R? something like try-catch exception ?
library('RSQLite')
getData <- function(portfolio){
lite <- dbDriver("SQLite", max.con = 25)
db <- dbConnect(lite, dbname = "portfolioInfo.db")
res <- dbSendQuery(db, paste("SELECT * from ",portfolio," ",sep=""))
data <- fetch(res)
return (data)
}
getData("table1")
One way to do this would be to check the class of the data that is returned.
I am not familiar with RSQLite but I guess it will return a dataframe if the table is found and a text error when not?
So a possibility would be to check whether or not an error is raised:
checkQueryResult<-function(data){
if(class(data)=='data.frame') cat('SQL execution worked!')
else cat('Something went wrong, table does not exist?')
}
checkQueryResult(getData("table1"))
But maybe the RSQLite package already has error handling stuff built in?
Related
Below is a snippet of my code that I use in R to extract IDs from a PostgreSQL database. When I run the function I get the following warning message from R:
In result_create(conn#ptr, statement) :
Closing open result set, cancelling previous query
How do I avoid this warning message from happening without making use of options(warn=-1) at the beginning of my code, suppressing the warning instead of
con <- dbConnect(RPostgres::Postgres(),
user = "postgres",
dbname = "DataBaseName",
password = "123456",
port = 5431)
get_id <- function(connection, table){
query <- toString(paste("SELECT id FROM ", table, sep = ""))
data_extract_query <- dbSendQuery(connection, query)
data_extract <- dbFetch(data_extract_query)
return(data_extract)
}
get_id(con, "users")
I found a method for solving the problem.
I found a thread on GitHub for RSQLite a https://github.com/r-dbi/RSQLite/issues/143. In this thread, they explicitly set n = -1 in the dbFetch() function.
This seemed to solve my problem, and the warning message did not show up again by editing the code like the following:
data_extract <- dbFetch(data_extract_query, n = -1)
The meaning of n is the number of rows that the query should return. By setting this to -1 all rows will be retrieved. By default, it is set to n = -1 but for some reason, in this build of R (3.6.3) the warning will still be shown.
Calling ?dbFetch in R you can see more information on this. I have included a snippet from the R-help page:
Usage
dbFetch(res, n = -1, ...)
fetch(res, n = -1, ...)
Arguments
res An object
inheriting from DBIResult, created by dbSendQuery().
n maximum number of records to retrieve per fetch. Use n = -1 or
n = Inf to retrieve all pending records. Some implementations may
recognize other special values.
... Other arguments passed on to methods.
This issue comes up with other database implementations if the results are not cleared before submitting a new one. From the docs of DBI::dbSendQuery
Usage
dbSendQuery(conn, statement, ...)
...
Value
dbSendQuery() returns an S4 object that inherits from DBIResult. The result set can be used with dbFetch() to extract records. Once you have finished using a result, make sure to clear it with dbClearResult(). An error is raised when issuing a query over a closed or invalid connection, or if the query is not a non-NA string. An error is also raised if the syntax of the query is invalid and all query parameters are given (by passing the params argument) or the immediate argument is set to TRUE.
To get rid of the warning the get_id() function must be modified as follows:
get_id <- function(connection, table){
query <- toString(paste("SELECT id FROM ", table, sep = ""))
data_extract_query <- dbSendQuery(connection, query)
data_extract <- dbFetch(data_extract_query)
# Here we clear whatever remains on the server
dbClearResult(data_extract_query)
return(data_extract)
}
See Examples section in help for more.
Using :
-IronPython
-AutoDesk
-Revit (PyRevit)
-Revit API
-SQLite3
My code is as follows:
try:
conn = sqlite3.connect('SQLite_Python.db')
c = conn.cursor()
print("connected")
Insert_Volume = """INSERT INTO Column_Coordinates
(x, y, z)
VALUES
(1, 2, 3)"""
count = c.execute(Insert_Volume)
conn.commit()
print("Volume values inserted", c.rowcount)
c.close()
except sqlite3.Error as error:
print("Failed to insert data into sqlite table", error)
finally:
if (conn):
conn.close()
print("The SQLite connection is closed")'''
This code used to work within PyRevit, but now does not, with the following error:
Exception : System.IO.IOException: Could not add reference to assembly IronPython.SQLite
Please advise, this is one of the early steps of a large project and therefore is delaying my work quite a bit.
I look forward to your reply.
I've set up a connection between R and SQL using the package RODBC and managed to connect and query the database from R.
I've created a small R function which objective is to delete some lines (as parameter) in a specific table.
Here it is (nameDB is the name of my database, and values_conversion another function I did to convert some data from R format to SQL format) :
delete_SQL = function(data, table){
ch = odbcConnect(nameDB,"postgres")
names = sqlColumns(channel=ch,table,schema="public",catalog = nameDB)$COLUMN_NAME
for(i in 1:nrow(data)){
sqlQuery(channel=ch,query=paste0("DELETE FROM public.\"",table,"\" WHERE ",
paste0(names," = ",values_conversion(data[i,]),collapse = " and "),";"),errors = TRUE)
}
odbcCloseAll()
}
Query exemple : "DELETE FROM public.\"lieu_protection\" WHERE lieu_id = 3 and protection_id = 1430;"
The code inside this function works fine when I execute everything directly, but when I call the function it throws an
Error in sqlQuery(channel = ch, query = paste0("DELETE FROM public.\"", :
first argument is not an open RODBC channel
I have a similar function that's getting and returning data from SQL and which is working fine, so I guess it has something to do with the delete, but the error is about the channel, so I'm quite confused.
Thanks to anyone that can help !
I try to create a loop that will create data frames by using different sql queries that have the same name exept the day number.For example here is the name for query that is for day 1 :SF_2013_Day1_BaseLine and this is for day 6: SF_2013_Day6_BaseLine. I wrote this code (below) but I got an error : Error: unexpected ',' in "for(i in 3," .So any Idea how can i get this code to work ?
Thank you
for (i in 1,3,6,10,14,21,30) {
SF_FinalviewQ3_2013_Day[i]_BaseLine<-sqlQuery(DB,"select * from [SF_2013_Day[i]_BaseLine]");
dim(SF_Day[i]_BaseLine)}
After a change based on #Pgibas edvice to this code :
for (i in c(1,3,6,10,14,21,30)) {
SF_FinalviewQ3_2013_Day[i]_BaseLine<-sqlQuery(DB,"select * from [SF_2013_Day[i]_BaseLine]");
dim(SF_Day[i]_BaseLine)}
I got this error: Error: unexpected input in "for(i in c(3,6)){glm_d[i]_" . What can I do to resolve the problem?
You need to resolve i within the names first:
for (i in c(1,3,6,10,14,21,30)) {
set <- sqlQuery(DB, paste0("select * from [SF_2013_Day[", i, "]_BaseLine]"))
eval(parse(text = paste0("SF_FinalviewQ3_2013_Day", i, "_BaseLine <- set"))
dim(set)
}
I am using the RGoogleAnalytics library to get all the data from my Google Analytics Account into R. However, complex queries deliver 0 results.
My code looks like:
query.list <- Init(start.date = paste(c(lastmonth.startdate)),
end.date = paste(c(lastmonth.enddate)),
metrics = "ga:goalCompletionsAll",
dimensions = "ga:countryIsoCode,ga:yearMonth",
filters = "ga:goalCompletionsAll>0",
max.results = 10000,
table.id = sprintf("ga:%s", sites$profile.id[i]))
# Create the Query Builder object so that the query parameters are validated
ga.query <- QueryBuilder(query.list)
# Extract the data and store it in a data-frame
ga.countriesConversions1 <- GetReportData(ga.query, token)
Everything is inside a "for", and the script stops if one of the queries end in 0 results, because GetReportData(ga.query, token) cannot create a dataframe if there is no data.
I would like to know if there is a way use the warning message ("Your query matched 0 results. Please verify your query using the Query Feed Explorer and re-run it") fired by the library to the console, assign it to a variable and use this as an if condition. So I could create a dummy data.frame before the next function comes.
Assuming getReportData is throwing an error, then you can try:
ga.countriesConversions1 <- try(GetReportData(ga.query, token), silent=TRUE)
if(inherits(ga.countriesConversions1, "try-error")) {
warning(geterrmessage())
... error handling logic ...
}