How to add a SQLite temp table from an R dataframe? - r

I have an SQLite database connection to a database file. I want to extract some data from one of the tables, do some processing in R and then create a temporary table on the same connection from the processed data. It needs to be a temp table because users may not have write access to the database, but I want to be able to query this new data alongside the data already in the database.
so, for example:
require(sqldf)
db <- dbConnect(SQLite(), "tempdb")
dbWriteTable(db, "iris", iris)
# do some processing in R:
d <- dbGetQuery(db, "SELECT Petal_Length, Petal_Width FROM iris;")
names(d) <- c("length_2", "width_2")
d <- exp(d)
and then I want to make a temporary table in the connection db from d
I know I could do:
dbWriteTable(conn=db, name= "iris_proc", value = d)
but I need it in a temp table and there doesn't seem to be an option for this in dbWriteTable.
One workaround I thought of was to add a temp table and then add columns and update them:
dbGetQuery(db, "CREATE TEMP TABLE iris_proc AS SELECT Species FROM iris;")
dbGetQuery(db, "ALTER TABLE iris_proc ADD COLUMN length_2;")
But then I can't get the data from d into the columns:
dbGetQuery(db, paste("UPDATE iris2 SET length_2 =", paste(d$length_2, collapse = ", "), ";"))
Error in sqliteExecStatement(con, statement, bind.data) :
RS-DBI driver: (error in statement: near "4.05519996684467": syntax error)
I imagine that, even if I get this to work, it will be horribly inefficient.
I thought there might have been some way to do this with read.csv.sql but this does not seem to work with open connection objects.

Use an in-memory database for the temporary table:
library(RSQLite)
db <- dbConnect(SQLite(), "tempdb")
dbWriteTable(db, "iris", iris)
d <- dbGetQuery(db, "SELECT Petal_Length, Petal_Width FROM iris")
d <- exp(d)
dbGetQuery(db, "attach ':memory:' as mem")
dbWriteTable(db, "mem.d", d, row.names = FALSE) # d now in mem database
dbGetQuery(db, "select * from iris limit 3")
dbGetQuery(db, "select * from mem.d limit 3")
dbGetQuery(db, "select * from sqlite_master")
dbGetQuery(db, "select * from mem.sqlite_master")

Related

Update Azure Table using R

I want tu insert this data frame to my table datatable...
library(RODBC)
conn <- odbcConnect("CData Azure Source")
sqlTables(conn)
df <- data.frame(a=1:10, b=10:1, c=11:20)
values <- paste("(",df$a,",", df$b,",",df$c,")", sep="", collapse=",")
cmd <- paste("insert into datatable values ", values)
result <- sqlQuery(conn, cmd, as.is=FALSE)
print(result)
But I'm having this problem...
[1] "42000 -1 Malformed SQL Statement: Expected '('. Found: values\r\nStatement:insert into datatable values (1,10,11),(2,9,12),(3,8,13),(4,7,14),(5,6,15),(6,5,16),(7,4,17),(8,3,18),(9,2,19),(10,1,20)"
[2] "[RODBC] ERROR: Could not SQLExecDirect 'insert into datatable values (1,10,11),(2,9,12),(3,8,13),(4,7,14),(5,6,15),(6,5,16),(7,4,17),(8,3,18),(9,2,19),(10,1,20)'"
Can any anybody point where i'm wrong...

Connecting R To Teradata VOLATILE TABLE

I am using R to try and connect to a teradata database and am running into difficulties
The steps in the process are below
1) Create Connection
2) Create a VOLATILE TABLE
3) Load information from a data frame into the Volatile table
Here is where it fails, giving me an error message
Error in sqlSave(conn, mydata, tablename = "TEMP", rownames = FALSE, :
first argument is not an open RODBC channel
The code is below
# Import Data From Text File and remove duplicates
mydata = read.table("Keys.txt")
mydata.unique = unique(mydata)
strSQL.TempTable = "CREATE VOLATILE TABLE TEMP………[Table Details]"
"UNIQUE PRIMARY INDEX(index)"
"ON COMMIT PRESERVE ROWS;"
# Connect To Database
conn <- tdConnect('Teradata')
# Execute Temp Table
tdQuery(strSQL.TempTable)
sqlSave(conn, mydata, tablename = "TEMP ",rownames = FALSE, append = TRUE)
Can anyone help, Is it closing off the connection before I can upload the information into the Table?
My Mistake, I have been confusing libraries
Basically the lines
# Connect To Database
conn <- tdConnect('Teradata')
# Execute Temp Table
tdQuery(strSQL.TempTable)
sqlSave(conn, mydata, tablename = "TEMP ",rownames = FALSE, append = TRUE)
can all be replaced by this
# Connect To Database
channel <- odbcConnect('Teradata')
# Execute Temp Table
sqlQuery(channel, paste(strSQL.TempTable))
sqlSave(channel, mydata, table = "TEMP",rownames = FALSE, append = TRUE)
Now I'm being told, i don't have access to do this but this is another question for another forum
Thanks

update table in postgresql database through r

How do I update data in a postgresql db through R with new data?
I've tried
dbGetQuery(con,"UPDATE table SET column1=:1,column2=:2, column3=:3
where id=:4", data=Rdata[,c("column1", "column3", "column3","id")])
I also tried with the colons replaced with $ but that didn't work either. I keep getting:
Error in postgresqlExecStatement(conn, statement, ...) :
unused argument(s)
I figured it out using:
update <- function(i) {
drv <- dbDriver("PostgreSQL")
con <- dbConnect(drv, dbname="db_name", host="localhost", port="5432", user="chris", password="password")
txt <- paste("UPDATE data SET column_one=",data$column_one[i],",column_two=",data$column_two[i]," where id=",data$id[i])
dbGetQuery(con, txt)
dbDisconnect(con)
}
registerDoMC()
foreach(i = 1:length(data$column_one), .inorder=FALSE,.packages="RPostgreSQL")%dopar%{
update(i)
}
At least the RODBC has a specific function sqlUpdate:
sqlUpdate updates the table where the rows already exist. Data frame
dat should contain columns
with names that map to (some of) the columns in the table
See http://cran.r-project.org/web/packages/RODBC/RODBC.pdf

How to insert a dataframe into a SQL Server table?

I'm trying to upload a dataframe to a SQL Server table, I tried breaking it down to a simple SQL query string..
library(RODBC)
con <- odbcDriverConnect("driver=SQL Server; server=database")
df <- data.frame(a=1:10, b=10:1, c=11:20)
values <- paste("(",df$a,",", df$b,",",df$c,")", sep="", collapse=",")
cmd <- paste("insert into MyTable values ", values)
result <- sqlQuery(con, cmd, as.is=TRUE)
..which seems to work but does not scale very well. Is there an easier way?
[edited] Perhaps pasting the names(df) would solve the scaling problem:
values <- paste( " df[ , c(",
paste( names(df),collapse=",") ,
")] ", collapse="" )
values
#[1] " df[ , c( a,b,c )] "
You say your code is "working".. I would also have thought one would use sqlSave rather than sqlQuery if one wanted to "upload".
I would have guessed this would be more likely to do what you described:
sqlSave(con, df, tablename = "MyTable")
This worked for me and I found it to be simpler.
library(sqldf)
library(odbc)
con <- dbConnect(odbc(),
Driver = "SQL Server",
Server = "ServerName",
Database = "DBName",
UID = "UserName",
PWD = "Password")
dbWriteTable(conn = con,
name = "TableName",
value = x) ## x is any data frame
Since insert INTO is limited to 1000 rows, you can dbBulkCopy from rsqlserver package.
dbBulkCopy is a DBI extension that interfaces the Microsoft SQL Server popular command-line utility named bcp to quickly bulk copying large files into table. For example:
url = "Server=localhost;Database=TEST_RSQLSERVER;Trusted_Connection=True;"
conn <- dbConnect('SqlServer',url=url)
## I assume the table already exist
dbBulkCopy(conn,name='T_BULKCOPY',value=df,overwrite=TRUE)
dbDisconnect(conn)

How to import from SQLite database?

I have an SQLite database file exported from Scraperwiki with .sqlite file extension. How do I import it into R, presumably mapping the original database tables into separate data frames?
You could use the RSQLite package.
Some example code to store the whole data in data.frames:
library("RSQLite")
## connect to db
con <- dbConnect(drv=RSQLite::SQLite(), dbname="YOURSQLITEFILE")
## list all tables
tables <- dbListTables(con)
## exclude sqlite_sequence (contains table information)
tables <- tables[tables != "sqlite_sequence"]
lDataFrames <- vector("list", length=length(tables))
## create a data.frame for each table
for (i in seq(along=tables)) {
lDataFrames[[i]] <- dbGetQuery(conn=con, statement=paste("SELECT * FROM '", tables[[i]], "'", sep=""))
}
To anyone else that comes across this post, a nice way to do the loop from the top answer using the purr library is:
lDataFrames <- map(tables, ~{
dbGetQuery(conn=con, statement=paste("SELECT * FROM '", .x, "'", sep=""))
})
Also means you don't have to do:
lDataFrames <- vector("list", length=length(tables))
Putting together sgibb's and primaj's answers, naming tables, and adding facility to retrieve all tables or a specific table:
getDatabaseTables <- function(dbname="YOURSQLITEFILE", tableName=NULL){
library("RSQLite")
con <- dbConnect(drv=RSQLite::SQLite(), dbname=dbname) # connect to db
tables <- dbListTables(con) # list all table names
if (is.null(tableName)){
# get all tables
lDataFrames <- map(tables, ~{ dbGetQuery(conn=con, statement=paste("SELECT * FROM '", .x, "'", sep="")) })
# name tables
names(lDataFrames) <- tables
return (lDataFrames)
}
else{
# get specific table
return(dbGetQuery(conn=con, statement=paste("SELECT * FROM '", tableName, "'", sep="")))
}
}
# get all tables
lDataFrames <- getDatabaseTables(dbname="YOURSQLITEFILE")
# get specific table
df <- getDatabaseTables(dbname="YOURSQLITEFILE", tableName="YOURTABLE")

Resources