Updating table in SQL-Server from R - r

I am downloading data from EuroStat with the R code below:
library(eurostat)
tps00176<-get_eurostat("tps00176",time_format = "num")%>%
data.frame()
This code works successfully, and after that, I transfer this data to SQL-Server, with code below:
sqlSave(uat_conn, tps00176, tablename = "integration.tps00176",rownames=FALSE,
varTypes = c(
"citizen"="varchar(5)",
"agedef"="varchar(8)",
"age"="varchar(6)",
"unit"="varchar(3)",
"sex"="varchar(2)",
"geo"="varchar(3)",
"time"="integer",
"values"="integer"))
So far, so good. But problems arise when I want to update this data after some period. I receive a message in the R that this table already exists.
Error in sqlSave(uat_conn, integration.tps00176, tablename = "integration.integration.tps00176", :
table ‘integration.integration.tps00176’ already exists
So can anybody help me how to update this table with commands from R instead of created?

Related

How to save to pre-existing Snowflake table from R using pool

I am using pool to handle connections to my Snowflake warehouse. I have created a connection to my database and can read data in a pre-existing table with no issues e.g:
my_pool <- dbPool(odbc::odbc(),
Driver = "Snowflake",
Server = Sys.getenv('WH_URL'),
UID = Sys.getenv('WH_USER'),
PWD = Sys.getenv('WH_PW'),
Warehouse = Sys.getenv('WH_WH'),
Database = "MY_DB")
my_data<-tbl(my_pool, in_schema(sql("schema_name"), sql("table_name"))) %>%
collect()
I would like to save back to a table (table_name) and I believe the best way to do this is with pool::dbWriteTable:
# Create some data to save to db
data<-data.frame("user_email" = "tim#apple.com",
"query_run" = "arrivals_departures",
"data_downloaded" = FALSE,
"created_at" = as.character(Sys.time()))
# Define where to save the data
table_id <- Id(database="MY_DB", schema="MY_SCHEMA", table="TABLE_NAME")
# Write to database
pool::dbWriteTable(my_pool, table_id, data, append=TRUE)
However this returns the error:
Error in new_result(connection#ptr, statement, immediate) :
nanodbc/nanodbc.cpp:1594: 00000: SQL compilation error:
Object 'MY_DB.MY_SCHEMA.TABLE_NAME' already exists.
I have read/write/update permissions for this database for the user specified in my_pool.
I have explored the accepted answers here and here to create the above attempt and can't figure out what I'm doing wrong. It's probably something simple that I've forgotten to do - any thoughts?
EDIT: Wondering if my issue is anything to do with: https://github.com/r-dbi/odbc/issues/480

Appending data in snowflake table

I am trying to write back data frame R into snowflake table using below code :
table_id <- Id(database="database", schema="schema", table="tablename")
dbWriteTable(conn, table_id, R data frame name,append=TRUE,overwrite=FALSE)
It works for the first time when table is not there, but fails to append data. It throws an error as "Object already exists". I tried using dbAppendTable as well, but it alsi does not work.
This worked for me:
table_id <- Id(database="database", schema="schema", table="tablename")
dbWriteTable(conn, table_id, R data frame name, append=TRUE)

R dbReadTable returns no such table error

I ran this bit of code yesterday successfully, but today, I am getting a 'no such table' error. I am trying to pull data in a table (named tblLatjamInputs) from an SQLite database into R using DBI::dbReadTable(), but it is acting as though the table does not exist.
Using SQLiteStudio
and separately the command line
,
I can see that the table does indeed exist and that there are data in the table.
Here is the code, both written out and as a screenshot so you can see the error I'm getting.
setwd("D:/hsm/databases")
db <- dbConnect(SQLite(), conn = "lookup_and_tracking.sqlite")
tblName <- "tblLatjamInputs"
df.full <- dbReadTable(db, tblName)
Error in result_create(conn#ptr, statement) : no such table: tblLatjamInputs
I got the same error when the tblName line is changed to this: tblName <- dbQuoteIdentifier(db, "tblLatjamInputs")
dbListTables(db) returns character(0), and dbListFields(db, "lkpSpecies") (a different table in the db) returns the no such table error as well.
I checked that there are no spaces around the table name in the database. I also tried to pull data from other tables (to see if it was just an issue with this table), but I got the same error. I have tried disconnecting and reconnecting to the database multiple times, including disconnecting from the db, closing SQLiteStudio and the command line, and then reopening. I also tried closing everything, including R, reloading the project, and starting again from scratch. I also tried connecting to a different database altogether with the same results (R makes the connection, but can't seem to find any tables). I'm totally baffled because, as I mentioned, all this works fine in the command line, and I did this yesterday with the same database, table, and lines of code, and it worked fine.
Use
db <- dbConnect(SQLite(), "lookup_and_tracking.sqlite")
The problem is the file name parameter is not named conn=; It's dbname= and the default is "" which creates a new, empty data base.

How to use Rsqlite to save and load data in sqlite

I am learning SQLite and I have a big data frame in csv format and I imported into the SQLite.
db <- dbConnect(SQLite(), dbname="myDB.sqlite")
dbWriteTable(conn = db, name = "myDB", dataframe, overwrite=T,
row.names=FALSE)
after that, I saw there is a myDB.sqlite in my directory but with zero byte. How can I save the dataframe in the sqlite so that I don't need to write table everytime. Thanks.
It should be written in your db. Like i said before, your code works for me it's just that the R-Studio File Viewer doesn't automatically Refresh when some of the files have been written to.
Just to be sure that data was written to your db try running this dbGetQuery(conn=db, "SELECT * FROM myDB). That should return your data frame.

Persistence of data frames(R objects) into Database

I have a database table lets say Table 1. Table 1 has 10 columns lets assume:
column1,column2,column3,column4,column5,column6,column7,column8,column9,column10...
I have a data-frame as
sample_frame<-data.frame(column1=1,column2=2,column3=3,column4=4)
I wish to persist the data frame i.e. sample_frame into my database table i.e. Table 1.
presently I am using ROracle package to write into database. the code which I am using is as follows:
library(ROracle)
dbWriteTable(con, name="Table 1", value=sample_frame, row.names = FALSE,
overwrite = FALSE,append = TRUE, schema ="sample_schema")
I have created connection object using dbConnect(), As far as integrity and null constraints of Table 1 is concerned, I have taken care of that. When I try to write into the table using dbWriteTable(), the following error is thrown:
"ORA-00947: not enough values"
Can someone correct the method I am using or provide me an alternative method of inserting selective columns(non-nullable columns) into the Table 1 while leaving other columns empty. I am using R 2.15.3
As I mentioned in my comment, you are creating sample_frame with lesser number of columns you are getting this error... Try this (if you actual table in database have same column names)
sample_frame<-data.frame(column1=1,column2=2,column3=3,column4=4,
column5=5,column6=6,column7=7,column8=8,
column9=9,column10=10)
library(ROracle)
dbWriteTable(con, name="Table 1", value=sample_frame, row.names = FALSE,
overwrite = FALSE,append = TRUE, schema ="sample_schema")
Update
Considering your new requirement, I would suggest you prepare a query and use following
qry = #You update query
dbSendQuery(con, qry)

Resources