Appending data in snowflake table - r

I am trying to write back data frame R into snowflake table using below code :
table_id <- Id(database="database", schema="schema", table="tablename")
dbWriteTable(conn, table_id, R data frame name,append=TRUE,overwrite=FALSE)
It works for the first time when table is not there, but fails to append data. It throws an error as "Object already exists". I tried using dbAppendTable as well, but it alsi does not work.

This worked for me:
table_id <- Id(database="database", schema="schema", table="tablename")
dbWriteTable(conn, table_id, R data frame name, append=TRUE)

Related

Updating table in SQL-Server from R

I am downloading data from EuroStat with the R code below:
library(eurostat)
tps00176<-get_eurostat("tps00176",time_format = "num")%>%
data.frame()
This code works successfully, and after that, I transfer this data to SQL-Server, with code below:
sqlSave(uat_conn, tps00176, tablename = "integration.tps00176",rownames=FALSE,
varTypes = c(
"citizen"="varchar(5)",
"agedef"="varchar(8)",
"age"="varchar(6)",
"unit"="varchar(3)",
"sex"="varchar(2)",
"geo"="varchar(3)",
"time"="integer",
"values"="integer"))
So far, so good. But problems arise when I want to update this data after some period. I receive a message in the R that this table already exists.
Error in sqlSave(uat_conn, integration.tps00176, tablename = "integration.integration.tps00176", :
table ‘integration.integration.tps00176’ already exists
So can anybody help me how to update this table with commands from R instead of created?

Trouble accessing the first column of a dataset in R

I am a beginner in R. I work in RStudio
After importing my dataset with the function:
mydata <- read.table("dataset.txt",sep="\t",dec=",", h=T, row.names=1)
the file is imported and I can see it in RStudio. Then, I would like to look at my data:
table (column2, column3) # it does work and give me the table
everything looks OK. However, if I ask for the First column
table (column1, column2) # it doesn't work!
I received the error message:
"Error in table(column1) : object 'ID_site' not found"
It seems the first column is not part of my dataset...
Do you know Why? Is there an option to choose before data importation?
Try removing the row.names=1 part from your read.table call. When you do that, R takes the first column of your dataset and uses it as the rownames of your data frame. That, in turn removes the ID of the column, which is why R says your "ID_site" column can't be found.

How we can write data to a postgres DB table using R?

I need to write data to a postgres DB table using R. If a data exist for an ID in the table, data should be updated otherwise new data should append to the table.
I tried this using 'RPostgreSQL' Package I got this error message
dbWriteTable(con, 'credit', credit,overwrite=TRUE,row.names=FALSE,append=TRUE)<br>
Error in postgresqlWriteTable(conn, name, value, ...) :overwrite and append cannot both be TRUE
You cannot use overwrite and append at once. If use overwrite command as follows, it will truncate the table and rewrite the data.
dbWriteTable(con, 'credit', credit,overwrite=TRUE,row.names=FALSE)
If use append it will Add the rows. But it won't update your results.
dbWriteTable(con, 'credit', credit,row.names=FALSE,append=TRUE)

How to use dbWriteTable(2) properly?

I have started to use these packages
require(RPostgreSQL)
require(sqldf)
require(caroline)
in order to process data between PostgreSQL an R.
I created a schema public in my database with some tables (one is called "data") and attributes. I imported the table "data" back into R as the data frame "test".
After manipulating it a bit I tried to upload it back to PostgreSQL DB into the (empty) table "data". However, I always get a error which I cant find a solution for:
dbWriteTable2(con, "data", test, fill.null = T, add.id=T,
row.names=FALSE, pg.update.seq=FALSE)
[1] "loading draft table to database"
[1] FALSE
Warning message:
In postgresqlWriteTable(conn, name, value, ...) :
table data exists in database: aborting assignTable
I know that the table "data" exists. Therefore I use dbWriteTable2 function, which seems to be able to insert the data frame into the corresponding attributes within the table "data".
When droping table data from PostgreSQL DB and rerun I get:
dbWriteTable2(con, "data", test, fill.null = T, add.id=T,
row.names=FALSE, pg.update.seq=FALSE)
Error in `[.data.frame`(dbGetQuery(conn, paste("select a.attname from pg_attribute a, pg_class c, pg_tables t, pg_namespace nsp", :
undefined columns selected
Just to sum up, how can I upload a data frame into an (empty) existing table within a PostgreSQL DB.

Replacing a mysql table using R code

I am trying to export data from R data frame into mysql table names X using "sqlsave" function of RODBC. But, as per my process, I need to replace the data everytime in the table X. Can anyone let me know if I can do that inside R to either drop the table and then create an new table using sqlsave or if there is a way to replace the table X with new data.
Thanks!!!
If you need to overwrite table :
The simpler is to use RMySQL and dbWriteTable.
dbWriteTable(connection, data.frame, "MyTable", overwrite = TRUE, row.names = FALSE )
PS: if the table doesn't exist , the first call it will be created.

Resources