I need to write data to a postgres DB table using R. If a data exist for an ID in the table, data should be updated otherwise new data should append to the table.
I tried this using 'RPostgreSQL' Package I got this error message
dbWriteTable(con, 'credit', credit,overwrite=TRUE,row.names=FALSE,append=TRUE)<br>
Error in postgresqlWriteTable(conn, name, value, ...) :overwrite and append cannot both be TRUE
You cannot use overwrite and append at once. If use overwrite command as follows, it will truncate the table and rewrite the data.
dbWriteTable(con, 'credit', credit,overwrite=TRUE,row.names=FALSE)
If use append it will Add the rows. But it won't update your results.
dbWriteTable(con, 'credit', credit,row.names=FALSE,append=TRUE)
Related
I am trying to write back data frame R into snowflake table using below code :
table_id <- Id(database="database", schema="schema", table="tablename")
dbWriteTable(conn, table_id, R data frame name,append=TRUE,overwrite=FALSE)
It works for the first time when table is not there, but fails to append data. It throws an error as "Object already exists". I tried using dbAppendTable as well, but it alsi does not work.
This worked for me:
table_id <- Id(database="database", schema="schema", table="tablename")
dbWriteTable(conn, table_id, R data frame name, append=TRUE)
I loaded some spatial data from my PostgreSQL DB into R with the help of the RPostgreSQL-package and ST_AsText:
dbGetQuery(con, "SELECT id, ST_AsText(geom) FROM table;")
After having done some analyses I want to go back that way. My geom column is stil formated as character / WKT. Unfortunately dbWriteTable doesn't accept similar arguments as dbGetQuery.
What is the best way to import spatial data from R to PostgreSQL?
Up to now the only way I found, is importing the data into the DB and using ST_GeomFromText in an additional step to get my geometry data type.
I created a table on my DB by using dbWriteTable() and the postGIStools-package to INSERT the data (it must be a SpatialPolygonsDataFrame).
## Create an empty table on DB
dbWriteTable(con, name=c("public", "<table>"), value=(dataframe[-c(1:nrow(dataframe)), ]))
require(postGIStools)
## INSERT INTO
postgis_insert(con, df=SpatialPolygonsDataFrame, tbl="table", geom_name="st_astext")
dbSendQuery(con, "ALTER TABLE <table> RENAME st_astext TO geom;")
dbSendQuery(con, "ALTER TABLE <table> ALTER COLUMN geom TYPE geometry;")
I have started to use these packages
require(RPostgreSQL)
require(sqldf)
require(caroline)
in order to process data between PostgreSQL an R.
I created a schema public in my database with some tables (one is called "data") and attributes. I imported the table "data" back into R as the data frame "test".
After manipulating it a bit I tried to upload it back to PostgreSQL DB into the (empty) table "data". However, I always get a error which I cant find a solution for:
dbWriteTable2(con, "data", test, fill.null = T, add.id=T,
row.names=FALSE, pg.update.seq=FALSE)
[1] "loading draft table to database"
[1] FALSE
Warning message:
In postgresqlWriteTable(conn, name, value, ...) :
table data exists in database: aborting assignTable
I know that the table "data" exists. Therefore I use dbWriteTable2 function, which seems to be able to insert the data frame into the corresponding attributes within the table "data".
When droping table data from PostgreSQL DB and rerun I get:
dbWriteTable2(con, "data", test, fill.null = T, add.id=T,
row.names=FALSE, pg.update.seq=FALSE)
Error in `[.data.frame`(dbGetQuery(conn, paste("select a.attname from pg_attribute a, pg_class c, pg_tables t, pg_namespace nsp", :
undefined columns selected
Just to sum up, how can I upload a data frame into an (empty) existing table within a PostgreSQL DB.
I have a database table lets say Table 1. Table 1 has 10 columns lets assume:
column1,column2,column3,column4,column5,column6,column7,column8,column9,column10...
I have a data-frame as
sample_frame<-data.frame(column1=1,column2=2,column3=3,column4=4)
I wish to persist the data frame i.e. sample_frame into my database table i.e. Table 1.
presently I am using ROracle package to write into database. the code which I am using is as follows:
library(ROracle)
dbWriteTable(con, name="Table 1", value=sample_frame, row.names = FALSE,
overwrite = FALSE,append = TRUE, schema ="sample_schema")
I have created connection object using dbConnect(), As far as integrity and null constraints of Table 1 is concerned, I have taken care of that. When I try to write into the table using dbWriteTable(), the following error is thrown:
"ORA-00947: not enough values"
Can someone correct the method I am using or provide me an alternative method of inserting selective columns(non-nullable columns) into the Table 1 while leaving other columns empty. I am using R 2.15.3
As I mentioned in my comment, you are creating sample_frame with lesser number of columns you are getting this error... Try this (if you actual table in database have same column names)
sample_frame<-data.frame(column1=1,column2=2,column3=3,column4=4,
column5=5,column6=6,column7=7,column8=8,
column9=9,column10=10)
library(ROracle)
dbWriteTable(con, name="Table 1", value=sample_frame, row.names = FALSE,
overwrite = FALSE,append = TRUE, schema ="sample_schema")
Update
Considering your new requirement, I would suggest you prepare a query and use following
qry = #You update query
dbSendQuery(con, qry)
I am trying to export data from R data frame into mysql table names X using "sqlsave" function of RODBC. But, as per my process, I need to replace the data everytime in the table X. Can anyone let me know if I can do that inside R to either drop the table and then create an new table using sqlsave or if there is a way to replace the table X with new data.
Thanks!!!
If you need to overwrite table :
The simpler is to use RMySQL and dbWriteTable.
dbWriteTable(connection, data.frame, "MyTable", overwrite = TRUE, row.names = FALSE )
PS: if the table doesn't exist , the first call it will be created.