Replacing a mysql table using R code - r

I am trying to export data from R data frame into mysql table names X using "sqlsave" function of RODBC. But, as per my process, I need to replace the data everytime in the table X. Can anyone let me know if I can do that inside R to either drop the table and then create an new table using sqlsave or if there is a way to replace the table X with new data.
Thanks!!!

If you need to overwrite table :
The simpler is to use RMySQL and dbWriteTable.
dbWriteTable(connection, data.frame, "MyTable", overwrite = TRUE, row.names = FALSE )
PS: if the table doesn't exist , the first call it will be created.

Related

How to insert nested values from R into Big Query

I would like to insert data into BQ via R. When I have a normal table everything is ok. The problem begins when I have to insert table which contains a map (nested/record repeated) column.
The column is defined like this:
I use bigrquery package and DBI like here:
dbWriteTable(
con,
"database.table",
table,
overwrite = FALSE,
append = TRUE,
row.names = FALSE
)
How should I define the customerdata column in R to insert it into Big Query? I've tried json and list but it didn't work. Although it could also be wrongly written json or list :)
I know that the example is not reproducible but it is rather not possible here or I have no idea how to create it.
Do you have any idea how to do this?

Get all rows that DBI::dbWriteTable has just written

I want to use dbWriteTable() of R's DBI package to write data into a database. Usually, the respective tables are already present so I use the argument append = TRUE. How do I get which rows were added to the table by dbWriteTable()? Most of tables have certain columns with UNIQUE values so a SELECT will work (see below for a simple example). However, this is not true for all of them or only several columns together are UNIQUE making the SELECT more complicated. In addition, I would like to put the writing and querying into a function so I would prefer a consistent approach for all cases.
I mainly need this to get the PRIMARY KEY's added by the database and to allow a user to quickly see what was added. If important, my database is PostgreSQL and I would like to use the odbc package for connection.
I have something like this in mind, however, I am looking for a more general solution:
library(DBI)
con <- dbConnect(odbc::odbc(), dsn = "database")
dbWriteTable(con,
name = "site", value = data.frame(name = c("abcd", "efgh"),
append = TRUE))
dbGetQuery(conn,
paste("SELECT * FROM site WHERE name in ('abcd', 'efgh');"))

Import table from R into PostgreSQL DB

I loaded some spatial data from my PostgreSQL DB into R with the help of the RPostgreSQL-package and ST_AsText:
dbGetQuery(con, "SELECT id, ST_AsText(geom) FROM table;")
After having done some analyses I want to go back that way. My geom column is stil formated as character / WKT. Unfortunately dbWriteTable doesn't accept similar arguments as dbGetQuery.
What is the best way to import spatial data from R to PostgreSQL?
Up to now the only way I found, is importing the data into the DB and using ST_GeomFromText in an additional step to get my geometry data type.
I created a table on my DB by using dbWriteTable() and the postGIStools-package to INSERT the data (it must be a SpatialPolygonsDataFrame).
## Create an empty table on DB
dbWriteTable(con, name=c("public", "<table>"), value=(dataframe[-c(1:nrow(dataframe)), ]))
require(postGIStools)
## INSERT INTO
postgis_insert(con, df=SpatialPolygonsDataFrame, tbl="table", geom_name="st_astext")
dbSendQuery(con, "ALTER TABLE <table> RENAME st_astext TO geom;")
dbSendQuery(con, "ALTER TABLE <table> ALTER COLUMN geom TYPE geometry;")

How we can write data to a postgres DB table using R?

I need to write data to a postgres DB table using R. If a data exist for an ID in the table, data should be updated otherwise new data should append to the table.
I tried this using 'RPostgreSQL' Package I got this error message
dbWriteTable(con, 'credit', credit,overwrite=TRUE,row.names=FALSE,append=TRUE)<br>
Error in postgresqlWriteTable(conn, name, value, ...) :overwrite and append cannot both be TRUE
You cannot use overwrite and append at once. If use overwrite command as follows, it will truncate the table and rewrite the data.
dbWriteTable(con, 'credit', credit,overwrite=TRUE,row.names=FALSE)
If use append it will Add the rows. But it won't update your results.
dbWriteTable(con, 'credit', credit,row.names=FALSE,append=TRUE)

Persistence of data frames(R objects) into Database

I have a database table lets say Table 1. Table 1 has 10 columns lets assume:
column1,column2,column3,column4,column5,column6,column7,column8,column9,column10...
I have a data-frame as
sample_frame<-data.frame(column1=1,column2=2,column3=3,column4=4)
I wish to persist the data frame i.e. sample_frame into my database table i.e. Table 1.
presently I am using ROracle package to write into database. the code which I am using is as follows:
library(ROracle)
dbWriteTable(con, name="Table 1", value=sample_frame, row.names = FALSE,
overwrite = FALSE,append = TRUE, schema ="sample_schema")
I have created connection object using dbConnect(), As far as integrity and null constraints of Table 1 is concerned, I have taken care of that. When I try to write into the table using dbWriteTable(), the following error is thrown:
"ORA-00947: not enough values"
Can someone correct the method I am using or provide me an alternative method of inserting selective columns(non-nullable columns) into the Table 1 while leaving other columns empty. I am using R 2.15.3
As I mentioned in my comment, you are creating sample_frame with lesser number of columns you are getting this error... Try this (if you actual table in database have same column names)
sample_frame<-data.frame(column1=1,column2=2,column3=3,column4=4,
column5=5,column6=6,column7=7,column8=8,
column9=9,column10=10)
library(ROracle)
dbWriteTable(con, name="Table 1", value=sample_frame, row.names = FALSE,
overwrite = FALSE,append = TRUE, schema ="sample_schema")
Update
Considering your new requirement, I would suggest you prepare a query and use following
qry = #You update query
dbSendQuery(con, qry)

Resources