Import table from R into PostgreSQL DB - r

I loaded some spatial data from my PostgreSQL DB into R with the help of the RPostgreSQL-package and ST_AsText:
dbGetQuery(con, "SELECT id, ST_AsText(geom) FROM table;")
After having done some analyses I want to go back that way. My geom column is stil formated as character / WKT. Unfortunately dbWriteTable doesn't accept similar arguments as dbGetQuery.
What is the best way to import spatial data from R to PostgreSQL?
Up to now the only way I found, is importing the data into the DB and using ST_GeomFromText in an additional step to get my geometry data type.

I created a table on my DB by using dbWriteTable() and the postGIStools-package to INSERT the data (it must be a SpatialPolygonsDataFrame).
## Create an empty table on DB
dbWriteTable(con, name=c("public", "<table>"), value=(dataframe[-c(1:nrow(dataframe)), ]))
require(postGIStools)
## INSERT INTO
postgis_insert(con, df=SpatialPolygonsDataFrame, tbl="table", geom_name="st_astext")
dbSendQuery(con, "ALTER TABLE <table> RENAME st_astext TO geom;")
dbSendQuery(con, "ALTER TABLE <table> ALTER COLUMN geom TYPE geometry;")

Related

Append two tables inside sqlite database using R

I have two very large csv files that contain the same variables. I want to combine them into one table inside a sqlite database - if possible using R.
I successfully managed to put both csv files in separate tables inside one database using inborutils::csv_to_sqlite that one imports small chunks of data at a time.
Is there a way to create a third tables where both tables are simply appended using R (keeping in mind the limited RAM)? And if not - how else can I perform this task? Maybe via the terminal?
We assume that when the question refers to the "same variables" that it means that the two tables have the same column names. Below we create two such test tables, BOD and BOD2, and then in the create statement we combine them creating table both. This does the combining entirely on the SQLite side. Finally we use look at both.
library(RSQLite)
con <- dbConnect(SQLite()) # modify to refer to existing SQLite database
dbWriteTable(con, "BOD", BOD)
dbWriteTable(con, "BOD2", 10 * BOD)
dbExecute(con, "create table both as select * from BOD union select * from BOD2")
dbReadTable(con, "both")
dbDisconnect(con)

Writing spatial data from R into MS SQL Server using sf::st_write()

I am creating a database for a large monitoring project. I have already set up the schema on the database and am now trying to populate the tables. I am using R's DBI:: package to transfer data from R to SQL. I have been successful in transferring all datatype except for my spatial data. For the tabular data I have been using DBI::dbWriteTable() However, from scouring other posts, it seems like loading spatial data is better done using the sf::st_write() function in the sf:: package. However, I am getting several errors:
In my local instance of SQL Server, I am getting an error that I am not providing a valid instance of datatype geometry. The reproducible example below will throw this error.
On my network instance of SQL Server, I am getting an error Invalid object name 'spatial_ref_sys' Unfortunately, I was unable to reproduce this error with example data.
N.B.: In the code below, you will need to replace the name of your local instance of sql server in the connection strings
##Loading Necessary Packages##
library(DBI)
library(sf)
library(spData)
##Getting Example Data from R's spData package##
data("us_states")
##Creating a test database in a local instance of MS SQL Server##
con<-dbConnect(odbc::odbc(), .connection_string="driver={SQL Server Native Client 11.0};
server=localsqlserver;trusted_connection=yes")
dbSendQuery(con, "CREATE DATABASE test;")
dbDisconnect(con)
##Changing the connection string to connect directly to test database##
con2<-dbConnect(odbc::odbc(), .connection_string="driver={SQL Server Native Client 11.0};
server=localsqlserver;database=test;trusted_connection=yes")
##Writing tabular data to new table in test##
DF<-us_states_df
dbWriteTable(con2, "States", DF)
##Adding a column for spatial data##
dbSendQuery(con2, "ALTER TABLE dbo.States ADD geom geometry")
##Writing spatial data to new column##
geom_tmp<-us_states$geometry
geom<-st_transform(geom_tmp, "+init=epsg:2992")
st_write(obj=geom, dsn=con2, layer = Id(schema="dbo", table="States"), driver="MSSQLSpatial", append=TRUE)
My goal at the end of the day is simply to add the spatial data in geom to the geom column in test.dbo.states I am open to other avenues that might accomplish this. Thanks in advance for any help.
Take Care,
-Sean
After much tinkering I think I found the solution. Admittedly it's a bit of a workaround, but it isn't too ugly. Instead of trying to write a single column, I wrote the entire table using sf::st_write(). Importantly, while I could not find a way to write geometries directly into SQL, I found out that I could write a Well-Known-Text to SQL. Once it was in the SQL database I used the geometery::STGeomFromText() stored procedure to convert from WKT to geometry. Below is the updated code:
N.B.: Change the server to the name of your sql server instance in the connection strings below for reproducibility
##Loading Necessary Packages##
library(DBI)
library(sf)
library(spData)
##Getting Example Data from R's spData package##
data("us_states")
##Creating a test database in a local instance of MS SQL Server##
con<-dbConnect(odbc::odbc(), .connection_string="driver={SQL Server Native Client 11.0};
server=localsqlserver;trusted_connection=yes")
dbSendQuery(con, "CREATE DATABASE test;")
dbDisconnect(con)
##Changing the connection string to connect directly to test database##
con2<-dbConnect(odbc::odbc(), .connection_string="driver={SQL Server Native Client 11.0};
server=localsqlserver;database=test;trusted_connection=yes")
##Writing tabular data to new table in test##
DF<-as.data.frame(us_states)
geom<-DF$geometry
DF[,"geom"]<-st_as_text(st_transform(geom,"+init=epsg:2992"))
##Writing table to database##
dbWriteTable(con2, Id(schema="dbo", table="States"), DF[,-7])
##Writing a SQL Statement to create new column with geometry datatype##
##Adding a column for spatial data##
dbSendQuery(con2, "ALTER TABLE dbo.States ADD geom2 geometry")
##Writing spatial data to new column##
dbSendQuery(con2, "UPDATE dbo.States
Set geom2 = geometry::STGeomFromText(geom, 2992)")
##Dropping the WKT column##
dbSendQuery(con2, "ALTER TABLE dbo.States
DROP COLUMN geom")
##View the results##
DB<-dbGetQuery(con2, "SELECT * FROM dbo.States")
DB

How to create table in PostgreSQL using R?

I'd like to create table in PostgrSQL using R DBI package.
Here is s small example.
dbExecute(con, "create table data1 (var1 int not null, var2 date not null, var3 int)")
where con is connection object.
But I got an error Failed to fetch row and something else that I cannot read due to UTF-8 encoding problem.
Also I tried dbSendQuery and dbGetquery. The same result.
How can write a code to complete this task?
A one restiction applied. I know that there is a dbCreateTablecommand which creates table in PosgreSQL. But it uses R notation, but I want apply exact SQL notation.
Thanks in advance.
I got the way out.
All I need is to read the following code:
RPostgres:: dbSendQuery(con, "create table ...")

How we can write data to a postgres DB table using R?

I need to write data to a postgres DB table using R. If a data exist for an ID in the table, data should be updated otherwise new data should append to the table.
I tried this using 'RPostgreSQL' Package I got this error message
dbWriteTable(con, 'credit', credit,overwrite=TRUE,row.names=FALSE,append=TRUE)<br>
Error in postgresqlWriteTable(conn, name, value, ...) :overwrite and append cannot both be TRUE
You cannot use overwrite and append at once. If use overwrite command as follows, it will truncate the table and rewrite the data.
dbWriteTable(con, 'credit', credit,overwrite=TRUE,row.names=FALSE)
If use append it will Add the rows. But it won't update your results.
dbWriteTable(con, 'credit', credit,row.names=FALSE,append=TRUE)

Replacing a mysql table using R code

I am trying to export data from R data frame into mysql table names X using "sqlsave" function of RODBC. But, as per my process, I need to replace the data everytime in the table X. Can anyone let me know if I can do that inside R to either drop the table and then create an new table using sqlsave or if there is a way to replace the table X with new data.
Thanks!!!
If you need to overwrite table :
The simpler is to use RMySQL and dbWriteTable.
dbWriteTable(connection, data.frame, "MyTable", overwrite = TRUE, row.names = FALSE )
PS: if the table doesn't exist , the first call it will be created.

Resources