How to use dbWriteTable(2) properly? - r

I have started to use these packages
require(RPostgreSQL)
require(sqldf)
require(caroline)
in order to process data between PostgreSQL an R.
I created a schema public in my database with some tables (one is called "data") and attributes. I imported the table "data" back into R as the data frame "test".
After manipulating it a bit I tried to upload it back to PostgreSQL DB into the (empty) table "data". However, I always get a error which I cant find a solution for:
dbWriteTable2(con, "data", test, fill.null = T, add.id=T,
row.names=FALSE, pg.update.seq=FALSE)
[1] "loading draft table to database"
[1] FALSE
Warning message:
In postgresqlWriteTable(conn, name, value, ...) :
table data exists in database: aborting assignTable
I know that the table "data" exists. Therefore I use dbWriteTable2 function, which seems to be able to insert the data frame into the corresponding attributes within the table "data".
When droping table data from PostgreSQL DB and rerun I get:
dbWriteTable2(con, "data", test, fill.null = T, add.id=T,
row.names=FALSE, pg.update.seq=FALSE)
Error in `[.data.frame`(dbGetQuery(conn, paste("select a.attname from pg_attribute a, pg_class c, pg_tables t, pg_namespace nsp", :
undefined columns selected
Just to sum up, how can I upload a data frame into an (empty) existing table within a PostgreSQL DB.

Related

Appending data in snowflake table

I am trying to write back data frame R into snowflake table using below code :
table_id <- Id(database="database", schema="schema", table="tablename")
dbWriteTable(conn, table_id, R data frame name,append=TRUE,overwrite=FALSE)
It works for the first time when table is not there, but fails to append data. It throws an error as "Object already exists". I tried using dbAppendTable as well, but it alsi does not work.
This worked for me:
table_id <- Id(database="database", schema="schema", table="tablename")
dbWriteTable(conn, table_id, R data frame name, append=TRUE)

How to write an R Data Frame to a Snowflake database table

Does anyone know how to WRITE an R Data Frame to a new Snowflake database table? I have a successful Snowflake ODBC connection created in R, and can successfully query from Snowflake. The connection command is: conn <- DBI::dbConnect(odbc::odbc(), "Snowflake").
Now, I want to WRITE a data frame created in R back to Snowflake as a table. I used the following command: dbWriteTable(conn, "database.schema.tablename", R data frame name). Using this command successfully connects with Snowflake, but I get the following error message: "Error in new_result(connection#ptr, statement) : nanodbc/nanodbc.cpp:1344: 22000: Cannot perform CREATE TABLE. This session does not have a current database. Call 'USE DATABASE', or use a qualified name."
I am using a qualified database name in my "database.schema.tablename" argument in the dbWriteTable function. I don't see how to employ "USE DATABASE" in my R function. Any ideas?? Thank you!!
The API for DBI::dbWriteTable(…) requires passing either the literal table name as a string, or as a properly quoted identifier:
dbWriteTable(conn, name, value, ...)
conn: A DBIConnection object, as returned by dbConnect().
name: A character string specifying the unquoted DBMS table name, or the result of a call to dbQuoteIdentifier().
value: a data.frame (or coercible to data.frame).
dbWriteTable(conn, "database.schema.tablename", R data frame name)
Your code above will attempt to create a table literally named "database.schema.tablename", using the database and schema context associated with the connection object.
For example, if your connection had a database DB and schema SCH set, this would have succeeded in creating a table called DB.SCH."database.schema.tablename".
To define the database, schema and table names properly, use the DBI::Id class object with the right hierarchal order:
table_id <- Id(database="database", schema="schema", table="tablename")
dbWriteTable(conn, table_id, R data frame name)
Behind the scenes, the DBI::dbWriteTable(…) function recognizes the DBI::Id class argument type for name, and converts it into a quoted identifier format via DBI::dbQuoteIdentifier(…) (as a convenience).

RODBC sqlSave can not append, Access column name with white space

I can not append data frame into the an existing table, because the table in access has white space.
e.g.
Access table variable name: Dlr Name
data frame variable name: DlrName
I try to replicate access table variable name to my df using varTypes=varTypes, but when I run the following code
sqlSave(access, Contracts_CurrMth, "T_Contracts", append=TRUE, rownames=FALSE, fast=FALSE, safer=TRUE, verbose=TRUE)
Looks like R automatically delete the space in my df variable name and The error msg is
HYS22 -1507 [Microsoft][ODBC Microsoft Access Driver] The INSERT INTO statement contains the following unknown field name: DlrName
Anyone can help me on this please?
Some more background info:
R 32bit, access 2010, 32bit, both data table are same format
my data frame table is like this data frame
my access table is like this access table

How we can write data to a postgres DB table using R?

I need to write data to a postgres DB table using R. If a data exist for an ID in the table, data should be updated otherwise new data should append to the table.
I tried this using 'RPostgreSQL' Package I got this error message
dbWriteTable(con, 'credit', credit,overwrite=TRUE,row.names=FALSE,append=TRUE)<br>
Error in postgresqlWriteTable(conn, name, value, ...) :overwrite and append cannot both be TRUE
You cannot use overwrite and append at once. If use overwrite command as follows, it will truncate the table and rewrite the data.
dbWriteTable(con, 'credit', credit,overwrite=TRUE,row.names=FALSE)
If use append it will Add the rows. But it won't update your results.
dbWriteTable(con, 'credit', credit,row.names=FALSE,append=TRUE)

Persistence of data frames(R objects) into Database

I have a database table lets say Table 1. Table 1 has 10 columns lets assume:
column1,column2,column3,column4,column5,column6,column7,column8,column9,column10...
I have a data-frame as
sample_frame<-data.frame(column1=1,column2=2,column3=3,column4=4)
I wish to persist the data frame i.e. sample_frame into my database table i.e. Table 1.
presently I am using ROracle package to write into database. the code which I am using is as follows:
library(ROracle)
dbWriteTable(con, name="Table 1", value=sample_frame, row.names = FALSE,
overwrite = FALSE,append = TRUE, schema ="sample_schema")
I have created connection object using dbConnect(), As far as integrity and null constraints of Table 1 is concerned, I have taken care of that. When I try to write into the table using dbWriteTable(), the following error is thrown:
"ORA-00947: not enough values"
Can someone correct the method I am using or provide me an alternative method of inserting selective columns(non-nullable columns) into the Table 1 while leaving other columns empty. I am using R 2.15.3
As I mentioned in my comment, you are creating sample_frame with lesser number of columns you are getting this error... Try this (if you actual table in database have same column names)
sample_frame<-data.frame(column1=1,column2=2,column3=3,column4=4,
column5=5,column6=6,column7=7,column8=8,
column9=9,column10=10)
library(ROracle)
dbWriteTable(con, name="Table 1", value=sample_frame, row.names = FALSE,
overwrite = FALSE,append = TRUE, schema ="sample_schema")
Update
Considering your new requirement, I would suggest you prepare a query and use following
qry = #You update query
dbSendQuery(con, qry)

Resources