Append rows to MYSQL table from R - r

I have table 'df1' in mysql , now i am trying to append next set of rows (df2) to the df1,
I am trying to append the df2 to df1 as below
connection <- dbConnect(MySQL(), user = 'root', password = 'pass',
host = 'localhost', dbname = 'data')
dbWriteTable(connection,"df1",df2,row.names=T,append=T)
ERROR:could not run statement: Table 'df1' already exists
please suggest Any modifications in the above code
Thanks in advance

The following works fine for me:
library(RMySQL)
library(stringi)
con <- dbConnect(MySQL(), user = 'test', password = '****',
host = 'localhost', dbname = 'test')
dbf1 <- data.frame(x=round(runif(500, 1, 1000),0),
y=rnorm(500, mean=100, sd=21),
z=stri_rand_strings(500, 5),
stringsAsFactors = FALSE)
dbWriteTable(con, "test1", dbf1)
dbDisconnect(con)
and then:
library(RMySQL)
library(stringi)
con <- dbConnect(MySQL(), user = 'test', password = '****',
host = 'localhost', dbname = 'test')
dbf2 <- data.frame(x=round(runif(300, 10000, 11000),0),
y=rnorm(300, mean=100, sd=21),
z=stri_rand_strings(300, 5),
stringsAsFactors = FALSE)
dbWriteTable(con, "test1", dbf2, append = TRUE)
dbGetQuery(con, "SELECT count(x) FROM test1")
dbDisconnect(con)
The query returns:
count(x)
1 800
showing that the second set of 300 rows has been appended as expected.
You need to give a working example of the error that we can run. The immediate thing that springs to mind is that using T and F as abbreviations for TRUE and FALSE is bad practice! Whilst TRUE and FALSE are reserved words in R and cannot be changed, T and F are ordinary variables and can be modified. So, without seeing your whole script, there is no guarantee that row.names=T,append=T actually means that append=TRUE!

Related

dbAppendTable() error when I try to append data to a local server

I'm just starting my journey with r, so I'm a complete newbie and I can't find anything that will help me solve this.
I have a csv table (random integers in each column) with 9 columns. I read 8 and I want to append them to a sql table with 8 fields (Col1 ... 8, all int's). After uploading the csv into rStudio, it looks right and only has 8 columns:
The code I'm using is:
# Libraries
library(DBI)
library(odbc)
library(tidyverse )
# CSV Files
df = head(
read_delim(
"C:/Data/test.txt",
" ",
trim_ws = TRUE,
skip = 1,
skip_empty_rows = TRUE,
col_types = cols('X7'=col_skip())
)
, -1
)
# Add Column Headers
col_headings <- c('Col1', 'Col2', 'Col3', 'Col4', 'Col5', 'Col6', 'Col7', 'Col8')
names(df) <- col_headings
# Connect to SQL Server
con <- dbConnect(odbc(), "SQL", timeout = 10)
# Append data
dbAppendTable(conn = con,
schema = "tmp",
name = "test",
value = df,
row.names = NULL)
I'm getting this error message:
> Error in result_describe_parameters(rs#ptr, fieldDetails) :
> Query requires '8' params; '18' supplied.
I ran into this issue also. I agree with Hayward Oblad, the dbAppendTable function appears to be finding another table of the same name throwing the error. Our solution was to specify the name parameter as an Id() (from DBI::Id())
So taking your example above:
# Append data
dbAppendTable(conn = con,
name = Id(schema = "tmp", table = "test"),
value = df,
row.names = NULL)
Ran into this issue...
Error in result_describe_parameters(rs#ptr, fieldDetails) : Query
requires '6' params; '18' supplied.
when saving to a snowflake database and couldn't find any good information on the error.
Turns out that there was a test schema where the tables within the schema had exactly the same names as in the prod schema. DBI::dbAppendTable() doesn't differentiate the schemas, so until those tables in the text schema got renamed to unique table names, the params error persisted.
Hope this saves someone the 10 hours I spent trying to figure out why DBI was throwing the error.
See he for more on this.
ODBC/DBI in R will not write to a table with a non-default schema in R
add the name = Id(schema = "my_schema", table = "table_name") to DBI::dbAppendTable()
or in my case it was the DBI::dbWriteTable().
Not sure why the function is not using the schema from my connection object though.. seems redundant.

ROracle dbWriteTable affirms insertion that didnt happen

My goal is to create a table in the database and fill it with data afterwards. This is my code:
library(ROracle)
# ... "con" is the connection string, created in an earlier stage!
# 1 create example
testdata <- data.frame(A = c(1,2,3), B = c(4,5,6))
# 2 create-statement
createTable <- paste0("CREATE TABLE TestTable(", paste(paste(colnames(testdata), c("integer", "integer")), collapse = ","), ")")
# 3 send and execute query
dbGetQuery(con, createTable)
# 4 write example data
dbWriteTable(con, "TestTable", testdata, row.names = TRUE, append = TRUE)
I already suceeded a few times. The table was created and filled.
Now step 4 doesn't work anymore, R returns TRUE after execution of dbWriteTable though. But the table is still empty.
I know this is a vague question, but does anyone have an idea what could be wrong here?
I found the solution for my problem. After creating the table in step 3, you have to commit! After that, the data is written into the table.
library(ROracle)
# ... "con" is the connection string, created in an earlier stage!
# 1 create example
testdata <- data.frame(A = c(1,2,3), B = c(4,5,6))
# 2 create-statement
createTable <- paste0("CREATE TABLE TestTable(", paste(paste(colnames(testdata), c("integer", "integer")), collapse = ","), ")")
# 3 send and execute query
dbGetQuery(con, createTable)
# NEW LINE: COMMIT!
dbCommit(con)
# 4 write example data
dbWriteTable(con, "TestTable", testdata, row.names = TRUE, append = TRUE)

Error When Insert data frame using RMySQL on R

Using R, I tried to insert a data frame. My script looks like below:
con <- dbConnect(RMySQL::MySQL(), username = "xxxxxx", password = "xxxxxx",host = "127.0.0.1", dbname = "xxxxx")
dbWriteTable(conn=con,name='table',value=as.data.frame(df), append = TRUE, row.names = F)
dbDisconnect(con)
WHen the script hit below line:
dbWriteTable(conn=con,name='table',value=as.data.frame(df), append = TRUE, row.names = F)
I got an error like below:
Error in .local(conn, statement, ...) : could not run statement: Invalid utf8 character string: 'M'
I am not sure why this error occurred. This is a part of script that has been run well on another machine.
Please advise
You should create proper connection, then only can insert data frame to your DB.
for creating connection username & password, host name & data base name should correct. same code only but i removed some parameter
try this:
mydb = dbConnect(MySQL(), user='root', password='password', dbname='my_database', host='localhost')
i just insert Iris data in my_database
data(iris)
dbWriteTable(mydb, name='db', value=iris)
i inserted iris data frame in the name of db in my_database

Connecting R To Teradata VOLATILE TABLE

I am using R to try and connect to a teradata database and am running into difficulties
The steps in the process are below
1) Create Connection
2) Create a VOLATILE TABLE
3) Load information from a data frame into the Volatile table
Here is where it fails, giving me an error message
Error in sqlSave(conn, mydata, tablename = "TEMP", rownames = FALSE, :
first argument is not an open RODBC channel
The code is below
# Import Data From Text File and remove duplicates
mydata = read.table("Keys.txt")
mydata.unique = unique(mydata)
strSQL.TempTable = "CREATE VOLATILE TABLE TEMP………[Table Details]"
"UNIQUE PRIMARY INDEX(index)"
"ON COMMIT PRESERVE ROWS;"
# Connect To Database
conn <- tdConnect('Teradata')
# Execute Temp Table
tdQuery(strSQL.TempTable)
sqlSave(conn, mydata, tablename = "TEMP ",rownames = FALSE, append = TRUE)
Can anyone help, Is it closing off the connection before I can upload the information into the Table?
My Mistake, I have been confusing libraries
Basically the lines
# Connect To Database
conn <- tdConnect('Teradata')
# Execute Temp Table
tdQuery(strSQL.TempTable)
sqlSave(conn, mydata, tablename = "TEMP ",rownames = FALSE, append = TRUE)
can all be replaced by this
# Connect To Database
channel <- odbcConnect('Teradata')
# Execute Temp Table
sqlQuery(channel, paste(strSQL.TempTable))
sqlSave(channel, mydata, table = "TEMP",rownames = FALSE, append = TRUE)
Now I'm being told, i don't have access to do this but this is another question for another forum
Thanks

How to insert a dataframe into a SQL Server table?

I'm trying to upload a dataframe to a SQL Server table, I tried breaking it down to a simple SQL query string..
library(RODBC)
con <- odbcDriverConnect("driver=SQL Server; server=database")
df <- data.frame(a=1:10, b=10:1, c=11:20)
values <- paste("(",df$a,",", df$b,",",df$c,")", sep="", collapse=",")
cmd <- paste("insert into MyTable values ", values)
result <- sqlQuery(con, cmd, as.is=TRUE)
..which seems to work but does not scale very well. Is there an easier way?
[edited] Perhaps pasting the names(df) would solve the scaling problem:
values <- paste( " df[ , c(",
paste( names(df),collapse=",") ,
")] ", collapse="" )
values
#[1] " df[ , c( a,b,c )] "
You say your code is "working".. I would also have thought one would use sqlSave rather than sqlQuery if one wanted to "upload".
I would have guessed this would be more likely to do what you described:
sqlSave(con, df, tablename = "MyTable")
This worked for me and I found it to be simpler.
library(sqldf)
library(odbc)
con <- dbConnect(odbc(),
Driver = "SQL Server",
Server = "ServerName",
Database = "DBName",
UID = "UserName",
PWD = "Password")
dbWriteTable(conn = con,
name = "TableName",
value = x) ## x is any data frame
Since insert INTO is limited to 1000 rows, you can dbBulkCopy from rsqlserver package.
dbBulkCopy is a DBI extension that interfaces the Microsoft SQL Server popular command-line utility named bcp to quickly bulk copying large files into table. For example:
url = "Server=localhost;Database=TEST_RSQLSERVER;Trusted_Connection=True;"
conn <- dbConnect('SqlServer',url=url)
## I assume the table already exist
dbBulkCopy(conn,name='T_BULKCOPY',value=df,overwrite=TRUE)
dbDisconnect(conn)

Resources