Connecting R To Teradata VOLATILE TABLE - r

I am using R to try and connect to a teradata database and am running into difficulties
The steps in the process are below
1) Create Connection
2) Create a VOLATILE TABLE
3) Load information from a data frame into the Volatile table
Here is where it fails, giving me an error message
Error in sqlSave(conn, mydata, tablename = "TEMP", rownames = FALSE, :
first argument is not an open RODBC channel
The code is below
# Import Data From Text File and remove duplicates
mydata = read.table("Keys.txt")
mydata.unique = unique(mydata)
strSQL.TempTable = "CREATE VOLATILE TABLE TEMP………[Table Details]"
"UNIQUE PRIMARY INDEX(index)"
"ON COMMIT PRESERVE ROWS;"
# Connect To Database
conn <- tdConnect('Teradata')
# Execute Temp Table
tdQuery(strSQL.TempTable)
sqlSave(conn, mydata, tablename = "TEMP ",rownames = FALSE, append = TRUE)
Can anyone help, Is it closing off the connection before I can upload the information into the Table?

My Mistake, I have been confusing libraries
Basically the lines
# Connect To Database
conn <- tdConnect('Teradata')
# Execute Temp Table
tdQuery(strSQL.TempTable)
sqlSave(conn, mydata, tablename = "TEMP ",rownames = FALSE, append = TRUE)
can all be replaced by this
# Connect To Database
channel <- odbcConnect('Teradata')
# Execute Temp Table
sqlQuery(channel, paste(strSQL.TempTable))
sqlSave(channel, mydata, table = "TEMP",rownames = FALSE, append = TRUE)
Now I'm being told, i don't have access to do this but this is another question for another forum
Thanks

Related

dbWriteTable of DBI package in R addig digits to numeric values

I use DBI package to write values to my database tables. Database is PostgreSQL.
My data looks like this. Some of my values have 0 digits, some have 1:
I get this data from reading csv using xlsx library.
I use this code to write data to my table:
DBI::dbWriteTable(conn = con,
name = Id(schema = "schema", table = 'table'),
value = df,
append=T,)
But in database I end up with this:
Column types of min_limit and max_limit in database are numeric.
I tried to use:
DBI::dbWriteTable(conn = con,
name = Id(schema = "schema", table = 'table'),
value = format(df, digits = 2),
append=T,)
But this gives me error:
> Error while preparing parameters ERROR: column "row_names" of relation "table" does not exist
What do I need to do to write rounded to 2 digits values to database table?

dbAppendTable() error when I try to append data to a local server

I'm just starting my journey with r, so I'm a complete newbie and I can't find anything that will help me solve this.
I have a csv table (random integers in each column) with 9 columns. I read 8 and I want to append them to a sql table with 8 fields (Col1 ... 8, all int's). After uploading the csv into rStudio, it looks right and only has 8 columns:
The code I'm using is:
# Libraries
library(DBI)
library(odbc)
library(tidyverse )
# CSV Files
df = head(
read_delim(
"C:/Data/test.txt",
" ",
trim_ws = TRUE,
skip = 1,
skip_empty_rows = TRUE,
col_types = cols('X7'=col_skip())
)
, -1
)
# Add Column Headers
col_headings <- c('Col1', 'Col2', 'Col3', 'Col4', 'Col5', 'Col6', 'Col7', 'Col8')
names(df) <- col_headings
# Connect to SQL Server
con <- dbConnect(odbc(), "SQL", timeout = 10)
# Append data
dbAppendTable(conn = con,
schema = "tmp",
name = "test",
value = df,
row.names = NULL)
I'm getting this error message:
> Error in result_describe_parameters(rs#ptr, fieldDetails) :
> Query requires '8' params; '18' supplied.
I ran into this issue also. I agree with Hayward Oblad, the dbAppendTable function appears to be finding another table of the same name throwing the error. Our solution was to specify the name parameter as an Id() (from DBI::Id())
So taking your example above:
# Append data
dbAppendTable(conn = con,
name = Id(schema = "tmp", table = "test"),
value = df,
row.names = NULL)
Ran into this issue...
Error in result_describe_parameters(rs#ptr, fieldDetails) : Query
requires '6' params; '18' supplied.
when saving to a snowflake database and couldn't find any good information on the error.
Turns out that there was a test schema where the tables within the schema had exactly the same names as in the prod schema. DBI::dbAppendTable() doesn't differentiate the schemas, so until those tables in the text schema got renamed to unique table names, the params error persisted.
Hope this saves someone the 10 hours I spent trying to figure out why DBI was throwing the error.
See he for more on this.
ODBC/DBI in R will not write to a table with a non-default schema in R
add the name = Id(schema = "my_schema", table = "table_name") to DBI::dbAppendTable()
or in my case it was the DBI::dbWriteTable().
Not sure why the function is not using the schema from my connection object though.. seems redundant.

How to update a table in database using a data frame using RPostgreSQL

I have a table in Postgres database with some columns and values. I have imported this table into local memory, performed some computation on these columns and have a data frame with new values . Now I want this updated data frame to be placed back in the database in the same table.
drv <- dbDriver("PostgreSQL")
con <- dbConnect(drv, host = "*****", port = "****",
dbname = "sai", user = "sai", password = "password")
saix_account_demo <- dbReadTable(con = con, name = "saix_account")
...
dbWriteTable(con, name = "saix_account", value = saix_account_demo,
row.names=FALSE, append=TRUE)`
I have performed dbWrtiteTable() with append==T and overwrite ==F. But I am facing an error saying primary key constraint violated. I understood the problem that I was trying to insert data instead of updating.

Could not run statement: Lost connection to MySQL server during query

Transferring an R data frame to a MySQL (MariaDB) database table, I get the following error: Lost connection to MySQL server during query
Example data can be loaded in R with this command
cntxt <- read.delim("http://ec.europa.eu/eurostat/estat-navtree-portlet-prod/BulkDownloadListing?sort=1&file=comext%2FCOMEXT_METADATA%2FCLASSIFICATIONS_AND_RELATIONS%2FENGLISH%2FCN.txt", header = FALSE, quote = "", stringsAsFactors = FALSE)
I use the RMySQL package to transfer the data frame to the the database:
con <- RMySQL::dbConnect(RMySQL::MySQL(), dbname = "test")
RMySQL::dbWriteTable(con, "cntxt", cntxt, row.names = FALSE, overwrite = TRUE)
The database write operation works fine on my laptop for tables of any size. But on a server it returns an error. The error only appears for sufficiently large tables (above 1000 rows):
dbWriteTable() succeeds for 1000 lines of data
RMySQL::dbWriteTable(con, "cntxt", head(cntxt,1000), row.names = FALSE, overwrite = TRUE)
# [1] TRUE
dbWriteTable() fails for 2000 lines of data
RMySQL::dbWriteTable(con, "cntxt", head(cntxt,2000), row.names = FALSE, overwrite = TRUE)
# Error in .local(conn, statement, ...) :
# could not run statement: Lost connection to MySQL server during query
Based on related questions, I have checked the value of max_allowed_packet:
mysql> SHOW VARIABLES LIKE 'max_allowed_packet';
| max_allowed_packet | 16777216 |
16Mb should be more than enough for 2000 lines of data.
Where is the error coming from?
There is nothing visible in mysql the error log /var/log/mysql/error.log.
Server version: 10.1.26-MariaDB-0+deb9u1 Debian 9.1
Replace the RMySQL package with the newer RMariaDB package
install.packages("RMariaDB")
Then tables over 2000 rows can be transferred again.
con <- RMariaDB::dbConnect(RMariaDB::MariaDB(), dbname="test")
cntxt <- read.delim("http://ec.europa.eu/eurostat/estat-navtree-portlet-prod/BulkDownloadListing?sort=1&file=comext%2FCOMEXT_METADATA%2FCLASSIFICATIONS_AND_RELATIONS%2FENGLISH%2FCN.txt", header = FALSE, quote = "", stringsAsFactors = FALSE)
RMariaDB::dbWriteTable(con, "cntxt", cntxt, row.names = FALSE, overwrite = TRUE)

Error When Insert data frame using RMySQL on R

Using R, I tried to insert a data frame. My script looks like below:
con <- dbConnect(RMySQL::MySQL(), username = "xxxxxx", password = "xxxxxx",host = "127.0.0.1", dbname = "xxxxx")
dbWriteTable(conn=con,name='table',value=as.data.frame(df), append = TRUE, row.names = F)
dbDisconnect(con)
WHen the script hit below line:
dbWriteTable(conn=con,name='table',value=as.data.frame(df), append = TRUE, row.names = F)
I got an error like below:
Error in .local(conn, statement, ...) : could not run statement: Invalid utf8 character string: 'M'
I am not sure why this error occurred. This is a part of script that has been run well on another machine.
Please advise
You should create proper connection, then only can insert data frame to your DB.
for creating connection username & password, host name & data base name should correct. same code only but i removed some parameter
try this:
mydb = dbConnect(MySQL(), user='root', password='password', dbname='my_database', host='localhost')
i just insert Iris data in my_database
data(iris)
dbWriteTable(mydb, name='db', value=iris)
i inserted iris data frame in the name of db in my_database

Categories

Resources