Athena Connection with R - r

I am new to Athena. I want to connect this with R
Sys.getenv()
URL <- 'https://s3.amazonaws.com/athena-downloads/drivers/AthenaJDBC42_2.0.14.jar'
fil <- basename(URL)
if (!file.exists(fil)) download.file(URL, fil)
drv <- JDBC(driverClass="com.simba.athena.jdbc.Driver", fil, identifier.quote="'")
This is the error message
Error in .jfindClass(as.character(driverClass)[1]) :
java.lang.ClassNotFoundException
Referred this article
https://aws.amazon.com/blogs/big-data/running-r-on-amazon-athena/
con <- jdbcConnection <- dbConnect(drv, 'jdbc:awsathena://athena.ap-south-1.amazonaws.com:443/',
s3_staging_dir="s3://aws-athena-query-results-ap-south-1-region/",
user=("xxx"),
password=("xxx"))
Need help really struggling from two days
Thanks in advance. I downloaded jar files and java.

You are using a newer driver version and the driver is now developed by simba and therefore the driver class name has changed.
The driver class is now com.simba.athena.jdbc.Driver.
You may also want to check out AWR.Athena - A nice R package to interact with Athena.

If you are still having difficulty with the JDBC drivers for Athena you could always try: RAthena or noctua. These two packages opt in using AWS SDK to make the connection to AWS Athena.
RAthena uses Python boto3 sdk (similar to pyathena), through reticulate.
noctua uses R paws sdk.
Code Example:
library(DBI)
# connect to AWS
# using ~/.aws/credentials to store aws credentials
con <- dbConnect(RAthena::athena(),
s3_staging_dir = "s3://mybucket/")
# upload some data into aws athena
dbWriteTable(con, "iris", iris)
# query iris in aws athena
dbGetQuery(con, "select * from iris")
NOTE: noctua works extactly the same way as code example above but instead the driver is: noctua::athena()

Related

Error with dbConnect to Snowflake via Rscript (but not R Studio)

I have successfully connected/queried Snowflake from R Studio using an ODBC driver. When I try the code in Rgui.exe, it also works. However, in Rterm (or calling rScript from a batch script), it does not. Rterm returns the following error:
OOB curl_easy_perform() failed: SSL peer certificate or SSH remote key was not OK
My R code is:
library(ROracle)
library(methods)
username <- keyring::key_list("blake-snowflake")[1,2]
password <- keyring::key_get("blake-snowflake", keyring::key_list("my-snowflake")[1,2])
### connect to EDW
con_snowflake <- dbConnect(
odbc::odbc(),
"EDW_sample",
uid=username,
pwd=password)
I switched from using ODBC to JDBC.
library(RJDBC)
jdbcDriver <- JDBC(driverClass="com.snowflake.client.jdbc.SnowflakeDriver", classPath = "..\\java\\snowflake-jdbc-3.7.2.jar")
con_snowflake <- dbConnect(jdbcDriver, "jdbc:snowflake://xxx.snowflakecomputing.com/", keyring::key_list("my-snowflake")[1,2], keyring::key_get("my-snowflake", keyring::key_list("my-snowflake")[1,2]), db="db_name", schema="schema_name")
### read in data
query = readr::read_file("...\\query.sql")
df <- ROracle::dbGetQuery(con_snowflake, query)

Writing data rom R to AWS Redshift db directly

Is there a way to write large datasets from R studio to AWS Redshift db directly? I used the following solution that I got online. but it throws error -unused argument: tablename=".."
install.packages('devtools')
devtools::install_github("RcppCore/Rcpp")
devtools::install_github("rstats-db/DBI")
devtools::install_github("rstats-db/RPostgres")
install.packages("aws.s3", repos = c(getOption("repos"), "http://cloudyr.github.io/drat"))
devtools::install_github("sicarul/redshiftTools")
library("aws.s3") library(RPostgres) library(redshiftTools)
pconn_r <- dbConnect(RPostgres::Postgres(), dbname="db",
host='xyz.db.amazon.com', port='1234',
user='user', password='pwd',sslmode='require')
rs_replace_table(tst, dbcon=pconn_r, tableName='abc', bucket="pqr")
Please help!
You can look that library: redshift-r
Or connect throught RJDBC: RJDBC

R connection to Redshift using AWS driver doesn't work but does work with Postgre driver

I am trying establish a connection to my redshift database after following the example provided by AWS https://blogs.aws.amazon.com/bigdata/post/Tx1G8828SPGX3PK/Connecting-R-with-Amazon-Redshift. However, I get errors when trying to establish the connection using their recommended driver. However, when I use the Postgre driver I can establish a connection to the redshift DB.
AWS says their driver is "optimized for performance and memory management", so I would rather use it. Can someone please review my code below, and let me know if they see something wrong? I suspect that I am not setting the URL up correctly, but not sure what I should be using instead? Thanks in advance for any help.
#' This code attempts to establish a connection to redshift database. It
#' attempts to establish a connection using the suggested redshift but doesn't
#' work.
## Clear up space and set working directory
#Clear Variables
rm(list=ls(all=TRUE))
gc()
## Libriries for analyis
library(RJDBC)
library(RPostgreSQL)
#Create DBI driver for working with redshift driver directly
# download Amazon Redshift JDBC driver
download.file('http://s3.amazonaws.com/redshift-downloads/drivers/RedshiftJDBC41-1.1.9.1009.jar',
'RedshiftJDBC41-1.1.9.1009.jar')
# connect to Amazon Redshift using specific driver
driver_redshift <- JDBC("com.amazon.redshift.jdbc41.Driver",
"RedshiftJDBC41-1.1.9.1009.jar", identifier.quote="`")
## Using postgre connection that works
#postgre driver
driver_postgre <- dbDriver("PostgreSQL")
#establish connection
conn_postgre <- dbConnect(driver_postgre, host="nhdev.c6htwjfdocsl.us-west-2.redshift.amazonaws.com",
port="5439",dbname="dev",
user="xxxx", password="xxxx")
#list the tables available
tables = dbListTables(conn_postgre)
## Use URL option to establish connection like the example on AWS website
# url <- "<JDBCURL>:<PORT>/<DBNAME>?user=<USER>&password=<PW>
# url <- "jdbc:redshift://demo.ckffhmu2rolb.eu-west-1.redshift.amazonaws.com
# :5439/demo?user=XXX&password=XXX" #useses example from AWS instructions
#url using my redshift database
url <- "jdbc:redshift://nhdev.c6htwjfdocsl.us-west-2.redshift.amazonaws.com
:5439/dev?user=xxxx&password=xxxx"
#attempt connect but gives an error
conn_redshift <- dbConnect(driver_redshift, url)
#gives the following error:
# Error in .jcall(drv#jdrv, "Ljava/sql/Connection;", "connect", as.character(url)[1], :
# java.sql.SQLException: Error message not found: CONN_GENERAL_ERR. Can't find bundle for base name com.amazon.redshift.core.messages, locale en
## Similier to postgre example that works but doesn't work when using redshift specific driver
#gives an error saying url is missing, but I am not sure which url to use?
conn <- dbConnect(driver_redshift, host="nhdev.c6htwjfdocsl.us-west-2.redshift.amazonaws.com",
port="5439",dbname="dev",
user="xxxx", password="xxxx")
# gives the following error:
#Error in .jcall("java/sql/DriverManager", "Ljava/sql/Connection;", "getConnection", :
# argument "url" is missing, with no default
I've done it this way it works for me:
drv <- JDBC("com.amazon.redshift.jdbc41.Driver","PathTO/RedshiftJDBC41-1.1.2.0002.jar")
conn <- dbConnect(drv,"jdbc:redshift://......redshift.amazonaws.com:5439/dev",User,PWD)
The difference I see in yours is that you don't mention the full path to redshift jar in driver_redshift.
Hope it works.

Connect to MSSQL using DBI

I can not connect to MSSQL using DBI package.
I am trying the way shown in package itself
m <- dbDriver("RODBC") # error
Error: could not find function "RODBC"
# open the connection using user, passsword, etc., as
# specified in the file \file{\$HOME/.my.cnf}
con <- dbConnect(m, dsn="data.source", uid="user", pwd="password"))
Any help appreciated. Thanks
As an update to this question: RStudio have since created the odbc package (or GitHub version here) that handles ODBC connections to a number of databases through DBI. For SQL Server you use:
con <- DBI::dbConnect(odbc::odbc(),
driver = "SQL Server",
server = <serverURL>,
database = <databasename>,
uid = <username>,
pwd = <passwd>)
You can also set a dsn or supply a connection string.
It looks like there used to be a RODBC driver for DBI, but not any more:
http://cran.r-project.org/src/contrib/Archive/DBI.RODBC/
A bit of tweaking has got this to install in a version 3 R but I don't have any ODBC sources to test it on. But m = dbDriver("RODBC") doesn't error.
> m = dbDriver("RODBC")
> m
<ODBCDriver:(29781)>
>
Suggest you ask on the R-sig-db mailing list to maybe find out what happened to this code and/or the author...
Solved.
I used library RODBC. It has great functionality to connect sql and run sql queries in R.
Loading Library:
library(RODBC)
# dbDriver is connection string with userID, database name, password etc.
dbhandle <- odbcDriverConnect(dbDriver)
Running Sql query
sqlQuery(channel=dbhandle, query)
Thats It.

Connection R to Cassandra using RCassandra

I have an instance of Cassandra running on my localhost. For this example I have used the default configuration provided in conf\cassandra.yaml
I tried to connect R to Cassandra using the RCassandra package.
Basically, i have just installed the RCassandra package in R and tried to connect.
library("RCassandra")
RC.connect('localhost','9160')
RC.connect('127.0.0.1','9160')
None of those are working. Here is the error I get:
Error in RC.connect("localhost", port = "9160") :
cannot connect to locahost:9160
Using Cassandra-cli with the same parameters work. Can you please help on that.
Thank you
Set start_rpc: true in cassandra.yaml file.
Could not fix it but found a way to make it work: initiate a jdbc connection and then launch RCassandra
#Load RJDBC
library(RJDBC)
#Load in the Cassandra-JDBC diver
cassdrv <- JDBC("org.apache.cassandra.cql.jdbc.CassandraDriver",
list.files("C://Users//aab_ITSolutions//apache-cassandra-1.0.10//lib",pattern="jar$",full.names=T))
#Connect to Cassandra node and Keyspace
casscon <- dbConnect(cassdrv, "jdbc:cassandra://localhost:9160/DEMO")
#Query timeseries data
res <- dbGetQuery(casscon, "select * from StockHist limit 10")
library("RCassandra")
connx = RC.connect('localhost',9160)

Resources