Issue with src_snowflakedb(): 'src_sql' is not an exported object - r

I am trying to use dplyr with a snowflake db, using the dplyr.snowflakedb package (on GitHub). I am able to install and load the libraries, then set the classpath pointing to the latest JDBC driver (snowflake-jdbc-3.0.9.jar).
# need to load RJDBC, or error 'could not find function ".jinit"' is thrown
library(RJDBC)
library(dplyr)
library(dplyr.snowflakedb)
options(dplyr.jdbc.classpath = "drivers/snowflake-jdbc-3.0.9.jar")
When trying to setup the connection object with src_snowflakedb(), I get the following error message (I removed the account details, but they are correct in the actual code):
> nike_db <- src_snowflakedb(user = "user",
password = "user",
account = "acme",
opts = list(warehouse = "my_wh",
db = "my_db",
schema = "my_schema"))
URL: jdbc:snowflake://acme.snowflakecomputing.com:443/?account=acme&warehouse=my_wh&my_db=db&schema=my_schema
Error: 'src_sql' is not an exported object from 'namespace:dplyr'
Indeed the current version of dplyr doesn't export nor include any src_sql() function:
> dplyr:::src_sql
Error in get(name, envir = asNamespace(pkg), inherits = FALSE) :
object 'src_sql' not found
Is there any way to fix this?

Please try the below sample code :
Sys.getenv("JAVA_HOME")
Sys.setenv(JAVA_HOME="C:\\Program Files\\Java\\jdk-1.8\\jre")
Sys.getenv("JAVA_HOME")
install.packages(c("rJava"))
install.packages(c("RJDBC", "DBI", "dplyr"))
install.packages("devtools")
devtools::install_github("snowflakedb/dplyr-snowflakedb")
library(RJDBC)
library(dplyr)
library(dplyr.snowflakedb)
options(dplyr.jdbc.classpath = "C:\\Driver\\snowflake-jdbc-3.11.1.jar")
my_db <- src_snowflakedb(user = "USERNAME" , password = "PASSWORD", account = "test",host = 'test.us-east-1.snowflakecomputing.com',opts = list(warehouse = "WAREHOUSE_NAME",db='DATABASE_NAME',schema='SCHEMA_NAME'))
tbl(my_db, "TABLE_NAME")
Note :
a) If your Snowflake Account URL is like "https://test.snowflakecomputing.com" use below format :
my_db <- src_snowflakedb(user = "USERNAME" , password = "PASSWORD", account = "test", opts = list(warehouse = "WAREHOUSE_NAME",db='DATABASE_NAME',schema='SCHEMA_NAME'))
b) If your Snowflake Account URL is like "https://test.us-east-1.snowflakecomputing.com" use below format :
my_db <- src_snowflakedb(user = "USERNAME" , password = "PASSWORD", account = "test", host = 'test.us-east-1.snowflakecomputing.com', opts = list(warehouse = "WAREHOUSE_NAME",db='DATABASE_NAME',schema='SCHEMA_NAME'))

I had the same issue, and had to go back to resort to the new version of the JDBC connection via SF: you can see link here:
https://support.snowflake.net/s/article/ka131000000O5Jr/connecting-r-to-snowflake-using-the-jdbc-driver-mac-os-x
all you really need though is this:
result <- dbGetQuery(jdbcConnection, "select current_timestamp() as now")
print(result)

Related

Connect to Database Using dbPool RJDBC in R

I am trying to use a pool to connect to my database in R, but I get the error:
Schema must be specified when session schema is not set
How does one specify a schema ? It seems like I need to specify it inside the pool. If that's the case, what's the parameter name for a schema?
pool <- dbPool(
drv = RJDBC::JDBC(
"xxx",
"dir_to_jar", "`"
),
dbname = "db",
schema = "schema" # this didn't work
url = url,
user = user,
password = password,
SSL = 'true'
)
pool %>% tbl("schema.table")
I tried several other methods using DBI::dbConnect combined with Id and it worked:
pool <- DBI::dbConnect(
drv = RJDBC::JDBC(
"xxx",
"dir_to_jar", "`"
),
url = url,
user = user,
password = password,
SSL = 'true'
)
# Didn't work
pool %>% tbl(dbplyr::in_schema("catalog.schema", "table"))
# Works!
s <- Id(catalog = "catalog", schema = "schema", table = "table")
df <- dbReadTable(pool, s)

st_write cannot add to non-public schema

I am trying to write data into a non-public schema using function st_write from sf package
I cannot change the following way I connect to the db, as it will break all other functions -
create_db_connection <- function(host, dbuser, dbpassword){
drv = RPostgreSQL::PostgreSQL()
DBuser = dbuser
DBhost = host
DBport = "5432"
DBpassword = dbpassword
db = RPostgreSQL::dbConnect(drv, dbname = "DIFM", user = DBuser, host = DBhost, port = DBport, password = DBpassword)
return(db)
}
Using the above connection, I have tried the following -
1. sf::st_write(obj = obj_geom, dsn = db, layer = c(schema_name, "temp_geometrytable"), row.names = FALSE, append = TRUE)
2. sf::st_write(obj = obj_geom, dsn = db, DBI::Id(schema=schema_name, table = "temp_geometrytable"), row.names = FALSE, append = TRUE)
But all of these keep adding into the public schema, but i want to dynamically provide schema name and then add the object into the database into a non-public schema

How to: New order Binance API via RStudio

I am trying to create a new order via the Binance API using RStudio.
I found the Binance Official API Docs and figured out that I should use
POST /api/v3/order (HMAC SHA256).
The following script doesn't work out for me:
url='https://api.binance.com/api/v3/account'
GET(url,
add_headers("X-MBX-APIKEY"= *[my API key]*),
query=list("symbol"="ETHBTC",
"side"="BUY",
"type"="MARKET",
"quantity"=1,
recvWindow=5000,
"timestamp"=1499827319559,
"signature"=**???**),
verbose())
Does anyone know what I'm doing wrong and how I can create an order via the Binance API using RSTUDIO and how I can create my signature?
library(httr)
timestamp <-
as.character(jsonlite::fromJSON(content(
GET("https://api.binance.com/api/v1/time"), "text"
))$serverTime + 999)
query <-
list(
"symbol" = "VENBTC",
"side" = "BUY",
"type" = "MARKET",
"quantity" = 1,
"recvWindow" = 5000,
"timestamp" = timestamp
)
signature <-
digest::hmac(
key = "*[my secret key]*",
object = paste(names(query), query, sep = "=", collapse = "&"),
algo = "sha256"
)
POST(
url,
add_headers("X-MBX-APIKEY" = "*[my API key]*"),
query = c(query, signature = signature),
verbose()
)

Query Oracle DNS in RStudio

I am using RStudio with package RODBC using the following code
require(RODBC)
channel<-odbcConnect(dsn = "USA", uid = "AA", pwd = "***" )
odbcGetInfo(channel)
This returns all the details but when I try and do a sql query
test<-sqlQuery(channel,"select * from cnty", rows_at_time = 1)
It returns an error with
Error in odbcFetchRows(channel, max = max, buffsize = buffsize, nullstring = nullstring, :
negative length vectors are not allowed
This works if I open Microsoft AccessDB - External Data -ODBC DataBase - link to data source click machine Data Source and select the source which then allows me to do a select query.
I have also tried using
debug(odbcFetchRows) test<-sqlQuery(channel,"select * from cnty", rows_at_time = 1)
This returns
function (channel, max = 0, buffsize = 1000, nullstring = NA_character_,
believeNRows = TRUE)
{
if (!odbcValidChannel(channel))
stop("first argument is not an open RODBC channel")
.Call(C_RODBCFetchRows, attr(channel, "handle_ptr"), max,
buffsize, as.character(nullstring), believeNRows)
}
I got this working by using test<-sqlQuery(channel,"select * from cnty", rows_at_time = 1,believeNRows = FALSE)

Keep alive function for PostgreSQL connection in Shiny app?

I have a shiny app that utilizes RPostgreSQL to connect to database & queries data ad-hoc.
# Initialize environment to hold SQL parameters:
library(RPostgreSQL)
if (!exists('.sql')) .sql <- new.env()
.sql$cxn <- dbConnect(PostgreSQL(), host = "localhost", dbname = "testdb", user = "myuser", password = "Passw0rd!", port = 5432)
This code runs when the app initializes, but after some time, the connection is terminated by the server:
> dbGetQuery(.sql$cxn, "SELECT 1")
Error in postgresqlExecStatement(conn, statement, ...) :
RS-DBI driver: (could not run statement: no connection to the server
)
If I simply call:
.sql$cxn <- dbConnect(PostgreSQL(), host = "localhost", dbname = "testdb", user = "myuser", password = "Passw0rd!", port = 5432)
again, it creates a second connection: (and I do not want that)
> dbListConnections(PostgreSQL())
[[1]]
<PostgreSQLConnection:(10601,4)>
[[2]]
<PostgreSQLConnection:(10601,5)>
(Eventually, the maximum number of connections is reached and you cannot create another: https://groups.google.com/forum/#!topic/shiny-discuss/0VjQc2a6z3M)
I want to create a function that connects to my PostgreSQL database (if it hasn't been already), and keeps the connection open (by SELECT'ing 1) if it has:
getConnection <- function(.host, .user, .pw) {
tryCatch({
if (!exists('cxn', where = .sql)) {
.sql$cxn <- dbConnect(PostgreSQL(), host = .host, dbname = "testdb", user = .user, password = .pw, port = 5432)
} else {
dbGetQuery(.sql$cxn, "SELECT 1")
}
}, warning = function(w) {
NULL # placeholder for warnings
}, error = function(e) {
print(e)
cat("Looks like PostgreSQL connection died. Let's try to reconnect...\n")
invisible(lapply(dbListConnections(PostgreSQL()), dbDisconnect)) # Close all db connections
.sql$cxn <<- dbConnect(PostgreSQL(), host = .host, dbname = "testdb", user = .user, password = .pw, port = 5432)
}, finally = {
return(.sql$cxn)
})
}
getConnection(.host = "localhost", .user = "myuser", .pw = "Passw0rd!")
EDIT 2016/03/12: Not sure why, but the function I wrote above doesn't seem to be working properly...
When I call it, I get:
> getConnection(.host = awsrds$host, .user = awsrds$username, .pw = awsrds$password)
Error in postgresqlExecStatement(conn, statement, ...) :
RS-DBI driver: (could not run statement: no connection to the server
)
<PostgreSQLConnection:(12495,0)>
In particular, this part dbGetQuery(.sql$cxn, "SELECT 1") returns:
Error in postgresqlExecStatement(conn, statement, ...) :
RS-DBI driver: (could not run statement: no connection to the server
)
NULL
Warning message:
In postgresqlQuickSQL(conn, statement, ...) :
Could not create executeSELECT 1
And the class of this output is NULL (as opposed to an error?).
Any ideas what I'm doing wrong? Thanks!
I believe you can solve the error by including a check in your connection function. This is what I did for mysql...should work the same in your case I believe. The key is checking if SELECT 1 == 'try-error'
connectionFunction <- function() {
if (!exists("connectionName", where = .GlobalEnv)) {
connectionName <<- dbConnect(MySQL(), default.file = mysqlconf, dbname = "dbName")
} else if (class(try(dbGetQuery(connectionName, "SELECT 1"))) == "try-error") {
dbDisconnect(connectionName)
connectionName <<- dbConnect(MySQL(), default.file = mysqlconf, dbName = "dbName")
}
return(connectionName)
}

Resources