Updated documentation to interface R and Fusion Tables - r

can anyone provide updated documentation for interfacing R with Fusion Tables 1.0 API?
Old package:
http://andrei.lopatenko.com/rstat/fusion-tables.R
I am trying to understand how to use an API key to make SQL calls to Fusion Tables. After succesfully connecting with client login, running the code below returns "Forbidden".
ft.executestatement <- function(auth, statement) {
url = "https://www.googleapis.com/fusiontables/v1/query"
params = list( sql = statement, key = "XXXXXXX")
connection.string = paste("GoogleLogin auth=", auth, sep="")
opts = list( httpheader = c("Authorization" = connection.string))
result = postForm(uri = url, .params = params, .opts = opts)
if (length(grep("<HTML>\n<HEAD>\n<TITLE>Parse error", result, ignore.case = TRUE))) {
stop(paste("incorrect sql statement:", statement))
}
return (result)
}
Any direction is much appreciated.

Related

R Package passing function through to new function

I'm creating a new package as a learning exercise. I've selected a few functions that serve the purpose I need and want to bundle them together into a single, new, package that I can then apply gWidgets to to make them GUI driven.
Finding documentation on how to pass existing functions has been pretty sparse and I am a novice at this. Any assistance would be appreciated.
I've added the necessary imports in my Description using usethis:use_package() updated the NAMESPACE (using Roxygen)
and created the .R files using a Stackoverflow as a framework. a sample .R looks like this:
#'ODBC list drivers
#'
#'#export
odbcListDrivers <- function() {
odbc::odbcListDrivers()
}
This Works.
But when it comes to the point I need to pass information with a more advanced function:
#'
#' u/export
#'
DBconnect <- function() {
DBI::dbConnect()
}
I get an unused argument error when I try to run the test code.
con <- DBconnect(RMySQL::MySQL(),
+ dbname = "sakila",
+ host = "192.168.50.71",
+ port = 49153,
+ user = "root",
+ password = "Camplovers01")
Error in DBconnect(RMySQL::MySQL(), dbname = "sakila", host = "192.168.50.71", :
unused arguments (dbname = "sakila", host = "192.168.50.71", port = 49153, user = "root", password = "Camplovers01")
The question then is when I pass a function like above how can I make sure I pass the correct arguments? Function I am trying to pass
Ok, asked and then answered by myself. The answer is in the error, unused arguments. I need to place the arguments from the source function in the function I create. These will pass through to the original function and return the correct response. The odbcListDrivers worked as it had no arguments to pass or expect.
Example
Old .R file
#'
#' u/export
#'
DBconnect <- function() {
DBI::dbConnect()
}
This of course fails as was the reason for my question.
New .R file
#'
#' #export
#'
DBconnect <- function(dsn = NULL,
...,
timezone = "UTC",
timezone_out = "UTC",
encoding = "",
bigint = c("integer64", "integer", "numeric", "character"),
timeout = Inf,
driver = NULL,
server = NULL,
database = NULL,
uid = NULL,
pwd = NULL,
dbms.name = NULL,
.connection_string = NULL) {
DBI::dbConnect()
}
Finding the arguments of the function was a simple matter of reviewing the .R file in the GitHub repo for the Package in question.

Cannot connect to todoist REST API with R

I am not very good with working with API's "From scratch" so to speak. My issue here is probably more to do with my ignorance of RESTful API's than the Todoist API specifically, but I'm struggling with Todoist because all of their documentation is geared around python and I'm not sure why my feeble attempts are failing. Once I get connected/authenticated I think I'll be fine.
Todoist documentation
I've tried a couple of configurations using httr::GET(). I would appreciate a little push here as I get started.
Things I've tried, where key is my api token:
library(httr)
r<-GET("https://beta.todoist.com/API/v8/", add_headers(hdr))
for hdr, I've used a variety of things:
hdr<-paste0("Authorization: Bearer", key)
just my key
I also tried with projects at the end of the url
UPDATE These are now implemented in the R package rtodoist.
I think you nearly had it except the url? (or maybe it changed since then) and the header. The following works for me, replacing my_todoist_token with API token found here.
library(jsonlite)
library(httr)
projects_api_url <- "https://api.todoist.com/rest/v1/projects"
# to get the project as a data frame
header <- add_headers(Authorization = paste("Bearer ", my_todoist_token))
project_df <- GET(url = projects_api_url, header) %>%
content("text", encoding = "UTF-8") %>%
fromJSON(flatten = TRUE)
# to create a new project
# unfortunately no way to change the dot color associated with project
header2 <- add_headers(
Authorization = paste("Bearer ", my_todoist_token),
`Content-Type` = "application/json",
`X-Request-Id` = uuid::UUIDgenerate())
POST(url = projects_api_url, header2,
body = list(name = "Your New Project Name"
# parent = parentID
),
encode = "json")
# get a project given project id
GET(url = paste0(projects_api_url, "/", project_df$id[10]),
header) %>%
content("text", encoding = "UTF-8") %>%
fromJSON(flatten = TRUE)
# update a project
POST(url = paste0(projects_api_url, "/", project_df$id[10]),
header2, body = list(name = "IBS-AR Biometric 2019"), encode = "json")

rvest package doesn't recognize the form

I wanted to scrape some data from following website:
http://predstecajnenagodbe.fina.hr/pn-public-web/predmet/search
but when I tried to use rvest:
library(rvest)
session <- html_session("http://predstecajnenagodbe.fina.hr/pn-public-web/predmet/search")
form <- html_form(session)
form
it doesn't find the form, even if it is there (as you can see on the page).
I have also tried with POST function from httr package:
parameters <- list(since = "1.6.2018", until = "5.6.2018", `g-recaptcha-response` = "03AF6jDqXcBw1qmbrxWqadGqh9k8eHAzB9iPbYdnwzhEVSgCwO0Mi6DQDgckigpeMH1ikV70egOC0UppZsO7tO9hgdpEIaI04jTpG6JxGMR6wov27kEkLuVsEp1LhxZB4WFDRkDWdqcZeVN1YkiojUpje4k-swFG7tPyG2pJN86SdT290D9_0fyfrxlpfFNL2VUwE_c15vVthcBEdXIQ68V5qv7ZVooLiwrdTO2qLDLF1yUZWiu9IJoLuBWdFzJ_zdSP6fbuj5wTpfPdsYJ2n988Gcb3q2aYdn-2TVuWoQzqs1wbh7ya_Geo7_8gnDUL92l2nqTeV9CMY58fzppPPYDJcchdHFTTxadGwCGZyKC3WUSh81qiGZ5JhNDUpPnOO-MgSr5aPbA7tei7bbypHV9OOVjPGLLtqA9g")
httr::POST(
url,
body = parameters,
config = list(
add_headers(Referer = "http://predstecajnenagodbe.fina.hr"),
user_agent(get_header()),
accept_encoding = get_encoding(),
use_proxy("xxxx", port = 80,
username = "xxx", password = "xxxx"),
timeout(20L),
tcp_keepalive = FALSE
),
encode = "form",
verbose()
)
but it returns some JS code and message:
Please enable JavaScript to view the page content.Your support ID is:
10544975822212666004
could you please explain why rvest doesn't recognize form and why POST doesn't work eater?

Don't call db_disconnector automatically in dplyr

I'm using tbl_sql object in my Shiny app to have access to a database table. I've noticed that sometimes dplyr close this connection. It might be because garbage collector calls db_disconnector. Is there any way to stop this? I could close the connection on the shiny close event.
It seems like, if you d <- src_mysql(...) (I guess that's the backend you're using, and how you're connecting to the data base?) then the garbage collector will only run if d goes out of scope. Maybe its the database that is timing out connections as a way to manage load?
One way to test this is to write your own wrapper (rather than src_mysql()) that does not disconnect
src_yoursql <-
function (dbname, host = NULL, port = 0L, user = "root", password = "",
...)
{
if (!requireNamespace("RMySQL", quietly = TRUE)) {
stop("RMySQL package required to connect to mysql/mariadb",
call. = FALSE)
}
con <- DBI::dbConnect(RMySQL::MySQL(), dbname = dbname, host = host,
port = port, username = user, password = password, ...)
info <- DBI::dbGetInfo(con)
src_sql("mysql", con, info = info)
}
d = src_yoursql(...)
Close it manually with
DBI::dbDisconnect(d$con)

Check that connection is valid

I'm using RPostgreSQL and sqldf inside my function like this:
MyFunction <- function(Connection) {
options(sqldf.RPostgreSQL.user = Connection[1],
sqldf.RPostgreSQL.password = Connection[2],
sqldf.RPostgreSQL.dbname = Connection[3],
sqldf.RPostgreSQL.host = Connection[4],
sqldf.RPostgreSQL.port = Connection[5])
# ... some sqldf() stuff
}
How do I test that connection is valid?
You can check that an existing connection is valid using isPostgresqlIdCurrent.
conn <- dbConnect("RPgSQL", your_database_details)
isPostgresqlIdCurrent(conn)
For testing new connections, I don't think that there is a way to know if a connection is valid without trying it. (How would R know that the database exists and is available until it tries to connect?)
For most analysis purposes, just stopping on an error and fixing the login details is the best approach. So just call dbConnect and don't worry about extra check functions.
If you are creating some kind of application where you need to to handle errors gracefully, a simple tryCatch wrapper should do the trick.
conn <- tryCatch(conn <- dbConnection(wherever), error = function(e) do_something)
My current design uses tryCatch:
Connection <- c('usr','secret','db','host','5432')
CheckDatabase <- function(Connection) {
require(sqldf)
require(RPostgreSQL)
options(sqldf.RPostgreSQL.user = Connection[1],
sqldf.RPostgreSQL.password = Connection[2],
sqldf.RPostgreSQL.dbname = Connection[3],
sqldf.RPostgreSQL.host = Connection[4],
sqldf.RPostgreSQL.port = Connection[5])
out <- tryCatch(
{
sqldf("select TRUE;")
},
error=function(cond) {
out <- FALSE
}
)
return(out)
}
if (!CheckDatabase(Connection)) {
stop("Not valid PostgreSQL connection.")
} else {
message("PostgreSQL connection is valid.")
}
One approach is to simply try executing the code, and catching any errors with a nice informative error message. Have a look at the documentation of tryCatch to see the details regarding how this works.
The following blog post provides an introduction to the exception-based style of programming.

Resources