Query InfluxDB 2.0 from R - r

I can retrieve the data with this query with Node-RED, but need to retrieve it with R.
This is as far as I've gotten.
post.1 <- httr::POST(url=paste0("http://", influx.ip, ":8086/api/v2/signin"),
authenticate(influx.user, influx.passwd))
# Authentication seems to work.
influx.query <- 'from(bucket: "nr_meas")
|> range(start: -12h)'
post.2 <- httr::POST(url=paste0("http://", influx.ip, ":8086/api/v2/query"),
query=list(org=influx.org),
add_headers("Content-Type: application/json",
'Accept: application/csv'),
body=list(q=influx.query)
)
content(post.2)
# $code
# [1] "invalid"
#
# $message
# [1] "failed to decode request body: invalid character '-' in numeric literal"
To save from Node-RED isn't an option (on different computer).
What is the right way to get data from InfluxDB to R?

You could try InfluxDB 2.0 R client:
library(influxdbclient)
client <- InfluxDBClient$new(
url = paste0("http://", influx.ip, ":8086"),
token = "my-token",
org = influx.org)
data <- client$query('from(bucket: "nr_meas") |> range(start: -12h) |> drop(columns: ["_start", "_stop"])')

Related

How do I fix "The query parameter [expansion] is not one of [usernames,expansions,tweet.fields,user.fields]" in twitter R Script

I am running the following code which is taken from https://developer.twitter.com/en/docs/tutorials/getting-started-with-r-and-v2-of-the-twitter-ap with some slight changes because it is doesn't seem to work.
library(rjson)
#https://developer.twitter.com/en/docs/tutorials/getting-started-with-r-and-v2-of-the-twitter-ap
Sys.setenv(BEARER_TOKEN="{mybearertoken}")
require(httr)
require(jsonlite)
require(dplyr)
bearer_token <- Sys.getenv("TWITTER_BEARER")
headers <- c(`Authorization` = sprintf('Bearer %s', bearer_token))
params <- list(`user.fields` = 'description',
`expansion` = 'pinned_tweet_id')
handle <- "codewryte"
url_handle <- paste("https://api.twitter.com/2/tweets/", handle)
#url_handle <-"https://twitter.com/TwitterDev/status/1228393702244134912"
response <-
httr::GET(url = url_handle,
httr::add_headers(.headers = headers),
query = params)
obj <- httr::content(response, as = "text")
x <- fromJSON(obj)
I get the following error:
$errors[[2]]$message
[1] "The query parameter [expansion] is not one of [usernames,expansions,tweet.fields,user.fields]"
$title
[1] "Invalid Request"
$detail
[1] "One or more parameters to your request was invalid."
$type
[1] "https://api.twitter.com/2/problems/invalid-request"
I also tried https://api.twitter.com/2/users/by/username/codewryte which is my user handle with the same message.
Does anyone understand what this message means and how I can fix it?
The problem was I was sending a list where a string needed to be sent.
response <-httr::GET(url = url_handle,
httr::add_headers(.headers = headers),
query="expansions=pinned_tweet_id&user.fields=created_at&tweet.fields=created_at")
I am not sure why the tutorial had this list.

Why am I having a problem downloading patent data with "patentsview" in R

I am trying to fetch patent data with the "patentsview" package in R but I am always getting an error and I couldn't find the solution anywhere. Here's my code -
# Load library
library(patentsview)
# Write query
query <- with_qfuns(
and(
begins(cpc_subgroup_id = 'G06N'),
gte(patent_year = 2020)
)
)
# Create a list of fields
# get_fields(endpoint = "patents")
# Needed Fields
fields <- c(
"patent_id",
"patent_title",
"patent_abstract",
"patent_date"
)
# Send an HTTP request to the PatentsView API to get the data
pv_res <- search_pv(query = query, fields = fields, all_pages = TRUE)
The output is -
Error in xheader_er_or_status(resp) : Not Found (HTTP 404).
What am I doing wrong here? And what is the solution?

RPlumber API - returning data as CSV instead of JSON - works locally on mac, but not on ubuntu-16.04

We are using RPlumber to host an API, and our developers asked that the API endpoints provide data in a CSV format, rather than JSON. To handle this, we have the following:
r_endpoints.R
#* #get /test-endpoint-1
testEndpoint <- function(res) {
mydata <- data.frame(a = c(1,2,3), b = c(3,4,5))
print('mydata')
print(mydata)
con <- textConnection("val","w")
print(paste0('con: ', con))
write.csv(x = mydata, con, row.names = FALSE)
close(con)
print('res and res.body')
print(res);
res$body <- paste(val, collapse="\n")
print(res$body)
return(res)
}
#* #get /test-endpoint-2
testEndpoint2 <- function() {
mydata <- data.frame(a = c(1,2,3), b = c(3,4,5))
return(mydata)
}
run_api.r
library(plumber)
pr <- plumber::plumb("r_endpoints.R")
pr$run(host = "0.0.0.0", port = 8004)
test-endpoint-2 returns the data in a JSON format, whereas test-endpoint-1 returns the data in a CSV format. When these endpoints are run locally on my mac, and when I hit the endpoints, I receive the following correct output:
To host the API, we've installed R + the libraries + pm2 on a Linode Ubuntu 16.04 server, and installed all (I think all) of the dependencies. When we try to hit the endpoints as hosted on the server, we receive:
Here are the print statements that I've added to test-endpoint-1 to help with debugging:
[1] "mydata"
a b
1 1 3
2 2 4
3 3 5
[1] "con: 3"
[1] "res and res.body"
<PlumberResponse>
Public:
body: NULL
clone: function (deep = FALSE)
headers: list
initialize: function (serializer = serializer_json())
removeCookie: function (name, path, http = FALSE, secure = FALSE, same_site = FALSE,
serializer: function (val, req, res, errorHandler)
setCookie: function (name, value, path, expiration = FALSE, http = FALSE,
setHeader: function (name, value)
status: 200
toResponse: function ()
[1] "\"a\",\"b\"\n1,3\n2,4\n3,5"
These are the correct print statements - the same that we get locally. For some reason, the server will not allow us to return in a CSV format in the same way that my local machine allows, and I have no idea why this is the case, or how to fix it.
Edit
After updating the plumber library on my local machine, I now receive the error An exception occurred. on my local machine as well. It seems, in the newer version of plumber, that the snippet of code I use to convert the API endpoint output to a CSV file:
...
con <- textConnection("val","w")
write.csv(x = mydata, con, row.names = FALSE)
close(con)
res$body <- paste(val, collapse="\n")
return(res)
no longer works.
Edit 2
Here's my own stackoverflow post from nearly 3 years ago on how to return the data as a CSV... seems to no longer work.
Edit 3
Using #serialize csv does "work", but when I hit the endpoint, the data is downloaded as a CSV onto my local machine, whereas it would be better for the data to simply be returned in a CSV format from the API, but not automatically downloaded into a CSV file...
Maybe look into this for inspiration, here I'm modifying responses content-type headers to text/plain. text/plain should display in the browser I believe.
#* #get /json
#* #serializer unboxedJSON
function() {
dostuff()
}
#* #get /csv
#* #serializer csv list(type="text/plain; charset=UTF-8")
function() {
dostuff()
}
dostuff <- function() {
mtcars
}
This ugly code works
EDIT : added an enum spec for swagger UI
library(plumber)
#* #get /iris
function(type, res) {
if (type == "csv") {
res$serializer <- serializer_csv(type = "text/plain; charset=UTF-8")
}
iris
}
#* #plumber
function(pr) {
pr_set_api_spec(pr, function(spec) {
spec$paths$`/iris`$get$parameters[[1]]$schema$enum = c("json", "csv")
spec
})
}
The An exception occurred issue is actually from httpuv and is fixed in the latest GitHub version of the package (see https://github.com/rstudio/httpuv/pull/289). Installing httpuv from GitHub (remotes::install_github("rstudio/httpuv")) and running the API again should resolve the issue.

Using the Adwords Traffic Estimator Service with R / ‘RAdwords’ packege

So I need to get Traffic estimation for some keywords via Google Adwords. I see that Google has a traffic estimator service API: https://developers.google.com/adwords/api/docs/guides/traffic-estimator-service .
I'm using R, which has quite a comprehensive RAdwords package, https://jburkhardt.github.io/RAdwords/faq/ , but one with which I failed miserably to find weather it provides access to this particular Adwords API service.
So did anyone use R to get keyword data from Google adwords via Traffic estimator? Is it possible with the RAdwords packege or does it need to be done via classic scripting ?
Thanks in advance.
you can use rgoogleads package.
Documentation: https://selesnow.github.io/rgoogleads/docs/
See Keywords Planing Data section: https://selesnow.github.io/rgoogleads/docs/reference/index.html#section-keywords-planing-data
There's a R package called RAdwordsPlus, which is built on top of RAdwords that will do just that.
devtools::install_github("adviso/RAdwordsPlus")
library(RAdwordsPlus)
google_auth <- doAuth() # requires user interaction the first time
api_version <- "v201809"
customer_id <- "xxx-xxx-xxxx"
# Build keyword request
k <- keyword(
text = c("mars cruise", "cheap cruise", "cruise"),
match.type = c("BROAD", "PHRASE", "EXACT"),
type = "Keyword"
)
# Keyword estimate request
ker <- keyword.estimate.request(
keyword = k,
max.cpc = 5000000, # 1 million == one unit (microamaounts)
is.negative = FALSE
)
# AdGroupEstimateRequest
aer <- adgroup.estimate.request(ker)
# Criteria for CampaignEstimateRequest
cer_criteria <- vector("list", length = 2)
cer_criteria[[1]] <- as.criterion(id = "2826", type = "Location") # united kingdom
cer_criteria[[2]] <- as.criterion(id = "1000", type = "Language") # english
# CampaignEstimateRequest
cer <- campaign.estimate.request(aer, campaign.id = NULL, criteria = cer_criteria)
# Build the request
request <- traffic.estimator.request(cer)
# Download data from the API
r <- get.service(request = request,
cid = customer_id,
auth = google_auth,
api.version = api_version,
user.agent = "r-adwordsplus-test",
verbose = TRUE)

passing of longitude and lattitude arguments from R to read URL in Google Maps and extract routes

I try to use Google Maps and R to calculate travel times per transit between an origin an destination.
The guidelines for the search can be found at: https://developers.google.com/maps/documentation/directions/#TravelModes
When I submit the latitude and the longitude of the origin and destination as literals, things work fine.
For instance, the following R code executes correctly and we obtain the distance and trip duration (the output of the search is in JSON format and is converted to an R object with fromJSON)
library(rjson)
library(gooJSON)
route <- url('http://maps.googleapis.com/maps/api/directions/json? origin=51.13854,4.384575&destination=51.13156,4.387118&region=be&sensor=false&mode=transit&departure_time=1372665319')
route_file <- file("route_file.json")
L <- readLines(route,-1)
writeLines(L, route_file)
close(route)
routesR_zone1_to_zone20 <- fromJSON( file = route_file )
routesR_zone1_to_zone20$routes[[1]][[3]][[1]]$distance$value/1000
routesR_zone1_to_zone20$routes[[1]][[3]][[1]]$duration$value/60
However, what I am really interested in is to repeat this operation for thousands of origin-destination pairs. The longitude and the latitude of the origins and destinations then become variables.
For instance:
> lat_or
[1] 51.13854
> long_or
[1] 4.384575
> lat_des
[1] 51.13156
> long_des
[1] 4.387118
> route <- url('http://maps.googleapis.com/maps/api/directions/json? origin=lat_or,long_or&destination=lat_des,long_des&region=be&sensor=false&mode=transit&departure_time=1372665319')
> route_file <- file("route_file.json")
> L <- readLines(route,-1)
> writeLines(L, route_file)
> close(route)
> routesR_zone1_to_zone20 <- fromJSON( file = route_file )
> routesR_zone1_to_zone20
$routes
list()
$status
[1] "NOT_FOUND"
Thus, although the coordinates are the same as in the previous example, this time, no route is found.
I suppose that the problem is that, when the url is accessed, lat_or etc are not "translated" in the corresponding numeric values, and that Google tries to calculate the route between the literals " lat_or,long_or" and " lat_des,long_des".
Does anyone have a suggestion on how to circumvent the problem?
Standard text processing:
lat_or <- 51.13854
long_or <- 4.384575
lat_des <- 51.13156
long_des <- 4.387118
route <- url(paste0("http://maps.googleapis.com/maps/api/directions/json?origin=",lat_or,",",long_or,"&destination=",lat_des,",",long_des,"&region=be&sensor=false&mode=transit&departure_time=1372665319") )
route_file <- file("route_file.json")
L <- readLines(route,-1)
writeLines(L, route_file)
close(route)
routesR_zone1_to_zone20 <- fromJSON( file = route_file )
routesR_zone1_to_zone20$routes[[1]][[3]][[1]]$distance$value/1000
#[1] 1.161
routesR_zone1_to_zone20$routes[[1]][[3]][[1]]$duration$value/60
#[1] 11.73333

Resources