I am trying to access Google Big Query with R, using the 'assertthat' and 'bigrquery' packages, following these instructions:
http://thinktostart.com/using-google-bigquery-with-r/#comment-22450
http://www.lunametrics.com/blog/2014/06/25/google-analytics-data-mining-bigquery-r/
The issue comes at the authentication step, I get directed to a code in the webbrowser, and when I paste the code in the terminal the following error appears:
Enter authorization code:
####CODE GOES HERE#####
Error en function (type, msg, asError = TRUE) :
Could not resolve host: accounts.google.com
I think that one possible issue is that we are behind a corporate firewall. While we do have access to the internet and I can install R packages, if I ping google.com from the terminal, I get an error. But I would like to know if any of you have found a solution to this kind of problem.
Thank you very much for reading this post. Any help is appreciated.
I found a solution to the issue. It was related to the corporate proxies. If I use the wifi for visitors I can run queries.
Related
I am trying to integrate Alertmanager to Webex to receive alerts in channel.
For that purpose I was trying webhook_configs. Also I can see webex_configs in https://prometheus.io/docs/alerting/latest/configuration/#webex_config
But using webex_configs prometheus operator throws error "unmarshal errors:\n line 32: field webex_configs not found in type config.plain"
Is there any way to solve this ? Anyone of you have tried can you please help me with some example.
I created a Neo4j AuraDB database where I have dumped the movie recommendation dataset. I am able to start and connect to the instance in the cloud. However, when I try to connect to the instance via its API in R Studio, using the neo4r package and the following code
movieDB <- neo4j_api$new(
url = curl("neo4j+s://0cc45a14.production-orch-0054.neo4j.io:7687"),
user = "neo4j",
password = "password"
)
movieDB$ping()
I get the error message in my console Error: Could not resolve host: 7. Also, when I try to start a graph with the following code
graph <- startGraph(
url = curl("neo4j+s://0cc45a14.production-orch-0054.neo4j.io:7687"),
user = "neo4j",
password = "password"
)
I also get the following error message in my console
Error in function (type, msg, asError = TRUE) :
Could not resolve host: 8
I do not know why the error is happening, as I have previously connected to a Neo4j sandbox instance from within R Studio without any hassle. As always, I will appreciate your helpful suggestions. Thanks!
AuraDB requires a driver which supports the Bolt protocol. I do not know of any R driver that supports the protocol at the moment, they only support HTTP (and HTTP is not available yet in AuraDB).
I am fairly new to using APIs and trying to use the zendesk API now through R using the ZendeskR package. I belive I have connected to it however I keep getting the following error whenever I try to query it.
Here is my code:
library(zendeskR)
library(rjson)
zendesk(username, password, url)
ticket <- getTicket('20150')
The username, password and url are all variables that I have assigned the correct values.
The following error that I get when I run it is this:
Error in function (type, msg, asError = TRUE) :
error:1407742E:SSL routines:SSL23_GET_SERVER_HELLO:tlsv1 alert protocol version
Please help as I am unsure on what this error means or what I am doing wrong.
Thanks.
TLS v1.1 is no longer accepted by Zendesk, please use TLS v1.2.
How do I use R to do a Google Custom search? I have the custom search engine id and the api key. I currently try to do this:
getURL("https://www.googleapis.com/customsearch/v1?key=API_KEY&cx=ENGINE_ID&q=searchterm")
and I get the following error:
Error in function (type, msg, asError = TRUE) : SSL certificate
problem: unable to get local issuer certificate
Though I am able to get the results in json when I do a get request in the browser. Any clue on whats happening?
httr package worked!!
library(httr)
query="https://www.googleapis.com/customsearch/v1?key=API_KEY&cx=ENGINE_ID&q=SEARCH_TERM"
content(GET(query))
set ssl.verifypeer=TRUE in getURL
getURL("https://www.googleapis.com/customsearch/v1?key=API_KEY&cx=ENGINE_ID&q=searchterm", ssl.verifypeer=TRUE)
Since Rcurl no longer works for importing data into R from Google Sheets, I have been using gsheet2tbl.
This has been working well but today I was trying to download from a recently created Google Sheet and I received the following error:
url2<-"https://docs.google.com/spreadsheets/d/.../edit?usp=sharing"
d <- gsheet2tbl(url2, sheetid = 0)
Error in parse.response(r, parser, encoding = encoding) :
client error: (400) Bad Request
I double checked and everything is working just fine with my previously created Google Sheets.
Does anyone have any thoughts on how I can troubleshoot this issue?
Thanks very much,
Matt
There's a new package for reading from Google sheets... https://github.com/jennybc/googlesheets
I find that it's fantastic for this sort of work. Give it a shot...
devtools::install_github("jennybc/googlesheets")
# run this and it will ask for user authentication...
gs_ls()
gs_read(ws = "Your worksheet")