I am trying to connect to Azure DataLake Gen2 via R. I get from Azure DataLake administrator read access to given folder.
I have:
blob_service = 'https://DataLakeName.blob.core.windows.net'
data_lake_storage = 'https://DataLakeName.dfs.core.windows.net'
pathToFolder = 'path/to/folder/with/acess'
blob_sas_token = 'blobksastoken1232133210'
blob_sas_url = 'blob_service/pathToFolder?blob_sas_token'
Accoring to information about library AzureStor I tried:
library(AzureStor)
end_point <- blob_endpoint(endpoint = "https://DataLakeName.blob.core.windows.net/", sas = blob_sas_token)
list_blob_containers(end_point)
Unfortunately I got HTTP status code 500, what means probably an internal server error. Administratior said that token is correct and endpoint also works fine.
Error in process_storage_response(response,
match.arg(http_status_handler), : Internal Server Error (HTTP
500). Failed to complete Storage Services operation. Message:
InternalError Server encountered an internal error. Please try again
after some time.
Any idea what I am doing wrong, or how can I connect to ADL in other way using data that I have?
Related
I am trying to create a simple app (in R) using GroupMe's API, which utilizes OAuth2.0. The documentation can be found here. However, I'm getting stuck on the first step of authentication/token generation for a user. See below for my code and the response I get:
access_key = ****
client_id = ****
gendpoint <- oauth_endpoint(
authorize = glue("https://oauth.groupme.com/oauth/login_dialog?client_id={client_id}"),
access = glue("http://localhost:1410/?access_token={access_key}")
)
gapp <- oauth_app(
"pingme",
key = client_id,
secret = access_key
)
t <- oauth2.0_token(
endpoint = gendpoint,
app = gapp
)
The above code is sufficient to bring in the login page, which presumably allows me to enter my credentials to obtain a token. However, when I enter my credentials, I get the following error message in R:
Authentication complete.
Error in curl::curl_fetch_memory(url, handle = handle) :
Failed to connect to localhost port 1410: Connection refused
So it looks like my authentication/login credentials worked, but somewhere an error prevented me from actually generating the token.
Could someone help me with this? This is my first time using the OAuth2.0 framework so I'm very confused. Thanks in advance!
I am trying to migrate my RAdwords code to rgoogleads and keep bumping into the same authentication nightmare. Some context:
Use R in my local computer, but all deployments are in a remote server that runs shiny, RStudio and other goodies.
I have managed to successfully activate the service account. I am (almost) sure about it since I successfully connect to Google Ads with Python.
I tried the same code with local OAuth2.0 and it works fine. However when I try to do so with the ServiceAccount authentication I keep getting the same error. I am sure I am doing something wrong.
This is a sample of my code:
library(rgoogleads)
gads_auth(path='../serviceAccount.json')
gads_auth(email='my_email#company.com',
developer_token = '_my_dev_token_'
)
gads_set_login_customer_id('xxx-xxx-0524')
gads_get_accessible_customers()
And this is the response
<error/rlang_error>
Request had insufficient authentication scopes.
Backtrace:
1. rgoogleads::gads_get_accessible_customers()
2. gargle::response_process(ans, error_message = gads_check_errors2)
8. rgoogleads:::error_message(resp)
9. rgoogleads:::gads_abort(paste(client_id, msg))
10. cli::cli_abort(message = message, ..., .envir = .envir)
Even when I try gads_has_token() it returns TRUE (!)
Any clues? I can't seem to get my head around this.
I created a Neo4j AuraDB database where I have dumped the movie recommendation dataset. I am able to start and connect to the instance in the cloud. However, when I try to connect to the instance via its API in R Studio, using the neo4r package and the following code
movieDB <- neo4j_api$new(
url = curl("neo4j+s://0cc45a14.production-orch-0054.neo4j.io:7687"),
user = "neo4j",
password = "password"
)
movieDB$ping()
I get the error message in my console Error: Could not resolve host: 7. Also, when I try to start a graph with the following code
graph <- startGraph(
url = curl("neo4j+s://0cc45a14.production-orch-0054.neo4j.io:7687"),
user = "neo4j",
password = "password"
)
I also get the following error message in my console
Error in function (type, msg, asError = TRUE) :
Could not resolve host: 8
I do not know why the error is happening, as I have previously connected to a Neo4j sandbox instance from within R Studio without any hassle. As always, I will appreciate your helpful suggestions. Thanks!
AuraDB requires a driver which supports the Bolt protocol. I do not know of any R driver that supports the protocol at the moment, they only support HTTP (and HTTP is not available yet in AuraDB).
I'm just starting to get my feet wet in the API waters, and I'm trying to follow along the "Send a simple request" section at this link. When executing
github_api <- function(path) {
url <- modify_url("https://api.github.com", path = path)
GET(url)
}
resp <- github_api("/repos/hadley/httr")
I get the following error message:
Error in curl::curl_fetch_memory(url, handle = handle) : schannel:
next InitializeSecurityContext failed: SEC_E_UNTRUSTED_ROOT
(0x80090325) - The certificate chain was issued by an authority that
is not trusted.
I'm getting similar error messages for most API calls I try to make on my machine, although the call
GET("http://api.open-notify.org/astros.json")
taken from this link happily returns data without issue. Searching google for the error message returns a lot of posts unrelated to R specifically and I'm having trouble determining what troubleshooting steps I can take.
Update
I have tested the call on another machine with success, so there is some setting/configuration/firewall impediment on my main machine which is preventing me from making some, but not all, API calls. This may be related to this issue. Is there a way to determine the root cause here and apply a fix?
I was able to solve this by using a forward proxy which allows my machine to reach sites outside my corporate firewall as follows (I've obscured the url and port for obvious reasons):
proxy <- use_proxy( url = "http://myproxy"
,port = 9999
,auth = "basic")
github_api <- function(path) {
url <- modify_url("https://api.github.com", path = path)
GET(url, proxy)
}
resp <- github_api("/repos/hadley/httr")
Hopefully such a forward proxy exists for anyone else facing this issue.
I am using R and package bigrquery to access Bigquery from an R session.
This works great as long as I am on my local machine.
However, when I try to access Bigquery from R on a remote server it does not work at all.
I tried to copy the .httr-oauth file into my home directory on the server but this does not work.
I get the error message:
Auto-refreshing stale OAuth token.
Error in refresh_oauth2.0(self$endpoint, self$app, self$credentials) :
client error: (400) Bad Request
I really have no idea about where to store the necessary credentials and unfortunately I was not able to find anything useful about that by google-searching the topic.
By default httr, which is used by bigrquery for oauth, will look in the R session's current working directory for .httr-oauth. You can override this location with the following (perhaps putting it in your .Rprofile if you like):
options("httr_oauth_cache"="~/.httr-oauth")
But for error message you received, its seems like the location is not the issue and it might be easier to just redo the oauth flow on the remote server to cache a new credential. To trigger a new oauth flow on the remote server:
ensure the .httr-oauth file does not exist
restart R
perform one query with bigrquery
Note that if httr tries to redirect to localhost, you can force it to do an out-of-band oauth flow with:
options(httr_oob_default = TRUE)