I'm trying to query several coordinates through the revgeocode command of the ggmap package, but at the time of running my code tells me that it is not possible to connect to the API url.
I'm trying to understand:
How to place the API key inside the code so that the query can be generated?
Why does it tell me that I have already exceeded the number of consultations if I have not made one?
Below is part of the code:
ll <- cbind(Longitud, Latitud)
LL_1 <- as.matrix(ll)
DirR <- rep(0, nrow(LL_1))
for (j in 1:nrow(LL_1)){
DirR[j]<- revgeocode(LL_1[j,])
}
You need to upgrade ggmap to version 2.7.903 from Github and register your Google Maps API key. There is a tutorial under this link.
How to place the API key inside the code so that the query can be generated?
You have to register_google(key = "...") in every new R session before you execute any calls to the API.
Why does it tell me that I have already exceeded the number of consultations if I have not made one?
If you do not register a billing-enabled Google Maps API key, you share the quota with all the requests of your geographical region.
Related
I'm trying to connect to db in Azure Data Explorer from R using AzureKusto library. Following this documentation https://github.com/Azure/AzureKusto, after calling kusto_database_endpoint(...) function I need to open a browser page and insert the printed code manually. There's a way to skip this manual step and do it automatically? Or there are alternatives for connecting to ADX db?
Thanks for the help!
Co-creator of the package here. Thank you for the question. Yes, you can use the get_kusto_token function to obtain a token and then pass it to kusto_database_endpoint as the .query_token argument. get_kusto_token supports the following authentication flows:
"authorization_code"
"device_code"
"client_credentials"
"resource_owner"
For example, if you have an AAD application service principal that has access to the Azure Data Explorer cluster, you can use its ID and secret to authenticate:
# authenticate using client_credentials method: see ?AzureAuth::get_azure_token
token <- get_kusto_token("https://mycluster.kusto.windows.net",
tenant="mytenant",
authtype="client_credentials",
app="myappid",
password="myclientsecret")
kusto_database_endpoint(server = "mycluster.kusto.windows.net",
database = "mydb",
.query_token=token)
The help page ?AzureKusto::get_kusto_token provides more detailed information on this. Also, please note that the get_kusto_token function is a wrapper around AzureAuth::get_azure_token. The readme for the AzureAuth R package has more detailed examples of other methods of obtaining an Azure access token: https://github.com/Azure/AzureAuth
I'm trying to get data from my GA account using R. I've managed to do this with RGoogleAnalytics package. Unfortunetally I have a problem with authentication on production server. Everything works but I have to refresh token each day (and it is not acceptable for me). I've read about need of refresh_token but I have no idea how to obtain it using this package... What I did is:
library("RGoogleAnalytics")
client.id <- "XXX"
client.secret <- "XXX"
token <- Auth(client.id, client.secret)
save(token, file = "./auth/token")
load("./auth/token")
ValidateToken(token)
It works only for few hours (when token is valid) and the next day I get the error: Error: Refresh token not available. How to get this refresh token? Auth function does not seem to give it to me and after reading whole Internet I still have no idea how to deal with it. Could you help?
the RGoogleAnalytics package gives an easy way to extract the Google Analytics Data. but I think you should consider other packages that are more up to date and that are a little bit 'easier' to deal with(like the googleAnalyticsR package)
In my experience, practically all google services need this 'authentication' when using the API, and generally, the token expires, however, it is possible to generate a 'permanent token'.
This form for this 'permanent token' is a little 'hard', but once learned, you will be able to apply it in a similar way to most of Google's API services!
In summary, you need to create a project there in the GCP (Google Cloud console) to create an authentication key :
I recommend reading this first:
https://cloud.google.com/docs/authentication
In the documentation link below, you can find a step by step for this authentication for the RgoogleAnalytics package.
https://cran.r-project.org/web/packages/RGoogleAnalytics/RGoogleAnalytics.pdf
I am moving from Google to Here-Api Geocoding service.I need to implement batch geocoding using HERE Batch Geocoder API. Till now I am able to get almost all of the needed info from Batch Geocoder API. However, I am not able to find a way to get TimeZone info in Batch Geocoder API response.
For example,I was able to get Timezone object through Here Forward Geocoding API- by set query parameters 'gen=9&&locationattributes=adminInfo,timeZone', but this two APIs-Geocoding and Batch Geocoder API seems to work slightly different.
I tried various combinations of query params using this generic URL:
http://batch.geocoder.cit.api.here.com/6.2/jobs?action=run&app_code=[your-app-code]&app_id=[your-app-id]&gen=8&header=true&indelim=|&outdelim=|&outcols=displayLatitude,displayLongitude,navigationLatitude,navigationLongitude,mapViewTopLeftLatitude,mapViewTopLeftLongitude,mapViewBottomRightLatitude,mapViewBottomRightLongitude,locationLabel,houseNumber,street,district,city,county,state,postalCode,country,relevance,matchLevel,matchType,matchCode,mapReferenceId,responseAdditionalData,addressAdditionalData&addressattributes=all&locationattributes=all&responseattributes=all&maxresults=5&outputcombined=true&mailto=[yourname#domain.com]
I took it from a response here:
How and what do responseattributes return for the Here Batch Geocoder API?
However, neither in posted URL nor in HERE Api documentation I found a way to include in "outcols" information for Timezone(i need only the TimeZone Id anyway).My logic is basically that if params gen=9 and locationattributes=adminInfo,timeZone are set it should work in Batch geocoding(as it works in Forward Geocoding API).
Looking at the documentation it seems to me that Timezone info is not included in batch geocoding response at all, which is a problem for me since I need Timezone Id and in this way after Batch geocoding I need to make request for each entry to set Timezone id.
In short, I need to take TimeZone info(TimeZone Id) from HERE Batch Geocoder API
The BatchGeocoder Service does not support time zones only the Geocoder Service does support this right now.
I try to connect with Azure Cognitive Service using Roxford package. I got error propably due to wrong endpoint (after including Oxford Project into Azure Services there are several, region specific end points).
I got the key from personal account in Azure Cognitive Service project:
library(Roxford)
library(plyr)
library(rjson)
facekey <- "xxx" #look it up on your subscription site
getFaceResponseURL("http://getwallpapers.com/wallpaper/full/5/6/4/1147292-new-women-faces-wallpaper-2880x1800-for-phone.jpg",key= facekey)
#I got error
# {"error":{"code":"Unspecified","message":"Access denied due to invalid subscription key. Make sure you are subscribed to an API you are trying to call and provide the right key."}}
How to change the endpoint to the: "https://westcentralus.api.cognitive.microsoft.com/face/v1.0" ???
If your Roxford lib is the one here: https://github.com/flovv/Roxford/blob/master/R/videoAnalysis_LIB.R#L182
Then you can add the region when you call the method. Cognitive Services keys are dedicated to an Azure region, so you should use the same region when you use it. If you don't remember which region you choose when you generated the key, it's written in the overview in Azure portal.
Then when you use getFaceResponseUrl:
getFaceResponseURL <- function(img.url, key, region="westus")
Pass the region:
getFaceResponseURL("http://getwallpapers.com/wallpaper/full/5/6/4/1147292-new-women-faces-wallpaper-2880x1800-for-phone.jpg", key=facekey, region="theAzureRegionOfYourKey")
Context: We are trying to load some CSV format data into GCP BigQuery using GCP Dataflow (Apache Beam). As a part of this for the first time (for each table) creating the BQ tables thru BigQueryIO API. One of the customer requirement is the data on GCP needs to be encrypted using Customer supplied/managed Encryption keys.
Problem Statement: We are not able to find any way to specify the "Custom Encryption Keys" thru APIs while creating Tables. The GCP documentation details about how to specify the Custom encryption keys thru GCP BQ Console but could not find anything for specifying it thru APIs from within DataFlow Code.
Code Snippet:
String tableSpec = new StringBuilder().append(PipelineConstants.PROJECT_ID).append(":")
.append(dataValue.getKey().target_dataset).append(".").append(dataValue.getKey().target_table_name)
.toString();
ValueProvider<String> valueProvider = StaticValueProvider.of("gs://bucket/folder/");
dataValue.getValue().apply(Count.globally()).apply(ParDo.of(new RowCount(dataValue.getKey())))
.apply(ParDo.of(new SourceAudit(runId)));
dataValue.getValue().apply(ParDo.of(new PreProcessing(dataValue.getKey())))
.apply(ParDo.of(new FixedToDelimited(dataValue.getKey())))
.apply(ParDo.of(new CreateTableRow(dataValue.getKey(), runId, timeStamp)))
.apply(BigQueryIO.writeTableRows().to(tableSpec)
.withSchema(CreateTableRow.getSchema(dataValue.getKey()))
.withCustomGcsTempLocation(valueProvider)
.withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_IF_NEEDED)
.withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_APPEND));
Query: If anybody could let us know
If this is possible to provide encryption key thru Beam API?
If its not possible with the current version what could be the possible work
around?
Kindly let know if additional information is required.
Customer supplied encryption keys is a new feature, not all libraries have been updated to support it yet.
If you know the table name in advance, you can use UI/CLI or API to create table, then run your normal flow to load data into that table. That might be a work around for you.
https://cloud.google.com/bigquery/docs/customer-managed-encryption#create_table
API to create table: https://cloud.google.com/bigquery/docs/reference/rest/v2/tables/insert
You need to set this section on table object:
"encryptionConfiguration": {
"kmsKeyName": string
}
More details on table: https://cloud.google.com/bigquery/docs/reference/rest/v2/tables#resource