R script to import data from google analytics - r

I am trying to connect to google analytic s API through a script running on R studio server.
I have followed steps from this tutorial:
http://www.r-bloggers.com/how-to-extract-google-analytics-data-in-r-using-rgoogleanalytics/
If I run this on localhost, it works alright but when I try to run the script on a remote server through R studio,
authroization step does not complete because it tries to connect to the url on localhost i.e
localhost:1410/
instead of REMOTESERVERHOSTNAME:1410
I found this post which suggests port forwarding if running through R studio : link
but if tomorrow, if I want to access it on the other host computer, I would not want to set port forwarding first.
How to run this script without having to set port forwarding?What are other ways for oauth authentication for my R script?

One suggestion would be to use a Google Service Account. The googleAuthR package by Mark Edmondson, available through CRAN, provides functionality to perform server-side authentication in R using a Google Service Account. Another package by the same author called googleAnalyticsR, also on CRAN, integrates with googleAuthR and uses the resulting authentication token to execute queries against the Google Analytics Reporting APIs, including the latest version, 4.0.
To achieve this:
Create a service account for your Google API project.
Download the JSON file containing the private key of the service account.
Grant the service account access to Google Analytics, in the same way as you would for any other user.
Supply the location of the private key JSON file as an argument when authenticating with googleAuthR (see the example below.):
The following example R script references the JSON file containing the private key and performs a basic Google Analytics reporting query. Remember to set the json_file argument to the appropriate file path and the id argument to the appropriate Google Analytics view:
library(googleAuthR)
library(googleAnalyticsR)
gar_auth_service(
json_file = "API Project-xxxxxxxxxxxx.json",
scope = "https://www.googleapis.com/auth/analytics"
)
google_analytics(id = "123456789", start = "2016-06-01", end = "2016-06-28")

Related

Use Google SearchConsoleR Data in Shiny Application

I have created an API Key, OAuth 2.0 Client IDs json file, and a service accounts email. How can I automatically connect without opening up the browser for authentication to the google search console with the searchConsoleR package?
For googleAnalyticsR I did this
googleAuthR::gar_set_client(json = here::here("credentials/client_id.json"))
googleAnalyticsR::ga_auth(email = "email",
json_file = here::here("credentials/file.json"))
But I am unable to connect to the search console. Is there any good documentation how I can connect to google search console in an automatic way without authentication in the browser?
Take a look at this:
options(googleAuthR.scopes.selected = "https://www.googleapis.com/auth/webmasters")
gar_auth_service(json_file = "/service-account-key.json")

How Can I Obtain GCP service account credentials on Google Cloud Run?

This page explains both:
Obtaining and providing service account credentials manually for developing local, deploying on-premises, or deploying to another public cloud.
Obtaining credentials on Compute Engine, Kubernetes Engine, App Engine flexible environment, and Cloud Functions
But there is no mention of obtaining credentials on Cloud Run. I'd appreciate it if you give instructions for obtaining credentials and setting firebase-admin initializeApp and firebase initializeApp for authentication on Cloud Run.
The documentation suggests that you can use the default service account just like other Google Cloud products as described here. The Firebase Admin SDK should use that account when initialized with no parameters.
There are also steps described if you want to use a non-default service account, which you can simply configure in the console or provide with gcloud.
If you must provide a file that's readable at runtime, you will have to deploy an image with that file added to the image. There is no short set of steps to add that file - you will have to make your docker build include it in a readable location, and your code will know where to look for it in order to load it.

Error 403 because Google Cloud Vision client points to wrong project

I'm trying to work through the Google Cloud Vision Pyhon example but I'm getting an authentication error.
This is not my only Google Cloud project, and my GOOGLE_APPLICATION_CREDENTIALS environment variable is set to the path to my bigquery project. I thought I could override this by using this statement:
client = vision.ImageAnnotatorClient.from_service_account_json(key_path)
where key_path is the path of the json key file associated with my (Cloud Vision API-enabled) vision project. However, I'm getting the 403 error from this
response = client.label_detection(image=image)
Apparently, even though I specified the key file path for the ImageAnnotatorClient, it still looks at my bigquery project's credentials and spits the dummy because there is no vision API enabled for it.
Do I really have to change the environment variable every time I change the project?
It seems that the Cloud Vision project ID does not propagate to the Python environment from either the Cloud Console or the credentials file. I fixed the reference using the Cloud Console:
gcloud config set project my_vision_project
The label_detection call works now.

Google Translate API authentication error

I am trying to call the Google Translate API and using the following to authenticate from my local - gcloud auth application-default login.
The command works successfully and I am authenticated but when I try to call the API i get the following error message which indicates that it is being read as an anonymous API call
google.cloud.exceptions.Forbidden: 403 Daily Limit Exceeded
I ran into this issue too this week,
I thought i was well authenticated but when i was running my code which is C# using the google translate API v2 package, it gave me the same 403 code daily limit exceeded,
I fiddled around with the CLI, made several accounts, service accounts API keys and all and it never worked.
https://cloud.google.com/dotnet/docs/getting-started/hello-world
this page, (the .NET guide part) says you should be using the google cloud platform plugin they release for visual studio, and login via that, i used it and it worked.
If you look on the bottom left part there are guides for any other language that you might be using. (consider adding that as info it helps me help you).
I would love it if it only worked via CLI but as long as it works i guess it's fine...

What is the host url of Oracle cloud Application Express?

I have signed up for the trial version of oracle cloud data base.
Where can I find the host URL that I need to use in my Java dynamic web project of eclipse?
Thanks in advance.
Basic knowledge about the trial request and details needed :
After you request a trial service from Oracle Cloud, you'll be provided with a few details: Identity Domain, Temporary password, and your username.
By using them you'll login to your trial cloud account from here:
[https://cloud.oracle.com/sign_in][1]
Accessing the service from Eclipse
When it comes to access your Database Cloud Service from Eclipse:
You need to add the Oracle Cloud plugin from the Eclipse Market (Help Menu->Eclipse Marketplace)
You'll add the Oracle Cloud connection window to the interface (Window Menu->Show View->Other)
Create a new connection where you follow the steps with the proper details you have about your account.

Resources