Use bigrquery auth in shiny application - r

I want to create a shiny application which makes use of the bigrquery to connect to the BigQuery API and run a query.
I use the following code to execute the query:
library(bigrquery)
project <- "PROJECT_ID" # put your project ID here
sql <- 'QUERY '
test <- query_exec(sql, project = project)
But before this there is an authentication process in the bigrquery package like:
google <- oauth_endpoint(NULL, "auth", "token",
base_url = "https://accounts.google.com/o/oauth2")
bigqr <- oauth_app("google",
"465736758727.apps.googleusercontent.com",
"fJbIIyoIag0oA6p114lwsV2r")
cred <- oauth2.0_token(google, bigqr,
scope = c(
"https://www.googleapis.com/auth/bigquery",
"https://www.googleapis.com/auth/cloud-platform"))
How can I integrate the auth process in my application that
the process needs no interaction or
the process works with given app key and secrets (where do I get them? ) or
the auth process opens up in another browser window.
Regards

One suggestion I have, which is similar to an answer I provided on a question about server-side access to Google Analytics data, is to use a Google Service Account. The googleAuthR package by Mark Edmondson, available through CRAN, provides functionality to perform server-side authentication in R using a Google Service Account. Another package by the same author called bigQueryR, also on CRAN, integrates with googleAuthR and uses the resulting authentication token to execute queries to Google BigQuery.
To achieve this:
Create a service account for your Google API project.
Download the JSON file containing the private key of the service account.
Grant the service account access to your Google BigQuery project, in the same way as you would for any other user. This is done via the Google API console IAM screen where you set permissions for your project.
Supply the location of the private key JSON file as an argument when authenticating with googleAuthR (see the example below.):
The following example R script, based off an example from the bigrquery package, references the JSON file containing the private key and performs a basic Google BigQuery query. Remember to set the json_file argument to the appropriate file path and the project argument to your Google BigQuery project:
library(googleAuthR)
library(bigQueryR)
gar_auth_service(
json_file = "API Project-xxxxxxxxxxxx.json",
scope = "https://www.googleapis.com/auth/bigquery"
)
project <- "project_id" # put your project ID here
sql <- "SELECT year, month, day, weight_pounds
FROM [publicdata:samples.natality] LIMIT 5"
bqr_query(projectId = project, query = sql, datasetId = "samples")

Related

Authentication for Bigquery using bigrquery from an R in Google Colab

I try to access my own data tables stored on Google BigQuery in my Google Colab sheet (with a R runtime) by running the following code:
# install.packages("bigrquery")
library("bigrquery")
bq_auth(path = "mykeyfile.json")
projectid = "work-366734"
sql <- "SELECT * FROM `Output.prepared_data`"
Running
tb <- bq_project_query(projectid, sql)
results in the following access denied error:
Access Denied: BigQuery BigQuery: Permission denied while globbing file pattern. [accessDenied]
For clarification, I already created a service account (under Google Cloud IAM and admin), gave it the roles ‘BigQuery Admin’ and ‘BigQuery Data Owner’, and extracted the above-mentioned json Key file ‘mykeyfile.json’. (as suggested here)
Additionally, I added the Role of the service account to the dataset (BigQuery – Sharing – Permissions – Add Principal), but still, the same error shows up…
Of course, I already reset/delete and reinitialized the runtime.
Am I missing giving additional permissions somewhere else?
Thanks!
Not sure if it is relevant, but I add it just in case: I also tried the authentication process via
bq_auth(use_oob = TRUE, cache = FALSE)
which opens an additional window, where I have to allow access (using my Google Account, which is also the Data Owner) and enter an authorization code. While this steps works, bq_project_query(projectid, sql) still gives the same Access Denied error.
Trying to authorize access to Google BigQuery using python and the following commands, works flawless (using the same account/credentials).
from google.colab import auth
auth.authenticate_user()
project_id = "work-366734"
client = bigquery.Client(project=project_id)
df = client.query( '''
SELECT
*
FROM
`work-366734.Output.prepared_data`
''' ).to_dataframe()

Connect to googlesheets via shiny in R with googlesheets4

I'm trying to use an updated version of this example to connect to a private googlesheet via shiny, and deploy this app on the shinyapps.io server. The user is not required to authenticate to a google account as the app uses a specified pre-existing googlesheet.
I've followed this example (partly copied here), attempting to save the token to my shiny app:
# previous googlesheets package version:
shiny_token <- gs_auth() # authenticate w/ your desired Google identity here
saveRDS(shiny_token, "shiny_app_token.rds")
but tried to update it to googlesheets4, like this:
ss <- gs4_get("MY GOOGLE DOC URL") # do the authentication once, manually.
ss
gs4_has_token() # check that the token exists
# get token
ss_token <- gs4_token()
# save the token
save(ss_token, file = "APP PATH ... /data/tk.rdata")
Then in the app, I have placed this code outside the shinyApp() function.
load("data/tk.rdata")
googlesheets4::gs4_auth(token = ss_token, use_oob = T)
In the app, I connect to a google doc from the app, using a hardcoded id obtained from
ss$spreadsheet_id above. The app works locally.
After attempting to deploy the app to the server I get the error "...Can't get google credentials. Are you running googlesheets4 in a non-interactive session?... etc" I thought that the token would contain sufficient information for this.
I'd be grateful if anyone can point me to a guide to setting this up, and also comment on whether this approach (saving a token on the shinyapps.io) is safe?
I've looked at other examples, but it seems most are for the previous version of googlesheets
On 21-Jul-2021 googlesheets4 deprecated some of its function when releasing v1.0.0.
I have updated volfi's answer to work with googlesheets4 v1.0.0.
It also works when deploying to shinyapps.io.
Set up non-interactive authentication
library(googlesheets4)
# Set authentication token to be stored in a folder called `.secrets`
options(gargle_oauth_cache = ".secrets")
# Authenticate manually
gs4_auth()
# If successful, the previous step stores a token file.
# Check that a file has been created with:
list.files(".secrets/")
# Check that the non-interactive authentication works by first deauthorizing:
gs4_deauth()
# Authenticate using token. If no browser opens, the authentication works.
gs4_auth(cache = ".secrets", email = "your#email.com")
Example - add data to Gooogle Sheet
Create a Google Sheet on Google Sheets and copy the sheet's url.
library(googlesheets4)
gs4_auth(cache=".secrets", email="your#email.com")
ss <- gs4_get("https://docs.google.com/path/to/your/sheet")
sheet_append(ss, data.frame(time=Sys.time()))
If deploying your app to shinyapps.io make sure to deploy the file in the .secrets folder.
Just follow the instructions in this link:
# designate project-specific cache
options(gargle_oauth_cache = ".secrets")
# check the value of the option, if you like
gargle::gargle_oauth_cache()
# trigger auth on purpose to store a token in the specified cache
# a broswer will be opened
googlesheets4::sheets_auth()
# see your token file in the cache, if you like
list.files(".secrets/")
# sheets reauth with specified token and email address
sheets_auth(
cache = ".secrets",
email = "youremail"
)
I am posting here because I started from this thread on this journey, and want to share what finally worked after many hours of having a go, reading gargle, googledrive, and googlesheets4 documentation and oh so many other posts on this issue.
I first used the googlesheets4 method gs4_auth() to obtain a credential and stored it in a .secrets folder. As described in this thread and here. This worked on my desktop and I was excited. It did not work on shinyapps.io or on my Ubuntu 18.4 instance of shiny-server that I have on an AWS EC2 instance. The error was something like this:
"Error in ... : Can't get Google credentials.Are you running googledrive in a non-interactive session? Consider: drive_deauth() to prevent the attempt to get credentials. Call drive_auth() directly with all necessary specifics."
Then I tried an approach starting from here and taking me to here
Somehow this did work on shinyapps.io but still not on my Ubuntu shiny server.
This worked: I pursued a Google service account approach as described here and created a project, then a service account for the project, added Google Sheets API to the project, then downloaded a key as a JSON file. I then used at the top of my app_server.R file googlesheets4::gs4_auth(path = './<path to hidden JSON file folder I called .token>/.token/<JSON key file>.json'). This still did not work until the final step that is not clearly explained almost anywhere I looked which is to go to the Google sheet in question, and "share" it with the client_email email address from the JSON key file, giving it editor permissions, in my case. This was finally well explained in this random article: https://robocorp.com/docs/development-guide/google-sheets/interacting-with-google-sheets
Finally read and write access for my app from shiny server on my AWS server instance. I really hope someone finds this useful.

Authentication for Bigquery using bigrquery from an R Markdown document

I am having problems using bigrquery to connect to a GCP service account from within an R Markdown document that I knit. When I attempt from the console, authentication works fine. Both
library(bigrquery)
bq_auth()
and
library(bigrquery)
bq_auth(email="my-service-account-email#myproject.iam.gserviceaccount.com")
launch a browser with a dialog that lets me pick and authenticate using the specified account as expected. But in the R Markdown, any attempt like
options("httr_oob_default" = TRUE)
bq_auth(email="my-service-account-email#myproject.iam.gserviceaccount.com")
or even using the full list like this
bq_auth(
email = "my-service-account-email#myproject.iam.gserviceaccount.com",
path = NULL,
scopes = c("https://www.googleapis.com/auth/bigquery"),
cache = gargle::gargle_oauth_cache(),
use_oob = gargle::gargle_oob_default(),
token = NULL
)
leads to the error
Error: Can't get Google credentials.
Are you running bigrquery in a non-interactive session? Consider:
* Call `bq_auth()` directly with all necessary specifics.
Can anyone see what I am missing? Thanks in advance.
You can download the JSON file of your Google Cloud service account, then use it as a path that the “bq_auth” function can recognize. Here's the steps:
Google Cloud Console (console.cloud.google.com)
Navigation Menu
IAM & Admin Service
Accounts
Create Service Account (create one)
Create Key, and save to "/path/to/jsonfilename.json"
Authenticate in your R Markdown code: bigrquery::bq_auth(path = "/path/to/jsonfilename.json")
Note: you'll need to make sure to set the service account to have access to BigQuery. I set mine to "BigQuery Admin" and it worked, but that might be too broad
Borrowed this answer from Elaine See's post on medium: https://medium.com/#elaine.yl.see/easiest-way-to-use-bigquery-in-r-8af466cd55ca

How can I login to googlesheets using R and an existing JSON key?

I am trying to upload my csv files to my google drive/sheets.
Working code: a window pop-up, I login into my google account, then code works perfectly, file is uploaded.
library(googledrive)
library(googlesheets4)
dff <- drive_upload('dff.csv', type = "spreadsheet")
drive_browse(dff)
I want to avoid this step. So I went to my googlesheets API and created from that window a service account with owner permission.
library(googledrive)
library(googlesheets4)
sheets_auth(scope = "https://www.googleapis.com/auth/drive",
path = 'myjson.json')
drive_auth(token = sheets_token())
dff <- drive_upload('dff.csv', type = "spreadsheet")
drive_browse(dff)
And this does not work. Moreover, the drive_browse(dff) opens a browser window with a message that I have no access to the file.
How can I solve this? Maybe there are other options? Ideally I need this script to run without any logins to google at all.
Service accounts behave like any other normal account - they can own files and have permissions to them. i.e. if you create a file with your personal account and attempt to access it using the service account, without giving the appropriate permissions prior to that, it will result in an error.
I suggest you use the service account for both of your communications to the Sheets and Drive APIs. In order to do that, you should replace drive_auth(token = sheets_token()) to drive_auth(service_token='myjson.json') (in case you are using googledrive library v0.1.3) or drive_auth(path = 'myjson.json') for v1.0.0 of the library.
Bear in mind that after creating the files using the service account, if you plan on accessing them from any other account you will have to share them to your account beforehand. You can use the drive_share() function in your R code for that.
Further to that, you can consider the use of delegation in case you are using G Suite in your domain. This feature will allow the service account to act as (or "impersonate") another user of the domain. Every action that it performs will be as if the subject of the "impersonation" were executing it, anything permissions related included.
Solution provided by carlesgg97
Once the file is uploaded, it has to be shared to all users.
https://www.rdocumentation.org/packages/googledrive/versions/1.0.0/topics/drive_share

How to access bigquery from shiny app

I am trying to build a shiny app that needs to access my bigquery tables, it works fine locally through interactive authentication.
Whey I deploy the app it does not work giving the error:
Error: oauth_listener() needs an interactive environment.
There is a suggestion here - Authorization for accessing BigQuery from R session on server, but I don't know how to pass the .httr-oauth file to shinyapps.io
The way it was resolved is you need to generate access token json file in Google Cloud IAM with relevant access to BigQuery resources.
This file should be placed in your appDir folder and referenced in code, e.g.:
with bigrquery package
library(bigrquery)
set_service_token('access_token.json')
tb <- bq_dataset_query(
x = ds,
query = sql,
billing = '{your billing project}',
use_legacy_sql = use.legacy.sql,
parameters = params
)
dt <- bq_table_download(tb)
, or with retl package:
# set env var `BIGQUERY_ACCESS_TOKEN_PATH` to path of your token file.
devtools::install_github(madedotcom/retl)
library(retl)
library(data.table)
dt <- bqExecuteQuery(sql)

Resources