In the R script, when I try to send the email with the following codes below. It asks that the gmailr package is requesting access to your Google account. Select a pre-authorised account or enter '0' to obtain a new token. Press Esc/Ctrl + C to abort.
1: email1#gmail.com
without manually entering 1 in the console, how can my R script automatically select my pre-authorised account and sent an email accordingly?
library(gmailr)
gm_auth_configure(path="C:/Users/Google Drive/email.json")
my_email_message <- gm_mime() %>%
gm_to("email1#gmail.com") %>%
gm_from("email1#gmail.com") %>%
gm_subject("My test message")
gm_send_message(my_email_message)
This is the unattended / non-interactive authentication problem. I will try to give the rundown of the process as it worked for me - and the problem, exactly like yours, went away. As it states in gmailr/readme - you download json credentials, authenticate once interactively and copy creds to wherever you like. Credentials you can get via python quickstart, or even better - by simply creating a project on https://console.developers.google.com, adding gmail API to it, then creating OAuth credentials for a desktop app. The benefit of the latter approach is you will know exactly where all components are and will be able to repeat as many times as you want. I created a separate google e-mail address for this purpose. You will then download OAuth "client-secret" .json file into your project directory and call it credentials.json (or any other json name you like). Then you will once authenticate interactively running below commands from Rstudio when you are in your project directory:
gm_auth_configure(path = "credentials.json")
gm_auth(email = TRUE, cache = ".secret")
A webpage will pop up with scary messages, but you will agree to all and from then on you will be using cache. Cache .secret sub-directory that you just created inside your project (and you can give whatever name you wish to the cache directory) is portable - you can copy that alongside your credentials.json over to your shiny-server. It is convenient that all is contained in your project directory. You will need a few lines in your code after that - they should precede the command gm_send_message(your_email_prepared_with_gm_mime) and no more interactive authentication is needed no matter which computer you have copied your project to as long as it has gmailr and gargle (which is a gmailr dependency) installed in R on your server:
gm_auth_configure(path = "credentials.json")
options(
gargle_oauth_cache = ".secret",
gargle_oauth_email = "email_address_used_for_creds#gmail.com"
)
gm_auth(email = "email_address_used_for_creds#gmail.com")
# then compose your e-mail and send it
the last command allows to avoid dialogue for which account to use. This sometimes pops up on first use.
gmailr Readme explains it well; my explanation is an encouragement to read it again, if you get stuck. You can read also gmailr reference at https://gmailr.r-lib.org/index.html - it is pretty good. But my guess is - if you have followed the process here you won't even need that.
Note on cache: Default gargle (this is what makes authentication for gmailr happen) cache directory is in some hidden subdirectory of your home directory - so it is specific to you on that computer. However if you set it to be a subdirectory to your R project, the whole OAuth process becomes portable. Just copy your project directory were you want and the OAuth credential pair - the json file and OAuth token(s) in the cache will follow along. Tokens are gzipped binary files that gmail creates cryptographically and deposits in the cache during the "authentication dance". One address paired to one G-project gives one token. One probably could use multiple addresses and google projects in one R project, but I so far have yet to see the need for that.
Just add the "from e-mail address" with gm_auth(email = "email1#gmail.com")
library(gmailr)
gm_auth_configure(path="C:/Users/Google Drive/email.json")
gm_auth(email = "email1#gmail.com")
my_email_message <- gm_mime() %>%
gm_to("email1#gmail.com") %>%
gm_from("email1#gmail.com") %>%
gm_subject("My test message")
gm_send_message(my_email_message)```
Related
I've got googlesheets4 working in a shinyapps.io with the following code:
gs4_auth(
email = "me#email.com",
path = NULL,
scopes = "https://www.googleapis.com/auth/drive",
cache = "path_to_cache",
use_oob = FALSE,
token = NULL)
I run this locally, which requires initial browser authentication and downloads a file of some sort.
As long as I upload that file with my app to shinyapps.io, then it works (i.e. refreshes the token whenever it needs).
However, as I understand it, this is using googlesheets4 own Google API settings, which were set up to make it easy for everyone to use.
The disadvantage is that, since a lot of people are sharing this API, they sometimes (myself included) hit the data limits and get a 429 RESOURCE EXHAUSTED error. This is discussed here.
OK, so I've followed the instructions here and here and added the following code BEFORE the auth chunk already provided:
if (interactive()){
# Desktop Client ID
google_app <- httr::oauth_app(
"my-awesome-google-api-wrapping-package",
key = "mykey_for_desktop_app",
secret = "mysecret"
)
}else{
# Web Client ID
google_app <- httr::oauth_app(
"my-awesome-google-api-wrapping-package",
key = "mykey_for_web_app",
secret = "mysecret"
)
}
# API key
google_key <- "My-API-KEY"
gs4_auth_configure(app = google_app, api_key = google_key)
# Also configure google drive to use my API
drive_auth_configure(app = google_app, api_key = google_key)
So this seems to work locally (e.g. in RStudio) and I can see activity on my Google Cloud API dashboard.
However, whilst this works for a short period of time (e.g. 10 mins), even when uploaded to shinyapps.io, the auto-refresh seems to fail because I soon get the dreaded:
"Can't get Google credentials. Are you running googlesheets4 in a non-interactive session?"
Is anyone able to point me towards what I'm doing wrong?
Again - it works fine as long as I'm not trying to use my own API settings (the second code chunk).
OK, pretty sure I've got this working...
It was the YouTube video here that really helped, and made this more clear.
All I need is a Service Account, which seems to generate a json file that I can upload with my app.
i.e. at around 1:03 in the video shows the creation of this service account, then adding that e-mail address (of the Service Account) to the Google Sheet(s) I want to access, this means I can download (using GoogleDrive) and write (using GoogleSheets).
The crazy part is that all I need to put in my code is the following:
drive_auth(path = ".secrets/client_secret.json")
gs4_auth(path = ".secrets/client_secret.json")
i.e. those two lines (plus the downloaded json file for the Service Account) replace ALL the code I posted in my OP!
If anyone is reading this, I was struggling with the last steps of Jimbo's (excellent) answer, i.e. how to upload the local json file to shinyapps.io.
My working solution : I created a subfolder inside the shiny app folder, next to the app.r file, called "secrets". I placed the json file there. I made sure to set my working directory to the shiny app when testing everything locally. (Note : don't include the setwd() code in your shiny app code). I'm not sure if this exposes the json file somehow, but it'll have to do.
When publishing to shinyapps, I checked all boxes suggested by Rstudio to upload the whole contents of the folder (app.r file, subfolder + json file in subfolder). I used the following path in the app.r file:
drive_auth(path = "secret/clientsecret.json")
gs4_auth(path = "secret/clientsecret.json")
I'm trying to use an updated version of this example to connect to a private googlesheet via shiny, and deploy this app on the shinyapps.io server. The user is not required to authenticate to a google account as the app uses a specified pre-existing googlesheet.
I've followed this example (partly copied here), attempting to save the token to my shiny app:
# previous googlesheets package version:
shiny_token <- gs_auth() # authenticate w/ your desired Google identity here
saveRDS(shiny_token, "shiny_app_token.rds")
but tried to update it to googlesheets4, like this:
ss <- gs4_get("MY GOOGLE DOC URL") # do the authentication once, manually.
ss
gs4_has_token() # check that the token exists
# get token
ss_token <- gs4_token()
# save the token
save(ss_token, file = "APP PATH ... /data/tk.rdata")
Then in the app, I have placed this code outside the shinyApp() function.
load("data/tk.rdata")
googlesheets4::gs4_auth(token = ss_token, use_oob = T)
In the app, I connect to a google doc from the app, using a hardcoded id obtained from
ss$spreadsheet_id above. The app works locally.
After attempting to deploy the app to the server I get the error "...Can't get google credentials. Are you running googlesheets4 in a non-interactive session?... etc" I thought that the token would contain sufficient information for this.
I'd be grateful if anyone can point me to a guide to setting this up, and also comment on whether this approach (saving a token on the shinyapps.io) is safe?
I've looked at other examples, but it seems most are for the previous version of googlesheets
On 21-Jul-2021 googlesheets4 deprecated some of its function when releasing v1.0.0.
I have updated volfi's answer to work with googlesheets4 v1.0.0.
It also works when deploying to shinyapps.io.
Set up non-interactive authentication
library(googlesheets4)
# Set authentication token to be stored in a folder called `.secrets`
options(gargle_oauth_cache = ".secrets")
# Authenticate manually
gs4_auth()
# If successful, the previous step stores a token file.
# Check that a file has been created with:
list.files(".secrets/")
# Check that the non-interactive authentication works by first deauthorizing:
gs4_deauth()
# Authenticate using token. If no browser opens, the authentication works.
gs4_auth(cache = ".secrets", email = "your#email.com")
Example - add data to Gooogle Sheet
Create a Google Sheet on Google Sheets and copy the sheet's url.
library(googlesheets4)
gs4_auth(cache=".secrets", email="your#email.com")
ss <- gs4_get("https://docs.google.com/path/to/your/sheet")
sheet_append(ss, data.frame(time=Sys.time()))
If deploying your app to shinyapps.io make sure to deploy the file in the .secrets folder.
Just follow the instructions in this link:
# designate project-specific cache
options(gargle_oauth_cache = ".secrets")
# check the value of the option, if you like
gargle::gargle_oauth_cache()
# trigger auth on purpose to store a token in the specified cache
# a broswer will be opened
googlesheets4::sheets_auth()
# see your token file in the cache, if you like
list.files(".secrets/")
# sheets reauth with specified token and email address
sheets_auth(
cache = ".secrets",
email = "youremail"
)
I am posting here because I started from this thread on this journey, and want to share what finally worked after many hours of having a go, reading gargle, googledrive, and googlesheets4 documentation and oh so many other posts on this issue.
I first used the googlesheets4 method gs4_auth() to obtain a credential and stored it in a .secrets folder. As described in this thread and here. This worked on my desktop and I was excited. It did not work on shinyapps.io or on my Ubuntu 18.4 instance of shiny-server that I have on an AWS EC2 instance. The error was something like this:
"Error in ... : Can't get Google credentials.Are you running googledrive in a non-interactive session? Consider: drive_deauth() to prevent the attempt to get credentials. Call drive_auth() directly with all necessary specifics."
Then I tried an approach starting from here and taking me to here
Somehow this did work on shinyapps.io but still not on my Ubuntu shiny server.
This worked: I pursued a Google service account approach as described here and created a project, then a service account for the project, added Google Sheets API to the project, then downloaded a key as a JSON file. I then used at the top of my app_server.R file googlesheets4::gs4_auth(path = './<path to hidden JSON file folder I called .token>/.token/<JSON key file>.json'). This still did not work until the final step that is not clearly explained almost anywhere I looked which is to go to the Google sheet in question, and "share" it with the client_email email address from the JSON key file, giving it editor permissions, in my case. This was finally well explained in this random article: https://robocorp.com/docs/development-guide/google-sheets/interacting-with-google-sheets
Finally read and write access for my app from shiny server on my AWS server instance. I really hope someone finds this useful.
This is literally driving me mad and I've been working on it for so long that I figured I'd post my issue on here. Any help is appreciated!
The problem: I created a basic Dropbox account. I have a Shiny application and would like to use a stored data file from Dropbox within the app. I followed the steps to create an application on Dropbox and set the app to require full access to my files. I then ran the code below in R:
drop_auth(new_user = TRUE,key = "key",secret = "secret",cache = TRUE)
where key and secret are the actual key and secret for my application. A web browser is opened in Chrome with the error below.
I have looked up solutions online, however, none of them provides a clear enough explanation for me to follow along (obviously I am lacking knowledge). Can someone pleeeease help me with this. Thanks!
You haven't provided much code to work on here but I'm assuming you're using rdrop2? In your code above, you've provided several arguments to drop_auth() but, as per the package's documentation, run drop_auth() with no arguments - this should open a browser window and allow you to authorise the connection:
library(rdrop2)
drop_auth()
### PINCHED FROM RDROP2 DOCUMENTATION ###
# This will launch your browser and request access to your Dropbox account. You will be prompted to log in if you aren't already logged in.
# Once completed, close your browser window and return to R to complete authentication.
# The credentials are automatically cached (you can prevent this) for future use.
# If you wish to save the tokens, for local/remote use
token <- drop_auth()
saveRDS(token, file = "token.rds")
# Then in any drop_* function, pass `dtoken = token
# Tokens are valid until revoked.
If you have multiple Dropbox accounts, I recommend signing out of all of them prior to running the above code. When the browser window launches, it should then ask you to sign in to the account you need and complete the authentication process.
I am trying to upload my csv files to my google drive/sheets.
Working code: a window pop-up, I login into my google account, then code works perfectly, file is uploaded.
library(googledrive)
library(googlesheets4)
dff <- drive_upload('dff.csv', type = "spreadsheet")
drive_browse(dff)
I want to avoid this step. So I went to my googlesheets API and created from that window a service account with owner permission.
library(googledrive)
library(googlesheets4)
sheets_auth(scope = "https://www.googleapis.com/auth/drive",
path = 'myjson.json')
drive_auth(token = sheets_token())
dff <- drive_upload('dff.csv', type = "spreadsheet")
drive_browse(dff)
And this does not work. Moreover, the drive_browse(dff) opens a browser window with a message that I have no access to the file.
How can I solve this? Maybe there are other options? Ideally I need this script to run without any logins to google at all.
Service accounts behave like any other normal account - they can own files and have permissions to them. i.e. if you create a file with your personal account and attempt to access it using the service account, without giving the appropriate permissions prior to that, it will result in an error.
I suggest you use the service account for both of your communications to the Sheets and Drive APIs. In order to do that, you should replace drive_auth(token = sheets_token()) to drive_auth(service_token='myjson.json') (in case you are using googledrive library v0.1.3) or drive_auth(path = 'myjson.json') for v1.0.0 of the library.
Bear in mind that after creating the files using the service account, if you plan on accessing them from any other account you will have to share them to your account beforehand. You can use the drive_share() function in your R code for that.
Further to that, you can consider the use of delegation in case you are using G Suite in your domain. This feature will allow the service account to act as (or "impersonate") another user of the domain. Every action that it performs will be as if the subject of the "impersonation" were executing it, anything permissions related included.
Solution provided by carlesgg97
Once the file is uploaded, it has to be shared to all users.
https://www.rdocumentation.org/packages/googledrive/versions/1.0.0/topics/drive_share
Inspired by this awesome post on a Git branching model and this one on what a version bumping script actually does, I went about creating my own Git version bumping routine which resulted in a little package called bumpr.
However, I don't like the current way of handling (GitHub) HTTPS credentials. I'm using the solution stated in this post and it works great, but I don't like the fact that I need to store my credentials in plain text in this _netrc file.
So I wondered:
if one could also obfuscate console input when prompting via readline(), scan() or the like in much the same way as when using the Git shell. See code of /R/bump.r at line 454:
input <- readline(paste0("Password for 'https://",
git_user_email, "#github.com': "))
idx <- ifelse(grepl("\\D", input), input, NA)
if (is.na(idx)){
message("Empty password")
message("Exiting")
return(character())
}
git_https_password <- input
how RStudio realizes that a "Insert credentials" box pops up when pushing to a remote Git repository and how they obfuscate the password entry.
if file _netrc is something closely related to the GitHub API or if this works for HTTPS requests in general
Git has a mechanism to store, cache or prompt for credentials. Please read http://git-scm.com/docs/gitcredentials.
Within a script, you can use the git credential command to access it: http://git-scm.com/docs/git-credential