gmailr credentials randomly (?) need re-authentication - r

I'm using gmailr in an automatic R script to send out some emails. It's been working fine for about a month and a half, but recently it failed with the following error:
Error: Can't get Google credentials.
Are you running gmailr in a non-interactive session? Consider:
* Call `gm_auth()` directly with all necessary specifics.
Execution halted
My code, which hasn't changed, is
library(gmailr)
options(gargle_oauth_email = TRUE)
gm_auth_configure(path ="data/credentials.json")
gm_auth(email = TRUE, cache = ".secret")
and is run non-interactively. (there is only one token in the .secrets folder) When I now ran it interactively, it "did the dance" and opened up the authentication thingy in the browser, which I confirmed and now everything is running fine again.
The problem is that I don't understand why the credentials suddenly required re-authentication or how I could prevent the script failing like this in the future.

You can try to clean cache in gargle folder and then create new one.
It worked for me, when i had similar problem
gm_auth function with gargle_oauth_cache stop working

Related

Authenticating Google Cloud Storage in R Studio

I know a similar question has been asked (link), but the response didn't work for me.
TLDR: I keep running into errors when trying to authenticate Google Cloud Storage in RStudio. I'm not sure what is going wrong and would love advice.
I have downloaded both the GCS_AUTH_FILE (created a service account with service admin privileges'--downloaded the key associated with the service account) and also downloaded GAR_CLIENT_WEB_JSON by creating a OAuth 2.0 Client ID and downloading that associated JSON file.
I've tried authenticating my Google Cloud Storage in several ways and hit different errors.
Way 1-automatic setup:
gcs_setup()
Then I select any one of the options, and get the error: Error in if (file.exists(local_file)) { : argument is of length zero And that error happens no matter which of the three options I select.
Way 2 - basic, following manual setup instructions from the package:
Sys.setenv("GCS_DEFAULT_BUCKET" = "my-default-bucket",
"GCS_AUTH_FILE" = "/fullpath/to/service-auth.json")
gcs_auth()
In this case, GCS_AUTH_FILE is the file that I mentioned at the beginning of this post, and the GCS_DEFAULT_BUCKET is the name of the bucket. When I run the first line, it seems to be working (nothing goes awry and it runs just fine), but when I run gcs_auth() I get taken to a web browser page that states:
"Authorization Error
Error 400: invalid_request
Missing required parameter: client_id"
Way 3: Following the method from the post that I linked above
This way involves manually setting the .Renviron file w/ the GCS_AUTH_FILE and GAR__CLIENT_WEB_JSON locations, and then running gar_auth(). And yet again, I get the exact same error as in Way 2.
Any ideas about what could be going wrong? Thanks for your help. I wasn't sure how to put in totally reproducible code in this case, so if there is a way I should do that, please let me know.

WordPress wp_safe_remote_get() in offline

You must continue executing the code after the wp_safe_remote_get() failed to execute offline.
By default, the wp_safe_remote_get() stops code execution offline.
We get the error not when we try to download the content, but after applying the variable. Before using the downloaded content, you need to check the variable for errors with the is_wp_error() function

ERROR: [_parse_http_data] invalid HTTP method in shiny app

When I load my docker shiny app domain name in the browser, it crashes (greys out) and I get this "ERROR: [_parse_http_data] invalid HTTP method".
I have developed an web application that consists of a shiny app (has a login feature connected to an RMySQL database), a website and a mariadb database. I put them together in a docker-compose file and tested it on my local computer and it works fine. I then proceeded to deploy them in a Kubernetes cluster in GCE and that was also successful. I used cloudflare to install a ssl certificate for the shiny app domain (i.e. trnddaapp.com). Now when I load the shiny app domain in the browser it appends the https and loads the app successfully but after about a minute it crashes (greys out). I loaded the shiny app external ip with http and this doesn’t crash.
The closest solution I have come to is https://github.com/rstudio/shiny-server/issues/392 but there doesn't seem to be any other solution to my problem. I would be grateful if anyone help me resolve this problem.
This is the error message I get when I check with kubectl log [app pod name], I get this error:
ERROR: [_parse_http_data] invalid HTTP method
ERROR: [_parse_http_data] invalid HTTP method
ERROR: [_parse_http_data] invalid HTTP method
I expect the app not to crash when the shiny app domain (trnddaapp.com) is appended with the https.
Let's start with the analysis of the error message, it says:
[_parse_http_data]
So we know that your app is receiving something, but it doesn't understand what it is (it may be a malformed HTTP/1.0 or HTTP/1.1 or even binary data) then we have an
invalid HTTP method
Now we are sure it is not a HTTP/1.X call but a stream of (non recognized) data.
We now know is not the instance since it "deploys" and "delivers" the service, but something inside that is just breaking.
There are a few things that may be happening, since it runs in your local machine (where I am assuming it has access to more resources, especially memory) it may be an issue of resource allocation and that once ran in a container, it could be possible that it empties its allocated amount of resources and breaks (perhaps a library that is called in real time that uses a chunk of memory?) but we won't be sure unless we can debug it inside a container, so could it be possible for you to add a debug library that records your requests to see if it parses all of those and at some point in time it stops and why? I know a person from R-Studio created a httpuv that logs every request this can be done as in:
devtools::install_github('rstudio/httpuv#wch-print-req')
And after that, maybe share the output and see why the application is behaving like that and killing its own service.
I really thank you in advance, hopefully with those logs we may be able to shed more light into this matter.
Thanks once again!
-JP

R / Shiny promises and futures not working with httr

I am working in a Shiny app that connects to Comscore using their API. Any attempt at executing POST commands inside future/promises fails with the cryptic error:
Warning: Error in curl::curl_fetch_memory: Bulk data encryption algorithm failed in selected cipher suite.
This happens with any POST attempt, not only when/if I try to call Comscore´s servers. As an example of a simple, harmless and uncomplicated POST request that fails, here is one:
rubbish <- future(POST('https://appsilon.com/an-example-of-how-to-use-the-new-r-promises-package/'))
print(value(rubbish))
But everything works fine if I don not use futures/promises.
The problem I want to solve is that currently we have an app that works fine in a single user environment, but it must be upgraded for a multiuser scenario to be served by a dedicated Shiny Server machine. The app makes several such calls in a row (from a few dozen to some hundreds), taking from 5 to 15 minutes.
The code is running inside an observeEvent block, triggered by a button clicked by the user when he has configured the request to be submitted.
My actual code is longer, and there are other lines both before and after the POST command in order to prepare the request and then process the received answer.
I have verified that all lines before the POST command get executed, so the problem seems to be there, in the attempt to do a POST connecting to the outside world from inside a promise.
I am using RStudio Server 1.1.453 along with R 3.5.0 in a RHEL server.
Package versions are
shiny: 1.1.0
httr: 1.3.1
future; 1.9.0
promise: 1.0.1
Thanks in advance,

R Googlesheets package and gs_auth() function - Non interactive auto-refresh stale Token issue

I am running an R script that reads google sheet for every one hour. I am using googlesheets package and gs_auth() function for this. Initially I am storing the token from running gs_auth() with interactive authentication. From next time, the code just reads that saved token and runs gs_auth(token = ...) function for authentication. Here is the sample code.
## One time execution to save token ##
token <- gs_auth(cache = TRUE)
saveRDS(token, file = "/myfilepath/googlesheets_token.rds")
## reading the saved token everytime the R script runs ##
gs_auth(token = "/myfilepath/googlesheets_token.rds")
This works fine for couple of hours and then gives me this error.
Auto-refreshing stale OAuth token.
Error in function_list[[k]](value) : Unauthorized (HTTP 401).
In addition: Warning message:
Unable to refresh token
Each time this happens, I am running the two line one time code and storing a fresh token, this again runs for couple of hours and then give me same error.
I have also used cache = FALSE instead of TRUE, though I don't have clear idea which one to use and its purpose. But that didn't work out.
Also I tried refreshing the token each time it is read from local directory before using in gs_auth().
t <- readRDS(file = "/myfilepath/googlesheets_token.rds")
t$refresh()
gs_auth(token = t)
Is there any way that I could resolve this problem and make the authentication work every time without an interactive version.

Resources