I built simple shinyapp that download tweets from a particular account and display some simple statistics and graphs (sentiment analysis, word clouds, etc.). I used the rtweet package. I would like to publish it at https://www.shinyapps.io/. The app works as intended locally using twitter auth token saved as a global environment.
How should I safely authorize my app publishing it online? Hardcoding my API keys into the script feels a terrible idea.
You could use library(secret) and add your API key to a vault. In your shiny application you add a field where your privat key needs to be provided and with this key you can get the API key from the vault.
Alternatively, you can add a field in your APP where the api key needs to be entered directly.
I found the answers I needed using these two instructions together:
https://docs.ropensci.org/rtweet/articles/auth.html#save
How to pass environment variables to shinyapps
This allowed me to publish the app to shinyapps.io without hardcoding any secret information into the app. Instead I used the functions rtweet::rtweet_app and rtweet::auth_app like this at the top of the server.R file:
app <- rtweet::rtweet_app(bearer_token = Sys.getenv("MY_BEARER_TOKEN"))
rtweet::auth_as(app)
The part saying Sys.getenv("MY_BEARER_TOKEN") retrieves the token from an environmental variable that you store according to recipe 2 above (the bearer token that you need to put in that .Renviron file is gotten from the Twitter developer platform and your app project there). The only thing to note regarding the recipe in link 2 above is that you should not store the .Renviron file locally at your computer but in the app that you publish to shinyapps.io, (as commented by the user Erik Iverson: "This worked for me after creating a copy of my .Renviron file in the root directory of my Shiny application").
Related
I'm trying to deploy my app that has the stripe payment extension installed. Everything works fine on the emulator (I am not using stripe for the moment) but when I try to deploy everything looks ok except deploy fails with this message:
Error: firestore-stripe-payments: Found
'projects/1234/secrets/ext-firestore-stripe-payments-STRIPE_API_KEY/versions/1'
for secret param STRIPE_API_KEY, but this instance was previously
using a different secret
projects/1234/secrets/firestore-stripe-payments-STRIPE_API_KEY.
Changing secrets is not supported. If you want to change the value of
this secret, use a new version of
projects/1234/secrets/firestore-stripe-payments-STRIPE_API_KEY.You
can create a new version at
https://console.cloud.google.com/security/secret-manager?project=1234
So I go to the link without knowing how my emulator API key works compared to the production one ( I didn't activated the Stripe account ). Here I find two keys, one is firestore-stripe-payments-STRIPE_API_KEY and the other one is ext-firestore-stripe-payments-STRIPE_API_KEY. When I click to make a new version it asks for a file with a secret.
So my main problem is: do I need to change version of both two secrets or only one? Second problem is where do I find those keys and does the test API key work on deployment?
Am I missing something here, am I on the right path?
I'm trying to run my Cloud Functions locally using the below guide
https://firebase.google.com/docs/functions/local-emulator
I'd like to be able to use the Admin SDK in my local functions. I've downloaded JSON admin keys at the Service Accounts Pane of the Google Cloud Console and it says to add it using
export GOOGLE_APPLICATION_CREDENTIALS="path/to/key.json"
I generated keys using
the PROJECTNAME#appspot.gserviceaccount.com that has
App Engine default service account credentials
NOT
firebase-adminsdk-CODE#PROJECTNAME.iam.gserviceaccount.com with firebase-adminsdk credentials
What I tried
I tried to save it down to a separate folder, and I provided the path as relative to root. And I executed this command in terminal while in my functions folder. It didn't give me any response. Just went to the next line in Terminal.
export GOOGLE_APPLICATION_CREDENTIALS="/Users/[user]/Documents/[PROJECT]/Service_Account/file_name.json"
Questions:
Did I download/use the right JSON credentials?
Is there a certain place I need to save that .json file? Or can it be anywhere n my system?
Does that path need to be from root? Or relative to my functions folder?
Where do I need to execute this command?
Should it provide some sort of response that it worked? How do we know if it does?
I have been trying to create connection between the Google cloud storage and RStudio server(The one I spinned up in Google cloud), so that I can access the files in R to run sum analysis on.
I have found three different ways to do it on the web, but I don't see many clarity around these ways so far.
Access the file by using the public URL specific to the file [This is not an option for me]
Mount the Google cloud storage as a disc in RStudio server and access it like any other files in the server [ I saw someone post about this method but could not find on any guides or materials that shows how it's done]
Using the googleCloudStorageR package to get full access to the Cloud Storage bucket.
The step 3 looks like the pretty standard way to do it. But I get following error when I try to hit the gcs_auth() command
Error in gar_auto_auth(required_scopes, new_user = new_user, no_auto =
no_auto, : Cannot authenticate -
options(googleAuthR.scopes.selected) needs to be set to
includehttps://www.googleapis.com/auth/devstorage.full_control or
https://www.googleapis.com/auth/devstorage.read_write or
https://www.googleapis.com/auth/cloud-platform
The guide on how to connect using this is found on
https://github.com/cloudyr/googleCloudStorageR
but it says it requires a service-auth.json file to set the environment variables and all other keys and secret keys, but do not really specify on what these really are.
If someone could help me know how this is actually setup, or point me to a nice guide on setting the environment up, I would be very much grateful.
Thank you.
Before using any services by google cloud you have to attach your card.
So, I am assuming that you have created the account, after creating the account go to Console ,if you have not created Project then Create Project, then click on sidebar find APIs & Services > Credentials.
Then,
1)Create Service Account Keys save this File in json you can only download it once.
2)OAuth 2.0 client ID give the name of the app and select type as web application and download the json file.
Now For Storage go to Sidebar Find Storage and click on it.
Create Bucket and give the name of Bucket.
I have added the single image in bucket, you can also add for the code purpose.
lets look how to download this image from storage for other things you can follow the link that you have given.
First create environment file as .Renviron so it automatically catches the json file and save it in a working directory.
In .Renviron file add those two downloaded json files like this
GCS_AUTH_FILE="serviceaccount.json"
GAR_CLIENT_WEB_JSON="Oauthclient.json"
#R part
library(googleCloudStorageR)
library(googleAuthR)
gcs_auth() # for authentication
#set the scope
gar_set_client(scopes = c("https://www.googleapis.com/auth/devstorage.read_write",
"https://www.googleapis.com/auth/cloud-platform"))
gcs_get_bucket("you_bucket_name") #name of the bucket that you have created
gcs_global_bucket("you_bucket_name") #set it as global bucket
gcs_get_global_bucket() #check if your bucket is set as global,you should get your bucket name
objects <- gcs_list_objects() # data from the bucket as list
names(objects)
gcs_get_object(objects$name[[1]], saveToDisk = "abc.jpeg") #save the data
**Note :**if you dont get json file loaded restart the session using .rs.restartR()
and check the using
Sys.getenv("GCS_AUTH_FILE")
Sys.getenv("GAR_CLIENT_WEB_JSON")
#it should show the files
You probably want the FUSE adaptor - this will allow you to mount your GCS bucket as a directory on your Server.
Install gcsfuse on the R server.
create a mnt directory.
run gcsfuse your-bucket /path/to/mnt
Be aware though that RW performance isnt great vis FUSE
Full documentation
https://cloud.google.com/storage/docs/gcs-fuse
I recently started building my first Firebase app, and I'm unsure how to create test users.
For non-user test data, I can keep a testdata.json file in my codebase and import it via the Firebase Console, but there doesn't seem to be an equivalent mechanism for users.
I'm aware that the latest version of firebase-tools (v3.2.0, released 4 days ago) added an auth:import command, but when I checked the docs, I saw that it expected password hashes to be pre-generated, which is not something I know how or want to do manually.
If there was an equivalent auth:export command that generated a file appropriate for feeding to auth:import, then I could use the Firebase Console to manually create a few users, export them to a file, and check it into my codebase (just like testdata.json), but there is no such command.
Even then, the fact that the Firebase Console doesn't let you set basic profile attributes (like displayName) on users is yet another obstacle...
There are three ways to create email/password users:
through the API
through the Firebase Console
by importing them with the Firebase CLI
For your use-case all three of these sound equally applicable. If you're having trouble getting one working, edit your question to include the minimal steps that reproduce the problem. If you'd like to request a fourth way, I recommend filing a feature request.
For anyone following along with this post, it appears the Firebase CLI added this export feature:
firebase auth:export users.json
However it's not clear to me how user password hashes are preserved for auth:import
The easiest way to add test users with JSON + Firebase console.
Go to your project -> develop -> database
click menu -> export JSON to see your Db structure
add new users (or other data) to this JSON
Go to your project -> develop -> database and delete your database
click menu -> import JSON and paste your new JSON-file
Now your Db contains test users!
I have a Shiny app making use of the googlesheets package which requires the user to authenticate for writing to Google Sheets (despite the Sheet being public and "published to the web"). This is done graphically in a browser, and works when the Shiny app is run locally. However, when deployed on shinyapps.io, the authentication call crashes the app and logs an error:
Warning: Error in : oauth_listener() needs an interactive environment.
So, what options are there? I'm thinking it would be a bad idea to upload my own .httr-oauth file or token to shinyapps.io... Any workarounds?
In case anyone else has a problem with this, the new {googlesheets4} provides a solution:
https://googlesheets4.tidyverse.org/articles/articles/auth.html
If you don’t need to access private Sheets, use gs4_deauth() to
indicate there is no need for a token. This puts googlesheets4 into a
de-authorized mode.
(an example script is also included)