Firestore Run Functions Locally with Admin - firebase

I'm trying to run my Cloud Functions locally using the below guide
https://firebase.google.com/docs/functions/local-emulator
I'd like to be able to use the Admin SDK in my local functions. I've downloaded JSON admin keys at the Service Accounts Pane of the Google Cloud Console and it says to add it using
export GOOGLE_APPLICATION_CREDENTIALS="path/to/key.json"
I generated keys using
the PROJECTNAME#appspot.gserviceaccount.com that has
App Engine default service account credentials
NOT
firebase-adminsdk-CODE#PROJECTNAME.iam.gserviceaccount.com with firebase-adminsdk credentials
What I tried
I tried to save it down to a separate folder, and I provided the path as relative to root. And I executed this command in terminal while in my functions folder. It didn't give me any response. Just went to the next line in Terminal.
export GOOGLE_APPLICATION_CREDENTIALS="/Users/[user]/Documents/[PROJECT]/Service_Account/file_name.json"
Questions:
Did I download/use the right JSON credentials?
Is there a certain place I need to save that .json file? Or can it be anywhere n my system?
Does that path need to be from root? Or relative to my functions folder?
Where do I need to execute this command?
Should it provide some sort of response that it worked? How do we know if it does?

Related

Google Secret Manager Permissions For Local Emulating of Functions

I've given the service account for the functions the necessary permissions ('Secret Manager Secret Accessor') and when deployed, the firebase functions are able to access the secrets without any problems.
However, when using firebase serve or firebase emulators:start --only functions in local development, I'm getting the following error
Unhandled error Error: 7 PERMISSION_DENIED: Permission 'secretmanager.versions.access' denied for resource
I've found in the documentation that setting export GOOGLE_APPLICATION_CREDENTIALS=pathtoserviceaccount.json is needed to be entered in the terminal, though this did also not work for me.
I would be thankful for all pointers. Cheers.
I've found the answer myself:
When the functions are emulated locally, they do not get run by the App Engine default service account per default, this needs to be enabled as well.
So I had to follow this tutorial https://firebase.google.com/docs/functions/local-shell
The App Engine default service account needs a key which can be created in the Service Accounts settings in the Google Cloud, and then
I had to enter
export GOOGLE_APPLICATION_CREDENTIALS="path/to/key.json"
in the terminal. By running then firebase emulators:start they also got permission to access the Secret Manager.
So while I was on the right track, I was exporting the wrong Service Account key, and not the one that was allowed to run access the Secret Manager.
In order to access Secret Manager from your Firebase application running with local emulator you need to add role of:
"Secret Manager Secret Accessor" to YOUR account used to authenticate with Firebase
You can verify it by running: firebase login in local CLI.
If you're already logged in, it should respond with Already logged in as [email address].
This email address is the Principal account you need to add the role to.
As you've mentioned in your question the "firebase-adminsdk" service account permissions are used on the production deployment, but not on local, unless you specify it with: export GOOGLE_APPLICATION_CREDENTIALS="path/to/key.json"
"[...] you can override secrets values by setting up a .secret.local file. This makes it easy for you to test your functions locally, especially if you don't have access to the secret value."
https://firebase.google.com/docs/functions/config-env#secrets_and_credentials_in_the_emulator
Step by step:
make sure you have the latest version of firebase-tools installed, as this feature is relatively new.
Create a file named secret.local in the root of your firebase project (along side the .firebaserc and firebase.json
add your secrets to the file, formatted the same as way as a regular .env file. e.g.
MY_SECRET_1=foo
MY_SECRET_2=bar
run the emulator firebase emulators:start
within your firebase functions, access the secrets on the process object. e.g. process.env.MY_SECRET_1
note, as far as I can tell, the secrets are only available inside the block scope of a function handler. you can't access them in the root scope of your functions JS code (if somebody finds a way to do that, please comment here as I'd love to know too)
I had the same problem and tried the solution from pureth's answer to adding local overrides, but it didn't work. What did work for me was to create the .secret.local file in the functions directory, not in the project root.
My project structure is as follows:
/
|- .firebaserc
|- firebase.json
|- package.json
|- /* ... */
|- functions/
|- .secret.local
|- package.json
|- /* ... */
So the .secret.local file needs to be placed in the directory where your functions reside and not where the .firebaserc file is.
Also, please note that the file name starts with a dot.

How can I programmatically update environment config file for firebase functions?

I need to update some parameters that I keep in the config file for the Firebase functions via programs such that without me needing to manually deploy the function again the parameters value is updated and the appropriate functions are deployed again.
I have tried Google Cloud Build and Cloud Run. I can re deploy a function but I can't seem to be able to update the config.json before deploying the function.
It's not possible to change any of the content or configuration after deployment without deploying again. If you require dynamically modifiable configuration, you're going to have to provide that for yourself via a database query or something that the function can do at runtime to see if there is an update

Accessing files from Google cloud storage in RStudio

I have been trying to create connection between the Google cloud storage and RStudio server(The one I spinned up in Google cloud), so that I can access the files in R to run sum analysis on.
I have found three different ways to do it on the web, but I don't see many clarity around these ways so far.
Access the file by using the public URL specific to the file [This is not an option for me]
Mount the Google cloud storage as a disc in RStudio server and access it like any other files in the server [ I saw someone post about this method but could not find on any guides or materials that shows how it's done]
Using the googleCloudStorageR package to get full access to the Cloud Storage bucket.
The step 3 looks like the pretty standard way to do it. But I get following error when I try to hit the gcs_auth() command
Error in gar_auto_auth(required_scopes, new_user = new_user, no_auto =
no_auto, : Cannot authenticate -
options(googleAuthR.scopes.selected) needs to be set to
includehttps://www.googleapis.com/auth/devstorage.full_control or
https://www.googleapis.com/auth/devstorage.read_write or
https://www.googleapis.com/auth/cloud-platform
The guide on how to connect using this is found on
https://github.com/cloudyr/googleCloudStorageR
but it says it requires a service-auth.json file to set the environment variables and all other keys and secret keys, but do not really specify on what these really are.
If someone could help me know how this is actually setup, or point me to a nice guide on setting the environment up, I would be very much grateful.
Thank you.
Before using any services by google cloud you have to attach your card.
So, I am assuming that you have created the account, after creating the account go to Console ,if you have not created Project then Create Project, then click on sidebar find APIs & Services > Credentials.
Then,
1)Create Service Account Keys save this File in json you can only download it once.
2)OAuth 2.0 client ID give the name of the app and select type as web application and download the json file.
Now For Storage go to Sidebar Find Storage and click on it.
Create Bucket and give the name of Bucket.
I have added the single image in bucket, you can also add for the code purpose.
lets look how to download this image from storage for other things you can follow the link that you have given.
First create environment file as .Renviron so it automatically catches the json file and save it in a working directory.
In .Renviron file add those two downloaded json files like this
GCS_AUTH_FILE="serviceaccount.json"
GAR_CLIENT_WEB_JSON="Oauthclient.json"
#R part
library(googleCloudStorageR)
library(googleAuthR)
gcs_auth() # for authentication
#set the scope
gar_set_client(scopes = c("https://www.googleapis.com/auth/devstorage.read_write",
"https://www.googleapis.com/auth/cloud-platform"))
gcs_get_bucket("you_bucket_name") #name of the bucket that you have created
gcs_global_bucket("you_bucket_name") #set it as global bucket
gcs_get_global_bucket() #check if your bucket is set as global,you should get your bucket name
objects <- gcs_list_objects() # data from the bucket as list
names(objects)
gcs_get_object(objects$name[[1]], saveToDisk = "abc.jpeg") #save the data
**Note :**if you dont get json file loaded restart the session using .rs.restartR()
and check the using
Sys.getenv("GCS_AUTH_FILE")
Sys.getenv("GAR_CLIENT_WEB_JSON")
#it should show the files
You probably want the FUSE adaptor - this will allow you to mount your GCS bucket as a directory on your Server.
Install gcsfuse on the R server.
create a mnt directory.
run gcsfuse your-bucket /path/to/mnt
Be aware though that RW performance isnt great vis FUSE
Full documentation
https://cloud.google.com/storage/docs/gcs-fuse

Is it possible to retrieve a document from Cloud Storage from linux shell script?

I am currently working on an personal application where, I upload documents/pictures/videos from my phone to Cloud Storage. During this time my computer sitting at home is consistently running a shell script waiting for a new document to be uploaded to Cloud Storage, after it finds an uploaded file, it downloads it does some work to it, and then deletes it.
I can figure out how to upload and connect my application to Firebase, but I am not sure if its possible for a shell script to do the remaining work.
Should I look into some other service to do this, or another method?
thank you for your help!
You can use a command line program called gsutil to upload and download files from a Cloud Storage bucket. This should be easy to use from a shell script.

Firebase Storage - How to setup a backup

Can someone please advise how to setup a backup for Files in Firebase storage. I am able to make a backup of Database but not sure how to setup a regular backup for files (I have images) in firebase storage.
How to make local backup of Firebase Storage
There is no built-in method via Firebase. However, since Firebase uses Google Cloud Storage behind the scenes for Firebase Storage it's possible to use the gutils Tool.
Prerequisites
Make sure Python (2.7.9+) is installed on your machine python -V
Go to the Google Cloud SDK page and follow the directions to download and install Google Cloud SKD on your OS.
Steps
At the end of the Google SDK installation you should have run gcloud init. This will ask you to select your project and authenticate you. Since Firebase uses Google Cloud Platform behind the scenes your Firebase project should be available as a choice.
In order for Google Cloud Utils to download the files that were uploaded with Firebase permissions you need to give your account Firebase Privileges. Go to the IAM page and select your email address you signed into cloud init with. In the list of available permissions you need to select Firebase Rules System from the Other category.
Get your Google Storage URL from the Firebase Storage Page in the dashboard (Towards the top) Should look something like this: gs://<bucket_name>
In command line on your local machine navigate to the folder you want to do a local backup to. Make sure you are in the folder you want as the following command will download all files right there in current folder
Run the gutil command gsutil -m cp -R gs://<bucket_name> .
-m enables multithreading for faster downloads if you have many files.
cp is the copy command
-R is recursive. If enabled it will download all files and folders in the specified tree.
You're done! This will run for some time depending on the size of your storage.
This can be used to also make a copy(backup) to another Google Cloud Storage Bucket or AWS etc.
Use Google Cloud Transfer Service.
Select your current project
Create Transfer Job
Select source (storage bucket url)
Select destination (click browse and create new bucket)
Use created bucket URL as destination
Configure transfer settings (This is where you can schedule how often the backup runs.)
Click "Create"
If you follow the wizard in the link it will guide you through pretty easily.
There is no built-in backup feature in Cloud Storage for Firebase.
But since it is built on top of Google Cloud Storage, any backup solution for GCS can work for Firebase too. Typically this will involve creating a separate bucket that is the target of the regular bucket where you store/read files.

Resources