Using Google Cloud Storage in The Google Cloud Build - firebase

According to the documentation google cloud build can build from google cloud storage or a repository. But I'm having a hard time finding documentation on how to use files I upload to google cloud storage in the build steps. My intention is for a website where my jekyll source code is in the repository and my images are in google cloud storage; I'd like to add the images to the jekyll output and then upload to firebase with these steps.
Thanks for any guidance provided!

For a step in Cloud Build to copy files you would provide your Cloud Build service account with IAM permission to read the files in the bucket (probably something like: roles/storage.objectViewer).
Once the Service Account used by Cloud Build can read the files you would create a step to pull the files in before your step to deploy to Firebase.
A rough example:
steps:
- name: 'gcr.io/cloud-builders/gsutil'
args: ['cp', 'gs://$MY_BUCKET/*.jpg', '.']
- name: gcr.io/$PROJECT_ID/firebase
args: ['deploy', '--project=$PROJECT_ID', '--only=hosting']

the firebase cli deploy command uses google cloud storage. They package your local files and upload them to cloud storage and the build process takes the files from there.
I don't think they thought about the process of 'merging' your repo files and files from cloud storage.
But you use the steps configuration for cloud build and have 1 step downloading assets from cloud storage with the gcloud cli.

Related

Can I create and configure a GCP project + Firebase completely from the CLI or a script?

I am using gcloud + firebase cli to explore a reproducible way to create and configure a GCP + Firebase project.
I have created the GCP project using gcloud cli tool. I have then used firebase cli to run the command firebase init firestore.
Ulimately it ended up outputting...
Error: It looks like you haven't used Cloud Firestore in this project before. Go to https://console.firebase.google.com/project/my-project/firestore to create your Cloud Firestore database.
Is there a way I can "create my firestore database" using a cli tool or scripting api, instead of having to navigate to a web GUI tool and manually execute steps?
Unless Firebase do more behind the scenes, creating a Firestore instance could be as simple as this command from the GCloud CLI reference:
gcloud firestore databases create --region=us-central --project=my-project
The documentation for the entire create-provision cycle (from GCP's perspective) is here - including Terraform details (some commands may have been released since the docs were written).

How do I start the Firebase Storage Emulator with production bucket data?

I'm using the Firebase Emulator to run all Firebase services. I have managed to run the emulator with a backup of my Firestore data by running:
firebase emulators:start --import ./my-directory
... but I can't find a way to do the same with my Storage data.
Firestore has an option to do import and export, while Firebase storage don't have this feature yet (storage uses upload and download). Currently, there's no native way to import Google Cloud Storage data in the Storage Emulator.
Additionally, you can study how the emulator register the object by uploading sample files within the emulator and then running the export, you will see that the emulator will require 1 object and 1 JSON file that contains it's metadata. For now, it'll be up to you to download the objects from your production bucket along with a separate JSON file containing its metadata and then structure it for import.
There's also an opened issue for this in github that you can monitor as well.
Although #RJC's answer is correct, I want to share what I did.
You can add the option --export-on-exit when starting the emulator, and it will export the state of the local Firebase instance to a folder of your choice. So I downloaded all my data from the real Firebase project, uploaded everything with the Emulator, then killed the emulator, everything (including storage) got exported and now I start it with the --import option using the same folder where I exported everything.
This was not very awful since I didn't that much stuff in my storage.

Deploy Firebase Functions, Rules and Indexes to multiple GCP Projects

We are attempting to deploy Firebase Functions, Rules, and Indexes to multiple projects for tenant isolation of data. We are attempting to use Google Cloud Source Repository, but Cloud Build in each project does not have the ability to connect to the Central Project Source Repository - and we have added the required Source Repo IAM rules on our Cloud Build service account.
What is a good solution for deploying our Firebase Functions, Rules, and Indexes from a central repository?
You can't access to event from a source repository in another project mode. Thereby, you can't set up a trigger on the source repository that don't belong to your project
So, you can imagine this workaround to achieve what you want
Source Project
Create a PubSub topic (push-event for example)
Configure the trigger that you want which run a Cloud Build
In this Cloud Build, format a JSON message with all the push data that you want (commit SHA, type of event, repo name,...) and publish this message to push-event topic
Tenant Projects
Create a cloud function that trigger Cloud Build (focus on that bellow)
Create a push subscription on the pubsub push-event topic located in the source projet (be sure that the current account that run the terraform has the roles topicViewer and topicSubscriber on the push-event topic (or on the source project))
Note: the first thing that you have to do in the Cloud Build execution is to clone the source repository because you won't have the data automatically downloaded (get the correct source according with the branch, tag or pull event.)
Cloud Functions
I don't know your dev language, but the principle is to perform an API call to the Cloud Build API to launch the build. This API call require the content of the cloudbuild.json. So, in the cloud function,
You can also clone the source repo (grant the reader permission) in the /tmp directory and then read the cloudbuild.json file to run in your Cloud Build. But it could be difficult in case of branch, tag, or pull context.
You can publish, in addition of other data in the PubSub message published in the source project, the content of the cloudbuild.json file to run by the Cloud Functions in the tenant project.

uploading an app directory into firebase storage

I am trying to set up my project in Firebase and can't seem to find an one shot way to upload my entire directory into a firebase storage bucket. Any thoughts? or is this a premium feature?
What you're calling "Firebase Storage" is actually the product Google Cloud Storage with Firebase client APIs and tools added on. You can deal with the storage bucket created by Firebase in the same way that you treat any Cloud Storage bucket. This means you can use the gsutil command line program to copy files from your machine into that bucket. This is not a premium feature.

Firebase Storage - How to setup a backup

Can someone please advise how to setup a backup for Files in Firebase storage. I am able to make a backup of Database but not sure how to setup a regular backup for files (I have images) in firebase storage.
How to make local backup of Firebase Storage
There is no built-in method via Firebase. However, since Firebase uses Google Cloud Storage behind the scenes for Firebase Storage it's possible to use the gutils Tool.
Prerequisites
Make sure Python (2.7.9+) is installed on your machine python -V
Go to the Google Cloud SDK page and follow the directions to download and install Google Cloud SKD on your OS.
Steps
At the end of the Google SDK installation you should have run gcloud init. This will ask you to select your project and authenticate you. Since Firebase uses Google Cloud Platform behind the scenes your Firebase project should be available as a choice.
In order for Google Cloud Utils to download the files that were uploaded with Firebase permissions you need to give your account Firebase Privileges. Go to the IAM page and select your email address you signed into cloud init with. In the list of available permissions you need to select Firebase Rules System from the Other category.
Get your Google Storage URL from the Firebase Storage Page in the dashboard (Towards the top) Should look something like this: gs://<bucket_name>
In command line on your local machine navigate to the folder you want to do a local backup to. Make sure you are in the folder you want as the following command will download all files right there in current folder
Run the gutil command gsutil -m cp -R gs://<bucket_name> .
-m enables multithreading for faster downloads if you have many files.
cp is the copy command
-R is recursive. If enabled it will download all files and folders in the specified tree.
You're done! This will run for some time depending on the size of your storage.
This can be used to also make a copy(backup) to another Google Cloud Storage Bucket or AWS etc.
Use Google Cloud Transfer Service.
Select your current project
Create Transfer Job
Select source (storage bucket url)
Select destination (click browse and create new bucket)
Use created bucket URL as destination
Configure transfer settings (This is where you can schedule how often the backup runs.)
Click "Create"
If you follow the wizard in the link it will guide you through pretty easily.
There is no built-in backup feature in Cloud Storage for Firebase.
But since it is built on top of Google Cloud Storage, any backup solution for GCS can work for Firebase too. Typically this will involve creating a separate bucket that is the target of the regular bucket where you store/read files.

Resources