Connecting Firebase Storage and Compute Engine Storage - firebase

Now, I make an application that use google cloud platforms. First I use firebase and google cloud Compute Engine.
In Android Studio , I make a folder in firebase Storage.
And, I want to use this folder in google Compute Engine. So I mounted this bucket and a directory in Compute Engine VM. However I can't see the folder that made by application.
And, I try use 'gsutil' command , I can check this folder in my storage but I copied this folder to my directory , then my directory is deleted. What's the problem..... help me please...
enter image description here
enter image description here

Related

Using Google Cloud Storage in The Google Cloud Build

According to the documentation google cloud build can build from google cloud storage or a repository. But I'm having a hard time finding documentation on how to use files I upload to google cloud storage in the build steps. My intention is for a website where my jekyll source code is in the repository and my images are in google cloud storage; I'd like to add the images to the jekyll output and then upload to firebase with these steps.
Thanks for any guidance provided!
For a step in Cloud Build to copy files you would provide your Cloud Build service account with IAM permission to read the files in the bucket (probably something like: roles/storage.objectViewer).
Once the Service Account used by Cloud Build can read the files you would create a step to pull the files in before your step to deploy to Firebase.
A rough example:
steps:
- name: 'gcr.io/cloud-builders/gsutil'
args: ['cp', 'gs://$MY_BUCKET/*.jpg', '.']
- name: gcr.io/$PROJECT_ID/firebase
args: ['deploy', '--project=$PROJECT_ID', '--only=hosting']
the firebase cli deploy command uses google cloud storage. They package your local files and upload them to cloud storage and the build process takes the files from there.
I don't think they thought about the process of 'merging' your repo files and files from cloud storage.
But you use the steps configuration for cloud build and have 1 step downloading assets from cloud storage with the gcloud cli.

Mirror Firebase Cloud Storage Directory to a Dropbox Directory

I'm very new to using Firebase Cloud Storage. I'm currently using it to store data generated by some in-house tablet apps. We want users to be able to easily access this data in the form of Dropbox, however don't want to integrate the Dropbox API into our tablet applications as we're already using Firebase for all login management, etc and don't want users to have to login twice.
Is it possible to setup of a mirror between Firebase Cloud Storage and a Dropbox account such that any file added to Firebase Cloud Storage is immediately copied to the Dropbox directory?
You can simply use cloud functions which are being triggered by cloud storage service (it will triggered on any uploading, updating, deleting files or folders) on your firebase project and then write your cloud functions in a way to use dropbox api and then make the same change in the dropbox directory structure.
Cost and performance wise I don't think it would an efficient thing to do, unless it is what really you want.

Firestore Run Functions Locally with Admin

I'm trying to run my Cloud Functions locally using the below guide
https://firebase.google.com/docs/functions/local-emulator
I'd like to be able to use the Admin SDK in my local functions. I've downloaded JSON admin keys at the Service Accounts Pane of the Google Cloud Console and it says to add it using
export GOOGLE_APPLICATION_CREDENTIALS="path/to/key.json"
I generated keys using
the PROJECTNAME#appspot.gserviceaccount.com that has
App Engine default service account credentials
NOT
firebase-adminsdk-CODE#PROJECTNAME.iam.gserviceaccount.com with firebase-adminsdk credentials
What I tried
I tried to save it down to a separate folder, and I provided the path as relative to root. And I executed this command in terminal while in my functions folder. It didn't give me any response. Just went to the next line in Terminal.
export GOOGLE_APPLICATION_CREDENTIALS="/Users/[user]/Documents/[PROJECT]/Service_Account/file_name.json"
Questions:
Did I download/use the right JSON credentials?
Is there a certain place I need to save that .json file? Or can it be anywhere n my system?
Does that path need to be from root? Or relative to my functions folder?
Where do I need to execute this command?
Should it provide some sort of response that it worked? How do we know if it does?

Is it possible to retrieve a document from Cloud Storage from linux shell script?

I am currently working on an personal application where, I upload documents/pictures/videos from my phone to Cloud Storage. During this time my computer sitting at home is consistently running a shell script waiting for a new document to be uploaded to Cloud Storage, after it finds an uploaded file, it downloads it does some work to it, and then deletes it.
I can figure out how to upload and connect my application to Firebase, but I am not sure if its possible for a shell script to do the remaining work.
Should I look into some other service to do this, or another method?
thank you for your help!
You can use a command line program called gsutil to upload and download files from a Cloud Storage bucket. This should be easy to use from a shell script.

Firebase Storage - How to setup a backup

Can someone please advise how to setup a backup for Files in Firebase storage. I am able to make a backup of Database but not sure how to setup a regular backup for files (I have images) in firebase storage.
How to make local backup of Firebase Storage
There is no built-in method via Firebase. However, since Firebase uses Google Cloud Storage behind the scenes for Firebase Storage it's possible to use the gutils Tool.
Prerequisites
Make sure Python (2.7.9+) is installed on your machine python -V
Go to the Google Cloud SDK page and follow the directions to download and install Google Cloud SKD on your OS.
Steps
At the end of the Google SDK installation you should have run gcloud init. This will ask you to select your project and authenticate you. Since Firebase uses Google Cloud Platform behind the scenes your Firebase project should be available as a choice.
In order for Google Cloud Utils to download the files that were uploaded with Firebase permissions you need to give your account Firebase Privileges. Go to the IAM page and select your email address you signed into cloud init with. In the list of available permissions you need to select Firebase Rules System from the Other category.
Get your Google Storage URL from the Firebase Storage Page in the dashboard (Towards the top) Should look something like this: gs://<bucket_name>
In command line on your local machine navigate to the folder you want to do a local backup to. Make sure you are in the folder you want as the following command will download all files right there in current folder
Run the gutil command gsutil -m cp -R gs://<bucket_name> .
-m enables multithreading for faster downloads if you have many files.
cp is the copy command
-R is recursive. If enabled it will download all files and folders in the specified tree.
You're done! This will run for some time depending on the size of your storage.
This can be used to also make a copy(backup) to another Google Cloud Storage Bucket or AWS etc.
Use Google Cloud Transfer Service.
Select your current project
Create Transfer Job
Select source (storage bucket url)
Select destination (click browse and create new bucket)
Use created bucket URL as destination
Configure transfer settings (This is where you can schedule how often the backup runs.)
Click "Create"
If you follow the wizard in the link it will guide you through pretty easily.
There is no built-in backup feature in Cloud Storage for Firebase.
But since it is built on top of Google Cloud Storage, any backup solution for GCS can work for Firebase too. Typically this will involve creating a separate bucket that is the target of the regular bucket where you store/read files.

Resources