Transfer Firebase Storage Bucket between projects - firebase

I am attempting to copy the contents of a folder in one Firebase project's Storage bucket to the storage bucket of another Firebase project.
I have been following the firestore docs and this SO question.
Both projects have the necessary permissions to other's service accounts.
Here is what I have done:
When attempting to transfer files from a folder in the default bucket of Project-A to the default bucket of Project-B using the cloud shell terminal, I first set the project to 'Project-A'. I then ran gcloud beta firestore export gs://[PROJECT_A_ID] --collection-ids=[FOLDER_TO_TRANSFER] --async. This succeeds and creates a folder called "2019-08-26T21:23:23_26014/" in Project-A. This folder contains some metadata.
Next, I tried beginning the import by setting the project to Project-B and running gcloud beta firestore import gs://[PROJECT_A_ID]/2019-08-26T21:23:23_26014
This completes and the logs display this message:
done: true
metadata:
'#type': type.googleapis.com/google.firestore.admin.v1beta1.ImportDocumentsMetadata
collectionIds:
- [FOLDER_TO_TRANSFER]
endTime: '2019-08-26T21:25:56.794588Z'
inputUriPrefix: gs://[PROJECT_A_ID]/2019-08-26T21:23:23_26014
operationState: SUCCESSFUL
startTime: '2019-08-26T21:25:19.689430Z'
name: projects/[PROJECT_B]/databases/(default)/operations/[SOME_ID_STRING]
response:
'#type': type.googleapis.com/google.protobuf.Empty
However, the Project-B storage bucket doesn't have any new files or folders. It looks like the import did nothing. Am I missing something?

You can create a transfer job in the GCP Console. You can specify source/destination buckets from different projects as long as you have access permissions. You can specify the folder by setting "Specify file filters":
https://console.cloud.google.com/storage/transfer
You can also use the gsutil tool, which is part of gcloud, to move or copy your objects to another bucket.
So your default buckets would be gs://[PROJECT_A_ID].appspot.com and gs://[PROJECT_B_ID].appspot.com Let's say you wanted to copy over the contents of my_directory:
gsutil cp -r gs://[PROJECT_A_ID].appspot.com/my_directory gs://[PROJECT_B_ID].appspot.com

Related

Error : INVALID_ARGUMENT: Path already exists when scheduled firestore export job runs

I followed the official docs for scheduling firestore export via cloud function and cloud scheduler.
It works perfectly only for the first time creating the necessary exports at the right location.
When I run it again I get following error in cloud function.
Error: 3 INVALID_ARGUMENT: Path already exists: /red115.appspot.com/daily_backup/users.overall_export_metadata
Why doesn't it overwrite on existing data?
I followed official docs, gave the necessary roles & permissions to principal account.
I have Recreated the setup from the docs you shared. I am assuming you also did the Configure access permissions part. I have scheduled the firebase function for firestore export every 5 minutes on all collections. And even after running 4 times with the schedule I have not got the Error you have mentioned.
The Firebase Storage Rules you provided do not affect the Firestore export functionality, they are only for the Firebase Storage.
But if you are experiencing this error I will recommend you to first check whether the export is already created at the given specified location on that bucket.
If Yes then To overwrite existing data on Firestore export, you could add a step in your Cloud Function to delete the existing export if there exists before running the new export. Something like this : admin.storage().bucket().delete() (be specific about this thing) method to delete the existing export before running the new export.
OR
You could change the export path to include a timestamp or a version number, so that each export is saved to a unique location. Which is a default behavior.
If No then there must be a typo that happened in the following line while providing the bucket link const bucket = 'gs://BUCKET_NAME';
Make sure you provide the same gs://BUCKET_NAME in the functions/index.js and gsutil iam ch serviceAccount:PROJECT_ID#appspot.gserviceaccount.com:admin \ gs://BUCKET_NAME
This thread also talks about the wrong gsutil path you have a look at once as well..

How to export data from Cloud Firestore to file?

I have an application programmed in Flutter and I use Firebase to collect some information sent by users.
The question is how can I transfer this information to my computer in the form of a file (JSON, TEXT, etc.) data like this picture:
Currently, Firestore does not support exporting existing data to a readable file but Firestore do have a managed Exporting and importing data that allows you to dump your data into a GCS bucket. It produces a format that is the same as Cloud Datastore uses. This means you can then import it into BigQuery.
However, community created a workaround for this limitation. You can use npm if you have installed it in your system. Below are the instructions to export the Firestore Data to JSON file using npm.
Generate a private key file for your service account. In the Firebase console, open Settings > Service Accounts.
Click Generate New Private Key, then confirm by clicking Generate Key.
Securely store the JSON file containing the key. You may also check this documentation.
Rename the JSON file to credentials.json.
Enter the below code to your console:
npx -p node-firestore-import-export firestore-export -a credentials.json -b backup.json
Follow the instructions prompted on your console.
You could also use this to import data to Firestore using below command:
npx -p node-firestore-import-export firestore-import -a credentials.json -b backup.json
Below are the results using npm from the package:
Firestore Collection:
Console:
backup.json:
{"__collections__":{"test":{"Lq8u3VnOKvoFN4r03Ri1":{"test":"test","__collections__":{}}}}}
You can find more information regarding the package here.
The package mentioned by #marc node-firestore-import-export in above answer has a flaw in case your database is very large.
This flaw is documented here
https://github.com/jloosli/node-firestore-import-export/issues/815
For this reason I would recommend using https://github.com/benyap/firestore-backfire

How do I delete an image with cloud functions with download url?

I want to delete an image. All I have is the download url.
In flutter i am able to get file path from download url and use that path to delete the file in cloud storage.
Is that possible to get the file path from download url and use that path to delete the image from cloud functions?
or is there any better/ faster way / more efficient way to delete an image from cloud storage only with the download url
Google Cloud Storage object URL has the following parts:
https://storage.cloud.google.com/[bucket_name]/[path/and/the/object/name*]?[autentication_if_needed]
*Path in Cloud Storage is "virtual", in fact it is an integral part of the object name/identification. Cloud Console and gsutil simulates folders for the user interface output.
There are several methods to delete object:
From the Cloud Console
Using Cloud SDK command: gsutil rm gs://[BUCKET_NAME]/[OBJECT_NAME]
Using Client Libraries, for example with python:
def delete_blob(bucket_name, blob_name):
"""Deletes a blob from the bucket."""
storage_client = storage.Client()
bucket = storage_client.get_bucket(bucket_name)
blob = bucket.blob(blob_name)
blob.delete()
print('Blob {} deleted.'.format(blob_name))`
Please keep in mind, that you need proper permissions to delete the object for the user/service account used to perform the operation.

Firestore Import - no error, but no changes

I'm attempting to use the instructions here: https://firebase.google.com/docs/firestore/manage-data/export-import to a) do periodic backups of data from my production instance, and b) copy data between production/staging/dev instances.
For what it's worth, each instance is in a separate Firebase project (myapp-dev, myapp-staging and myapp-production), all are on the Blaze plan and each has a corresponding bucket in Cloud Platform (gs://myapp-backup-dev, ...-staging, ...-production).
I've successfully completed all the "Before you begin" steps. I've exported data from one instance/project (staging) into it's bucket, and it *seems* that I can also import it back into that project successfully (no error message, operationState: SUCCESSFUL), but any records changed since the export don't 'restore' back to their original values.
And for what it's worth, I've also successfully copied the exported data from that bucket into another project's bucket (staging to dev), and get the same result when I am importing it into the second project (dev).
Am I doing something wrong here? Missing something?
Is the name of your collection testStuff or 'testStuff'? If it's testStuff, it seems like your export command was slightly off. You'll need to export the data again. You should get a workCompleted number this time around.
gcloud beta firestore export gs://myapp-backup-dev --collection-ids='testStuff'
gcloud beta firestore import gs://myapp-backup-dev/2018-10-15T21:38:18_36964 --collection-ids='testStuff'

Circle CI failing with Firebase Admin SDK

I have an Express API using the Firebase Admin SDK.
Currently, my application is failing CI as it cannot initialise on the test stage, due to not being able to connect to Firebase.
I have a serviceAccountKey.json file in the root of my project, that I import as follows..
import * as fbseAdmin from 'firebase-admin'
const FIREBASE_DB_URI = process.env.FIREBASE_DB_URI
const serviceAccount = require('../serviceAccountKey.json')
fbseAdmin.initializeApp({
credential: fbseAdmin.credential.cert(serviceAccount),
databaseURL: FIREBASE_DB_URI
})
export default fbseAdmin
This file is excluded from source control as it contains sensitive information.
The first issue I have is when my CI build runs, the tests fail as Error: Cannot find module '../serviceAccountKey.json'
How is best to approach this? Should I mock the file? I'd prefer not to keep a mock file floating around my solution.
Secondly, I believe the app will fail to start if Firebase does not initialise correctly.
Should I setup a mock Firebase project for testing?
You will require a valid instance of your Firebase cert if you need to initialise your app.
If you prefer not to check your cert file in to source control, which I agree is the best idea, you could then instead perhaps setup another application in the Firebase console, strictly for testing purposes.
Using this app you could Base64 encode the cert, setting it as an environment variable in your build.
Prior to running your app in the CI container, you then simply echo out the decoded BASE64 environment variable into a file name myServiceKey.json or something similar.
steps:
- checkout
- run: echo $FIREBASE_SERVICE_KEY | base64 -di > ./${FIREBASE_SERVICE_ACCOUNT_CERT_NAME}
This would produce the required file for testing purposes in the root of your project.

Resources