Exporting Firestore collection data, edit and re-upload - firebase

I created a Google Cloud account and have everything setup but for some reason I can't figure out how to export my Firestore data to a "file" edit/add to that file and re-upload.
I'm read through this several times:
https://firebase.google.com/docs/firestore/manage-data/export-import#import_specific_collections
No code but just real confusion. I am not sure how to export a collection so that I can make bulk edits.

You can use gcloud cli to export to a cloud store bucket
gcloud beta firestore export gs://[BUCKET_NAME]
To export to a local file, you can use the REST API
POST https://firestore.googleapis.com/v1beta1/{name=projects/*/databases/*}:exportDocuments
There is also an npm app which can do this, node-firestore-import-export.
firestore-export --accountCredentials path/to/credentials/file.json --backupFile /backups/myDatabase.json
firestore-import --accountCredentials path/to/credentials/file.json --backupFile /backups/myDatabase.json

Related

Copy Firestore Database Data to Bigquery using Cloud Shell Terminal

Does anyone know how I can manually copy/transfer data from Firestore database to Bigquery using Cloud Shell Terminal?
I did this in the past but I'm unable to find the documentation/video that I used. I find a lot that states that once Bigquery is connected to Firebase, it should be automatic but mine is not.
When I ran code in the Cloud Shell Terminal to pull data from Firebase the collection was copied as a table into a Bigquery dataset. Two tables were created and "raw_latest" and "raw_changelog" were created.
I'm not sure how to transfer another collection now.
I specifically need to transfer data from a subcollection in the Firestore database.
You can now export data from Cloud Firestore to BigQuery with a
Firebase Extension. To import all the previous data you will need
first to install the extension because all the writes while doing
the export first without installing the extension will be lost.
See: https://firebase.google.com/products/extensions/firestore-bigquery-export
Firestore allows import / export data to BigQuery using a GCS
bucket. The data is exported to a Cloud Storage bucket and from
there it can be imported into Big Query.
The gcloud commands for the same are :
export data :
gcloud beta firestore export --collection-ids=users gs://my bucket/users
load backup into bq :
bq load --source_format=DATASTORE_BACKUP mydataset.users gs://gs://mybucket/users/all_namespaces/kind_users/all_namespaces_kind_users.export_metadata
Here are some links that might be helpful:
https://firebase.google.com/docs/firestore/manage-data/export-import
https://cloud.google.com/bigquery/docs/loading-data-cloud-datastore
https://github.com/firebase/extensions/blob/master/firestore-bigquery-export/guides/IMPORT_EXISTING_DOCUMENTS.md

Firebase migration between projects - Destination Fire Store database doesn't change on Import command (No Error Displayed)

The import command from the bucket doesn't affect the Firestore database. I have been trying to migrate the data from one project to another. Both of the projects are in blaze plan and are in the same google account. Both the buckets are there on the google cloud console for each project. There was no error shown in the cloud shell. The following message was shown at the end of import command.
metadata:
'#type': type.googleapis.com/google.firestore.admin.v1.ImportDocumentsMetadata
inputUriPrefix: gs://rit_test_migration_bucket_pk_dep/2021-05-21T10:13:32_45764
operationState: PROCESSING
startTime: '2021-05-21T10:20:15.394043Z'
name: projects/project-tiedge-
test/databases/(default)/operations/AiAyNzc1NzEzMzYJGnRsdWFmZWQHEmVwb3J1ZS1zYm9qLW5pbWRhEQopEg
Source Bucket Name: rit_test_migration_bucket_pk_dev
Destination Bucket Name: rit_test_migration_bucket_pk_dep
Following Documentation: https://firebase.google.com/docs/firestore/manage-data/move-data
Steps Done:
Created a bucket in the source project through google cloud console, exported the data of that database to that bucket with the help of the following command in the cloud shell
gcloud firestore export gs://rit_test_migration_bucket_pk_dev --async
Transferred the bucket to another project bucket through the help of transfer page at google cloud console
Ref Link: https://cloud.google.com/storage/docs/moving-buckets
Trying to import the data of that database to that bucket with the help of the following command in the cloud shell
gcloud firestore import gs://rit_test_migration_bucket_pk_dev/2019-03-05T20:58:23_56418 --async
The bucket doesn't imply the project in the gcloud command line. You'll want to be explicit with each command. E.g. gcloud --project=source_project export gs://..., then gcloud --project=destination_project import gs://same-path-should-work.

In Google Cloud Firestore, how can I export storage also at the same time as collections?

In Google Cloud Firestore we store collection data and storage data (jpeg files). Is it possible to export them together and import them together? We export like this:
gcloud beta firestore export gs://backups-projectname
...and import like this into another project:
gcloud beta firestore import gs://backups-projectname/2019-06-28T18:18:37_6038/
It does, partially, work: the collection data shows up in the new location. But the storage data doesn't. Is the storage not included in the export? Can it be included with a flag? Or is there a separate command?
Note: I'm not talking about copying the files independently gsutil cp -r I can already get the files into the Firestore Storage with that command, e.g.:
gsutil cp -r gs://projectname.appspot.com gs://projectname-86abf.appspot.com
I want the Firestore database and file storage to be exported as a single large object and imported from that same large object. If that's not possible, what's the recommended way to accomplish this?

Is there currently a way to export cloud firestore data like we can with realtime database?

In general, the way I've been thinking about setting up a test database is creating two projects for DEV and PROD. Rather than creating a custom script is there any known process for import/export for cloud Firestore?
The supported export and import procedures are documented in detail. You use the gcloud command line.
Use the firestore export command to export all the documents in your
database, replacing [BUCKET_NAME] with the name of your Cloud Storage
bucket. Add the --async flag to prevent the gcloud tool from waiting
for the operation to complete.
gcloud alpha firestore export gs://[BUCKET_NAME]
Use the firestore import command to import documents from a previous
export operation.
gcloud alpha firestore import gs://[BUCKET_NAME]/[EXPORT_PREFIX]/

gcloud command to save datastore kind to google storage

Looking for gcloud command to save the google cloud datastore entities to google cloud storage.Right now I'm manually doing all the operations would like to see if this can be done through shell commands as well.
There's a command to export named kinds to GCS:
gcloud beta datastore export --kinds="KIND1, KIND2" --namespaces="NAMESPACE1, NAMESPACE2" gs://${BUCKET}
Typically, you would run:
gcloud beta datastore export --kinds="foo" --namespaces="(default)" gs://my-gcs-bucket/datastore_export/2017-11-09_12_00
The documentation can be found here:
https://cloud.google.com/datastore/docs/export-import-entities
And then the exported data can then be loaded into BigQuery.
If you don't want to load the data into BigQuery, the format isn't documented, but here are some references for how to read the entities from the exported LevelDB files:
http://varunpant.com/posts/read-gae-admin-backups-fromleveldb-format-and-export-gae-entities-using-bulkloader
http://gbayer.com/big-data/app-engine-datastore-how-to-efficiently-export-your-data/

Resources