How to export all kinds(tables) to local system? - google-cloud-datastore

I have a project whose database is in cloud datastore. Now I want to take a backup of all kinds including all its entities in local system. How it should be possible. I also have checked the cloud documentation i.e
1- https://cloud.google.com/datastore/docs/export-import-entities#exporting_entities
2- https://cloud.google.com/sdk/gcloud/reference/datastore/export
but it describes that how to export data from cloud datastore to cloud storage not in local system. Please let me know if anyone knows that how it should be possible.
Thanks!

It is not possible to get the Managed Export service to export directly to your local filesystem. So you'll need to export your entities to GCS. To use the exports on your local machine you can copy them to your local machine, then import them into the Datastore emulator.

I do something like this, but I had to create my own exporter and importer, see my answer to this question https://stackoverflow.com/a/52767415/4458510
To do this I wrote a google dataflow job that exports select models and saves them in google cloud storage in jsonl format. Then on my local host I have an endpoint called /init/ which launches a taskqueue job to download these exports and import them.
To do this i reuse my JSON REST handler code which is able to convert any model to json and vice versa.

Related

How to export data from Firebase Firestore emulator to actual Firebase Firestore database

Scenario:
I am working on a POC locally where I am using Firestore as my database. As it is my local setup I am using Firestore Emulator. Now my POC is successful and I want to move local database from emulator to actual Firestore.
Query:
Is it possible to achieve what I am trying to do?
So far I am not able to find any relevant content on internet around this. I did find couple of examples where there is demonstration of exporting data from Firestore and importing to local emulator but I was not able to find the vice-versa option!
Firebase does not provide any sort of tool or service to do this. Your easiest alternative will be to write a program to query the data out of the emulator and write it into your cloud hosted instance. You might find the Firebase Admin SDK helpful for writing to the cloud in a program that you run locally.

Copy Firestore Database Data to Bigquery using Cloud Shell Terminal

Does anyone know how I can manually copy/transfer data from Firestore database to Bigquery using Cloud Shell Terminal?
I did this in the past but I'm unable to find the documentation/video that I used. I find a lot that states that once Bigquery is connected to Firebase, it should be automatic but mine is not.
When I ran code in the Cloud Shell Terminal to pull data from Firebase the collection was copied as a table into a Bigquery dataset. Two tables were created and "raw_latest" and "raw_changelog" were created.
I'm not sure how to transfer another collection now.
I specifically need to transfer data from a subcollection in the Firestore database.
You can now export data from Cloud Firestore to BigQuery with a
Firebase Extension. To import all the previous data you will need
first to install the extension because all the writes while doing
the export first without installing the extension will be lost.
See: https://firebase.google.com/products/extensions/firestore-bigquery-export
Firestore allows import / export data to BigQuery using a GCS
bucket. The data is exported to a Cloud Storage bucket and from
there it can be imported into Big Query.
The gcloud commands for the same are :
export data :
gcloud beta firestore export --collection-ids=users gs://my bucket/users
load backup into bq :
bq load --source_format=DATASTORE_BACKUP mydataset.users gs://gs://mybucket/users/all_namespaces/kind_users/all_namespaces_kind_users.export_metadata
Here are some links that might be helpful:
https://firebase.google.com/docs/firestore/manage-data/export-import
https://cloud.google.com/bigquery/docs/loading-data-cloud-datastore
https://github.com/firebase/extensions/blob/master/firestore-bigquery-export/guides/IMPORT_EXISTING_DOCUMENTS.md

Is there anyway to configure the firebase firestore emulator to use the data inside the production database?

Is there a way I can configure the firestore emulator to start off with the data in my production database. I have a lot of test information in there and want to transition it to the local emulator without having to copy each and every document/collection. My initial thought was that there must be a configuration in the firebase.json file but I'm sure what it would be.
There is no feature to configure the Firestore emulator to read its initial data from the production database.
What you can do is export the data from your production database (through its regular API(, and import it to the emulator (also through its regular API) and after that use the import and export commands to get the data in and out of the emulator.

source bucket is not visible in Firebase Firestore import web tool even after giving required permission

I'm following the official docs on how to export and import firebase firestore data between 2 projects.
I'm able to export firestore data to a bucket. https://console.cloud.google.com/firestore/export
But I don't see that bucket when I try to import in a different firebase project. https://console.cloud.google.com/firestore/import
I gave Storage Admin permission to the destination service account i.e. dest-proj#appspot.gserviceaccount.com and both projects and this bucket is stored in the same multi-region (us-centeral)
I'm aware of the other way to import-export using gcloud shell but why this method is not working?
The console only browse buckets that exist inside your current project. If data is coming from an outside bucket, you can simply type its entire file path in the Filename field as shown on the image below.
It will succeed if the service account running the import have the right IAM permission on the separate source bucket.

import json files from gcloud storage into Firebase database

We manage to produce lots of large json files ~600MB+ (with one json record per line) and we want to import this into Firebase database. Is there any way to do this directly using a gs:// path?
(The fire base console UI has an import function but this is using json from local files not gs:// and https://github.com/firebase/firebase-import is not clear whether is can work from Google Storage.)
There is no direct connection to import data from a file in Google Cloud Storage into the Firebase Database. You will have to use some intermediary tool or code for that. While recommending a specific tool is off-topic on Stack Overflow, I want to point out Firebase's own streaming import tool as one option
But you should really consider what purpose this import serves. The Firebase Database is a great tool for synchronizing data between connected clients. It is not a great tool for advanced querying or reporting on large data sets. Something like BigQuery is much more suited to such tasks.

Resources