I have an application programmed in Flutter and I use Firebase to collect some information sent by users.
The question is how can I transfer this information to my computer in the form of a file (JSON, TEXT, etc.) data like this picture:
Currently, Firestore does not support exporting existing data to a readable file but Firestore do have a managed Exporting and importing data that allows you to dump your data into a GCS bucket. It produces a format that is the same as Cloud Datastore uses. This means you can then import it into BigQuery.
However, community created a workaround for this limitation. You can use npm if you have installed it in your system. Below are the instructions to export the Firestore Data to JSON file using npm.
Generate a private key file for your service account. In the Firebase console, open Settings > Service Accounts.
Click Generate New Private Key, then confirm by clicking Generate Key.
Securely store the JSON file containing the key. You may also check this documentation.
Rename the JSON file to credentials.json.
Enter the below code to your console:
npx -p node-firestore-import-export firestore-export -a credentials.json -b backup.json
Follow the instructions prompted on your console.
You could also use this to import data to Firestore using below command:
npx -p node-firestore-import-export firestore-import -a credentials.json -b backup.json
Below are the results using npm from the package:
Firestore Collection:
Console:
backup.json:
{"__collections__":{"test":{"Lq8u3VnOKvoFN4r03Ri1":{"test":"test","__collections__":{}}}}}
You can find more information regarding the package here.
The package mentioned by #marc node-firestore-import-export in above answer has a flaw in case your database is very large.
This flaw is documented here
https://github.com/jloosli/node-firestore-import-export/issues/815
For this reason I would recommend using https://github.com/benyap/firestore-backfire
Related
I am trying to import existing data from firestore to Algolia but can not make it work. I installed the *Firebase Algolia Extension` and tried to follow the documented steps:
running this:
npx firestore-algolia-search
Below are the questions that will be asked:
What is the Region? europe-west3
What is the Project Id? wishlist-88d58
What is the Algolia App Id? 15W2O5H8ZN
What is the Algolia Api Key? { unspecified parameter }
What is the Algolia Index Name? allUsers
What is the Collection Path? allUsers
What are the Fields to extract? { unspecified parameter }
What is the Transform Function? { unspecified parameter }
What is the path to the Google Application Credential File? wishlists_key.json
For the Algolia API Key I added a key:
I did not specify Fields to extracts and Transform Functions.
For path to the Google Application Credential File I created a private key in Firebase and located it on the my desktop as wishlists_key.json, which is where I ran the command above from.
I got a response which also contained the data but said at the beginning there was an error:
{"severity":"WARNING","message":"Warning, FIREBASE_CONFIG and GCLOUD_PROJECT environment variables are missing. Initializing firebase-admin will fail"}
{"location":"europa-west3","algoliaAppId":"15W205H8ZN","algoliaAPIKey":"********","algoliaIndexName":"allUsers","collectionPath":"allUsers","fields":"","transformFunction":"","projectId":"wishlist-88d58","severity":"INFO","message":"Initializing extension with configuration"}
{"severity":"INFO","message":"[ 'Sending rest of the Records to Algolia' ]"}
{"severity":"INFO","message":"[ 'Preparing to send 20 record(s) to Algolia.' ]"}
{"name":"RetryError","message":"Error when performing Algolia index","transporterStackTrace":[{"request":{"data":"{"requests":[{"action":"partialUpdateObject","body":{"signInMethod":"mail","username":"user662572"
...
The command does not finish running but get's stuck in this.
What am I doing wrong here? How do I correctly import data from FireStore to Algolia?
Also, later I will need to import a collection with about 24k documents. Is the documented way also capable of handling these amount of documents?
I was able to make it work through Google Cloud Shell. I had to import my Google Application Credential File to it, ran the command above again.
I still got the same Warning that it will fail, but it worked anyway and all data was correctly imported.
I followed the official docs for scheduling firestore export via cloud function and cloud scheduler.
It works perfectly only for the first time creating the necessary exports at the right location.
When I run it again I get following error in cloud function.
Error: 3 INVALID_ARGUMENT: Path already exists: /red115.appspot.com/daily_backup/users.overall_export_metadata
Why doesn't it overwrite on existing data?
I followed official docs, gave the necessary roles & permissions to principal account.
I have Recreated the setup from the docs you shared. I am assuming you also did the Configure access permissions part. I have scheduled the firebase function for firestore export every 5 minutes on all collections. And even after running 4 times with the schedule I have not got the Error you have mentioned.
The Firebase Storage Rules you provided do not affect the Firestore export functionality, they are only for the Firebase Storage.
But if you are experiencing this error I will recommend you to first check whether the export is already created at the given specified location on that bucket.
If Yes then To overwrite existing data on Firestore export, you could add a step in your Cloud Function to delete the existing export if there exists before running the new export. Something like this : admin.storage().bucket().delete() (be specific about this thing) method to delete the existing export before running the new export.
OR
You could change the export path to include a timestamp or a version number, so that each export is saved to a unique location. Which is a default behavior.
If No then there must be a typo that happened in the following line while providing the bucket link const bucket = 'gs://BUCKET_NAME';
Make sure you provide the same gs://BUCKET_NAME in the functions/index.js and gsutil iam ch serviceAccount:PROJECT_ID#appspot.gserviceaccount.com:admin \ gs://BUCKET_NAME
This thread also talks about the wrong gsutil path you have a look at once as well..
I want to delete an image. All I have is the download url.
In flutter i am able to get file path from download url and use that path to delete the file in cloud storage.
Is that possible to get the file path from download url and use that path to delete the image from cloud functions?
or is there any better/ faster way / more efficient way to delete an image from cloud storage only with the download url
Google Cloud Storage object URL has the following parts:
https://storage.cloud.google.com/[bucket_name]/[path/and/the/object/name*]?[autentication_if_needed]
*Path in Cloud Storage is "virtual", in fact it is an integral part of the object name/identification. Cloud Console and gsutil simulates folders for the user interface output.
There are several methods to delete object:
From the Cloud Console
Using Cloud SDK command: gsutil rm gs://[BUCKET_NAME]/[OBJECT_NAME]
Using Client Libraries, for example with python:
def delete_blob(bucket_name, blob_name):
"""Deletes a blob from the bucket."""
storage_client = storage.Client()
bucket = storage_client.get_bucket(bucket_name)
blob = bucket.blob(blob_name)
blob.delete()
print('Blob {} deleted.'.format(blob_name))`
Please keep in mind, that you need proper permissions to delete the object for the user/service account used to perform the operation.
I am developing with Firebase and have data stored in the Realtime Database. I need to share my database structure for a question here on Stack Overflow, or just take a backup before making breaking changes. How can I do this using the Firebase Console?
Data can be exported from the Firebase Realtime Database as JSON:
Login to the Database section of the Firebase Console.
Navigate to the node you wish to export by clicking on it in the list (skip this to export all data).
Click the 3-dot overflow menu icon, at the top-right of the data panel.
Click Export JSON from the menu.
Likewise, you can import a structure in the same fashion, using Import JSON.
There is an Node.js tool called firebase-export, similar to firebase-import but not from Firebase itself, that will export JSON from the command line.
Firebase export helper utility for exporting excluded JSON from Firebase.
To install
npm install -g firebase-export
Usage example
$ firebase-export --database_url https://test.firebaseio-demo.com --firebase_secret '1234' --exclude 'settings/*, users/*/settings'
Github Repo
Note: Firebase has a REST API, so you can use any language to retrieve (export) data:
curl 'https://[PROJECT_ID].firebaseio.com/users/jack/name.json'
Here's an example curl request with filters
curl 'https://dinosaur-facts.firebaseio.com/dinosaurs.json?orderBy="height"&startAt=3&print=pretty'
If you have a large JSON file then it is safe to download it using Postman's Import feature because downloading a large JSON file sometimes faces failure in the middle of the way. You just need to click save the response after the response is reached.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
Was wondering if there are any common practices in backup up a firebase DB. My concern is some process accidentally wiping out our Database.
Thanks!
As of the time of this question, Firebase backs up all instances daily. So while keeping your own backups may still be useful, it's not essential.
To create your own backups, you can simply curl the data:
curl https://<instance>.firebaseio.com/.json?format=export
Note that for multiple gigabytes of data, this will slow things down and lock read access for a short period. It would be better in this case to chunk the backups and work with smaller portions. The shallow parameter can help here by providing a list of keys for any given path in Firebase, without having to fetch the data first.
curl https://<instance>.firebaseio.com/.json?shallow=true
As previously mentioned, there are also several GitHub libs available for this, and incremental backups are practical with some creativity and a worker thread on the real-time SDK.
There are now "Import Data" and "Export Data" buttons on the data page of the web interface for every project, so you can now backup your data with a button click!
just yesterday wrote a shell-script, which utilizes firebase-tools (npm install -g firebase-tools), in order to have these database dumps contained within my regular backup cronjob:
#!/bin/bash
# $1 is the Firebase projectId.
# $2 is the destination directory.
# example usage: cron_firebase.sh project-12345 /home/backups/firebase
# currently being triggered by /etc/cron.hourly/firebase-hourly.cron
PROJECTID=$1
DESTINATION=$2
FIREBASE="$(which firebase)"
NOW="$(date +"%Y-%m-%d_%H%M")"
cd $DESTINATION
$FIREBASE --project $PROJECTID database:get / > ./$PROJECTID.$NOW.json
tar -pczf $PROJECTID.$NOW.tar.gz ./$PROJECTID.$NOW.json && rm ./$PROJECTID.$NOW.json
update: in the meanwhile, one can auto backup to Google Cloud Storage Bucket
...goto Firebase Console -> Realtime Database -> and click tab Backups.
It is now possible to backup and restore Firebase Firestore using Cloud Firestore managed export and import service
You do it by:
Create a Cloud Storage bucket for your project:
Set up gcloud for your project using gcloud config set project [PROJECT_ID]
EXPORT
Export all by calling
gcloud alpha firestore export gs://[BUCKET_NAME]
Or Export a specific collection using
gcloud alpha firestore export gs://[BUCKET_NAME] --collection-ids='[COLLECTION_ID_1]','[COLLECTION_ID_2]'
IMPORT
Import all by calling
gcloud alpha firestore import gs://[BUCKET_NAME]/[EXPORT_PREFIX]/
where [BUCKET_NAME] and [EXPORT_PREFIX] point to the location of your export files. For example - gcloud alpha firestore import gs://exports-bucket/2017-05-25T23:54:39_76544/
Import a specific collection by calling:
gcloud alpha firestore import --collection-ids='[COLLECTION_ID_1]','[COLLECTION_ID_2]' gs://[BUCKET_NAME]/[EXPORT_PREFIX]/
Full instructions are available here:
https://firebase.google.com/docs/firestore/manage-data/export-import
Just to expand #kato's answer using curl.
I was looking for ways to run the command every night. My solution:
1) created a compute engine (basically a VM) in Google Cloud. You might be familiar with EC2 if you are from AWS world.
2) Wrote a simple cronjob, something like this
0 23 * * * /usr/bin/curl https://yourdatabaseurl.com/.json?format=export -o /tmp/backuptest_`date +\%d\%m\%y`.bk
I am sure there might be a simpler way to do this within the free tier itself.
Like using cloud functions.