I am developing with Firebase and have data stored in the Realtime Database. I need to share my database structure for a question here on Stack Overflow, or just take a backup before making breaking changes. How can I do this using the Firebase Console?
Data can be exported from the Firebase Realtime Database as JSON:
Login to the Database section of the Firebase Console.
Navigate to the node you wish to export by clicking on it in the list (skip this to export all data).
Click the 3-dot overflow menu icon, at the top-right of the data panel.
Click Export JSON from the menu.
Likewise, you can import a structure in the same fashion, using Import JSON.
There is an Node.js tool called firebase-export, similar to firebase-import but not from Firebase itself, that will export JSON from the command line.
Firebase export helper utility for exporting excluded JSON from Firebase.
To install
npm install -g firebase-export
Usage example
$ firebase-export --database_url https://test.firebaseio-demo.com --firebase_secret '1234' --exclude 'settings/*, users/*/settings'
Github Repo
Note: Firebase has a REST API, so you can use any language to retrieve (export) data:
curl 'https://[PROJECT_ID].firebaseio.com/users/jack/name.json'
Here's an example curl request with filters
curl 'https://dinosaur-facts.firebaseio.com/dinosaurs.json?orderBy="height"&startAt=3&print=pretty'
If you have a large JSON file then it is safe to download it using Postman's Import feature because downloading a large JSON file sometimes faces failure in the middle of the way. You just need to click save the response after the response is reached.
Related
I am trying to import existing data from firestore to Algolia but can not make it work. I installed the *Firebase Algolia Extension` and tried to follow the documented steps:
running this:
npx firestore-algolia-search
Below are the questions that will be asked:
What is the Region? europe-west3
What is the Project Id? wishlist-88d58
What is the Algolia App Id? 15W2O5H8ZN
What is the Algolia Api Key? { unspecified parameter }
What is the Algolia Index Name? allUsers
What is the Collection Path? allUsers
What are the Fields to extract? { unspecified parameter }
What is the Transform Function? { unspecified parameter }
What is the path to the Google Application Credential File? wishlists_key.json
For the Algolia API Key I added a key:
I did not specify Fields to extracts and Transform Functions.
For path to the Google Application Credential File I created a private key in Firebase and located it on the my desktop as wishlists_key.json, which is where I ran the command above from.
I got a response which also contained the data but said at the beginning there was an error:
{"severity":"WARNING","message":"Warning, FIREBASE_CONFIG and GCLOUD_PROJECT environment variables are missing. Initializing firebase-admin will fail"}
{"location":"europa-west3","algoliaAppId":"15W205H8ZN","algoliaAPIKey":"********","algoliaIndexName":"allUsers","collectionPath":"allUsers","fields":"","transformFunction":"","projectId":"wishlist-88d58","severity":"INFO","message":"Initializing extension with configuration"}
{"severity":"INFO","message":"[ 'Sending rest of the Records to Algolia' ]"}
{"severity":"INFO","message":"[ 'Preparing to send 20 record(s) to Algolia.' ]"}
{"name":"RetryError","message":"Error when performing Algolia index","transporterStackTrace":[{"request":{"data":"{"requests":[{"action":"partialUpdateObject","body":{"signInMethod":"mail","username":"user662572"
...
The command does not finish running but get's stuck in this.
What am I doing wrong here? How do I correctly import data from FireStore to Algolia?
Also, later I will need to import a collection with about 24k documents. Is the documented way also capable of handling these amount of documents?
I was able to make it work through Google Cloud Shell. I had to import my Google Application Credential File to it, ran the command above again.
I still got the same Warning that it will fail, but it worked anyway and all data was correctly imported.
I followed the official docs for scheduling firestore export via cloud function and cloud scheduler.
It works perfectly only for the first time creating the necessary exports at the right location.
When I run it again I get following error in cloud function.
Error: 3 INVALID_ARGUMENT: Path already exists: /red115.appspot.com/daily_backup/users.overall_export_metadata
Why doesn't it overwrite on existing data?
I followed official docs, gave the necessary roles & permissions to principal account.
I have Recreated the setup from the docs you shared. I am assuming you also did the Configure access permissions part. I have scheduled the firebase function for firestore export every 5 minutes on all collections. And even after running 4 times with the schedule I have not got the Error you have mentioned.
The Firebase Storage Rules you provided do not affect the Firestore export functionality, they are only for the Firebase Storage.
But if you are experiencing this error I will recommend you to first check whether the export is already created at the given specified location on that bucket.
If Yes then To overwrite existing data on Firestore export, you could add a step in your Cloud Function to delete the existing export if there exists before running the new export. Something like this : admin.storage().bucket().delete() (be specific about this thing) method to delete the existing export before running the new export.
OR
You could change the export path to include a timestamp or a version number, so that each export is saved to a unique location. Which is a default behavior.
If No then there must be a typo that happened in the following line while providing the bucket link const bucket = 'gs://BUCKET_NAME';
Make sure you provide the same gs://BUCKET_NAME in the functions/index.js and gsutil iam ch serviceAccount:PROJECT_ID#appspot.gserviceaccount.com:admin \ gs://BUCKET_NAME
This thread also talks about the wrong gsutil path you have a look at once as well..
I have an application programmed in Flutter and I use Firebase to collect some information sent by users.
The question is how can I transfer this information to my computer in the form of a file (JSON, TEXT, etc.) data like this picture:
Currently, Firestore does not support exporting existing data to a readable file but Firestore do have a managed Exporting and importing data that allows you to dump your data into a GCS bucket. It produces a format that is the same as Cloud Datastore uses. This means you can then import it into BigQuery.
However, community created a workaround for this limitation. You can use npm if you have installed it in your system. Below are the instructions to export the Firestore Data to JSON file using npm.
Generate a private key file for your service account. In the Firebase console, open Settings > Service Accounts.
Click Generate New Private Key, then confirm by clicking Generate Key.
Securely store the JSON file containing the key. You may also check this documentation.
Rename the JSON file to credentials.json.
Enter the below code to your console:
npx -p node-firestore-import-export firestore-export -a credentials.json -b backup.json
Follow the instructions prompted on your console.
You could also use this to import data to Firestore using below command:
npx -p node-firestore-import-export firestore-import -a credentials.json -b backup.json
Below are the results using npm from the package:
Firestore Collection:
Console:
backup.json:
{"__collections__":{"test":{"Lq8u3VnOKvoFN4r03Ri1":{"test":"test","__collections__":{}}}}}
You can find more information regarding the package here.
The package mentioned by #marc node-firestore-import-export in above answer has a flaw in case your database is very large.
This flaw is documented here
https://github.com/jloosli/node-firestore-import-export/issues/815
For this reason I would recommend using https://github.com/benyap/firestore-backfire
I'm attempting to use the instructions here: https://firebase.google.com/docs/firestore/manage-data/export-import to a) do periodic backups of data from my production instance, and b) copy data between production/staging/dev instances.
For what it's worth, each instance is in a separate Firebase project (myapp-dev, myapp-staging and myapp-production), all are on the Blaze plan and each has a corresponding bucket in Cloud Platform (gs://myapp-backup-dev, ...-staging, ...-production).
I've successfully completed all the "Before you begin" steps. I've exported data from one instance/project (staging) into it's bucket, and it *seems* that I can also import it back into that project successfully (no error message, operationState: SUCCESSFUL), but any records changed since the export don't 'restore' back to their original values.
And for what it's worth, I've also successfully copied the exported data from that bucket into another project's bucket (staging to dev), and get the same result when I am importing it into the second project (dev).
Am I doing something wrong here? Missing something?
Is the name of your collection testStuff or 'testStuff'? If it's testStuff, it seems like your export command was slightly off. You'll need to export the data again. You should get a workCompleted number this time around.
gcloud beta firestore export gs://myapp-backup-dev --collection-ids='testStuff'
gcloud beta firestore import gs://myapp-backup-dev/2018-10-15T21:38:18_36964 --collection-ids='testStuff'
I am using the REST API of Firebase Realtime Database from an AppEngine Standard project with Java. I am able to successfully put data under different locations, however I don't know how I could ensure atomic updates to different paths.
To put some data separately at a specific location I am doing:
requestFactory.buildPutRequest("dbUrl/path1/17/", new ByteArrayContent("application/json", json1.getBytes())).execute();
requestFactory.buildPutRequest("dbUrl/path2/1733455/", new ByteArrayContent("application/json", json2.getBytes())).execute();
Now to ensure that when saving a /path1/17/ a /path2/1733455/ is also saved, I've been looking into multi path updates and batched updates (https://firebase.google.com/docs/firestore/manage-data/transactions#batched-writes, only available in Cloud Firestore?) However, I did not find whether this feature is available for the REST API of the Firebase Realtime Database as well or only through the Firebase Admin SDK.
The example here shows how to do a multi path update at two locations under the "users" node.
curl -X PATCH -d '{
"alanisawesome/nickname": "Alan The Machine",
"gracehopper/nickname": "Amazing Grace"
}' \
'https://docs-examples.firebaseio.com/rest/saving-data/users.json'
But I don't have a common upper node for path1 and path2.
Tried setting as the url as the database url without any nodes (https://db.firebaseio.com.json) and adding the nodes in the json object sent, but I get an error: nodename nor servname provided, or not known.
This would be possible with the Admin SDK I think, according to this blog post: https://firebase.googleblog.com/2015/09/introducing-multi-location-updates-and_86.html
Any ideas if these atomic writes can be achieved with the REST API?
Thank you!
If the updates are going to a single database, there is always a common path.
In your case you'll run the PATCH command against the root of the database:
curl -X PATCH -d '{
"path1/17": json1,
"path2/1733455": json2
}' 'https://yourdatabase.firebaseio.com/.json'
The key difference with your URL seems to be the / before .json. Without that you're trying to connect to a domain on the json TLD, which doesn't exist (yet) afaik.
Note that the documentation link you provide for Batched Updates is for Cloud Firestore, which is a completely separate database from the Firebase Realtime Database.