I am trying to import existing data from firestore to Algolia but can not make it work. I installed the *Firebase Algolia Extension` and tried to follow the documented steps:
running this:
npx firestore-algolia-search
Below are the questions that will be asked:
What is the Region? europe-west3
What is the Project Id? wishlist-88d58
What is the Algolia App Id? 15W2O5H8ZN
What is the Algolia Api Key? { unspecified parameter }
What is the Algolia Index Name? allUsers
What is the Collection Path? allUsers
What are the Fields to extract? { unspecified parameter }
What is the Transform Function? { unspecified parameter }
What is the path to the Google Application Credential File? wishlists_key.json
For the Algolia API Key I added a key:
I did not specify Fields to extracts and Transform Functions.
For path to the Google Application Credential File I created a private key in Firebase and located it on the my desktop as wishlists_key.json, which is where I ran the command above from.
I got a response which also contained the data but said at the beginning there was an error:
{"severity":"WARNING","message":"Warning, FIREBASE_CONFIG and GCLOUD_PROJECT environment variables are missing. Initializing firebase-admin will fail"}
{"location":"europa-west3","algoliaAppId":"15W205H8ZN","algoliaAPIKey":"********","algoliaIndexName":"allUsers","collectionPath":"allUsers","fields":"","transformFunction":"","projectId":"wishlist-88d58","severity":"INFO","message":"Initializing extension with configuration"}
{"severity":"INFO","message":"[ 'Sending rest of the Records to Algolia' ]"}
{"severity":"INFO","message":"[ 'Preparing to send 20 record(s) to Algolia.' ]"}
{"name":"RetryError","message":"Error when performing Algolia index","transporterStackTrace":[{"request":{"data":"{"requests":[{"action":"partialUpdateObject","body":{"signInMethod":"mail","username":"user662572"
...
The command does not finish running but get's stuck in this.
What am I doing wrong here? How do I correctly import data from FireStore to Algolia?
Also, later I will need to import a collection with about 24k documents. Is the documented way also capable of handling these amount of documents?
I was able to make it work through Google Cloud Shell. I had to import my Google Application Credential File to it, ran the command above again.
I still got the same Warning that it will fail, but it worked anyway and all data was correctly imported.
Related
I followed the official docs for scheduling firestore export via cloud function and cloud scheduler.
It works perfectly only for the first time creating the necessary exports at the right location.
When I run it again I get following error in cloud function.
Error: 3 INVALID_ARGUMENT: Path already exists: /red115.appspot.com/daily_backup/users.overall_export_metadata
Why doesn't it overwrite on existing data?
I followed official docs, gave the necessary roles & permissions to principal account.
I have Recreated the setup from the docs you shared. I am assuming you also did the Configure access permissions part. I have scheduled the firebase function for firestore export every 5 minutes on all collections. And even after running 4 times with the schedule I have not got the Error you have mentioned.
The Firebase Storage Rules you provided do not affect the Firestore export functionality, they are only for the Firebase Storage.
But if you are experiencing this error I will recommend you to first check whether the export is already created at the given specified location on that bucket.
If Yes then To overwrite existing data on Firestore export, you could add a step in your Cloud Function to delete the existing export if there exists before running the new export. Something like this : admin.storage().bucket().delete() (be specific about this thing) method to delete the existing export before running the new export.
OR
You could change the export path to include a timestamp or a version number, so that each export is saved to a unique location. Which is a default behavior.
If No then there must be a typo that happened in the following line while providing the bucket link const bucket = 'gs://BUCKET_NAME';
Make sure you provide the same gs://BUCKET_NAME in the functions/index.js and gsutil iam ch serviceAccount:PROJECT_ID#appspot.gserviceaccount.com:admin \ gs://BUCKET_NAME
This thread also talks about the wrong gsutil path you have a look at once as well..
I tried to clone a firestore database. I found a guide on this topic (https://qrc.ninja/2019/03/20/cloning-firestore-data/) so I tried to complete the steps in this guide.
To export the database I did the following:
gcloud config set project [PROJECT_ID]
gcloud firestore export gs://[BUCKET_NAME]
To import the database I did the following:
gcloud config set project [DESTINATION_PROJECT_ID]
gsutil acl ch -u [RIGHTS_RECIPIENT]:R gs://[BUCKET_NAME]
gcloud firestore import gs://[BUCKET_NAME]/[TIMESTAMPED_DIRECTORY]
The last step (gcloud firestore import ...) resulted in this error:
ERROR: (gcloud.firestore.import) Entity too large
I searched for this problem, but I could only find in a cached google result of this page: https://cloud.google.com/datastore/docs/export-import-entities
There it says:
An import operation updates entity keys and key reference properties in the import data with the project ID of the destination project. If this update increases your entity sizes, it can cause "entity is too big" or "index entries too large" errors for import operations.
To avoid either error, import into a destination project with a shorter project ID.
My project ID looks like this: XX-XXXXX-XXXXXXX. It is 16 characters long. As I need a paid plan for my project, simply testing with a shorter name won't be for free.
So I would be grateful for any hints on if the ID is really the problem or if I could try something else to clone my database.
Update: I can clone the database, by exporting/importing single collection. But one of my collections has over 79000 documents. When I do an export of this large collection and try to import it, I still get
ERROR: (gcloud.firestore.import) Entity too large
This kind of issues are usually related to Entities that somehow grow over the allowed size and when trying to restore a DB (from an export and then an import), issues arises. The issue is located in the import given that the export doesn't have any restrictions. The Project Id shouldn't have to do anything with the issue.
A way in which you can check this is that you import your data in BigQuery and inspect the larger entities yourself. The Cloud Datastore entities should respect the limits set here, in particular the size of entity. The size of an entity is the sum of:
The key size
The sum of the property sizes
32 bytes
You can either check the size of each entity manually, by writing a script or by loading the data in Big Query. The calculation of the size of entity is defined in the URL at here.
Additionally, you can run the command:
gcloud datastore operations describe [OPERATION_ID]
with the import operation id to get more details.
I found this Public Issue Tracker. As far as is mentioned, this issue should be resolved by modifying affected entities.
I'm attempting to use the instructions here: https://firebase.google.com/docs/firestore/manage-data/export-import to a) do periodic backups of data from my production instance, and b) copy data between production/staging/dev instances.
For what it's worth, each instance is in a separate Firebase project (myapp-dev, myapp-staging and myapp-production), all are on the Blaze plan and each has a corresponding bucket in Cloud Platform (gs://myapp-backup-dev, ...-staging, ...-production).
I've successfully completed all the "Before you begin" steps. I've exported data from one instance/project (staging) into it's bucket, and it *seems* that I can also import it back into that project successfully (no error message, operationState: SUCCESSFUL), but any records changed since the export don't 'restore' back to their original values.
And for what it's worth, I've also successfully copied the exported data from that bucket into another project's bucket (staging to dev), and get the same result when I am importing it into the second project (dev).
Am I doing something wrong here? Missing something?
Is the name of your collection testStuff or 'testStuff'? If it's testStuff, it seems like your export command was slightly off. You'll need to export the data again. You should get a workCompleted number this time around.
gcloud beta firestore export gs://myapp-backup-dev --collection-ids='testStuff'
gcloud beta firestore import gs://myapp-backup-dev/2018-10-15T21:38:18_36964 --collection-ids='testStuff'
I am developing with Firebase and have data stored in the Realtime Database. I need to share my database structure for a question here on Stack Overflow, or just take a backup before making breaking changes. How can I do this using the Firebase Console?
Data can be exported from the Firebase Realtime Database as JSON:
Login to the Database section of the Firebase Console.
Navigate to the node you wish to export by clicking on it in the list (skip this to export all data).
Click the 3-dot overflow menu icon, at the top-right of the data panel.
Click Export JSON from the menu.
Likewise, you can import a structure in the same fashion, using Import JSON.
There is an Node.js tool called firebase-export, similar to firebase-import but not from Firebase itself, that will export JSON from the command line.
Firebase export helper utility for exporting excluded JSON from Firebase.
To install
npm install -g firebase-export
Usage example
$ firebase-export --database_url https://test.firebaseio-demo.com --firebase_secret '1234' --exclude 'settings/*, users/*/settings'
Github Repo
Note: Firebase has a REST API, so you can use any language to retrieve (export) data:
curl 'https://[PROJECT_ID].firebaseio.com/users/jack/name.json'
Here's an example curl request with filters
curl 'https://dinosaur-facts.firebaseio.com/dinosaurs.json?orderBy="height"&startAt=3&print=pretty'
If you have a large JSON file then it is safe to download it using Postman's Import feature because downloading a large JSON file sometimes faces failure in the middle of the way. You just need to click save the response after the response is reached.
I'm using Firebase as a simple game-server and have some settings that are relevant for both client and backend and would like to keep them in RemoteConfig for consistency, but not sure if I can access it from my cloud functions in a simple way (I don't consider going through the REST interface a "simple" way)
As far as I can tell there is no mention of it in the docs, so I guess it's not possible, but does anyone know for sure?
firebaser here
There is a public REST API that allows you to read and set Firebase Remote Config conditions. This API requires that you have full administrative access to the Firebase project, so must only be used on a trusted environment (such as your development machine, a server you control or Cloud Functions).
There is no public API to get Firebase Remote Config settings from a client environment at the moment. Sorry I don't have better news.
This is probably only included in newer versions of firebase (8th or 9th and above if I'm not mistaken).
// We first need to import remoteConfig function.
import { remoteConfig } from firebase-admin
// Then in your cloud function we use it to fetch our remote config values.
const remoteConfigTemplate = await remoteConfig().getTemplate().catch(e => {
// Your error handling if fetching fails...
}
// Next it is just matter of extracting the values, which is kinda convoluted,
// let's say you want to extract `game_version` field from remote config:
const gameVersion = remoteConfigTemplate.parameters.game_version.defaultValue.value
So parameters are always followed by the name of the field that you defined in Firebase console's remote config, in this example game_version.
It's a mouthful (or typeful) but that's how you get it.
Also note that if value is stored as JSON string, you will need to parse it before usage, commonly: JSON.parse(gameVersion).
Similar process is outlined in Firebase docs.