I'm attempting to use the instructions here: https://firebase.google.com/docs/firestore/manage-data/export-import to a) do periodic backups of data from my production instance, and b) copy data between production/staging/dev instances.
For what it's worth, each instance is in a separate Firebase project (myapp-dev, myapp-staging and myapp-production), all are on the Blaze plan and each has a corresponding bucket in Cloud Platform (gs://myapp-backup-dev, ...-staging, ...-production).
I've successfully completed all the "Before you begin" steps. I've exported data from one instance/project (staging) into it's bucket, and it *seems* that I can also import it back into that project successfully (no error message, operationState: SUCCESSFUL), but any records changed since the export don't 'restore' back to their original values.
And for what it's worth, I've also successfully copied the exported data from that bucket into another project's bucket (staging to dev), and get the same result when I am importing it into the second project (dev).
Am I doing something wrong here? Missing something?
Is the name of your collection testStuff or 'testStuff'? If it's testStuff, it seems like your export command was slightly off. You'll need to export the data again. You should get a workCompleted number this time around.
gcloud beta firestore export gs://myapp-backup-dev --collection-ids='testStuff'
gcloud beta firestore import gs://myapp-backup-dev/2018-10-15T21:38:18_36964 --collection-ids='testStuff'
Related
I am trying to import existing data from firestore to Algolia but can not make it work. I installed the *Firebase Algolia Extension` and tried to follow the documented steps:
running this:
npx firestore-algolia-search
Below are the questions that will be asked:
What is the Region? europe-west3
What is the Project Id? wishlist-88d58
What is the Algolia App Id? 15W2O5H8ZN
What is the Algolia Api Key? { unspecified parameter }
What is the Algolia Index Name? allUsers
What is the Collection Path? allUsers
What are the Fields to extract? { unspecified parameter }
What is the Transform Function? { unspecified parameter }
What is the path to the Google Application Credential File? wishlists_key.json
For the Algolia API Key I added a key:
I did not specify Fields to extracts and Transform Functions.
For path to the Google Application Credential File I created a private key in Firebase and located it on the my desktop as wishlists_key.json, which is where I ran the command above from.
I got a response which also contained the data but said at the beginning there was an error:
{"severity":"WARNING","message":"Warning, FIREBASE_CONFIG and GCLOUD_PROJECT environment variables are missing. Initializing firebase-admin will fail"}
{"location":"europa-west3","algoliaAppId":"15W205H8ZN","algoliaAPIKey":"********","algoliaIndexName":"allUsers","collectionPath":"allUsers","fields":"","transformFunction":"","projectId":"wishlist-88d58","severity":"INFO","message":"Initializing extension with configuration"}
{"severity":"INFO","message":"[ 'Sending rest of the Records to Algolia' ]"}
{"severity":"INFO","message":"[ 'Preparing to send 20 record(s) to Algolia.' ]"}
{"name":"RetryError","message":"Error when performing Algolia index","transporterStackTrace":[{"request":{"data":"{"requests":[{"action":"partialUpdateObject","body":{"signInMethod":"mail","username":"user662572"
...
The command does not finish running but get's stuck in this.
What am I doing wrong here? How do I correctly import data from FireStore to Algolia?
Also, later I will need to import a collection with about 24k documents. Is the documented way also capable of handling these amount of documents?
I was able to make it work through Google Cloud Shell. I had to import my Google Application Credential File to it, ran the command above again.
I still got the same Warning that it will fail, but it worked anyway and all data was correctly imported.
I followed the official docs for scheduling firestore export via cloud function and cloud scheduler.
It works perfectly only for the first time creating the necessary exports at the right location.
When I run it again I get following error in cloud function.
Error: 3 INVALID_ARGUMENT: Path already exists: /red115.appspot.com/daily_backup/users.overall_export_metadata
Why doesn't it overwrite on existing data?
I followed official docs, gave the necessary roles & permissions to principal account.
I have Recreated the setup from the docs you shared. I am assuming you also did the Configure access permissions part. I have scheduled the firebase function for firestore export every 5 minutes on all collections. And even after running 4 times with the schedule I have not got the Error you have mentioned.
The Firebase Storage Rules you provided do not affect the Firestore export functionality, they are only for the Firebase Storage.
But if you are experiencing this error I will recommend you to first check whether the export is already created at the given specified location on that bucket.
If Yes then To overwrite existing data on Firestore export, you could add a step in your Cloud Function to delete the existing export if there exists before running the new export. Something like this : admin.storage().bucket().delete() (be specific about this thing) method to delete the existing export before running the new export.
OR
You could change the export path to include a timestamp or a version number, so that each export is saved to a unique location. Which is a default behavior.
If No then there must be a typo that happened in the following line while providing the bucket link const bucket = 'gs://BUCKET_NAME';
Make sure you provide the same gs://BUCKET_NAME in the functions/index.js and gsutil iam ch serviceAccount:PROJECT_ID#appspot.gserviceaccount.com:admin \ gs://BUCKET_NAME
This thread also talks about the wrong gsutil path you have a look at once as well..
I tried to clone a firestore database. I found a guide on this topic (https://qrc.ninja/2019/03/20/cloning-firestore-data/) so I tried to complete the steps in this guide.
To export the database I did the following:
gcloud config set project [PROJECT_ID]
gcloud firestore export gs://[BUCKET_NAME]
To import the database I did the following:
gcloud config set project [DESTINATION_PROJECT_ID]
gsutil acl ch -u [RIGHTS_RECIPIENT]:R gs://[BUCKET_NAME]
gcloud firestore import gs://[BUCKET_NAME]/[TIMESTAMPED_DIRECTORY]
The last step (gcloud firestore import ...) resulted in this error:
ERROR: (gcloud.firestore.import) Entity too large
I searched for this problem, but I could only find in a cached google result of this page: https://cloud.google.com/datastore/docs/export-import-entities
There it says:
An import operation updates entity keys and key reference properties in the import data with the project ID of the destination project. If this update increases your entity sizes, it can cause "entity is too big" or "index entries too large" errors for import operations.
To avoid either error, import into a destination project with a shorter project ID.
My project ID looks like this: XX-XXXXX-XXXXXXX. It is 16 characters long. As I need a paid plan for my project, simply testing with a shorter name won't be for free.
So I would be grateful for any hints on if the ID is really the problem or if I could try something else to clone my database.
Update: I can clone the database, by exporting/importing single collection. But one of my collections has over 79000 documents. When I do an export of this large collection and try to import it, I still get
ERROR: (gcloud.firestore.import) Entity too large
This kind of issues are usually related to Entities that somehow grow over the allowed size and when trying to restore a DB (from an export and then an import), issues arises. The issue is located in the import given that the export doesn't have any restrictions. The Project Id shouldn't have to do anything with the issue.
A way in which you can check this is that you import your data in BigQuery and inspect the larger entities yourself. The Cloud Datastore entities should respect the limits set here, in particular the size of entity. The size of an entity is the sum of:
The key size
The sum of the property sizes
32 bytes
You can either check the size of each entity manually, by writing a script or by loading the data in Big Query. The calculation of the size of entity is defined in the URL at here.
Additionally, you can run the command:
gcloud datastore operations describe [OPERATION_ID]
with the import operation id to get more details.
I found this Public Issue Tracker. As far as is mentioned, this issue should be resolved by modifying affected entities.
I am attempting to copy the contents of a folder in one Firebase project's Storage bucket to the storage bucket of another Firebase project.
I have been following the firestore docs and this SO question.
Both projects have the necessary permissions to other's service accounts.
Here is what I have done:
When attempting to transfer files from a folder in the default bucket of Project-A to the default bucket of Project-B using the cloud shell terminal, I first set the project to 'Project-A'. I then ran gcloud beta firestore export gs://[PROJECT_A_ID] --collection-ids=[FOLDER_TO_TRANSFER] --async. This succeeds and creates a folder called "2019-08-26T21:23:23_26014/" in Project-A. This folder contains some metadata.
Next, I tried beginning the import by setting the project to Project-B and running gcloud beta firestore import gs://[PROJECT_A_ID]/2019-08-26T21:23:23_26014
This completes and the logs display this message:
done: true
metadata:
'#type': type.googleapis.com/google.firestore.admin.v1beta1.ImportDocumentsMetadata
collectionIds:
- [FOLDER_TO_TRANSFER]
endTime: '2019-08-26T21:25:56.794588Z'
inputUriPrefix: gs://[PROJECT_A_ID]/2019-08-26T21:23:23_26014
operationState: SUCCESSFUL
startTime: '2019-08-26T21:25:19.689430Z'
name: projects/[PROJECT_B]/databases/(default)/operations/[SOME_ID_STRING]
response:
'#type': type.googleapis.com/google.protobuf.Empty
However, the Project-B storage bucket doesn't have any new files or folders. It looks like the import did nothing. Am I missing something?
You can create a transfer job in the GCP Console. You can specify source/destination buckets from different projects as long as you have access permissions. You can specify the folder by setting "Specify file filters":
https://console.cloud.google.com/storage/transfer
You can also use the gsutil tool, which is part of gcloud, to move or copy your objects to another bucket.
So your default buckets would be gs://[PROJECT_A_ID].appspot.com and gs://[PROJECT_B_ID].appspot.com Let's say you wanted to copy over the contents of my_directory:
gsutil cp -r gs://[PROJECT_A_ID].appspot.com/my_directory gs://[PROJECT_B_ID].appspot.com
I am trying to write a function that updates a node in database and then create a directory in the default storage bucket.
admin.database().ref('messages').push({ original: original })
.then(() => {
//looking for something like this
//functions.storage.object().mkdir('myFolder');
})
Function samples from firebase docs use const gcs = require('#google-cloud/storage')(); but i am having hard time importing this package using typescript.
importing it this way does not work. instead of having access to gcs.bucket(...) i have access to gcs.Bucket
import * as gcs from '#google-cloud/storage';
I am looking for ways to get this import working or other ways i can use in typescript.
thanks.
Google Cloud Storage does not have a concept of "folders" -- as a pure object store, you can store a file with any arbitrary key. Various UIs (including the Firebase console) look for slashes in the object names to provide a virtual structure, but no actual structure exists.
On the import issue -- as of version 5.2.0 of the Admin SDK, you can just do:
admin.storage().bucket()
to get a reference to the Cloud Storage bucket.