We manage to produce lots of large json files ~600MB+ (with one json record per line) and we want to import this into Firebase database. Is there any way to do this directly using a gs:// path?
(The fire base console UI has an import function but this is using json from local files not gs:// and https://github.com/firebase/firebase-import is not clear whether is can work from Google Storage.)
There is no direct connection to import data from a file in Google Cloud Storage into the Firebase Database. You will have to use some intermediary tool or code for that. While recommending a specific tool is off-topic on Stack Overflow, I want to point out Firebase's own streaming import tool as one option
But you should really consider what purpose this import serves. The Firebase Database is a great tool for synchronizing data between connected clients. It is not a great tool for advanced querying or reporting on large data sets. Something like BigQuery is much more suited to such tasks.
Related
I would like to get all the data from my Realtime Database (200GB) to migrate it to a different database (PostgreSQL).
I considered:
creating a script with the Admin SDK and request all the entries, but this would be really slow and would make the database reach the peek (that means slow down all the database).
using the "Export JSON" function of the console but it is not possible because the size (200GB) is too large.
I just made a manual backup on the console and hopefully I'm going to receive a huge ZIP file that then I will need to manage with a script.
Any other smarter solution?
I have a project whose database is in cloud datastore. Now I want to take a backup of all kinds including all its entities in local system. How it should be possible. I also have checked the cloud documentation i.e
1- https://cloud.google.com/datastore/docs/export-import-entities#exporting_entities
2- https://cloud.google.com/sdk/gcloud/reference/datastore/export
but it describes that how to export data from cloud datastore to cloud storage not in local system. Please let me know if anyone knows that how it should be possible.
Thanks!
It is not possible to get the Managed Export service to export directly to your local filesystem. So you'll need to export your entities to GCS. To use the exports on your local machine you can copy them to your local machine, then import them into the Datastore emulator.
I do something like this, but I had to create my own exporter and importer, see my answer to this question https://stackoverflow.com/a/52767415/4458510
To do this I wrote a google dataflow job that exports select models and saves them in google cloud storage in jsonl format. Then on my local host I have an endpoint called /init/ which launches a taskqueue job to download these exports and import them.
To do this i reuse my JSON REST handler code which is able to convert any model to json and vice versa.
My Current Plan:
I'm currently creating an IOS App that will access/change java/python files that are stored in the Google Cloud Storage. Once confirmed the App will talk with App Engine that will have a Compute Engine VM receive files and compile them. Once compiled have the result returned back to the IOS App
Is there any better or easier method to achieve this task? Should I use firebase or Google Cloud Functions? Would it be any help
Currently, I'm lost how to design and have requests sent between many platforms.
It would also depend on what type of data processing you are doing to the files in Cloud Storage. Ideally you would want to avoid as many "hops" between services as possible. You could do everything via Cloud Functions and listen on GCS Triggers. You can think of Cloud Functions as a sudo App Engine Backend to use for quick request handling.
Use Cloud Functions to respond to events from Cloud Storage or Firebase Storage to process files immediately after upload
If you are already using Firebase, it would be better to stay within their ecosystem as much as possible. If you are doing bigger or more intensive data processing you might want to look at different options.
With more information and current pain points, we may be able to offer more insight.
I'm using Direbase Realtime Database to store a large set of public transport informations for my mobile apps.
Last week I've tried to move everything to Firestore, that's a lot better for my database structure, thanks to his query system.
Unlike Realtime Database I cannot import data from JSON files, this has force me to use batch write operations to save the initial large block of data.
In this way I've exceeded the quota limit in a few seconds, is there a way to avoid the quota limit to import the initial data for the project?
For example in Realtime Database I was able to import a large set of data directly from the Firebase panel uploading a json file from my pc.
There is currently no API nor other service to specifically bulk import data into Cloud Firestore. That means that you must use the regular API to import the data. And that unfortunately means there is currently no way to bypass quota limits to import data sets.
I was thinking of import my server Java code to firebase, basically replace my Dynamic web app with Jeresy and mongoDB to Firebase using hosting and functions .
I have implemented the sendWelcomeEmail and sendByeEmail with events user.created and user.deleted.
But now I want to do something more complicated.
Based on post request that comes from the mobile I want to extract the JSON data and then update the DataBase.
So I created js file with plenty of functions but now I am not sure it will actually work.
Is that the best way to implement this workflow.
The workflow goes like this.
Image is taken on the android device => extract information from image ==> upload the Json Data to the server (Firebase Hosting) => functions is been execute is response to POST request ==> extarct the data ==> save it to the Firebase Database.
Let me know if this sound ok, or I need to implent another workflow.
Thanks
Eran
The whole idea of Firebase is that your app talks directly to backend services (such as the database, or cloud storage), and you only write server-side code (with cloud functions) for functionality that Firebase doesn't provide a client-side API for.
Why don't you simply let the Android client write directly to the Firebase Database?