Being on the Blaze plan, what are the cost implications when it comes to firestore triggers?
Do I assume correctly that a trigger by itself doesn't generate any extra reads from database?
I still have a feeling that triggers are not free from perspective of CPM/memory consumption. And from that perspective do I understand correctly that firestore triggers can be treated just like any other firebase functions?
The Firestore document that triggers your Cloud Function is included in the context. There is no document read or bandwidth charge for accessing this document.
You will be charged for the invocation and CPU/memory usage of the Cloud Function itself, as well as for any additional Firestore access you perform inside your Function's code.
According to the Firebase docs:
Firestore Triggers
With Cloud Functions, you can handle events in
Cloud Firestore with no need to update client code.
So Firestore triggers can only be done via Cloud Functions. For cloud functions there are costs for running a function (https://firebase.google.com/pricing) and the costs of listening to changes (snapshots) or reading/writing to Firestore will be added to that. Depending on what you are planning to do you also have to account for the internet traffic generated between the Firestore and the Cloud Functions, but this depends on some variables.
So, yes you are correct.
Related
Both in Firestore security rules and Firebase realtime db rules it is possible to make assessments on a pre-existing document in a specific location:
Firestore example : get(/databases/$(database)/collection/mydoc).data
Firebase example : root.child('collection').child('mydoc').val()
My question is:
is it possible to cross reference these rules from Firestore to Realtime db? i.e. validate the presence of a Firestore document from a Firebase rule and/or vice versa, possibly avoiding cloud functions?
Is it possible to cross reference these (security) rules from Firestore to
Realtime db?
No, it is not possible to "cross-reference" the different Security Rules between services.
Note that it is the same with Cloud Storage and Firestore or Cloud Storage and the RTDB.
As you said you may use a Cloud Function for that and there are several approaches here:
Use a Cloud Function to read and write data to e.g. Firestore, and in the Cloud Function check the existence of the node in the RTDB. Writing to/reading from a DB through a Cloud Function may have some drawbacks, see this article.
Use a Cloud Function to mirror the two database structures, e.g. create a new Firestore doc in the collection collection when a new RTDB node is created at the child('collection').child('mydoc') location. With this second approach you can still use security rules and the client SDKs when writing to and reading from Firestore.
(same logic apply if you invert the databases in the examples above)
Going through this blog, I could see that my app would be better off using the Firebase cloud functions for Firebase Firestore rather than directly accessing Firestore using client-side SDK.
I can implement the Firestore READ operation using get(), WRITE operation using set() or update() &
DELETE using delete() methods. All these one-shot operations are fine.
Is possible to implement addSnapshotListener to fetch real-time updates? If yes, how?
While it technically is possible to use addSnapshotListener, there are not a lot of use-cases where this makes sense.
Cloud Functions are short-lived, a maximum of 9 minutes, so not well suited for scenarios where you need to listen to the database for longer times.
Typically, you'll want to instead define a separate Cloud Function that responds to the thing you'd otherwise want to listen to.
Realtime listeners are not compatible with the way Cloud Functions works. Functions can only run for a limited amount of time and are shut down after that timeout expires. Database listeners are indefinite, and will continue listening until they are removed. If you add a listener inside a function, the system will still shut it down after the function timeout expires.
It's unclear to me why you need a listener in a Cloud Functions, but it's almost certainly not the right thing to do, given the way functions work.
I was looking at a few ways to export data out from Firestore without using export (expensive operation in the long term as it doesn't support incremental backups) to use in BigQuery and Data Studio.
1) Using Google Pub/Sub.
This will probably require function to both write to pub/sub and then another to trigger to BQ.
2) Using Cloud Functions to trigger from an onCreate event to write directly to a BigQuery dataset and table.
(This is using table.insert)
What would the advantage be to use Pub/Sub - other than it would appear that it will cost more in the long term?
Or is there another way I am unaware of to do this?
I'm new at this. Some advise and pro and cons of the above scenarios are much appreciated.
The official solution is here.
In case of using Cloud Functions to trigger from an onCreate event, what will you create? Create File on Cloud Storage or create Firestore Document?
I think that in case of using Cloud Functions you should use PubSub trigger.
I recommend asynchronous architecture like Pub/Sub. Because rerun is easy and the scope of influence is limited.
I developed sample is here. I'm using Cloud Scheduler not cron.yaml. The cost of Cloud Scheduler is here.
(If you want) Export Firebase Authentication users to Cloud Firestore Collection. Use Firestore, Cloud Functions(PubSub) and Cloud Scheduler.
Export All Cloud Firestore Collections and Specified collections to Cloud Storage. Use AppEngine and Cloud Scheduler.
Export Specified Cloud Firestore Collections to BigQuery( as Partitioned tables). Use AppEngine and Cloud Scheduler.
tldr: I need to have Google Cloud Functions on my own backend.
I write application, which uses firebase (especially firestore) as a transport layer between my own backend written on nodejs and client applications.
Sometimes, I need to catch some events from client on backend, but i want to avoid perform http queries directly to my backend (because I need to catch offline status, and other problems). It is better to made some changes in firestore documents, catch that changes on my backend and perform some business logic.
As for now, It can be solved with Cloud Functions, but this solution is not acceptable, because of delay between event and function invocation, and lack of invocation order.
Yet another solution, which is currently used in my project, is to making some changes to firestore document, and adding extra document, called "event" to other collection. On a server side, using firebase-admin sdk, I subscribe to that "events" collection and get realtime updates of it.
This work great, but looks like overcomplicated. Is there any way to subscribe from my backend to get all updates of all documents of firestore? Ideal solution is to subscribe to updates, as it done in Cloud Functions: https://firebase.google.com/docs/functions/firestore-events?authuser=0
The client and server SDKs don't have this capability. Cloud Functions is really your only way to get notified of changes in Firestore that match a wildcard pattern.
I'm building a Firebase app, and plan to use the real-time database when I need real-time updates. However, most of the application data is more traditional.
Now that Functions is a thing, how do I also leverage either DataStore or CloudSQL? Can anyone point me to specific documentation or examples how to read/write with either of those services from a function?
Neither Cloud Datastore nor Cloud SQL support Cloud Functions yet, which means you aren't yet able to trigger Cloud Functions based on their events the way you can with the Firebase Realtime Database.
Fortunately, once a Cloud Function has been triggered (for example via HTTP), you can still read and write from Datastore and SQL as you would from any other Node.js code. Here is documentation for Cloud Datastore, and here it is for Cloud SQL.
Finally, if you're adventurous and might like to provide early feedback on upcoming integrations like Datastore, fill out this form!