I am using many cloud function trigger and admin sdk to do multi-path update.I don't want to do too much multi-path update in client cause it will make firestore rules very complex and firestore rules also have document access call limits.So I decide to using cloud function to do most denomorlization stuff.
There is how one of my function work.
cloud function triggered at profiles/{userId}
and i using .get to load multi-path update needed paths at
profilesPaths/{userId}
set writebatch.update on those paths
writebatch.commit()
And I think there have a problem.cloud function is asynchronous right?.So when function are running to step 3 And at the same moment a client delete one of update path from cf already loaded document at profilesPaths/{userId} (already loaded at step 2).and now cloud function loaded document is not the latest version.Will this happend? or i should using transactions to lock those documents?
Yes, Cloud Functions run asynchronous, and possibly in parallel. You should be using transactions to make sure that updates are consistent among all the clients that are trying to modify them concurrently.
Related
We are using Firebase Functions with a few different HTTP functions .
One of the functions runs via a manual trigger from our website. It then pulls in a lot of data from an external resource and saves it into our Firestore database. Our function resources are Node.js 10, 1 GB of Memory and 540s before it times out.
However, when we have large datasets that we need to pull in, e.g. 5 000 - 10 000 records to write to the database, we start running into issues. We receive an error on large data sets of:
8 RESOURCE_EXHAUSTED: Bandwidth exhausted
The full error on Firebase Functions Health Dashboard logs looks like this:
Error: 8 RESOURCE_EXHAUSTED: Bandwidth exhausted
at Object.callErrorFromStatus (/workspace/node_modules/#grpc/grpc-js/build/src/call.js:31:26)
at Object.onReceiveStatus (/workspace/node_modules/#grpc/grpc-js/build/src/client.js:176:52)
at Object.onReceiveStatus (/workspace/node_modules/#grpc/grpc-js/build/src/client-interceptors.js:342:141)
at Object.onReceiveStatus (/workspace/node_modules/#grpc/grpc-js/build/src/client-interceptors.js:305:181)
at Http2CallStream.outputStatus (/workspace/node_modules/#grpc/grpc-js/build/src/call-stream.js:117:74)
at Http2CallStream.maybeOutputStatus (/workspace/node_modules/#grpc/grpc-js/build/src/call-stream.js:156:22)
at Http2CallStream.endCall (/workspace/node_modules/#grpc/grpc-js/build/src/call-stream.js:142:18)
at ClientHttp2Stream.stream.on (/workspace/node_modules/#grpc/grpc-js/build/src/call-stream.js:420:22)
at ClientHttp2Stream.emit (events.js:198:13)
at ClientHttp2Stream.EventEmitter.emit (domain.js:466:23)
Our Firebase project is on the blaze plan and also, on GCP connected to an active billing account.
Upon inspection on GCP, it seems like we are NOT exceeding our WRITES per minute quote, as previously thought, however, we are exceeding our Cloud Build limit. We are also using batched writes when we save data to firestore from within the function, which seems to also make the amount of db writes less. e.g.
We don't use Cloud Build, so I assume that Firebase Functions uses Cloud Build in the back end to run the functions or something, but I can't find any documentation on the matter. We also have a few firestore database functions that run when documents are created. Not sure if that uses Cloud build in the back end or not.
Any idea why this would happen ? Whenever this happens, our function gets terminated with that error which causes us to only import half of our data. The data import works flawlessly with smaller amounts of data.
See our usage here for this particular project:
Cloud Build is used during the deployment of Cloud Functions. If you check this documentation you can see that:
Deployments work by uploading an archive containing your function's source code to a Google Cloud Storage bucket. Once the source code has been uploaded, Cloud Build automatically builds your code into a container image and pushes that image to Container Registry. Cloud Functions uses that image to create the container that executes your function.
This by itself is not enough to justify the charges you are seeing, but if you check the container image documentation it says:
Because the entire build process takes place within the context of your project, the project is subject to the pricing of the included resources:
For Cloud Build pricing, see the Pricing page. This process uses the default instance size of Cloud Build, as these instances are pre-warmed and are available more quickly. Cloud Build does provide a free tier: please review the pricing document for further details.
So with that information in mind, I would make an educated guess that your website is triggering the HTTP function enough times to make Cloud Functions scale up this particular function with new intances of it, which triggers a build process for the container that hosts the function and charges you as a Cloud Build charge. So to keep doing what you doing you are going to have to increase your Cloud Build Quota to meet this demand of your website.
There was a Firestore trigger that was triggering on new records of the same type I was importing.
So in short, I was creating thousands of records in a collection, and for every one of those, the firestore rule (function) triggered, but what I did not know at the time, is that it created a new build process in the background for each firestore trigger that ran, which is not documented anywhere.
I'm sure these are common scenarios, but after researching some hours, I couldn't really find what the common practice is. Maybe someone with more experience in Firebase can point me to the right direction.
I have two particular scenarios:
1. Code that runs once
Example 1: adding new data to all users in firestore, which is needed for a new feature
Example 2: start duplicating data into existing documents
I currently write the code in a cloud function, and run it on a firestore event (onUpdate of a "hidden" document) and then I immediately delete the function if everything goes well.
I also increase the timeout and memory for this function, as the idea is to potentially update millions of documents.
2. Manually trigger a function from the firebase console (or command line)
Example: Give a user admin privileges in the app (function that sets custom claims and firestore data). We don't have time to implement a back-office, so doing this from the firebase web portal/console would be ideal, specifying the user id.
My current solution is to use a https function, and run it from the GCP portal (on the function's "Testing" tab, being able to pass a json). BUT the function can be triggered publicly, which I don't really like...
What are the common practices for these scenarios?
To expand on my comment: if you want to create a node script to run one-off code, you just write your JS code like for any cloud function but simply run it immediately. Something like this.
const admin = require('firebase-admin');
admin.initializeApp();
const db = admin.firestore();
db.collection('users')
.where('myField', '==', true)
.get()
.then(querySnapshot => {
querySnapshot.docs.forEach(doc => {
// update doc
});
});
If you save this as script.js and execute it with node script.js you’ll be pointed towards downloading a JSON file with credentials. Follow the instructions and you can then run the script again and now you’re running your own code on Firestore, from the command line.
For administrative type operations, you're better off just running them on your desktop or some other server you control. Cloud Functions is not well suited for long running operations, or things that must just happen once on demand.
Case 1 really should be managed by a standalone program or script that you can monitor by running it on your desktop.
Case 2 can be done a number of ways, such as building your own admin web site. But you might find it easiest to mirror the contents of a document to custom claims using a Firestore trigger. Read this: https://medium.com/firebase-developers/patterns-for-security-with-firebase-supercharged-custom-claims-with-firestore-and-cloud-functions-bb8f46b24e11
Does anyone know if there is an easy way to trigger a function everytime i re-deploy some funciont to firebase?
I have an specific firabase functions which i define inside GCP (this way when i do "firebase deploy" it doesnt re-deploy, unnisntal or touch in any form my current function)
but sometimes i might update this function manually on GCP and i would like to trigger a inner function of its code everytime it happens... is it possible?
ex:
exports.decrementAction = (req, res) => {/*do stuff*/res.status(200).send("ok")};
function auxiliary(){
//to be called on re-deploy
}
Unfortunately, there isn't an easy way for you to trigger a function within a code that is being redeployed. Since this code is only being deployed at the moment, this wouldn't be possible to be done automatically.
The alternative would be to have this function separately from the "root" function in the moment of deploying and use triggers to run this other Cloud Function, when the first is redeployed. This way, it would be possible to run it based in the deployment of the other.
You can get more information on the triggers available for Cloud Functions here: Calling Cloud Functions. With them, you should be able to configure the timing for the execution.
Besides that, it might be worth it to raise a Feature Request for Google's to verify the possibility of adding this in future releases.
Let me know if the information clarified!
I think there exists a manner.
With Pub/Sub you can catch logs from Stackdriver (docs). Those services allow you to store only the logs related to the deployment of a Cloud Function.
The store could be, for instance, Cloud Firestore. As you should know, there is available a trigger for Cloud Firestore events.
Finally, every time an event log related to a function's deployment is generated, it will be stored and triggers a function attached to that event. In the function, you can parse or filter the logs.
I came across unique use case where following feature will come in very useful.
In essence I have components that are listening for specific changes in a document, my security rules are set up in a way where reads are allowed, but all writes are disabled as database updates are only possible through cloud function, hence I was researching the docs to find if it is possible to do something like this.
/**
* This update should only happen locally / offline in order to update state of components
* that are listening to changes in this document
*/
myDocumentRef.update({ name: 'newname' });
/**
* Then cloud function is called that performs validation and if correct
* updates document with new data (same as one we set offline above).
* Once that is done listening components will receive new data. If unsuccessful
* we would set document to old data (again offline) and show error.
*/
await myCloudFunction();
Hence my question: is it possible to perform that update (and only update) locally / offline?
This is basically "optimistic update" flow utalising firebase as global store of sorts.
Firestore does not provide a concept of "local only" updates. All writes will be synchronized with the server at the earliest convenience. If a write eventually fails, the promise returned by the API call (if still in memory) will be rejected. If the sync happens after the app restarts, the promise is lost, and you have no way of knowing if the write succeeded. Local writes that eventually fail will be reverted in the local cache.
What you will have to do instead is write your data in such a way that the local listener can tell the stage of the write, possibly by writing metadata into the document to indicate that stage, cooperating with the Cloud Function on keeping that up to date.
I have successfully used onnotificationclick to handle actions and so far the only thing I can do is clients.openWindow().
I tried using importScripts() to include firestore JS SDK and write something to firestore, doesn't work.
I tried tried using importScripts() to include functions JS SDK and invoke functions as such: https://firebase.google.com/docs/functions/callable, doesn't work.
Chrome throws "Illegal invokation". I believe because the JS SDK uses something not available in service worker (e.g. XHR)?
Can I use fetch() to interact with firestore REST endpoint or invoke cloud functions?
FYI I am trying to implement a Mute or Archive action that requires invoking some cloud function or updating firestore DB.