Is there any way to manually trigger a scheduled function and/or a Firestore trigger function? I have two scenarios I need to solve:
A cloud function that is listening to a Firestore document (onCreate) didn't fire - it failed on 3 of about 1,000 invocations, so I need to manually trigger it for these 3 documents. Is this possible (to manually trigger this function)?
I have a scheduled function that runs hourly, but threw an error b/c of a map in the Firestore document when the code expected an array. Any way I can manually run the scheduled function once rather than waiting an hour before it runs again?
-- firebase console
-- functions
-- "..." at right side of cron job
-- "view in cloud scheduler"
-- "run now" at right side of function
You can run a firestore scheduled function via the FirebaseTools and running it locally. Starting the shell command eg npm run build && firebase functions:shell will allow you to invoke a Scheduled Function eg:
export const parseGarminHealthAPIActivityQueue = functions.region('europe-west2').runWith({
timeoutSeconds: TIMEOUT_IN_SECONDS,
memory: MEMORY
}).pubsub.schedule('every 10 minutes').onRun(async (context) => {
await parseQueueItems(ServiceNames.GarminHealthAPI);
});
It's not possible to manually trigger a function from the Firebase console. Your best bet is to use the methods shown in the Cloud documentation, which involve using gcloud's call command or the Cloud console's Testing tab. Neither of these are very easy, as you will have to construct the JSON payload to the function manually.
If I may make a suggestion - if your functions are failing due to errors, you should consider enabling retry on your functions, and making sure that your functions only generate errors for situations that should be retried. Depending on manual invocation in the event of a failure will not scale very well - errors should be handled by code as much as possible.
Related
Usually functions are deployed via CLI, calling the firebase deploy -only functions:my_function.
Is it possible to deploy functions programmatically (hence dynamically)?
In my use case I would like to re-schedule a PubSub to run after a specific amount of time, relative to the current execution time, rather than regularly every time interval.
The same way as setTimeout would work (rather than a setInterval), but without having a process running and waiting to call the function.
What would be the drawbacks?
What would be alternative ways to achieve a similar result with what Firebase provides?
You already deploy that Cloud Function programmatically by issuing a command.
Generally there's repeated and delayed execution available.
a) Cloud Scheduler crontab receives scheduled Pub/Sub events:
exports.cronjob = functions.pubsub.schedule('0 */12 * * *').onRun(async context => {
...
});
b) Cloud Tasks may be better to schedule at a specific time.
Really bizarre that Firebase doesn't seem to work quite like typical Express app. Whatever I write in Express and copy-paste to Firebase Functions I typically get error. There is one that I can't figure out on my own though.
This endpoint is designed to start a function and live long enough to finish even longer task. That request is a webhook (send docs, we will transform them and ping you when it's done to specified another webhook). Very simplified example below:
router.post('/', (req, res) => {
try {
generateZipWithDocuments(data) // on purpose it's not async so request can return freely
res.sendStatus(201)
} catch (error) {
res.send({ error })
}
})
On my local machine it works (both pure Express app and locally emulated Firebase Functions), but in the cloud it has problems and even though I put a cavalcade of console.log() I don't get much information. No error from Firebase.
If generateZipWithDocuments() is not asynchronous res.sendStatus() will be immediately executed after it, and the Cloud Function will be terminated (and the job done by generateZipWithDocuments() will not be completed). See the doc here for more details.
You have two possibilities:
You make it asynchronous and you wait its job is completed before sending the response. You would typically use async/await for that. Note that the maximum execution time for a Cloud Function is 9 minutes.
You delegate the long time execution job to another Cloud Function and, then, you send the response. For delegating the job to another Cloud Function, you should use Pub/Sub. See Pub/Sub triggers, the sample quickstart, and this SO thread for more details on how to implement that. In the Pub/Sub triggered Function, when the job is done you can inform the user via an email, a notification, the update of a Firestore document on which you have set a listener, etc... If generateZipWithDocuments() takes a long time, it is clearly the most user friendly option.
Does anyone know if there is an easy way to trigger a function everytime i re-deploy some funciont to firebase?
I have an specific firabase functions which i define inside GCP (this way when i do "firebase deploy" it doesnt re-deploy, unnisntal or touch in any form my current function)
but sometimes i might update this function manually on GCP and i would like to trigger a inner function of its code everytime it happens... is it possible?
ex:
exports.decrementAction = (req, res) => {/*do stuff*/res.status(200).send("ok")};
function auxiliary(){
//to be called on re-deploy
}
Unfortunately, there isn't an easy way for you to trigger a function within a code that is being redeployed. Since this code is only being deployed at the moment, this wouldn't be possible to be done automatically.
The alternative would be to have this function separately from the "root" function in the moment of deploying and use triggers to run this other Cloud Function, when the first is redeployed. This way, it would be possible to run it based in the deployment of the other.
You can get more information on the triggers available for Cloud Functions here: Calling Cloud Functions. With them, you should be able to configure the timing for the execution.
Besides that, it might be worth it to raise a Feature Request for Google's to verify the possibility of adding this in future releases.
Let me know if the information clarified!
I think there exists a manner.
With Pub/Sub you can catch logs from Stackdriver (docs). Those services allow you to store only the logs related to the deployment of a Cloud Function.
The store could be, for instance, Cloud Firestore. As you should know, there is available a trigger for Cloud Firestore events.
Finally, every time an event log related to a function's deployment is generated, it will be stored and triggers a function attached to that event. In the function, you can parse or filter the logs.
In our Firebase application there is a list with lots of items in Realtime Database. Every create, update and delete operation on single item is processed by Firebase Cloud Function with onWrite trigger (in simplest case this function just counts items). But sometimes there is a need for bulk operation on items without need for individual processing. Let's say we want in single transaction remove all items and reset counters.
Earlier it worked just fine. Due to the limit of 1000 for number of Cloud Functions triggered by a single write (https://firebase.google.com/docs/database/usage/limits), no functions where triggered at all and it was desired outcome.
Now, without any change to application code we have an error
Error: TOO_MANY_TRIGGERS: This request would cause too many functions to be triggered.
Same error appears in client application, Admin API and even when importing json using the web interface. Only option that works for us is processing of items in batches. But it is not transactional and takes up to tens of minutes instead of milliseconds as before.
What options do we have to bypass this error? Optimally this would be some switch to skip function triggering in case of exceeding the limit.
For anybody reading this question post-2018, there is now an option to disable strict enforcement for trigger limits.
Strict validation is enabled by default for write operations that trigger events. Any write operations that trigger more than 1000 Cloud Functions or a single event greater than 1 MB in size will fail and return an error reporting the limit that was hit. This might mean that some Cloud Functions aren't triggered at all if they fail the pre-validation.
If you're performing a larger write operation (for example, deleting your entire database), you might want to disable this validation, as the errors themselves might block the operation.
To turn off strictTriggerValidation, follow these steps:
Get your Database secret from the Service accounts tab of your Project settings in the Firebase console.
Run the following CURL request from your command line:
curl -X PUT -d "false" https://NAMESPACE.firebaseio.com/.settings/strictTriggerValidation/.json?auth\=SECRET
See here for the docs: https://firebase.google.com/docs/database/usage/limits
There is currently no way to prevent triggers from running in special circumstances. The only way around this is to undeploy all your triggers, perform your updates, then deploy all your triggers again.
I would encourage you to file a feature request for this.
I just got this error message in an older, Flutter project that I hadn't touched in quite some time.
[firebase_database/unknown] TOO_MANY_TRIGGERS: This request would cause too many functions to be
triggered.
It turned out that here it was caused by the fact that my Cloud Functions were still set to use Node v8, which was retired in early 2021.
Upgrading the Cloud Functions to use Node v12 (no other changes needed) removed the error message for me.
Turning off strictTriggerValidation is solved my issue.
if you are using firebase tool you can follow these steps.
Turn off strictTriggerValidation for entire project:
MAC
sudo firebase database:settings:set strictTriggerValidation false --project *my_project_id*
If you need to turn off for particular instance:
MAC
sudo firebase database:settings:set strictTriggerValidation false --project *my_project_id* --instance *my_instance_name*
check instances
sudo firebase database:instances:list --project *my_project_id*
Note: windows user please try without sudo
FYR:
Limitations: https://firebase.google.com/docs/database/usage/limits
Firebase CLI Commands: https://firebaseopensource.com/projects/firebase/firebase-tools/
Is there a way to disable a Cloud Function for Firebase through the Firebase dashboard?
I deployed a Cloud Function with a bug which caused an infinite loop of the function being triggered, updating the data, then the function triggering again. I discovered the error quickly, but I had to fix the code and redeploy the entire project to get the function to stop triggering.
Even though I deployed the new function, the deployment took some time and the function was triggered hundreds of times (which actually caused others to be triggered hundreds of times).
I'd like to be able to disable a function immediately when this happens, but I don't see any options in the dashboard or through the Firebase CLI.
If you view Cloud Functions in the Cloud Console, you can delete them individually from there: https://console.cloud.google.com/functions
Dont want to delete the function as I want to keep the usage history, logs, health ect?
This work around,long winded, but does the trick:
Disable function:
comment out the code in then function in your index.js
deploy just the firebase function:
firebase deploy --only functions:functionName
Enable function:
uncomment code
redeploy just the function with above line
Unfortunately Firebase has only a delete option and no disable option :(
A thing that I'm doing which isn't particularly neat but does the job. is just add a node in the database. for me I have a weekly script I run where I don't want my cloud functions to run when that's running. so at the top of my function I read that node and if the script is running, I just return early. not ideal but saves me having to comment out and redeploy every time
For me the fastest way is to edit function code directly in Google Cloud Console editor. In case of the HTTP function adding something like this at the beginning of a handler
res.status(500).send('The function is disabled');
return;
I use a solution similar to Red Baron. I have a Firestore Collection of booleans (one for each function) and I check that boolean at the beginning of my function to determine if it's allowed to run. The function will indeed be called, but it won't do anything if that boolean is set to false. It's not a perfect solution because it doesn't completely disable the function. But at least it will retain the log history.