I have a function that is fired on a onUpdate trigger from my cloud firestore database.
The function not being called at all when I change my database.
I did not deploy the function using firestore CLI, instead I deployed it using the GCP Console.
Here is the function:
exports.NotificationListener = functions
.firestore
.document('users/{userId}')
.onUpdate((change, context) => {
const userId = context.params.userId.toString();
const eventId = context.eventId;
console.log('EventId:' + eventId);
console.log('Change in:' + userId);
return 200;
});
Here is the deployment information from the GCP console (showing the trigger):
Finally, here is the Cloud Firestore schema:
I want to monitor any changes to any "USER" in the collection: "/user", hence I am using "user/{userId}".
Why is this function not being called when I change the database ?
EDIT 1
A little information about my environment:
I have my entire project core in a TypeScript file. I have over 40 HTTPS triggered functions that are currently online.
I add a new function in my TS file, then I do a npm run build to compile and get the JS file.
Finally, I go to Google Cloud Console and create a function and choose "Zip Upload" and upload the compiled JS file (obviously, along with the required JSON files for getting Database URL, Authentication etc.)
This approach works perfectly fine, at least for HTTP triggered firestore functions.
Now I repeated the same steps as above for the onUpdate trigger and just instead of choosing HTTP trigger, I chose Cloud Firestore trigger. The trigger information can be found above in the screenshot.
onUpdate is not being fired on DB changes.
EDIT 2
My event trigger function NotificationListener is showing up in the firebase console functions list along with my other 40 HTTPS functions. But it is not being called.
#doug-stevenson, your answer seems to have disappeared, I am not sure why.
Anyway, I found the reason why it wasn't working.
My firebase database was in project "Placeholder 1" and my GCP functions were in project "Placeholder 2".
Now, I was able to update the "Placeholder 1" DB from GCP functions (in "Placeholder 2") using firabse-functions API because I set the DatabaseURL to "Placeholder 1".
But, just setting the DatabaseURL to the desired database doesn't work if you want to LISTEN to the database for changes. You actually need to have the function in the same project otherwise it is not able to subscribe and listen for events.
I think it's a little inconsistent that you can read/write to a DB from different projects, but to listen for events, function needs to be in same project.
Or maybe I am missing something fundamental that caused this confusion for me.
Related
I am trying to deploy a Cloud function written in Python and deployed with Cloud console which is triggered whenever a document is added to a subcollection.
I have specified the path as
users/{userID}/contactDump/{dumpID}
which should mean that whenever a new document is added to the contactDump subcollection for any user, the function should trigger.
According to Google's documentation, this is a valid path structure. Their own documentation has the following
users/{username}/addresses/{addressId}: valid trigger. Monitors all
address documents.
But whenever I deploy the function (which happens fine) the path gets changed to just {dumpID}
N.B the same happens if I try to deploy a function triggered on users/{userID} , after deployment the path gets changed to just {userID}
This issue only happens in the Cloud Console, and it's easy to reproduce by creating a new function with a Firestore Trigger with the following set up:
Once the function is deployed, the trigger section shows the path is not the one added during the function trigger creation, but the one shown is {addressId}.
After getting this result, it made me think that is the same issue you are facing, because although using different Event types, the Document path seems to be changed and only shows the last part of the path we entered in the first place, in this case {addressId}.
For this, as you said that the the function was working fine, I wanted to get the Cloud Function data performing a GET request using the API Explorer from the Cloud Functions documentation where in the request parameters I added the Cloud Function created details.
After executing the API call the response will contain the eventTrigger with the full trigger path.
In the example before, the result is:
{
"eventTrigger": {
"eventType": "providers/cloud.firestore/eventTypes/document.create",
"resource": "projects/<PROJECT_ID>/databases/(default)/documents/users/{username}/addresses/{addressId}",
"service": "firestore.googleapis.com",
"failurePolicy": {}
}
}
The “eventTrigger” from the response contains the original path, users/{username}/addresses/{addressId}, not the one displayed in the console.
This makes me think that the reason why we see only the shortened path in the Cloud Console doesn’t mean that it has been changed, apparently it is only showing the part after the last “/” slash of the path, even though the path present in the Cloud Function is longer.
Since this looks like an issue with how the console displays the path, I've raised this public issue.
If you need to recover the original path, you can use the API, or the gcloud tool:
gcloud functions describe FUNCTION_ID --format 'value(eventTrigger.resource)'
I have a Firebase project with over 10 cloud functions. Today I added 3 more, but all three of the new ones are throwing an error UNAUTHENTICATED without actually trying to hit the function. (There's no record of a function call in the logs). I haven't made any changes to my environment or Firebase project since adding the previous functions.
I have tried redeploying all of the functions in my project, the three newest ones continue to fail while the previous ones work fine. I have also verified that I am using Node 8, since there are some similar issues reported stemming from using Node 10. I am not sure what else to try since the issue only applies to the new functions.
The new functions all have a similar signature:
exports.createGroup = functions.https.onCall((data, context) => {
//Firestore access
});
They are being called like this:
const create = functions().httpsCallable('createGroup');
return create({ group: oGroup }).then(result => {
//Do something
}).catch(err => {
console.log(err.message); //UNAUTHENTICATED
});
Node Version: 8.16.2
Firebase Tools Version: 7.6.1
2023 UPDATE
1. Go to the Google Cloud Console:
2. Click the checkbox next to the function to which you want to grant
access.
3. Click Permissions at the top of the screen. The Permissions panel
opens.
4. Click Add principal.
5. In the New principals field, type allUsers
6. Select the role Cloud Functions > Cloud Functions Invoker from the
Select a role drop-down menu.
7. Click Save.
It must be solved at your GCP Console (not Firebase). Just follow these steps:
Go to your GCP Console and login
On the top menu, select the corresponding Firebase project
On the left menu go to Cloud Functions Click the checkbox of your function (not the name of the function)
Once selected, on the right menu select "Add member"
On "New member" type allUsers
On the Select function bar, select Cloud Functions -> Cloud functions invoker
Click on "Save" and then "Allow public access" on the pop-up warning
And you're good to go!
Via Google Cloud docs:
As of January 15, 2020, HTTP functions require authentication by
default. You can specify whether a function allows unauthenticated
invocation at or after deployment.
The solution was to utilize the Google Cloud console (NOT the Firebase console) to add the allUsers permission to the newly created functions.
https://cloud.google.com/functions/docs/securing/managing-access-iam#allowing_unauthenticated_function_invocation
I ran into this error when deploying several functions and there was an issue during the upload where it seems the deployment got corrupted .. perhaps network related, not sure
After deleting the offending cloud functions in the Firebase console, and then doing a fresh successful deploy for them, the "FirebaseFunctionsException UNAUTHENTICATED" error went away
First I tried to just redeploy the functions, but that was not enough, the error continued this way
From the documentation for 2nd gen Cloud Functions:
Go to the Google Cloud console
Click the linked name of the function to which you want to grant access.
Click the Powered By Cloud Run link in the top right corner of the Function details overview page.
Click Trigger and select Allow unauthenticated invocations.
Click Save.
Took me forever to find this after trying to add allUsers and getting "Principals of type allUsers and allAuthenticatedUsers cannot be added to this resource".
I'm sure these are common scenarios, but after researching some hours, I couldn't really find what the common practice is. Maybe someone with more experience in Firebase can point me to the right direction.
I have two particular scenarios:
1. Code that runs once
Example 1: adding new data to all users in firestore, which is needed for a new feature
Example 2: start duplicating data into existing documents
I currently write the code in a cloud function, and run it on a firestore event (onUpdate of a "hidden" document) and then I immediately delete the function if everything goes well.
I also increase the timeout and memory for this function, as the idea is to potentially update millions of documents.
2. Manually trigger a function from the firebase console (or command line)
Example: Give a user admin privileges in the app (function that sets custom claims and firestore data). We don't have time to implement a back-office, so doing this from the firebase web portal/console would be ideal, specifying the user id.
My current solution is to use a https function, and run it from the GCP portal (on the function's "Testing" tab, being able to pass a json). BUT the function can be triggered publicly, which I don't really like...
What are the common practices for these scenarios?
To expand on my comment: if you want to create a node script to run one-off code, you just write your JS code like for any cloud function but simply run it immediately. Something like this.
const admin = require('firebase-admin');
admin.initializeApp();
const db = admin.firestore();
db.collection('users')
.where('myField', '==', true)
.get()
.then(querySnapshot => {
querySnapshot.docs.forEach(doc => {
// update doc
});
});
If you save this as script.js and execute it with node script.js you’ll be pointed towards downloading a JSON file with credentials. Follow the instructions and you can then run the script again and now you’re running your own code on Firestore, from the command line.
For administrative type operations, you're better off just running them on your desktop or some other server you control. Cloud Functions is not well suited for long running operations, or things that must just happen once on demand.
Case 1 really should be managed by a standalone program or script that you can monitor by running it on your desktop.
Case 2 can be done a number of ways, such as building your own admin web site. But you might find it easiest to mirror the contents of a document to custom claims using a Firestore trigger. Read this: https://medium.com/firebase-developers/patterns-for-security-with-firebase-supercharged-custom-claims-with-firestore-and-cloud-functions-bb8f46b24e11
Does anyone know if there is an easy way to trigger a function everytime i re-deploy some funciont to firebase?
I have an specific firabase functions which i define inside GCP (this way when i do "firebase deploy" it doesnt re-deploy, unnisntal or touch in any form my current function)
but sometimes i might update this function manually on GCP and i would like to trigger a inner function of its code everytime it happens... is it possible?
ex:
exports.decrementAction = (req, res) => {/*do stuff*/res.status(200).send("ok")};
function auxiliary(){
//to be called on re-deploy
}
Unfortunately, there isn't an easy way for you to trigger a function within a code that is being redeployed. Since this code is only being deployed at the moment, this wouldn't be possible to be done automatically.
The alternative would be to have this function separately from the "root" function in the moment of deploying and use triggers to run this other Cloud Function, when the first is redeployed. This way, it would be possible to run it based in the deployment of the other.
You can get more information on the triggers available for Cloud Functions here: Calling Cloud Functions. With them, you should be able to configure the timing for the execution.
Besides that, it might be worth it to raise a Feature Request for Google's to verify the possibility of adding this in future releases.
Let me know if the information clarified!
I think there exists a manner.
With Pub/Sub you can catch logs from Stackdriver (docs). Those services allow you to store only the logs related to the deployment of a Cloud Function.
The store could be, for instance, Cloud Firestore. As you should know, there is available a trigger for Cloud Firestore events.
Finally, every time an event log related to a function's deployment is generated, it will be stored and triggers a function attached to that event. In the function, you can parse or filter the logs.
I was hoping to trigger a Pub/Sub function (using functions.pubsub / onPublish) whenever a new Pub/Sub message is sent to a topic/subscription in a third-party project i.e. cross projects.
After some research and experimentation I found that TopicBuilder throws an error if the topic name contains a / and it defaults to "projects/" + process.env.GCLOUD_PROJECT + "/topics/" + topic (https://github.com/firebase/firebase-functions/blob/master/src/providers/pubsub.ts).
I also found a post in Stack Overflow that says that "Firebase provides a (relatively thin) wrapper around Google Cloud Functions"
(What is the difference between Cloud Function and Firebase Functions?)
This led me to look into Google Cloud Functions. Whilst I was able to create a subscription in a project I own to a topic in a third-party project - after changing permissions in IAM - I could not find a way associate a function with the topic. Nor was I successful in associating a function with a topic and subscription in a third-party project. In the console I only see the topics in my project and I had no success using gcloud.
Has anyone had any success in using a function across projects and, if so, how did you achieve this and is there a documentation URL you could provide? If a function can't be triggered by a message to a topic and subscription in a third-party project can you think of a way that I could ingest third-party Pub/Sub data?
As Pub/Sub fees are billed to the project that contains the subscription I would prefer that the subscription resides in the third-party project with the topic.
Thank you
Google Cloud Functions currently does not not allow a function to listen to a resource in another project. For Cloud Pub/Sub triggers specifically you could get around this by deploying an HTTP-function and add a Pub/Sub push subscription to the topic that you want to fire that cross-project function.
A Google Cloud Function can't be triggered by a subsription to a topic of another project (since you can't subscribe to another project's topic).
But a Google Cloud Function can publish to a topic of another project (and then subscribers of this topic will be triggered).
I solved it by establishing a Google Cloud Function in the original project which listens to the original topic and reacts with publishing to a new topic in the other project. Therefore, the service account (...#appspot.gserviceaccount.com) of this function "in the middle" needs to be authorized by the new topic (console.cloud.google.com/cloudpubsub/topic/detail/...?project=...), i.e. add principal with role: "Pub/Sub Publisher"
import base64
import json
import os
from google.cloud import pubsub_v1
#https://cloud.google.com/functions/docs/calling/pubsub?hl=en#publishing_a_message_from_within_a_function
# Instantiates a Pub/Sub client
publisher = pubsub_v1.PublisherClient()
def notify(event, context):
project_id = os.environ.get('project_id')
topic_name = os.environ.get('topic_name')
# References an existing topic
topic_path = publisher.topic_path(project_id, topic_name)
message_json = json.dumps({
'data': {'message': 'here would be the message'}, # or you can pass the message of event/context
})
message_bytes = message_json.encode('utf-8')
# Publishes a message
try:
publish_future = publisher.publish(topic_path, data=message_bytes)
publish_future.result() # Verify the publish succeeded
return 'Message published.'
except Exception as e:
print(e)
return (e, 500)
google endpoints can be a easier solution to add auth to the function http.