Google Cloud Task not calling http requests - firebase

I have created a google cloud task and the queue keeps retrying and the function is not getting invoked as well.
This is the log from the cloud console.
attemptResponseLog: {
attemptDuration: "0.133874s"
dispatchCount: "19"
maxAttempts: 0
responseCount: "0"
retryTime: "2020-06-21T21:20:18.518655Z"
scheduleTime: "2020-06-21T21:20:15.718098Z"
status: "UNAVAILABLE"
targetAddress: "POST some url"
targetType: "HTTP"
}

I ran into this same error, and I must say that the documentation is not clear enough.
WARNING : I feel there's a bit of latency for the roles to be taking into account, especially with the ServiceAccountUser one.
I made multiple test, and tried to keep the lowest rights possible, so I did try to remove some... do some test, it works... great, it's not necessary to have this right... came back later, and the thing is broken.
Here is my setup :
I use Cloud Scheduler to trigger a Cloud Function every 15 minutes by posting a message on a queue.
The CloudFunction build a list of tasks to compute stats on MySQL and create the tasks
Another Cloud Function run SQL query to get stats and store the results in FireStore.
I use cloud task so that the load on MySQL is not too heavy.
Below, I use functional names to make it easy to understand.
TaskCreatorCloudFunction running with TaskCreatorServiceAccount
TaskCreatorServiceAccount requires
the "Cloud Task Enqueuer" role #1
be a ServiceAccountUser on the CloudTaskComputeStatsServiceAccount (see after) #2
The Roles needed to do the job(read SQL to get the list of tasks to create, write logs, access secret manager, listen to pubsub as it's triggered by Cloud Scheduler via pubsub)
TaskImplementationCloudFunction (http) running with TaskImplementationServiceAccount
TaskImplementationServiceAccount has no specific role for CloudTasks, only the one needed to do the job (read SQL, write logs, access secret manager, firestore write)
The TaskQueue is named "compute-stats-on-mysql".
I've created a dedicated ServiceAccount named CloudTaskComputeStatsServiceAccount #3
CloudTaskComputeStatsServiceAccount has the specifics rights for the whole thing to work.
Cloud Function Invoker #4
Add CloudTaskComputeStatsServiceAccount as ServiceAccountUser on TaskImplementationServiceAccount #5
To do the last one in the console (script version below), you need to
go to IAM->Service Account
check the TaskImplementationServiceAccount
In the upper right corner, click "Show Info Panel" if it's not already displayed
click the Add Member
Paste the full name of the CloudTaskComputeStatsServiceAccount
Choose Service Account User as role
Save
You can edit this in the console, but it's better to script it.
gcloud tasks queues create compute-stats-on-mysql \
--max-dispatches-per-second=10 \
--max-concurrent-dispatches=15 \
--max-attempts=2 \
--min-backoff=1s
#3
gcloud iam service-accounts create CloudTaskComputeStatsServiceAccount --description="Service Account for the cloud task compute-stats-on-mysql" --display-name="Service Account for the cloud task compute-stats-on-mysql"
#4
gcloud projects add-iam-policy-binding ${PROJECT_ID} --member serviceAccount:CloudTaskComputeStatsServiceAccount#${PROJECT_ID}.iam.gserviceaccount.com --role "roles/cloudfunctions.invoker"
#1
gcloud projects add-iam-policy-binding ${PROJECT_ID} --member serviceAccount:TaskCreatorServiceAccount#${PROJECT_ID}.iam.gserviceaccount.com --role "roles/cloudtasks.enqueuer"
#5
gcloud iam service-accounts add-iam-policy-binding TaskImplementationServiceAccount#${PROJECT_ID}.iam.gserviceaccount.com --member="serviceAccount:CloudTaskComputeStatsServiceAccount#${PROJECT_ID}.iam.gserviceaccount.com" --role "roles/iam.serviceAccountUser"
#2
gcloud iam service-accounts add-iam-policy-binding CloudTaskComputeStatsServiceAccount#${PROJECT_ID}.iam.gserviceaccount.com --member="serviceAccount:TaskCreatorServiceAccount#${PROJECT_ID}.iam.gserviceaccount.com" --role=roles/iam.serviceAccountUser
when Creating the Task, you use the CloudTaskComputeStatsServiceAccount in the oidcToken
const body = Buffer.from(JSON.stringify(data)).toString('base64');
const task = {
httpRequest: {
httpMethod: 'POST',
url,
oidcToken: {
serviceAccountEmail: "CloudTaskComputeStatsServiceAccount#${PROJECT_ID}.iam.gserviceaccount.com",
},
headers: {
'Content-Type': 'application/json',
},
body,
},
};
My understanding is that when you run the
const [response] = await cloudTasksClient.createTask({parent, task});
The Cloud Function (Task Creator) need to Create Task, and act as the "CloudTaskComputeStatsServiceAccount"
And "CloudTaskComputeStatsServiceAccount" need to have the cloud function invoker and act as the target cloud function.

Indeed it's not a service account issue. It's the OIDC token audience missing. Seems that for cloud functions this is needed. I found two references... you can recreate this problem with the OIDC token in the cli by omitting this argument to gcloud tasks create-http-task
--oidc-token-audience=OIDC_TOKEN_AUDIENCE
The audience to be used when generating an OpenID Connect token to be
included in the request sent to the target when executing the task.
If not specified, the URI specified in the target will be used
The second reference that popped up, in ruby shows audience
https://googleapis.dev/ruby/google-cloud-tasks/latest/Google/Cloud/Tasks/V2/OidcToken.html
Code using google-cloud-tasks 1.5.0, the tasks object looks like this, where url_oidc has just the url to the cloud function (i.e. the trigger url... no url parameters)
# Construct the request body.
task = {
'http_request': { # Specify the type of request.
'http_method': 'GET',
'url': url_final, # The full url path that the task will be sent to.
'oidc_token': {
'service_account_email': service_account_email,
'audience': url_oidc
}
}
}

Related

gcp cloud function queue

I have written a GCP Cloud Function with a http trigger ("REST API") that invokes a Cloud Function with a Firebase Realtime Database trigger (onCreate).
The Cloud Function with a Firebase Realtime Database trigger performs REST calls to other services based on received data from the REST API.
I have noticed that the called services sometimes returns http 429 (too many calls) since my REST API does not have a limit to how many calls can be received.
The REST API has security measures in place to prevent unauthorised calls to invoke the Cloud Function with a Firebase Realtime Database trigger (onCreate). I do not wish to limit the amount of calls to my API, but rather place all requests in a queue and process them in sequence.
It is important that all calls are processed as promptly as possible. I do not wish to process transactions in 60 second intervals.
In my current solution all calls to the GCP HTTP http REST API immediately triggers the GCP Cloud Function via Firebase Realtime Database insert (onCreate event).
What I would like is to maybe have a queue in between my REST API and the Firebase Realtime Database insert (onCreate event) to ensure that only one GCP Cloud Function can execute simultaneously.
What is the best way to achieve this functionality?
Kind regards /K
EDIT:
Might Maximum instances be a solution here?
https://cloud.google.com/functions/docs/configuring/max-instances
You can Trigger Cloud Functions using Cloud Tasks. Here's an example I use whereby emails are placed inside of the Cloud Task queue I created and they are then processed by the task runner one-after-another:
import { CloudTasksClient } from '#google-cloud/tasks';
// define and use the following for your task:
// CLOUD_FUNCTION_URL
// TASK_HTTP_METHOD
// TASK_CONTENT_TYPE
// SERVICE_ACCOUNT_EMAIL
// PROJECT_ID
// REGION
// QUEUE_NAME
const client = new CloudTasksClient({ projectId: PROJECT_ID });
/**
* Build the Cloud Task
* In the below example we take a POST body and
* stringify it before converting to base64.
*/
const convertedPayload = JSON.stringify(payload);
const body = Buffer.from(convertedPayload).toString('base64');
const task: TaskT = {
httpRequest: {
url: CLOUD_FUNCTION_URL,
httpMethod: TASK_HTTP_METHOD,
body,
headers: {
'Content-Type': TASK_CONTENT_TYPE,
},
oidcToken: {
serviceAccountEmail: SERVICE_ACCOUNT_EMAIL,
},
},
scheduleTime: {
seconds: Date.now() / 1000, // <--- start the task now (in unix time)
},
};
return client.createTask({
parent: client.queuePath(PROJECT_ID, REGION, QUEUE_NAME),
task,
});
You'll also need to configure some IAM permissions for your development and app IAM users like Cloud Functions Invoker and Cloud Functions Viewer.

Deploying Cloud Firestore Trigger function from GCP Console

I have a function that is fired on a onUpdate trigger from my cloud firestore database.
The function not being called at all when I change my database.
I did not deploy the function using firestore CLI, instead I deployed it using the GCP Console.
Here is the function:
exports.NotificationListener = functions
.firestore
.document('users/{userId}')
.onUpdate((change, context) => {
const userId = context.params.userId.toString();
const eventId = context.eventId;
console.log('EventId:' + eventId);
console.log('Change in:' + userId);
return 200;
});
Here is the deployment information from the GCP console (showing the trigger):
Finally, here is the Cloud Firestore schema:
I want to monitor any changes to any "USER" in the collection: "/user", hence I am using "user/{userId}".
Why is this function not being called when I change the database ?
EDIT 1
A little information about my environment:
I have my entire project core in a TypeScript file. I have over 40 HTTPS triggered functions that are currently online.
I add a new function in my TS file, then I do a npm run build to compile and get the JS file.
Finally, I go to Google Cloud Console and create a function and choose "Zip Upload" and upload the compiled JS file (obviously, along with the required JSON files for getting Database URL, Authentication etc.)
This approach works perfectly fine, at least for HTTP triggered firestore functions.
Now I repeated the same steps as above for the onUpdate trigger and just instead of choosing HTTP trigger, I chose Cloud Firestore trigger. The trigger information can be found above in the screenshot.
onUpdate is not being fired on DB changes.
EDIT 2
My event trigger function NotificationListener is showing up in the firebase console functions list along with my other 40 HTTPS functions. But it is not being called.
#doug-stevenson, your answer seems to have disappeared, I am not sure why.
Anyway, I found the reason why it wasn't working.
My firebase database was in project "Placeholder 1" and my GCP functions were in project "Placeholder 2".
Now, I was able to update the "Placeholder 1" DB from GCP functions (in "Placeholder 2") using firabse-functions API because I set the DatabaseURL to "Placeholder 1".
But, just setting the DatabaseURL to the desired database doesn't work if you want to LISTEN to the database for changes. You actually need to have the function in the same project otherwise it is not able to subscribe and listen for events.
I think it's a little inconsistent that you can read/write to a DB from different projects, but to listen for events, function needs to be in same project.
Or maybe I am missing something fundamental that caused this confusion for me.

Google Cloud Functions Cron Job Not Working

I am trying to set up a scheduled function in Firebase Cloud Functions. As a simple test, I have tried to recreate the sample shown on the documentation page:
const functions = require('firebase-functions')
exports.scheduledFunction = functions.pubsub
.schedule('every 5 minutes')
.onRun(context => {
console.log('This will be run every 5 minutes!')
return null
})
However, when I run firebase serve --only functions, I get the following error:
function ignored because the pubsub emulator does not exist or is not running.
Any idea why I get this message and how I can fix it?
From the documentation on Firebase's local emulator:
The Firebase CLI includes a Cloud Functions emulator which can emulate the following function types:
HTTPS functions
Callable functions
Cloud Firestore functions
So the local Firebase emulators don't currently support pubsub, and the error message seems to confirm that. So for the moment, you can't run pubsub triggered Cloud Functions locally.
A feature request for adding PubSub support to the emulator was filed. You might want to read up (and possibly comment) there, as the direction taken may or may not match with your needs.
The local shell does support invoking pubsub functions. That is of course quite different, but might be useful as a workaround for the moment.
For what it is worth, you need to enable the pubsub emulator in firebase. Add this to your emulators block:
{
"emulators": {
"pubsub": {
"port": 8085
},
}
}
Even then, it only creates the definition. The emulator doesn't support running the function on a schedule.
To simulate that behavior, I define a HTTP trigger, in which I manually send a message to the topic. For a schedule topic, it is firebase-schedule-<functionName>. In your case it will be firebase-schedule-scheduledFunction.
Sample code looks like:
const pubsub = new PubSub()
export const triggerWork = functions.https.onRequest(async (request, response) => {
await pubsub.topic('firebase-schedule-scheduledFunction').publishJSON({})
response.send('Ok')
})
Then on the command line, I trigger the HTTP function on a schedule.
while [ 1 ];
do wget -o /dev/null -O /dev/null http://localhost:5001/path/to/function/triggerWork;
sleep 300;
done

Authenticate Google Composer http call task with IAP protected app

I have a setup where I have an app engine REST application and a Google composer / airflow DAG that has a task where it is supposed to fetch data from one of the endpoints of the app. The app is protected by IAP. I have added the service account under which Airflow runs to the "IAP-secured Web App User" list, however each time the step executes the response to the http call is the Google Sign-In page. Any idea if any additional step is needed?
The code for my DAG step:
def get_data():
r = requests.get(url="url-to-my-app-endpoint>")
print('stuff:')
print(r.status_code)
print(r.content)
return 1
# ...
python_fetch_data = PythonOperator(
task_id='python_fetch_data',
python_callable=get_data,
dag=dag,
depends_on_past=True,
priority_weight=2
)
https://cloud.google.com/iap/docs/authentication-howto#authenticating_from_a_service_account explains how to extend your DAG code so that it sends credentials to the IAP-protected API backend.
A bit of background: Since Composer is built on top of GCP, your Composer deployment has a unique service account identity that it's running as. You can add that service account to the IAP access list for your endpoint.
I don't know if the Composer UI makes it easy to see the "email" address for your service account, but if you add the code above and decode the token it generates, that will show it.

Can Cloud Functions for Firebase be used across projects?

I was hoping to trigger a Pub/Sub function (using functions.pubsub / onPublish) whenever a new Pub/Sub message is sent to a topic/subscription in a third-party project i.e. cross projects.
After some research and experimentation I found that TopicBuilder throws an error if the topic name contains a / and it defaults to "projects/" + process.env.GCLOUD_PROJECT + "/topics/" + topic (https://github.com/firebase/firebase-functions/blob/master/src/providers/pubsub.ts).
I also found a post in Stack Overflow that says that "Firebase provides a (relatively thin) wrapper around Google Cloud Functions"
(What is the difference between Cloud Function and Firebase Functions?)
This led me to look into Google Cloud Functions. Whilst I was able to create a subscription in a project I own to a topic in a third-party project - after changing permissions in IAM - I could not find a way associate a function with the topic. Nor was I successful in associating a function with a topic and subscription in a third-party project. In the console I only see the topics in my project and I had no success using gcloud.
Has anyone had any success in using a function across projects and, if so, how did you achieve this and is there a documentation URL you could provide? If a function can't be triggered by a message to a topic and subscription in a third-party project can you think of a way that I could ingest third-party Pub/Sub data?
As Pub/Sub fees are billed to the project that contains the subscription I would prefer that the subscription resides in the third-party project with the topic.
Thank you
Google Cloud Functions currently does not not allow a function to listen to a resource in another project. For Cloud Pub/Sub triggers specifically you could get around this by deploying an HTTP-function and add a Pub/Sub push subscription to the topic that you want to fire that cross-project function.
A Google Cloud Function can't be triggered by a subsription to a topic of another project (since you can't subscribe to another project's topic).
But a Google Cloud Function can publish to a topic of another project (and then subscribers of this topic will be triggered).
I solved it by establishing a Google Cloud Function in the original project which listens to the original topic and reacts with publishing to a new topic in the other project. Therefore, the service account (...#appspot.gserviceaccount.com) of this function "in the middle" needs to be authorized by the new topic (console.cloud.google.com/cloudpubsub/topic/detail/...?project=...), i.e. add principal with role: "Pub/Sub Publisher"
import base64
import json
import os
from google.cloud import pubsub_v1
#https://cloud.google.com/functions/docs/calling/pubsub?hl=en#publishing_a_message_from_within_a_function
# Instantiates a Pub/Sub client
publisher = pubsub_v1.PublisherClient()
def notify(event, context):
project_id = os.environ.get('project_id')
topic_name = os.environ.get('topic_name')
# References an existing topic
topic_path = publisher.topic_path(project_id, topic_name)
message_json = json.dumps({
'data': {'message': 'here would be the message'}, # or you can pass the message of event/context
})
message_bytes = message_json.encode('utf-8')
# Publishes a message
try:
publish_future = publisher.publish(topic_path, data=message_bytes)
publish_future.result() # Verify the publish succeeded
return 'Message published.'
except Exception as e:
print(e)
return (e, 500)
google endpoints can be a easier solution to add auth to the function http.

Resources