I've been following the following blog post from Amazon (Scenario 3: Triggering a Lambda function from an Amazon S3 bucket notification in another account) about authorizing Lambda functions for various uses. I would like to setup a Lambda function to accept SNS messages from external accounts (external to the acct with the lambda function).
https://aws.amazon.com/blogs/compute/easy-authorization-of-aws-lambda-functions/
I was expecting to add the permission to invoke the function remotely as follows:
$ aws lambda add-permission \
--function-name MyFunction \
--region us-west-2 \
--statement-id Id-123 \
--action "lambda:InvokeFunction" \
--principal sns.amazonaws.com \
--source-arn arn:aws:sns:::<topic name> \
--source-account <account number> \
--profile adminuser
I then attempted to go to my SNS topic and set Lambda as the endpoint, and type in the remote ARN for the lambda function in the first account. This doesn't work so well, as the endpoint expects an arn for a function in the account...
Plan B:
Try creating the subscription via the CLI to circumvent the limitation in the console...
aws sns --profile adminuser \
--region us-west-2 subscribe
--topic-arn arn:aws:sns:us-west-2:<account #>:<topic name>
--protocol lambda
--notification-endpoint arn:aws:lambda:us-west-2:<account id>:function:<lambda function name>
Response:
A client error (AuthorizationError) occurred when calling the Subscribe operation: The account <account id> is not the owner of the lambda function arn:aws:lambda:us-west-2:<account id>:function:<function name>
Has anyone been able to invoke a Lambda Function from a "remote" SNS in another account? I'm a little stumped as to where I may have gone wrong... Based on the note in the blog post, I fully expected a remote SNS to work:
Note: Amazon SNS (Simple Notification Service) events sent to Lambda works the same way, with “sns.amazonaws.com” replacing “s3.amazonaws.com” as the principal.
You can if the provider account authorizes the consumer account that owns the lambda to subscribe to the SNS topic. This is can be done in the "Edit topic policy" under the topics page.
Here's a summary of the steps to allow a lambda to listen to an SNS topic from an external account:
Consumer account creates lambda,
Consumer account adds event source to lambda in AWS console by specifying the provider's SNS topic ARN (don't worry about error messages here),
Provider account adds SNS subscription permissions to a consumer IAM account created in the third-party's AWS account (done via "edit topic policy" mentioned above),
Consumer uses the IAM account from step 2 to add subscription to provider account using AWS CLI.
Example command that worked for me previously for step 4:
aws sns subscribe --topic-arn <provider_sns_arn> --protocol lambda --notification-endpoint <consumer_lambda_arn> --profile consumer-IAM-account
Was having the similar requirement today. In summary there are 3 steps. Let's assume 111111111 is the producer account which has the SNS topic and 2222222222 is the consumer which has the lambda, and
Allowing the Lambda function to subscribe to the topic
aws sns --profile SNS_Owner_Profile add-permission \
--topic-arn "arn:aws:sns:us-east-1:111111111:your-sns-top" \
--label "AllowCrossAccountSns" \
--aws-account-id "2222222222" \
--action-name "Receive" "Subscribe"
allow the topic to invoke the Lambda function,
aws lambda --profile Lambda_Owner_Profile add-permission \
--function-name "your-lambda-function" \
--statement-id "allowCrossAccountSNS" \
--principal "sns.amazonaws.com" \
--action "lambda:InvokeFunction" \
--source-arn "arn:aws:sns:us-east-1:111111111:your-sns-top"
subscribe the lambda function to the topic.
aws sns --profile Lambda_Owner_Profile subscribe \
--topic-arn "arn:aws:sns:us-east-1:111111111:your-sns-top" \
--protocol "lambda" \
--notification-endpoint "arn:aws:lambda:us-east-1:2222222222:function:your-lambda-function"
In AWS Lambda Developer Guide there is a tutorial where AWS CLI commands are used to set up an invocation of a Lambda function from SNS that belongs to another account.
The procedure is quite similar as the procedure in the accepted answer. The subscription doesn't have to be confirmed. It was ready for testing right after aws sns subscribe command.
I ran into the same problem. The error is because you are calling the SNS subscribe function from the account that owns the SNS topic. While this seems logical and is how you would normally do it, AWS expects you to do it the other way around when it comes to cross-account access - you have to call the SNS subscribe function from the account that owns the Lambda function.
Related
I am using the Airflow EcsOperator to run AWS ECS Tasks. As part of this, I am using a custom fluentbit container that is setup to log the container logs to Cloudwatch and AWS Open Search. The logging to both destinations work fine. However, I noticed that the Cloudwatch log groups are getting generated in the format {awslogs_stream_prefix}-{ecs_task_id}. Braces are added to just show the two parts separately, the actual prefix is of the form "ecsprime-generator-container-firelens-977be157d3be4614a84537cd081152d7" where the string starting with 977 is the Task Id. Unfortunately, Airflow code that reads Cloudwatch logs expects the log group name to be in the format {awslogs_stream_prefix}/{ecs_task_id}. Due to this, I am not able to have the Airflow EcsOperator display the corresponding Cloudwatch logs.
Are there any workarounds to address this?
I have created a google cloud task and the queue keeps retrying and the function is not getting invoked as well.
This is the log from the cloud console.
attemptResponseLog: {
attemptDuration: "0.133874s"
dispatchCount: "19"
maxAttempts: 0
responseCount: "0"
retryTime: "2020-06-21T21:20:18.518655Z"
scheduleTime: "2020-06-21T21:20:15.718098Z"
status: "UNAVAILABLE"
targetAddress: "POST some url"
targetType: "HTTP"
}
I ran into this same error, and I must say that the documentation is not clear enough.
WARNING : I feel there's a bit of latency for the roles to be taking into account, especially with the ServiceAccountUser one.
I made multiple test, and tried to keep the lowest rights possible, so I did try to remove some... do some test, it works... great, it's not necessary to have this right... came back later, and the thing is broken.
Here is my setup :
I use Cloud Scheduler to trigger a Cloud Function every 15 minutes by posting a message on a queue.
The CloudFunction build a list of tasks to compute stats on MySQL and create the tasks
Another Cloud Function run SQL query to get stats and store the results in FireStore.
I use cloud task so that the load on MySQL is not too heavy.
Below, I use functional names to make it easy to understand.
TaskCreatorCloudFunction running with TaskCreatorServiceAccount
TaskCreatorServiceAccount requires
the "Cloud Task Enqueuer" role #1
be a ServiceAccountUser on the CloudTaskComputeStatsServiceAccount (see after) #2
The Roles needed to do the job(read SQL to get the list of tasks to create, write logs, access secret manager, listen to pubsub as it's triggered by Cloud Scheduler via pubsub)
TaskImplementationCloudFunction (http) running with TaskImplementationServiceAccount
TaskImplementationServiceAccount has no specific role for CloudTasks, only the one needed to do the job (read SQL, write logs, access secret manager, firestore write)
The TaskQueue is named "compute-stats-on-mysql".
I've created a dedicated ServiceAccount named CloudTaskComputeStatsServiceAccount #3
CloudTaskComputeStatsServiceAccount has the specifics rights for the whole thing to work.
Cloud Function Invoker #4
Add CloudTaskComputeStatsServiceAccount as ServiceAccountUser on TaskImplementationServiceAccount #5
To do the last one in the console (script version below), you need to
go to IAM->Service Account
check the TaskImplementationServiceAccount
In the upper right corner, click "Show Info Panel" if it's not already displayed
click the Add Member
Paste the full name of the CloudTaskComputeStatsServiceAccount
Choose Service Account User as role
Save
You can edit this in the console, but it's better to script it.
gcloud tasks queues create compute-stats-on-mysql \
--max-dispatches-per-second=10 \
--max-concurrent-dispatches=15 \
--max-attempts=2 \
--min-backoff=1s
#3
gcloud iam service-accounts create CloudTaskComputeStatsServiceAccount --description="Service Account for the cloud task compute-stats-on-mysql" --display-name="Service Account for the cloud task compute-stats-on-mysql"
#4
gcloud projects add-iam-policy-binding ${PROJECT_ID} --member serviceAccount:CloudTaskComputeStatsServiceAccount#${PROJECT_ID}.iam.gserviceaccount.com --role "roles/cloudfunctions.invoker"
#1
gcloud projects add-iam-policy-binding ${PROJECT_ID} --member serviceAccount:TaskCreatorServiceAccount#${PROJECT_ID}.iam.gserviceaccount.com --role "roles/cloudtasks.enqueuer"
#5
gcloud iam service-accounts add-iam-policy-binding TaskImplementationServiceAccount#${PROJECT_ID}.iam.gserviceaccount.com --member="serviceAccount:CloudTaskComputeStatsServiceAccount#${PROJECT_ID}.iam.gserviceaccount.com" --role "roles/iam.serviceAccountUser"
#2
gcloud iam service-accounts add-iam-policy-binding CloudTaskComputeStatsServiceAccount#${PROJECT_ID}.iam.gserviceaccount.com --member="serviceAccount:TaskCreatorServiceAccount#${PROJECT_ID}.iam.gserviceaccount.com" --role=roles/iam.serviceAccountUser
when Creating the Task, you use the CloudTaskComputeStatsServiceAccount in the oidcToken
const body = Buffer.from(JSON.stringify(data)).toString('base64');
const task = {
httpRequest: {
httpMethod: 'POST',
url,
oidcToken: {
serviceAccountEmail: "CloudTaskComputeStatsServiceAccount#${PROJECT_ID}.iam.gserviceaccount.com",
},
headers: {
'Content-Type': 'application/json',
},
body,
},
};
My understanding is that when you run the
const [response] = await cloudTasksClient.createTask({parent, task});
The Cloud Function (Task Creator) need to Create Task, and act as the "CloudTaskComputeStatsServiceAccount"
And "CloudTaskComputeStatsServiceAccount" need to have the cloud function invoker and act as the target cloud function.
Indeed it's not a service account issue. It's the OIDC token audience missing. Seems that for cloud functions this is needed. I found two references... you can recreate this problem with the OIDC token in the cli by omitting this argument to gcloud tasks create-http-task
--oidc-token-audience=OIDC_TOKEN_AUDIENCE
The audience to be used when generating an OpenID Connect token to be
included in the request sent to the target when executing the task.
If not specified, the URI specified in the target will be used
The second reference that popped up, in ruby shows audience
https://googleapis.dev/ruby/google-cloud-tasks/latest/Google/Cloud/Tasks/V2/OidcToken.html
Code using google-cloud-tasks 1.5.0, the tasks object looks like this, where url_oidc has just the url to the cloud function (i.e. the trigger url... no url parameters)
# Construct the request body.
task = {
'http_request': { # Specify the type of request.
'http_method': 'GET',
'url': url_final, # The full url path that the task will be sent to.
'oidc_token': {
'service_account_email': service_account_email,
'audience': url_oidc
}
}
}
I am trying to run the workflows using pmcmd or .ksh, but i m facing the error
saying
Failed to authenticate login. [UM_10205] [UM_10205] Failed to authenticate the user [user_name] that belongs to the security domain [e-directory]. For more information, see the domain logs.
Disconnecting from Integration Service.
I have tried logging into informatica and run the workflows thru workflow manager, it is running. I am not able to find the reason why it is saying authentication issue, if it is an authentication issue it should not allow me to login to informatica.
pmcmd startworkflow -sv intg_ser -d Domain_name -usd e-directory -u user -pv encrypted_passwd -f app_CCMLB_FPERX -wait wf_4100_CCML_CAP_PKG_EXT
I have a setup where I have an app engine REST application and a Google composer / airflow DAG that has a task where it is supposed to fetch data from one of the endpoints of the app. The app is protected by IAP. I have added the service account under which Airflow runs to the "IAP-secured Web App User" list, however each time the step executes the response to the http call is the Google Sign-In page. Any idea if any additional step is needed?
The code for my DAG step:
def get_data():
r = requests.get(url="url-to-my-app-endpoint>")
print('stuff:')
print(r.status_code)
print(r.content)
return 1
# ...
python_fetch_data = PythonOperator(
task_id='python_fetch_data',
python_callable=get_data,
dag=dag,
depends_on_past=True,
priority_weight=2
)
https://cloud.google.com/iap/docs/authentication-howto#authenticating_from_a_service_account explains how to extend your DAG code so that it sends credentials to the IAP-protected API backend.
A bit of background: Since Composer is built on top of GCP, your Composer deployment has a unique service account identity that it's running as. You can add that service account to the IAP access list for your endpoint.
I don't know if the Composer UI makes it easy to see the "email" address for your service account, but if you add the code above and decode the token it generates, that will show it.
I was hoping to trigger a Pub/Sub function (using functions.pubsub / onPublish) whenever a new Pub/Sub message is sent to a topic/subscription in a third-party project i.e. cross projects.
After some research and experimentation I found that TopicBuilder throws an error if the topic name contains a / and it defaults to "projects/" + process.env.GCLOUD_PROJECT + "/topics/" + topic (https://github.com/firebase/firebase-functions/blob/master/src/providers/pubsub.ts).
I also found a post in Stack Overflow that says that "Firebase provides a (relatively thin) wrapper around Google Cloud Functions"
(What is the difference between Cloud Function and Firebase Functions?)
This led me to look into Google Cloud Functions. Whilst I was able to create a subscription in a project I own to a topic in a third-party project - after changing permissions in IAM - I could not find a way associate a function with the topic. Nor was I successful in associating a function with a topic and subscription in a third-party project. In the console I only see the topics in my project and I had no success using gcloud.
Has anyone had any success in using a function across projects and, if so, how did you achieve this and is there a documentation URL you could provide? If a function can't be triggered by a message to a topic and subscription in a third-party project can you think of a way that I could ingest third-party Pub/Sub data?
As Pub/Sub fees are billed to the project that contains the subscription I would prefer that the subscription resides in the third-party project with the topic.
Thank you
Google Cloud Functions currently does not not allow a function to listen to a resource in another project. For Cloud Pub/Sub triggers specifically you could get around this by deploying an HTTP-function and add a Pub/Sub push subscription to the topic that you want to fire that cross-project function.
A Google Cloud Function can't be triggered by a subsription to a topic of another project (since you can't subscribe to another project's topic).
But a Google Cloud Function can publish to a topic of another project (and then subscribers of this topic will be triggered).
I solved it by establishing a Google Cloud Function in the original project which listens to the original topic and reacts with publishing to a new topic in the other project. Therefore, the service account (...#appspot.gserviceaccount.com) of this function "in the middle" needs to be authorized by the new topic (console.cloud.google.com/cloudpubsub/topic/detail/...?project=...), i.e. add principal with role: "Pub/Sub Publisher"
import base64
import json
import os
from google.cloud import pubsub_v1
#https://cloud.google.com/functions/docs/calling/pubsub?hl=en#publishing_a_message_from_within_a_function
# Instantiates a Pub/Sub client
publisher = pubsub_v1.PublisherClient()
def notify(event, context):
project_id = os.environ.get('project_id')
topic_name = os.environ.get('topic_name')
# References an existing topic
topic_path = publisher.topic_path(project_id, topic_name)
message_json = json.dumps({
'data': {'message': 'here would be the message'}, # or you can pass the message of event/context
})
message_bytes = message_json.encode('utf-8')
# Publishes a message
try:
publish_future = publisher.publish(topic_path, data=message_bytes)
publish_future.result() # Verify the publish succeeded
return 'Message published.'
except Exception as e:
print(e)
return (e, 500)
google endpoints can be a easier solution to add auth to the function http.