Set policy and role in AWS to connect API Gateway and DynamoDB - amazon-dynamodb

I am trying stream data from the AWS API Gateway to DynamoDB in AWS (directly, without something like lambda). I have looked for several tutorials, such as [1] and [2], who describe exactly this scenario. All of these assume that the right policies and roles are in place. Normally, I play and hack around till I get a working proof of concept, after which I rebuild a proper model, but with access rights I want to make sure I understand what I am doing. For [2], I also found a stack overflow question at [3] from somebody with the same problem that got solved, but not sure exactly how. I also looked at [4], describing API Gateway with Lambda.
Here is my guess:
Create a policy that allows calling from the API Gateway.
"AmazonAPIGatewayInvokeFullAccess" fits the name, but might not
be necessary + overkill with too much access
Create a policy that allows access to dynamoDB.
Here, "AmazonDynamoDBFullAccess" might be appropriate, even
though it might be overkill (too much access), and might only work
from the Management Console
Create a role that has those two policies attached.
Here, I run into the trouble that when I click create role, and
select AWS service, I can not find the correct "service that will use
this role" that has the policies I described above behind it. For
example, when clicking dynamoDB, I get the following "use-cases", none of which seem to relate to the dynamoDB full access policy:
Amazon DynamoDB Accelerator (DAX) - DynamoDB access
DynamoDB - Global Tables
DynamoDB Accelerator (DAX) - Cluster management
My main question is: How do I set the right minimal set of roles and policies to connect AWS API Gateway to DynamoDB (read and write), as described in [1]?
[1] https://sanderknape.com/2017/10/creating-a-serverless-api-using-aws-api-gateway-and-dynamodb/
[2] https://aws.amazon.com/blogs/compute/using-amazon-api-gateway-as-a-proxy-for-dynamodb/
[3] API Gateway does not have permission to assume the provided role DynamoDB
[4] https://docs.aws.amazon.com/apigateway/latest/developerguide/permissions.html

What you need to do is create an IAM Service Role that allows API Gateway to assume this role. You can easily do this through the UI. When you create a new role, the "Service Role" is selected by default and below the "Choose the service that will use this role" header, you can select API Gateway.
A role is a container of permissions that can be assumed by a certain entity (in our case, an API Gateway API resource). Your role needs "permissions" for the role to have any use. You add this permissions by adding policies to your role. This is explained more in depth here: https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_terms-and-concepts.html
Be sure to read the AWS Service Role part. You say that you need to "Create a policy that allows calling from the API Gateway" but this is incorrect: you need to create a role that can be assumed by API Gateway.
In your case, you'll want specific DynamoDB permissions for your role. Following the least-privilege principle as you mention, you should only add the specific actions for the specific DynamoDB table. The list of possible permissions can be found here: https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/api-permissions-reference.html
Let's say you only want API Gateway to get items from a specific table. Your policy might look something like this then:
{
"Effect": "Allow",
"Action": "dynamodb:GetItem",
"Resource": "arn:aws:dynamodb:eu-west-1:[aws_account_id]:table/[table_name]"
}
Hope this helps!

This recent new tutorial by ankana likhita sri priya includes starting off in high detail/screenshots of IAM (Policy, Role, etc.): https://medium.com/#likhita507/using-api-gateway-to-get-data-from-dynamo-db-using-without-using-aws-lambda-e51434a4f5a0

Related

How to get the role assignments of a resource through Resource Graph API?

I want to use the Azure Resource Graph API to get the role assignments of a resource (who are owners, contributors, etc.). That is, I want to create a query that finds the role assignments for a specific resource id that I provide. I've been going through the documentation, but I haven't found any way to get this information.
The only thing I found was this question from a couple of years ago, where it is mentioned as something that could be done somehow ("query the RBAC of each one of those resources").
Could anyone point me to how this could be done? Or is it not possible to do in Resource Graph API, and I need to use the Management API or something else?
I searched through the Azure Resource Graph table and resource type reference and the Advanced Resource Graph query samples, but didn't find an answer
I tried to reproduce the same in my environment and got the results like below:
I created Azure AD Application and added API permissions:
I generated an access token by using below parameters:
https://login.microsoftonline.com/TenantID/oauth2/v2.0/token
client_id:xxxxxx-xxx-xxx-xxxx-xxxxxxxx
client_secret:ClientSecret
scope:https://management.azure.com//.default
grant_type:client_credentials
To list the Role assignments in the subscription scope, I used the below query:
GET https://management.azure.com/subscriptions/subscriptionId/providers/Microsoft.Authorization/roleAssignments?api-version=2022-04-01
Based on your requirement you can change the scope and add the filter to get the role assignments. Refer the below MsDoc:
List Azure role assignments using the REST API - Azure RBAC
Currently it is not feasible to retrieve the role assignments via Azure Resource Graph. Alternatively, you can make use of Azure PowerShell or Azure CLI.
Get-AzRoleAssignment -Scope "/subscriptions/SubscriptionId/resourcegroups/RGName/providers/Providername/ResourceType/Resource"

How to add Azure custom Policy for Azure Data Factory to only use Azure Key Vault during the Linked Service Creation?

How to add Azure custom Policy for Azure Data Factory to only use Azure Key Vault during the Linked Service Creation for fetching the Data Store Credentials instead of credentials being put up directly in ADF Linked Service. Please suggest ARM or PowerShell methods for the policy implementation.
As of yesterday, the Data Factory Azure Policy integration is available which means you can now find some built-in policies that can be assigned to ADF.
One of those is exactly what you're asking for as you can see in the image below. You can find more information here
Edit: Based on your comment, I'm editing this answer with the info you want. When it comes to custom policies, it's pretty much up to you to come up with them and create what fits your needs. In your particular case, I've created one policy that does what you want, please see here.
This policy will audit your data factory linked services and check if they're using a self-hosted integration runtime. Currently, that check is only done for a few types of integration runtimes (if you look at the policy, you can see 5 of them) which means that if you want to check more types of linked services, you'll need to add them to the list of allowed values and select them when assigning the policy definition.
Bear in mind that for some linked services types, such as Key Vault, that check won't make sense since that service can't use a self-hosted IR

Firebase Cloud Functions Secure HTTPS Endpoints with API key

I've looked at a few places, Including this post and the firebase panel
Is there no way to use these api's to secure these endpoints using an api key you create per client who uses your cloud functions?
I'm able to block every one putting a restriction on the Browser key, but I would like to create a new api key, and use that as a way to authenticate my endpoint for various clients.
Creating a new api key, and using that as a parameter in my query doesn't work (don't now if I'm doing anything wrong)
Is there a way to do this?
Option 1: handle authentication within the function
https://github.com/firebase/functions-samples/tree/master/authorized-https-endpoint
Adapt above to use clients/keys stored in firestore
Option 2: Use an an API Gateway
Google Cloud Endpoints (no direct support for functions yet, need to implement a proxy)
Apigee (higher cost, perhaps more than you need)
Azure API Management (lower entry cost + easy to implement as a facade for services hosted outside Azure)
there are more..
The above gateways are probably best for your use case in that the first two would let you keep everything within Google, albeit with more complexity/cost -- hopefully Endpoints will get support for functions soon. Azure would mean having part of your architecture outside Google, but looks like an easy way to achieve what your after (api key per client for your google cloud / firebase functions)
Here's a good walkthrough of implementing Azure API Management:
https://koukia.ca/a-microservices-implementation-journey-part-4-9c19a16385e9
Not to achieve what you are after, as far as firebase and GCP is concerned your clients is your specific business problem.
One way you could tackle this (with the little information that is provided);
You need somewhere to store a list of clients + their API key (I would use firestore)
For the endpoints you want to secure with a client-specific API key you can include a check to confirm the header exists and also exists in your firestore client record.
Considerations:
Depending on your expected traffic loads and the the number of firestore reads you'll be adding, you might want to double check this kind of solution will work for your budget.
Is the API-key type solution the only option you must go for? You Could probably get pretty far using the https://github.com/firebase/firebaseui-web and doing user checks in your function with no extra DB read required. If you go down this path most of the user signup/ emails / account creation logic is ready to go.
https://firebase.google.com/docs/auth/web/password-auth#before_you_begin
Curious to see what some other firebase users suggest.

Explicitly allow usage of production API

I'm exploring WSO2 API Manager platform to use in Open API project. The idea is that we forbid registration in Store and creating users by ourselves. But we also want to give them only Sandbox API as a starting point and then, explicitly allow particular users to consume Production API. Haven't find any information. Is it possible? If yes - where to look?
You can restrict the token generation for the Production endpoints by using Workflows. Follow the documentation[1].
You could configure ProductionApplicationGeneration to use ApplicationRegistrationWSWorkflowExecutor and SandbobApplicationGeneration to use ApplicationRegistrationSimpleWorkflowExecutor.
With this approach if the subscriber tried to generate a token for production endpoints, it will trigger a human task, which needs to be approved from the Admin Portal.
For your requirement, you could write a custom workflow extension which allows restriction by role or user name. For more information on Writing custom workglow extension please follow [2]
[1] https://docs.wso2.com/display/AM210/Adding+an+Application+Registration+Workflow
[2] https://docs.wso2.com/display/AM210/Customizing+a+Workflow+Extension
Thanks and Regards

Get access to fusion tables

I'd like to show some map layer on my webpage, so I decided to give a try to this Google service. As the data is collected in a database in my server, I chose to use a service account as explained here and then use the private key generated in my php script.
Everything works fine when creating a table and inserting some test values. I get the table Id and I'm able to play with it from my script. The problem is that I don't know how to access these table from the web browser. In my API console usage stats are shown fine, but when logging with my account to Google Drive I don't see any table in there.
Where am I supposed to access them if at all possible? Do either the apps.googleusercontent.com or developer.gserviceaccount.com accounts play any role to log into some other service to get access through web?
I also got an api key associated, but when trying to query a table I get a 401 error.
Any hint? I'm feeling a bit lost now. Thanks.
You are using a Service Account right?
So when you create a table with this account, this account will be the table owner. No one else has permission to see this table.
When you access the Fusion Tables web interface with Your Personal Account, you will only see tables that you createdwith your Personal Account.
If you wish to inspect the tables created with your Service Account, you have to use the Google Drive API with your Service Account credentials to give access permission to your Personal Account.
Also if you wish to make your table (or any other type of document) public, you need to use this Google Drive API again.
See more about the topic here:
https://developers.google.com/drive/v2/reference/permissions/insert
Tip: if you want to achieve something on behalf of your Service Account that you only need once (so no need to implement a logic for it in your webapp) I'd seriously advise you to consider using the OAuth2 Playground. You can set your Service Account credentials in the "Settings" and issue authorized requests on behalf of your Service Account. Very usefull tool, no coding needed.

Resources