How to add Azure custom Policy for Azure Data Factory to only use Azure Key Vault during the Linked Service Creation? - azure-resource-manager

How to add Azure custom Policy for Azure Data Factory to only use Azure Key Vault during the Linked Service Creation for fetching the Data Store Credentials instead of credentials being put up directly in ADF Linked Service. Please suggest ARM or PowerShell methods for the policy implementation.

As of yesterday, the Data Factory Azure Policy integration is available which means you can now find some built-in policies that can be assigned to ADF.
One of those is exactly what you're asking for as you can see in the image below. You can find more information here
Edit: Based on your comment, I'm editing this answer with the info you want. When it comes to custom policies, it's pretty much up to you to come up with them and create what fits your needs. In your particular case, I've created one policy that does what you want, please see here.
This policy will audit your data factory linked services and check if they're using a self-hosted integration runtime. Currently, that check is only done for a few types of integration runtimes (if you look at the policy, you can see 5 of them) which means that if you want to check more types of linked services, you'll need to add them to the list of allowed values and select them when assigning the policy definition.
Bear in mind that for some linked services types, such as Key Vault, that check won't make sense since that service can't use a self-hosted IR

Related

How to get the role assignments of a resource through Resource Graph API?

I want to use the Azure Resource Graph API to get the role assignments of a resource (who are owners, contributors, etc.). That is, I want to create a query that finds the role assignments for a specific resource id that I provide. I've been going through the documentation, but I haven't found any way to get this information.
The only thing I found was this question from a couple of years ago, where it is mentioned as something that could be done somehow ("query the RBAC of each one of those resources").
Could anyone point me to how this could be done? Or is it not possible to do in Resource Graph API, and I need to use the Management API or something else?
I searched through the Azure Resource Graph table and resource type reference and the Advanced Resource Graph query samples, but didn't find an answer
I tried to reproduce the same in my environment and got the results like below:
I created Azure AD Application and added API permissions:
I generated an access token by using below parameters:
https://login.microsoftonline.com/TenantID/oauth2/v2.0/token
client_id:xxxxxx-xxx-xxx-xxxx-xxxxxxxx
client_secret:ClientSecret
scope:https://management.azure.com//.default
grant_type:client_credentials
To list the Role assignments in the subscription scope, I used the below query:
GET https://management.azure.com/subscriptions/subscriptionId/providers/Microsoft.Authorization/roleAssignments?api-version=2022-04-01
Based on your requirement you can change the scope and add the filter to get the role assignments. Refer the below MsDoc:
List Azure role assignments using the REST API - Azure RBAC
Currently it is not feasible to retrieve the role assignments via Azure Resource Graph. Alternatively, you can make use of Azure PowerShell or Azure CLI.
Get-AzRoleAssignment -Scope "/subscriptions/SubscriptionId/resourcegroups/RGName/providers/Providername/ResourceType/Resource"

Set policy and role in AWS to connect API Gateway and DynamoDB

I am trying stream data from the AWS API Gateway to DynamoDB in AWS (directly, without something like lambda). I have looked for several tutorials, such as [1] and [2], who describe exactly this scenario. All of these assume that the right policies and roles are in place. Normally, I play and hack around till I get a working proof of concept, after which I rebuild a proper model, but with access rights I want to make sure I understand what I am doing. For [2], I also found a stack overflow question at [3] from somebody with the same problem that got solved, but not sure exactly how. I also looked at [4], describing API Gateway with Lambda.
Here is my guess:
Create a policy that allows calling from the API Gateway.
"AmazonAPIGatewayInvokeFullAccess" fits the name, but might not
be necessary + overkill with too much access
Create a policy that allows access to dynamoDB.
Here, "AmazonDynamoDBFullAccess" might be appropriate, even
though it might be overkill (too much access), and might only work
from the Management Console
Create a role that has those two policies attached.
Here, I run into the trouble that when I click create role, and
select AWS service, I can not find the correct "service that will use
this role" that has the policies I described above behind it. For
example, when clicking dynamoDB, I get the following "use-cases", none of which seem to relate to the dynamoDB full access policy:
Amazon DynamoDB Accelerator (DAX) - DynamoDB access
DynamoDB - Global Tables
DynamoDB Accelerator (DAX) - Cluster management
My main question is: How do I set the right minimal set of roles and policies to connect AWS API Gateway to DynamoDB (read and write), as described in [1]?
[1] https://sanderknape.com/2017/10/creating-a-serverless-api-using-aws-api-gateway-and-dynamodb/
[2] https://aws.amazon.com/blogs/compute/using-amazon-api-gateway-as-a-proxy-for-dynamodb/
[3] API Gateway does not have permission to assume the provided role DynamoDB
[4] https://docs.aws.amazon.com/apigateway/latest/developerguide/permissions.html
What you need to do is create an IAM Service Role that allows API Gateway to assume this role. You can easily do this through the UI. When you create a new role, the "Service Role" is selected by default and below the "Choose the service that will use this role" header, you can select API Gateway.
A role is a container of permissions that can be assumed by a certain entity (in our case, an API Gateway API resource). Your role needs "permissions" for the role to have any use. You add this permissions by adding policies to your role. This is explained more in depth here: https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_terms-and-concepts.html
Be sure to read the AWS Service Role part. You say that you need to "Create a policy that allows calling from the API Gateway" but this is incorrect: you need to create a role that can be assumed by API Gateway.
In your case, you'll want specific DynamoDB permissions for your role. Following the least-privilege principle as you mention, you should only add the specific actions for the specific DynamoDB table. The list of possible permissions can be found here: https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/api-permissions-reference.html
Let's say you only want API Gateway to get items from a specific table. Your policy might look something like this then:
{
"Effect": "Allow",
"Action": "dynamodb:GetItem",
"Resource": "arn:aws:dynamodb:eu-west-1:[aws_account_id]:table/[table_name]"
}
Hope this helps!
This recent new tutorial by ankana likhita sri priya includes starting off in high detail/screenshots of IAM (Policy, Role, etc.): https://medium.com/#likhita507/using-api-gateway-to-get-data-from-dynamo-db-using-without-using-aws-lambda-e51434a4f5a0

Explicitly allow usage of production API

I'm exploring WSO2 API Manager platform to use in Open API project. The idea is that we forbid registration in Store and creating users by ourselves. But we also want to give them only Sandbox API as a starting point and then, explicitly allow particular users to consume Production API. Haven't find any information. Is it possible? If yes - where to look?
You can restrict the token generation for the Production endpoints by using Workflows. Follow the documentation[1].
You could configure ProductionApplicationGeneration to use ApplicationRegistrationWSWorkflowExecutor and SandbobApplicationGeneration to use ApplicationRegistrationSimpleWorkflowExecutor.
With this approach if the subscriber tried to generate a token for production endpoints, it will trigger a human task, which needs to be approved from the Admin Portal.
For your requirement, you could write a custom workflow extension which allows restriction by role or user name. For more information on Writing custom workglow extension please follow [2]
[1] https://docs.wso2.com/display/AM210/Adding+an+Application+Registration+Workflow
[2] https://docs.wso2.com/display/AM210/Customizing+a+Workflow+Extension
Thanks and Regards

Separate APIM Stores in internal and DMZ network

We'd like to create separate APIM stores in our internal network and DMZ. I've been going through the documentation, and I've seen you can publish to multiple store (https://docs.wso2.com/display/AM200/Publish+to+Multiple+External+API+Stores) but this is not exactly what I'm looking for, since you need to visit the "main" store to subscribe to an API.
I'd like to have the option from a single publisher instance to check of to which stores an API must be published. Much like the way you can decide to which API gateways you publish your APIs.
Any thoughts or help on this would be great.
Thanks,
Danny
Once API is published in publisher api artifacts are stored in registry which is shared between store and publisher. API store get artifacts from this registry and display it. So
When create apis use tags to differentiate artifacts e.g tag DMZ, Internal
Modify the store to get artifacts based on tags and display

REST API's and PEGA

I can see that you have some expertise with REST API's and PEGA. I would like to know if we expose web call using REST API's to PEGA, will we get all custom rules and all or we need to replicate the rules?
Regards,
Sudhanshu
If you are consuming the REST API and will be using Pega as the REST client, you do not have to create the rules manually. There is an accelerator ("wizard") that will create the rules for you, based on an example request/response for the REST API.
REST mean "Representational state transfer" in pega to create the REST , it provides widget by using widget we can create it. Please visit the link for more updates:
https://myknowpega.blogspot.com/2019/04/pega-81-application-development.html
Since your question is not entirely clear, let me specify for both options (i.e., connector and service). Besides, I am assuming you are using the latest version of PEGA
Integrating Pega with an external service (via Connect-REST)
This can be achieved using the integration wizard by navigating as specified below Configure -> Integration -> Connectors -> Create REST Integration
Documentation to achieve it step by step is listed here
https://docs.pega.com/data-management-and-integration/84/creating-rest-integration
Pega Exposed as a service to external systems via Service-REST
It is bit of a manual process involving creation of service packages, service-REST rules and the configuring the methods GET/POST/PUT/PATCH/DELETE and its corresponding responses.
Documentation on the same is available here as indicated below.
https://docs.pega.com/data-management-and-integration/84/service-rest-rules

Resources