Active directory integration with Corda - corda

Hyperledger Fabric allows to integrate LDAP with Fabric CA. This will help an organization to reuse their existing user access management to Fabric application. Do we have any such functionality available for r3 corda where we can integrate existing LDAP to Corda authentication layer?

You can achieve that in an indirect way:
Typically your node.conf would define the RPC users and their privileges as a static list:
rpcUsers=[
{
username=exampleUser
password=examplePass
permissions=[
"ALL"
]
},
...
]
But Corda has another option, which is using a database as a source for credentials. When using this approach you have to do the following:
Your node.conf will now provide role names instead of user name.
Your database must define several tables: users, user_roles, and roles_permissions.
This way you can add/remove users or grant/revoke privileges by doing it in the database without the need of modifying your node.conf or restarting your node.
I'm sure with this approach, you can create an integration with your LDAP to update those tables.
You can find more details on this approach here.

The question seems ambiguous but here it goes.
If you are asking about RPC connection credentials, then I'd probably follow Adel's suggestion of using a database as the intermediary, in short:
node.conf dataSource --points to--> RDBMS --syncs with--> LDAP
You'd have to setup a RDBMS/LDAP sync mechanism; if JPA is relevant to you as an ORM, you can probably use the entities predefined here by yours truly.
If about app-level user auth, have a look at Corda Accounts. You should be able to map between account IDs and (R)DNs using a UUID originating in either side.

Related

How to add Azure custom Policy for Azure Data Factory to only use Azure Key Vault during the Linked Service Creation?

How to add Azure custom Policy for Azure Data Factory to only use Azure Key Vault during the Linked Service Creation for fetching the Data Store Credentials instead of credentials being put up directly in ADF Linked Service. Please suggest ARM or PowerShell methods for the policy implementation.
As of yesterday, the Data Factory Azure Policy integration is available which means you can now find some built-in policies that can be assigned to ADF.
One of those is exactly what you're asking for as you can see in the image below. You can find more information here
Edit: Based on your comment, I'm editing this answer with the info you want. When it comes to custom policies, it's pretty much up to you to come up with them and create what fits your needs. In your particular case, I've created one policy that does what you want, please see here.
This policy will audit your data factory linked services and check if they're using a self-hosted integration runtime. Currently, that check is only done for a few types of integration runtimes (if you look at the policy, you can see 5 of them) which means that if you want to check more types of linked services, you'll need to add them to the list of allowed values and select them when assigning the policy definition.
Bear in mind that for some linked services types, such as Key Vault, that check won't make sense since that service can't use a self-hosted IR

Set someone else as cluster admin via kql syntax

I created a kusto cluster and database as one of my accounts on one Azure subscription, but now I want to grant cluster admin permissions to one of my other accounts that is not part of this subscription.
I have to do this via a kql command, or some other way I can manually pass in which users are becoming admins.
Is there such a thing as Cluster Admin permissions?
I added my other account as an admin to one of the databases in my cluster using
.add database DatabaseName admins ('aaduser=username#email.com')
but I cannot seem to do the same on a cluster level. How can I do this?
Cluster admin isn't a role you can add principals to.
You're likely looking for the All databases admin role: https://learn.microsoft.com/en-us/azure/data-explorer/kusto/management/access-control/role-based-authorization
You can add principals to that role via the Azure portal, or programmatically as explained here (note: there's a dropdown for C#, python, and an ARM template): https://learn.microsoft.com/en-us/azure/data-explorer/cluster-principal-python

Set policy and role in AWS to connect API Gateway and DynamoDB

I am trying stream data from the AWS API Gateway to DynamoDB in AWS (directly, without something like lambda). I have looked for several tutorials, such as [1] and [2], who describe exactly this scenario. All of these assume that the right policies and roles are in place. Normally, I play and hack around till I get a working proof of concept, after which I rebuild a proper model, but with access rights I want to make sure I understand what I am doing. For [2], I also found a stack overflow question at [3] from somebody with the same problem that got solved, but not sure exactly how. I also looked at [4], describing API Gateway with Lambda.
Here is my guess:
Create a policy that allows calling from the API Gateway.
"AmazonAPIGatewayInvokeFullAccess" fits the name, but might not
be necessary + overkill with too much access
Create a policy that allows access to dynamoDB.
Here, "AmazonDynamoDBFullAccess" might be appropriate, even
though it might be overkill (too much access), and might only work
from the Management Console
Create a role that has those two policies attached.
Here, I run into the trouble that when I click create role, and
select AWS service, I can not find the correct "service that will use
this role" that has the policies I described above behind it. For
example, when clicking dynamoDB, I get the following "use-cases", none of which seem to relate to the dynamoDB full access policy:
Amazon DynamoDB Accelerator (DAX) - DynamoDB access
DynamoDB - Global Tables
DynamoDB Accelerator (DAX) - Cluster management
My main question is: How do I set the right minimal set of roles and policies to connect AWS API Gateway to DynamoDB (read and write), as described in [1]?
[1] https://sanderknape.com/2017/10/creating-a-serverless-api-using-aws-api-gateway-and-dynamodb/
[2] https://aws.amazon.com/blogs/compute/using-amazon-api-gateway-as-a-proxy-for-dynamodb/
[3] API Gateway does not have permission to assume the provided role DynamoDB
[4] https://docs.aws.amazon.com/apigateway/latest/developerguide/permissions.html
What you need to do is create an IAM Service Role that allows API Gateway to assume this role. You can easily do this through the UI. When you create a new role, the "Service Role" is selected by default and below the "Choose the service that will use this role" header, you can select API Gateway.
A role is a container of permissions that can be assumed by a certain entity (in our case, an API Gateway API resource). Your role needs "permissions" for the role to have any use. You add this permissions by adding policies to your role. This is explained more in depth here: https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_terms-and-concepts.html
Be sure to read the AWS Service Role part. You say that you need to "Create a policy that allows calling from the API Gateway" but this is incorrect: you need to create a role that can be assumed by API Gateway.
In your case, you'll want specific DynamoDB permissions for your role. Following the least-privilege principle as you mention, you should only add the specific actions for the specific DynamoDB table. The list of possible permissions can be found here: https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/api-permissions-reference.html
Let's say you only want API Gateway to get items from a specific table. Your policy might look something like this then:
{
"Effect": "Allow",
"Action": "dynamodb:GetItem",
"Resource": "arn:aws:dynamodb:eu-west-1:[aws_account_id]:table/[table_name]"
}
Hope this helps!
This recent new tutorial by ankana likhita sri priya includes starting off in high detail/screenshots of IAM (Policy, Role, etc.): https://medium.com/#likhita507/using-api-gateway-to-get-data-from-dynamo-db-using-without-using-aws-lambda-e51434a4f5a0

Explicitly allow usage of production API

I'm exploring WSO2 API Manager platform to use in Open API project. The idea is that we forbid registration in Store and creating users by ourselves. But we also want to give them only Sandbox API as a starting point and then, explicitly allow particular users to consume Production API. Haven't find any information. Is it possible? If yes - where to look?
You can restrict the token generation for the Production endpoints by using Workflows. Follow the documentation[1].
You could configure ProductionApplicationGeneration to use ApplicationRegistrationWSWorkflowExecutor and SandbobApplicationGeneration to use ApplicationRegistrationSimpleWorkflowExecutor.
With this approach if the subscriber tried to generate a token for production endpoints, it will trigger a human task, which needs to be approved from the Admin Portal.
For your requirement, you could write a custom workflow extension which allows restriction by role or user name. For more information on Writing custom workglow extension please follow [2]
[1] https://docs.wso2.com/display/AM210/Adding+an+Application+Registration+Workflow
[2] https://docs.wso2.com/display/AM210/Customizing+a+Workflow+Extension
Thanks and Regards

Search engine bundle with dynamic DBAL connection

I have a SF2 application using multiple database connections. There is a database for each user account, so their number isn't static.
I use a DBAL connection factory to access a database depending on which user is logged in.
I need to integrate a search engine bundle that can work well with this type on configuration.
It should be able to :
generate search indexes per database
manage indexes for databases that have identical structures (same table and column names)
do search requests only in the database related to the logged user.
I know it's a custom configuration and not very popular but I hope somebody can help me find maybe the start of a solution, here. Thanks.
I have already checked Elastica and Solr bundles, but I don't know how to configure them without modifying the code.

Resources