On-prem ASP.NET Framework web app with Azure Key Vault - asp.net

We're in the process of trying to secure our application secrets in our internal ASP.NET Framework web applications. The initial plan offered to me was to use Azure Key Vault. I began development work using my Visual Studio Enterprise subscription, and that seems to work fine, locally.
We've created a second Key Vault in our company's production environment, and again, I can use it locally, because my own AAD account has access to the vault. However, in this project (4.7.2 Web Forms web application), I don't see any means of specifying the Access Policy principal that we've created for the application.
My google-fu is failing me: is there any documentation that explains how to do this? Is this scenario -- an on-prem, ASP.NET Framework app outside of the Azure environment, accessing Key Vault for confiugation values -- even possible?
Thanks.
UPDATE: I was unable to find a solution that would allow me to use the Access Policy principal from within the "Add Connected Service" dialog. I'm somewhat surprised it's not in there, or is hidden enough to elude me. So I ended up writing my own Key Vault Secret-Reader function, similar to the marked answer. Hope this helps someone...

In this scenario, your option is to use the service principal to access the keyvault, please follow the steps below, my sample get the secret from the keyvault.
1.Register an application with Azure AD and create a service principal.
2.Get values for signing in and create a new application secret.
3.Navigate to the keyvault in the portal -> Access policies -> add the correct secret permission for the service principal.
4.Then use the code below, replace the <client-id>, <tenant-id>, <client-secret> with the values got before.
using System;
using Microsoft.Azure.KeyVault;
using Microsoft.Azure.Services.AppAuthentication;
namespace test1
{
class Program
{
static void Main(string[] args)
{
var azureServiceTokenProvider = new AzureServiceTokenProvider("RunAs=App;AppId=<client-id>;TenantId=<tenant-id>;AppKey=<client-secret>");
var kv = new KeyVaultClient(new KeyVaultClient.AuthenticationCallback(azureServiceTokenProvider.KeyVaultTokenCallback));
var secret = kv.GetSecretAsync("https://keyvaultname.vault.azure.net/", "mySecret123").GetAwaiter().GetResult();
Console.WriteLine(secret);
}
}
}

Related

Data Protection using Entity Framework Core

So I have followed microsoft's official guide (https://learn.microsoft.com/el-gr/aspnet/core/security/data-protection/implementation/key-storage-providers?view=aspnetcore-2.2&tabs=visual-studio) for encrypting data and storing them in database using Entity Framework Core, but I can't make it work accross multiple machines. So I used Entity Framework Core implementation because in the guide says "With this package, keys can be shared across multiple instances of a web app.". The app works perfectly when using it from the deployed version for example xyz.com, but It doesn't let me interfere from localhost. Will it be a problem afterwards when my virtual machine is maxed out and I want to add another one? If so how can I make it work in both the deployed site and different machines? There is no tutorial which implements that, I have searched everywhere. Thank you very much.
services.AddDataProtection()
.UseCryptographicAlgorithms(
new AuthenticatedEncryptorConfiguration()
{
EncryptionAlgorithm = EncryptionAlgorithm.AES_256_CBC,
ValidationAlgorithm = ValidationAlgorithm.HMACSHA256,
}
).PersistKeysToDbContext<DataContext>();
Update 12-6-2019
So I followed microsoft's documentation (https://learn.microsoft.com/en-us/aspnet/core/security/data-protection/implementation/key-encryption-at-rest?view=aspnetcore-2.2)
and it states:
"If the app is spread across multiple machines, it may be convenient to distribute a shared X.509 certificate across the machines and configure the hosted apps to use the certificate for encryption of keys at rest"
I generated a x.509 certificate using this tutorial:
(https://www.youtube.com/watch?v=1xtBkukWiek)
My updated code:
services.AddDataProtection()
.UseCryptographicAlgorithms(
new AuthenticatedEncryptorConfiguration()
{
EncryptionAlgorithm = EncryptionAlgorithm.AES_256_CBC,
ValidationAlgorithm = ValidationAlgorithm.HMACSHA256,
}
)
// )
.ProtectKeysWithCertificate(new X509Certificate2("wibit-test-cert.pfx", "password"))
.PersistKeysToDbContext<DataContext>();
When testing on my local machine it works fine, but when I deploy it, I get this error:
error: "The system cannot find the file specified"
I have tried several ways to fix it including _hostingEnvironment.ContentRootPath or WebRootPath. Both these ways and the one I use in the updated code work in my machine but not in the deployed app.
Any clues?
I finally fixed it!
The problem was that I didn't set the application name:
.SetApplicationName("myapp")
And I changed the path of the certificate to this:
.ProtectKeysWithCertificate(new X509Certificate2(Path.Combine(_hostingEnvironment.ContentRootPath,"wibit-test-cert.pfx"), "password"))
Also it may be a permission problem, because when I hosted the app in A2Hosting it could't find the file specified(wibit-test-cert.pfx), but when I deployed in GCP Cloud it worked!
Now I can encrypt and decrypt data using the same database with different apps.
So my final code is this:
services.AddDataProtection()
.UseCryptographicAlgorithms(
new AuthenticatedEncryptorConfiguration()
{
EncryptionAlgorithm = EncryptionAlgorithm.AES_256_CBC,
ValidationAlgorithm = ValidationAlgorithm.HMACSHA256,
}
)
.SetApplicationName("myapp")
.ProtectKeysWithCertificate(new X509Certificate2(Path.Combine(_hostingEnvironment.ContentRootPath,"wibit-test-cert.pfx"), "password"))
.PersistKeysToDbContext<DataContext>();

Not able to connect to Azure Key Vault when using Service Identity

I am trying to retrieve secrets from Azure Key Vault using Service Identity in an ASPNet 4.6.2 web application. I am using the code as outlined in this article. Locally, things are working fine, though this is because it is using my identity. When I deploy the application to Azure I get an exception when keyVaultClient.GetSecretAsync(keyUrl) is called.
As best as I can tell everything is configured correctly. I created a User assigned identity so it could be reused and made sure that identity had get access to secrets and keys in the KeyVault policy.
The exception is an AzureServiceTokenProviderException. It is verbose and outlines how it tried four methods to authenticate. The information I'm concerned about is when it tries to use Managed Service Identity:
Tried to get token using Managed Service Identity. Access token could
not be acquired. MSI ResponseCode: BadRequest, Response:
I checked application insights and saw that it tried to make the following connection with a 400 result error:
http://127.0.0.1:41340/MSI/token/?resource=https://vault.azure.net&api-version=2017-09-01
There are two things interesting about this:
Why is it trying to connect to a localhost address? This seems wrong.
Could this be getting a 400 back because the resource parameter isn't escaped?
In the MsiAccessTokenProvider source, it only uses that form of an address when the environment variables MSI_ENDPOINT and MSI_SECRET are set. They are not set in application settings, but I can see them in the debug console when I output environment variables.
At this point I don't know what to do. The examples online all make it seem like magic, but if I'm right about the source of the problem then there's some obscure automated setting that needs fixing.
For completeness here is all of my relevant code:
public class ServiceIdentityKeyVaultUtil : IDisposable
{
private readonly AzureServiceTokenProvider azureServiceTokenProvider;
private readonly Uri baseSecretsUri;
private readonly KeyVaultClient keyVaultClient;
public ServiceIdentityKeyVaultUtil(string baseKeyVaultUrl)
{
baseSecretsUri = new Uri(new Uri(baseKeyVaultUrl, UriKind.Absolute), "secrets/");
azureServiceTokenProvider = new AzureServiceTokenProvider();
keyVaultClient = new KeyVaultClient(
new KeyVaultClient.AuthenticationCallback(azureServiceTokenProvider.KeyVaultTokenCallback));
}
public async Task<string> GetSecretAsync(string key, CancellationToken cancellationToken = new CancellationToken())
{
var keyUrl = new Uri(baseSecretsUri, key).ToString();
try
{
var secret = await keyVaultClient.GetSecretAsync(keyUrl, cancellationToken);
return secret.Value;
}
catch (Exception ex)
{
/** rethrows error with extra details */
}
}
/** IDisposable support */
}
UPDATE #2 (I erased update #1)
I created a completely new app or a new service instance and was able to recreate the error. However, in all instances I was using a User Assigned Identity. If I remove that and use a System Assigned Identity then it works just fine.
I don't know why these would be any different. Anybody have an insight as I would prefer the user assigned one.
One of the key differences of a user assigned identity is that you can assign it to multiple services. It exists as a separate asset in azure whereas a system identity is bound to the life cycle of the service to which it is paired.
From the docs:
A system-assigned managed identity is enabled directly on an Azure service instance. When the identity is enabled, Azure creates an identity for the instance in the Azure AD tenant that's trusted by the subscription of the instance. After the identity is created, the credentials are provisioned onto the instance. The lifecycle of a system-assigned identity is directly tied to the Azure service instance that it's enabled on. If the instance is deleted, Azure automatically cleans up the credentials and the identity in Azure AD.
A user-assigned managed identity is created as a standalone Azure resource. Through a create process, Azure creates an identity in the Azure AD tenant that's trusted by the subscription in use. After the identity is created, the identity can be assigned to one or more Azure service instances. The lifecycle of a user-assigned identity is managed separately from the lifecycle of the Azure service instances to which it's assigned.
User assigned identities are still in preview for App Services. See the documentation here. It may still be in private preview (i.e. Microsoft has to explicitly enable it on your subscription), it may not be available in the region you have selected, or it could be a defect.
To use a user-assigned identity, the HTTP call to get a token must include the identity's id.
Otherwise it will attempt to use a system-assigned identity.
Why is it trying to connect to a localhost address? This seems wrong.
Because the MSI endpoint is local to App Service, only accessible from within the instance.
Could this be getting a 400 back because the resource parameter isn't escaped?
Yes, but I don't think that was the reason here.
In the MsiAccessTokenProvider source, it only uses that form of an address when the environment variables MSI_ENDPOINT and MSI_SECRET are set. They are not set in application settings, but I can see them in the debug console when I output environment variables.
These are added by App Service invisibly, not added to app settings.
As for how to use the user-assigned identity,
I couldn't see a way to do that with the AppAuthentication library.
You could make the HTTP call manually in Azure: https://learn.microsoft.com/en-us/azure/active-directory/managed-identities-azure-resources/how-to-use-vm-token#get-a-token-using-http.
Then you gotta take care of caching yourself though!
Managed identity endpoints can't handle a lot of queries at one time :)

ASP.Net Identity in Azure Cloud

I have a problem with ASP.Net Identity used in application scaled to two instances. It seems that we have problem with checking password (checking hashes?) or verifying generated tokens (for example: password reset). When application works on one instance, then everything seems to be fine. It's weird because I read that using ASP.Net Identity in Azure cloud should be safe - it should use the same machine key on both instances.
Our user manager use token provider created below:
public DataProtectorTokenProvider<ExternalUser> Create(string purpose = "GeneralPurpose")
{
var provider = new DpapiDataProtectionProvider(_appName);
var result = new DataProtectorTokenProvider<ExternalUser>(provider.Create(purpose));
return result;
}
What is more, our user manager is singleton, but I think it shouldn't be a difference.
Any idea why it doesn't work on two instances in Azure? I will appreciate any help or advice.

Hub inheritance in Signalr in ASP.NET Framework

I'm using signalR to build a real-time website.
I have 2 hub:
NotificationHubCore
NotificationHub (inherits NotificationHubCore)
My solution includes 2 small projects : Domain & Web.
I put NotificationHubCore in Domain, NotificationHub in Web.
Now, in the web section, I want to acess NotificationHubCore by using :
GlobalHost.ConnectionManager.GetHubContext<NotificationHubCore>();
It always returns null to me.
My question is : how can I access to NotificationHubCore through NotificationHub.
I've tried:
var notificationHub = new NotificationHub();
GlobalHost.DependencyResolver.Register(typeof(NotificationHubCore), () => notificationHub);
But that way didn't work.
Can anyone help me please?
Thank you,
You may use SQL Server to distribute messages across a SignalR application that is deployed in two separate applications.
Create a new database for the backplane to use. You can give the database any name. You don’t need to create any tables in the database; the backplane will create the necessary tables.
Refer this article and section "Scaleout with SQL Server" for further details - why SignalR

Programming a Web Portal for Microsoft Dynamics CRM

I'm working on a web portal for customers that will connect to Microsoft Dynamics. I don't want to make Dynamics CRM directly a internet facing deployment (IFD), so I'd like to use a separate database that the web interface interacts with and then use web services to move the data between the web portal database and Dynamics CRM.
I'm just looking for thoughts on whether this is the best way to proceed and whether there are any good code examples, etc. that I can look at for implementing this?
I saw Microsoft has a Customer Portal but it looks like it requires (at a cursory glance) an IFD deployment - which I don't want.
First, after creating your ASP.NET project (WebForms or MVC 3), add the following references:
Microsoft.crm.sdk.proxy.
Microsoft.xrm.sdk.
System.Runtime. Serialization.
System.ServiceModel.
In your code-behind Create a class then add the following code:
private IOrganizationService GetCrmService(string userName, string password, string domain, Uri serviceUri)
{
OrganizationServiceProxy _serviceProxy;
ClientCredentials credentials = new ClientCredentials();
credentials.Windows.ClientCredential = new System.Net.NetworkCredential(userName, password, domain);
//credentials.UserName.UserName = userName; // uncomment in case you want to impersonate
//credentials.UserName.Password = password;
ClientCredentials deviceCredentials = new ClientCredentials();
using (_serviceProxy = new OrganizationServiceProxy(serviceUri,
null,
credentials,
deviceCredentials))
{
_serviceProxy.ServiceConfiguration.CurrentServiceEndpoint.Behaviors.Add(new ProxyTypesBehavior());
return (IOrganizationService)_serviceProxy;
}
}
If you want to retrieve multiple records:
string fetch = #"My Fetch goes here";
EntityCollection records = getCrmService().RetrieveMultiple(new FetchExpression(fetch));
I highly recommend to download the SDK or check this
You'll find many samples and walkthroughs which will help you to build good portals.
I think it's a good strategy because:
It allows you to asynchronously put the data entered on the website into the CRM. This decoupling ensures neither the CRM nor the Website will become eachother's bottleneck.
Only the intermediate service layer is internet facing, so you'll be in control over what CRM information would be disclosed/open for alteration if this service layer is compromised.
The architecture you're after is reminiscent of the way the CRM Asynchronous Service works (asynchronous plugins and workflows work this way).:
A job is put in a queue (table) in the CRM DB.
A scheduled service awakes every x seconds and fetches the latest y records from the queue table.
The service performs each job and writes the result (success, error message log) back to the queue table's records.
So the thing that is probably hardest is writing a good scheduled service that never throws an exception (but always digests it) and properly logs the results back to the DB.
To learn more about the Dynamics CRM's "Asynchronous Service Architecture", refer to the following: http://msdn.microsoft.com/en-us/library/gg334554.aspx
It looks like a good approach.
It will improve the performance of both the portal and CRM.
The data shown on portal is NEARLY realtime. i.e it is NOT realtime.
Throughout the development, you better keep checking that there is not TOO MUCH async processing to keep the CRM server busy all time.
I don't think, that the accelerators/portals REQUIRE CRM to be an IFD instance, I guess only the portal part needs to be Internate facing (of course to make it usable for the purpose!)
Anwar is right, SDK is a good lauchpad for such research.
Customer Portal Does not require IFD deployment. And if you do not like the Customer Portal you can always use SDK Extension for Portal development (microsoft.xrm.client.dll & microsoft.xrm.portal.dll and portalbase solution) which are all included in SDK.
There is a great resource regarding how to build portal by using SDK Portal Extenstion.
Dynamics CRM 2011 Portal Development

Resources