My task is simple - to share cookies among servers in a farm
I know that old way using machine key doesn't work in asp.net core, but there is a DataProtection API
However, I cannot store keys in shared folder (it's default built-in behavior of DataProtection)
Is there any way to store key exchange data in configs? (like in old asp.net)
you can store the keys in a file system folder, but there are of course security concerns in doing so, however the concerns are similar as with storing a key in web.config.
The thing that is different is that there is not just one key like with machine key, the keys in data protection api expire after a given period and new keys are created when needed automatically. If you encrypt something with the data protection api and store it persisted in the database for example, you may need to decrypt it later using the expired keys
This example is storing the keys in a folder named dp_keys within the web app main folder.
string pathToCryptoKeys = Path.Combine(environment.ContentRootPath, "dp_keys");
services.AddDataProtection()
.PersistKeysToFileSystem(new System.IO.DirectoryInfo(pathToCryptoKeys));
Note that storing the keys in the file system is not the recommended approach. There is a powershell script in the docs that enables storing keys per application pool in the system registry. For Azure there is the Azure Key vault for storing the data protection keys.
Related
I have an MVC5 app which is acting as a single sign on portal by creating a cookie which can be read by other MVC apps. This works fine because I have aligned the machine keys and set the cookie domain.
I now need to add in a .NET Core app which needs to be able to read the auth cookie too.
I understand that the machine key is no longer stored in the web config in .NET Core apps, and instead are stored in a directory on the file system.
I followed Microsoft's tutorial on how to do this, but it doesn't seem to explain how the keys should be stored.
Does anybody know how to actually store the machine key value so that the DataProtectionProvider can access it?
Thanks
So I was under the wrong impression that you added the key files yourself to the file system location passed to DataProtectionProvider.Create.
It turns out that the cookie encryption key is placed in that directory automatically by the app when it us used. So I'm guessing by pointing both apps to the same location (maybe with a bit of fiddling) then they should be able to share the auth cookie.
For our portal development, we have decided to use Apigee to expose the web service to the portal. For which currently I am storing the API Key and Api URL in the properties file of the project. Can anyone help with some pointers on how else can I save the API key apart from the properties file.
Any pointers will be helpful in this case.
Regards
Aswathy
Typically the API key will be persisted by the the API consumer - usually an App of some kind. In case of Mobile Apps, each of them have an API Key or Client ID that is saved inside the app usually in some kind of secure data store. For other kinds of API consumers such as web apps the API Key may be persisted within a secure vault or an database that has some encryption features.
I assume your web portal app resides on a secure machine inside your enterprise and that this machine is access restricted. If this is the case bare minimum security is taken care of
However, If the key is a high privilege key and you can access APIs with key alone(ie without a secret), it is not advisable to keep it in plain text.
You can
1. Encrypt and store it in the config file and decrypt at runtime
2. Encrypt and store in Database or other secure storage you use for storing credentials.
I am trying to make an ASP.NET MVC Azure-hosted website role that allows third-party authentication.
The client might not want me to add SQL server since it costs money, and I can cut back since all of my data comes from CRM. My problem is that DotNetOpenAuth, the library that supports the authentication, appears to require some database tables for storage. However I do not want to use any storage since I want to put all my data (and the auth token) in CRM.
1) If I don't use DB to persist the token, then is it a good idea to use encrypted cookie/server-side sessions? What do I have to modify?
2) Is in-memory, non-DB sessions from 1)* scalable in Azure?
3) Is there any way to make DotNetOpenAuth (WebSecurity class) work without relying on a db?
DotnetOpenAuth doesn't require database, you provide your own implementations of few interfaces, nowhere db is required.
Tokens are not persisted. Instead, tokens contain username and scopes encrypted with a private key of the authentication server. This way all you need is the public key in the resource server to decrypt the token (the ICryptoKeyStore interface).
You could as well persist keys in the filesystem or elsewhere.
I'm building an asp.net application that will later be ported to azure.
For the moment, I have all the business tables in one database and a separate database that I use for membership; it's basically the default database that the login control generates.
In the business database, I have a table that contains user profile data and one field is TheUserID (which is an int) and another field that's called TheUserMembership (a string), which will contain the user ID that's generated by the asp.net user management tool.
Once the user logs in, I store TheUserID in the session and the whole app works with the int as the identifier.
Is this a good way to do it? Will this port to azure?
You should be using: http://msdn.microsoft.com/en-us/library/yh26yfzy.aspx
And then yes it will port to Azure and SQL Azure.
Things to watch out for with Azure
Local storage (disk storage). Since you have multiple instances, local storage doesn't work as you will never be able to tell which instance it is on.
Session state must be out of proc, for the same reason as above.
There are many other little things here or there about Azure but those would be the 2 biggest to watch out for when moving across.
the user id is a GUID and it you can find the on the users table, you should not store user id into session since cookie can be stolen. for azure one principal requirement is that all tables have a primary key
I was wondering if anyone had successfully used DPAPI with a user store in a web farm enviroment?
Because our application is a recently converted from 1.1 to 2.0 ASP.NET app, we're using a custom wrapper which directly calls the CryptUnprotect methods. But this should be the same as the ProtectedData method available in the 2.0 framework.
Because we are operating in a web farm environment, we can't guarantee that the machine that did the encryption is going to be the one decrypting it. (Also because machine failures shouldn't destroy our encrypted data).
So what we have is a serviced component that runs in a service under a particular user account on each one of our web boxes. This user is a set up to have a roaming profile, as per the recomendation.
The problem we have is that info encrypted on one machine can not be decrypted on another, this fails with the win32 error:
'Key not valid for use in specified state'.
I suspect that this is because I've made a mistake by having the encryption service running as the user on multiple machines, hence keeping the user logged in on more than one machine at the same time.
If this is the problem, how are other using DPAPI with the User Store in a web farm environment?
In a web farm environment, rather than using DPAPI to encrypt/decrypt your data directly, you would instead use it to encrypt the key that you later use to decrypt your protected data.
You would "install" the key onto each server as part of the deployment process. The installation script would need to run under the AppPool's identity, and could store the encrypted key either in an app.config file or in the registry.
The encrypted data itself could be stored in a central repository / database, so that it can be accessed by all servers in the farm. To decrypt the data, the web app would retrieve the encrypted key from where it was installed, use DPAPI to decrypt it, then use the result to decrypt data that comes from the central repository.
The downside is that the cleartext key might exist on the local disk for a short time during the initial install process, where it might be exposed to operations staff. You could add an extra layer of encryption, such as with the web.config machineKey, if that's a concern.
The Microsoft poster is wrong.
http://support.microsoft.com/default.aspx?scid=kb;en-us;309408#6
"For DPAPI to work correctly when it uses roaming profiles, the domain user must only be logged on to a single computer in the domain. If the user wants to log on to a different computer that is in the domain, the user must log off the first computer before the user logs on to the second computer. If the user is logged on to multiple computers at the same time, it is likely that DPAPI will not be able to decrypt existing encrypted data correctly."
It appears that DPAPI will not work in a farm setting. I think this is a rather large oversight on Microsoft's part and makes DPAPI almost useless for most enterprise applications.
I just saw this. There is a way you can make this work, and that is to make sure the machines in the farm are in a domain, and use a domain account to encrypt and decrypt the data (ie; run the application under the domain account)
You cannot use DPAPI in the manner you want with local accounts because the key material is not exchanged between servers.
hope that helps!
Twelve years later . . . you can try using CNG DPAPI, which was meant to work in cloud environments that may or may not be load-balanced. From that link (in case it gets taken down):
Microsoft introduced the data protection application programming
interface (DPAPI) in Windows 2000. The API consists of two functions,
CryptProtectData and CryptUnprotectData. DPAPI is part of CryptoAPI
and was intended for developers who knew very little about using
cryptography. The two functions could be used to encrypt and decrypt
static data on a single computer.
Cloud computing, however, often requires that content encrypted on one
computer be decrypted on another. Therefore, beginning with Windows 8,
Microsoft extended the idea of using a relatively straightforward API
to encompass cloud scenarios. This new API, called DPAPI-NG, enables
you to securely share secrets (keys, passwords, key material) and
messages by protecting them to a set of principals that can be used to
unprotect them on different computers after proper authentication and
authorization.
In .NET Core this looks like
public void ConfigureServices(IServiceCollection services)
{
services.AddDataProtection()
.ProtectKeysWithDpapiNG();
}