ASP app cannot open RSA key container (NTE_BAD_KEYSET error) - asp.net

Having difficulty making CryptAcquireContext work for a .NET app. I've created the key, but it's throwing error NTE_BAD_KEYSET.
The documentation suggests that the trouble is with permissions, but I'm not quite sure who to give permission to. So far have tried:
SYSTEM
NETWORK
NETWORK SERVICE
ASPNET
Administrators
Cryptographic Operators
No dice.
Any ideas?
This is a machine key.
UPDATE: Gave access to EVERYONE and that seems to have worked. However, it seems like a bad security practice. The question stands: who should actually be getting access?

Related

Is allowing the AppPool local activation permission System Wide in dcomcnfg a big security risk?

I've recently been trying to use the IIS AppPool identity instead of Network Service or Local System.
As such I came across the ugly error
The machine-default permission settings do not grant Local Activation
permission for the COM Server application with CLSID
{6E46607A-7347-471B-A98C-BC9E49B07248} and APPID Unavailable to the
user IIS APPPOOL\MyAppPool SID
(S-1-5-82-476059244-1685105758-59475158-1390954050-72429515) from
address LocalHost (Using LRPC) running in the application container
Unavailable SID (Unavailable). This security permission can be
modified using the Component Services administrative tool.
As you may notice my APPID was missing from this error, I searched the registry and found out which component it was (also by debugging).
It's a VC++ out-of-process OLE/COM server which processes requests from our web server. (Yay, 1990's called). I'm not entirely sure why this involves DCOM, there's nothing 'distributed' about it by design, maybe more by accident or an artefact of VS2008's default MFC/OLE server templates?
On using the power of Google, I followed the typical route of changing the dcomcnfg setting for this component to allow my IIS AppPool\MyAppPool user the local activation permission (I tried them all actually!), and confirmed that w3wp.exe is running as the same identity.
I also made sure that this exe was readable/executable by that user.
However the error still persisted.
Only by setting the same permissions machine-wide (via the My Computer node, instead of the individual component node), did the component load properly. This feels like a big security risk. Is it?
In the failing case, I tried using process monitor to spot any registry keys or file access problems, or to identify what other components might require access. But nothing reared its head.
Given that setting the DCOM permission system wide fixes the problem - It does feel to me that there's another DCOM component or service that needs permissions being set, but I can't find out which.
So
a) Is there a way to further diagnose this problem? Sniff out decisions being made by DCOM? Is there a central DCOM broker that needed the permissions set also? Debugging/Process Monitor doesn't seem to help.
b) Is it ok to set the AppPool local activation policy machine wide?
Many thanks to anyone who helps me make the right decision.
Q1. Is it bad practice to give your App Pool account local DCOM activation permissions, computer-wide?
A1: Yes, it's bad. According to the book Secure DCOM Best Practices
It isn’t a good idea to loosen these permissions from the default
values
Q2: Why is the component still failing?
A2: This was a combination of problems:
Process monitor DID pick up an issue that my AppPool identity was not able to read the registry key HKEY_CLASSES_ROOT\WOW6432Node\AppID\{C33D7656-D310-4684-9482-A486787E4E3B}. Enabling read permission for my AppPool identity got me one step further.
The event log message about an Unavailable AppID was a clue. There was no AppID REG_SZ entry for the class being requested. So the security settings were not being picked up. I needed to ensure the following key existed: HKEY_CLASSES_ROOT\WOW6432Node\CLSID\{6E46607A-7347-471B-A98C-BC9E49B07248} with String Value AppID={C33D7656-D310-4684-9482-A486787E4E3B}
As per MSDN documentation AppID, and What is AppID

System.Security.Cryptography.CryptographicException: Not enough storage is available to process this command

Our asp.net app was working fine, then the DBA decided to encrypt the db password in the web.config. Now I'm getting this error:
System.Security.Cryptography.CryptographicException: Not enough storage is available to process this command.
There is only one other article on SO that has this error listed and the user resorted to a refactor instead of identifying a solution.
The weird thing is that we have plenty of space (RAM, HDD, etc). Even more weird, three of the people on my team don't have this problem (with the exact same url). Another guy had it yesterday, but it works today.
I'm worried about when we move this to prod. Especially, if this needs some kind of incremental storage or permissions for EACH user.
Edit: The other error that seems to show up is:
"Failed to decrypt using provider 'RsaProtectedConfigurationProvider'"
It turns out that this is a generic error message that happens whenever the server has trouble decrypting with RSA. Not very helpful, because it is misleading (at worst) and at best, very vague.
For us, the error was only happening for me because our dev servers are load-balanced (which I didn't know till today). The encryption key was generated on one machine (server1) and installed on both servers. When I got load-balanced onto server2, I see this error (so would anyone else on server2).
The solution is to export the private key from server1 and install it onto server2.

How to set up a secure password-protected connection between R and a server

I edited this question to clarify why I asked this question again (I had weak Google-Fu and found these rather old 1 2 3 pretty-much-duplicates only after posting).
Approaches to accessing a password-protected resources that I've seen in the wild.
Plaintext storage in script (might often end up being shared, or in a Dropbox)
Plaintext storage in a config script
You can do password = readline("Password: ") but of course the password ends up in plaintext in the console (and thus in console logs etc.), so might as well store it in a plaintext config file.
I found this little trick to avoid displaying the password in the Terminal, but running system("stty -echo") on OS X Mavericks leads to the error stty: stdin isn't a terminal, so I guess it wouldn't be particularly portable.
Using tcltk. Has the unfortunate effect of making Rstudio crash and being difficult to install.
keychain. It's not on CRAN, so I don't think I can use this as a first-line approach, I'd also like a bit more detail about where and how passwords are stored on various systems (i.e. will it end up in plaintext on Windows?).
Access tokens, OAuth etc. seem to have similar problems.
I don't know any R packages which use PGP for connections? Probably also a bit difficult for newbie users.
I'm not asking for myself mainly, but I want to provide somewhat sensible defaults for nontechnical users who might store plaintext passwords enabling access to sensitive data in their Dropbox.
Unlike others who asked similar questions, I could also change the server-side of things if I had a better approach.
Are there best-practice approaches that I'm currently missing? My focus on interactive sessions is because I assume that's how most nontechnical types use R, but of course it would be nice if it worked during e.g. knitr report generation too.
Some suggestions to solve your problem securely. These solutions match all programming languages.
Establish a secure connection to your resource without R, like a SSL tunnel.
If you need a secure password in R to establish a secure connection, then you can read this from a secure config file and remove this password variable if you don't use the password anymore. A secure config file is a config file that is not part of your code repository (Git, SVN, ...). You have to manage your secret independent of your code. This mean separate your code and your secrets. One simple way is to put your private and secure secret in your private and secure user home directory. Then you have delegated your security problem to your operating system. Your secret is now save as your OS and your home directory. Pleas check the rights of your home directory and enable the file system encryption if they are off. Notice, this is the way like Maven handle passwords.
You get more security if you encrypt your password/secret config file. Then you have second line of defense.
For most applications is point 2 enough.
Notice, be sure that your secret is not deployed with your code. You need a second way to manage and deploy your secret to production systems.
Notice, be sure that if your programs jams, that your secret is not in memory anymore.
Notice, use always strong algorithms for encryption. Don't implement your own security algorithm, is a high complexity task. Better use standard implementations of strong encryption algorithms.

How to keep multiple connectionString passwords safe, separate, and easy to deploy?

I know there are plenty of questions here already about this topic (I've read through as many as I could find), but I haven't yet been able to figure out how best to satisfy my particular criteria. Here are the goals:
The ASP.NET application will run on a few different web servers, including localhost workstations for development. This means encrypting web.config using a machine key is out. Each "type" or environment of web server (dev, test, prod) has its own corresponding database (dev, test, prod). We want to separate these connection strings so that a developer working on the "dev" code is not able to see any "prod" connection string passwords, nor allow these production passwords to ever get deployed to the wrong server or committed to SVN.
The application will should be able to decide which connection string to attempt to use based on the server name (using a switch statement). For example, "localhost" and "dev.example.com" will should know to use the DevDatabaseConnectionString, "test.example.com" will use the TestDatabaseConnectionString, and "www.example.com" will use the ProdDatabaseConnectionString, for example. The reason for this is to limit the chance for any deployment accidents, where the wrong type of web server connects to the wrong database.
Ideally, the exact same executables and web.config should be able to run on any of these environments, without needing to tailor or configure each environment separately every time that we deploy (something that seems like it would be easy to forget/mess up one day during a deployment, which is why we moved away from having just one connectionstring that has to be changed on each target). Deployment is currently accomplished via FTP. Update: Using "build events " and revising our deployment procedures is probably not a bad idea.
We will not have command-line access to the production web server. This means using aspnet_regiis.exe to encrypt the web.config is out. Update: We can do this programmatically so this point is moot.
We would prefer to not have to recompile the application whenever a password changes, so using web.config (or db.config or whatever) seems to make the most sense.
A developer should not be able to get to the production database password. If a developer checks the source code out onto their localhost laptop (which would determine that it should be using the DevDatabaseConnectionString, remember?) and the laptop gets lost or stolen, it should not be possible to get at the other connection strings. Thus, having a single RSA private key to un-encrypt all three passwords cannot be considered. (Contrary to #3 above, it does seem like we'd need to have three separate key files if we went this route; these could be installed once per machine, and should the wrong key file get deployed to the wrong server, the worst that should happen is that the app can't decrypt anything---and not allow the wrong host to access the wrong database!)
UPDATE/ADDENDUM: The app has several separate web-facing components to it: a classic ASMX Web Services project, an ASPX Web Forms app, and a newer MVC app. In order to not go mad having the same connection string configured in each of these separate projects for each separate environment, it would be nice to have this only appear in one place. (Probably in our DAL class library or in a single linked config file.)
I know this is probably a subjective question (asking for a "best" way to do something), but given the criteria I've mentioned, I'm hoping that a single best answer will indeed arise.
Thank you!
Integrated authentication/windows authentication is a good option. No passwords, at least none that need be stored in the web.config. In fact, it's the option I prefer unless admins have explicity taken it away from me.
Personally, for anything that varies by machine (which isn't just connection string) I put in a external reference from the web.config using this technique: http://www.devx.com/vb2themax/Tip/18880
When I throw code over the fence to the production server admin, he gets a new web.config, but doesn't get the external file-- he uses the one he had earlier.
you can have multiple web servers with the same encrypted key. you would do this in machine config just ensure each key is the same.
..
one common practice, is to store first connection string encrypted somewhere on the machine such as registry. after the server connects using that string, it will than retrieve all other connection strings which would be managed in the database (also encrypted). that way connection strings can be dynamically generated based on authorization requirements (requestor, application being used, etc) for example the same tables can be accessed with different rights depending on context and users/groups
i believe this scenario addresses all (or most?) of your points..
(First, Wow, I think 2 or 3 "quick paragraphs" turned out a little longer than I'd thought! Here I go...)
I've come to the conclusion (perhaps you'll disagree with me on this) that the ability to "protect" the web.config whilst on the server (or by using aspnet_iisreg) has only limited benefit, and is perhaps maybe not such a good thing as it may possibly give a false sense of security. My theory is that if someone is able to obtain access to the filesystem in order to read this web.config in the first place, then they also probably have access to create their own simple ASPX file which can "unprotect" it and reveal its secrets to them. But if unauthorized people are trouncing around in your filesystem—well… then you have bigger problems at hand, so my whole concern is now moot! 1
I also realize that there isn’t a foolproof way to securely hide passwords within a DLL either, as they can eventually be disassembled and discovered, perhaps by using something like ILDASM. 2 An additional measure of security obscurity can be obtained by obfuscating and encrypting your binaries, such as by using Dotfuscator, but this isn’t to be considered “secure.” And again, if someone has read access (and likely write access too) to your binaries and filesystem, you’ve again got bigger problems at hand methinks.
To address the concerns I mentioned about not wanting the passwords to live on developer laptops or in SVN: solving this through a separate “.config” file that does not live in SVN is (now!) the blindingly obvious choice. Web.config can live happily in source control, while just the secret parts do not. However---and this is why I’m following up on my own question with such a long response---there are still a few extra steps I’ve taken to try and make this if not any more secure, then at least a little bit more obscure.
Connection strings we want to try to keep secret (those other than the development passwords) won’t ever live as plain text in any files. These are now encrypted first with a secret (symmetric) key---using, of course, the new ridiculous Encryptinator(TM)! utility built just for this purpose---before they get placed in a copy of a “db.config” file. The db.config is then just uploaded only to its respective server. The secret key is compiled directly into the DAL’s dll, which itself would then (ideally!) be further obfuscated and encrypted with something like Dotfuscator. This will hopefully keep out any casual curiosity at the least.
I’m not going to worry much at all about the symmetric "DbKey" living in the DLLs or SVN or on developer laptops. It’s the passwords themselves I’ll keep out. We do still need to have a “db.config” file in the project in order to develop and debug, but it has all fake passwords in it except for development ones. Actual servers have actual copies with just their own proper secrets. The db.config file is typically reverted (using SVN) to a safe state and never stored with real secrets in our subversion repository.
With all this said, I know it’s not a perfect solution (does one exist?), and one that does still require a post-it note with some deployment reminders on it, but it does seem like enough of an extra layer of hassle that might very well keep out all but the most clever and determined attackers. I’ve had to resign myself to "good-enough" security which isn’t perfect, but does let me get back to work after feeling alright about having given it the ol’ "College Try!"
1. Per my comment on June 15 here http://www.dotnetcurry.com/ShowArticle.aspx?ID=185 - let me know if I'm off-base! -and some more good commentary here Encrypting connection strings so other devs can't decrypt, but app still has access here Is encrypting web.config pointless? and here Encrypting web.config using Protected Configuration pointless?
2. Good discussion and food for thought on a different subject but very-related concepts here: Securely store a password in program code? - what really hit home is the Pidgin FAQ linked from the selected answer: If someone has your program, they can get to its secrets.

Encrypt / Decrypt in asp.net using RsaProtectedConfigurationProvider

The encryption worked properly. But now i am getting an error that says "RsaProtectedConfigurationProvider Bad Data" When checked, i came to know that we need to run the command:
aspnet_regiis -pa "NetFrameworkConfigurationKey" "NT AUTHORITY\NETWORK SERVICE"
My question is, if I run this command in my production environment, will it affect any other websites thats hosted in the same server. Since its an update to the machine.config file will ther be any chnace that some other things will be affected?
It won't (should not) because that command does not modify the machine.config file, but an ACL that controls which accounts have access to the key container. You can read more here:
http://msdn.microsoft.com/en-us/library/yxw286t2.aspx
This is a sentence from the article
"By default, RSA key containers are tightly protected by NTFS access control lists (ACLs) on the server where they are installed. This improves the security of the encrypted information by restricting who can access the encryption key."
As for your specific error, I just worked with web.config file encryption a few days back, and I recall receiving the Bad data error at one point. After a couple of times of repeating the setup steps, I was able to make it work, but I can't confirm which step made it work. My guesses for your case are:
You imported the wrong key file (the exported XML from the original container) into the container on that machine.
The data value on the config was messed with.
The account that is trying to decrypt the config file does not have privileges to that key container. In that case, the command that you ask about is the one to give access to a given account.
You could be referencing a different key container on your configProtectedData section. Hadn't though of this, and I'm not sure if you would get that specific Bad Data error, but it's a thought.
Hope you solved after all. Even though the question is old, I thought the answer might help someone.

Resources