I am consuming some service and to consume the service provider has given a certificate.
So I have installed the certificate on LocalMachine and through following code I am attaching the certificate with the web request which i am posting to get response from the web service.
X509Certificate cert = null;
string ResponseXml = string.Empty;
// Represents an X.509 store, which is a physical store
// where certificates are persisted and managed
X509Store certStore = new X509Store(StoreName.My, StoreLocation.LocalMachine);
certStore.Open(OpenFlags.ReadOnly);
X509Certificate2Collection results =
certStore.Certificates.Find(X509FindType.FindBySubjectDistinguishedName,
Constants.CertificateName, false);
certStore.Close();
if (results != null && results.Count > 0)
cert = results[0];
else
{
ErrorMessage = "Certificate not found";
return ErrorMessage;
}
webClient.TransportSettings.ClientCertificates.Add(cert);
This works perfectly when i run the code with ASP.net Cassini (ASP.NET Developement Server).
But when i am hosting this code in IIS 7.0 it give forbidden 403 Error as response.
Please suggest.
You should maybe try this:
winhttpcertcfg -g -c LOCAL_MACHINE\MY -s (MyCertificate) -a ASPNET
As it turns out, the user who installs the certificate is automatically
granted access to the private key, I guess then in your case that would be you, so it works in the dev environment. When the web front end comes along, you are no longer the user, ASPNET is.
Related
I have IdentityServer3 and I'm trying to run their original samples WebHost (minimal) as the server and Console Client Credentials Flow using Certificate as the client because I want to test that the client can validate against IdS3 by using a X509 Thumbprint instead of a shared secret to get an Access Token.
The problem I'm having is that I'm getting an error response: invalid_client.
Apparently, it's because IdS3 doesn't receive the certificate on the incoming request, so it considers that the token request is invalid (I tested this by adding a custom SecretParser and checking the environment parameter and there's no ssl.ClientCertificate value which is the one X509CertificateSecretParser uses to parse it).
I'm just running both projects in 2 different instances of Visual Studio into IIS Express without modifying anything else on the projects. Is there anything that I'm missing on this matter? What else should I need to setup in order to make this work?
The first thing you need to do is to enable client certificates in IIS Express.
You do this by editing this file:
.vs\config\applicationhost.config
Change
<access sslFlags="None" />
to
<access sslFlags="Ssl, SslNegotiateCert" />
Now IIS Express supports client certificates, but it checks if the certificate is trusted as well.
The sample certificate, Client.pfx, will not work out of the box.
You can either let Windows trust the issuer of this certificate (not reccomended) or you could load an existing certificate from the certificate store with code like this:
X509Store store = new X509Store(StoreLocation.CurrentUser);
store.Open(OpenFlags.ReadOnly);
string thumb = "<thumbprint>";
X509Certificate2Collection cers = store.Certificates.Find(X509FindType.FindByThumbprint, thumb, false);
X509Certificate2 cert = null;
if (cers.Count > 0)
{
cert = cers[0];
}
store.Close();
You will also need to put the thumbprint of this certificate into the ClientSecret property in the client list on the Identity Server.
This is the sample code you will need to change:
new Client
{
ClientName = "Client Credentials Flow Client",
Enabled = true,
ClientId = "clientcredentials.client",
Flow = Flows.ClientCredentials,
ClientSecrets = new List<Secret>
{
new Secret("secret".Sha256()),
new Secret
{
Value = "<your thumbprint here>",
Type = Constants.SecretTypes.X509CertificateThumbprint,
Description = "Client Certificate"
},
},
AllowedScopes = new List<string>
{
"read",
"write"
},
Claims = new List<Claim>
{
new Claim("location", "datacenter")
}
},
I want to be able to map SSL client certificates to ASP.NET Identity users. I would like IIS to do as much of the work as possible (negotiating the client certificate and perhaps validating that it is signed by a trusted CA), but I don't want IIS to map the certificate to a Windows user. The client certificate is passed through to ASP.NET, where it is inspected and mapped to an ASP.NET Identity user, which is turned into a ClaimsPrincipal.
So far, the only way I have been able to get IIS to pass the client certificate through to ASP.NET is to enable iisClientCertificateMappingAuthentication and set up a many-to-one mapping to a Windows account (which is then never used for anything else.) Is there any way to get IIS to negotiate and pass the certificate through without this configuration step?
You do not have to use the iisClientCertificateMappingAuthentication. The client certificate is accessible in the HttpContext.
var clientCert = HttpContext.Request.ClientCertificate;
Either you enable RequireClientCertificate on the complete site or use a separate login-with-clientcertificate page.
Below is one way of doing this in ASP.NET MVC. Hopefully you can use parts of it to fit your exact situation.
First make sure you are allowed to set the SslFlags in web.config by turning on feature delegation.
Make site accept (but not require) Client Certificates
Set path to login-with-clientcertificate-page where client certificates will be required. In this case a User controller with a CertificateSignin action.
Create a login controller (pseudo-code)
[OutputCache(NoStore = true, Duration = 0, VaryByParam = "*")]
[AllowAnonymous()]
public ActionResult CertificateSignIn()
{
//Get certificate
var clientCert = HttpContext.Request.ClientCertificate;
//Validate certificate
if (!clientCert.IsPresent || !clientCert.IsValid)
{
ViewBag.LoginFailedMessage = "The client certificate was not present or did not pass validation";
return View("Index");
}
//Call your "custom" ClientCertificate --> User mapping method.
string userId;
bool myCertificateMappingParsingResult = Helper.MyCertificateMapping(clientCert, out userId);
if (!myCertificateMappingParsingResult)
{
ViewBag.LoginFailedMessage = "Your client certificate did not map correctly";
}
else
{
//Use custom Membersip provider. Password is not needed!
if (Membership.ValidateUser(userId, null))
{
//Create authentication ticket
FormsAuthentication.SetAuthCookie(userId, false);
Response.Redirect("~/");
}
else
{
ViewBag.LoginFailedMessage = "Login failed!";
}
}
return View("Index");
}
This is a complicated issue, so bear with me.
Scenario: using a ASHX proxy to relay request to an ArcGIS server.
Trying to use ASP.NET impersonation, so that the logged in ASP.NET user credentials are used by the proxy, when sending request to the ArcGIS server.
Issue: the proxy request to ArcGIS server is refused 401, even though I know the impersonated account (sean.ryan-B + sean.ryan) does have access.
There are 4 machines:
1. machine hosting proxy page. I am logged in as: sean.ryan-B
2. a test machine. I am logged in as sean.ryan-B
3. my laptop. I am logged in as sean.ryan
4. the arcgis server.
All 4 machines are on the same domain.
web.config:
<authentication mode="Windows"/>
<identity impersonate="true" /> <!-- userName="EUROPE\sean.ryan-B" password="xxx" -->
<authorization>
<deny users="?"/>
</authorization>
Test-1. Opening a test page, in same web app as proxy, via the proxy:
http://myHost.com/sean/ProxyAsp.Net/ArcGisProxy.ashx?http://myHost.com/sean/ProxyAsp.Net
[ok on all boxes 1-3]
This looks OK - the impersonation seems look OK,
since with impersonation OFF: WindowsIdentity.GetCurrent().Name = the AppPool account
with impersonation ON: WindowsIdentity.GetCurrent().Name = EUROPE\sean.ryan or EUROPE\sean.ryan-B
Test-2. opening an image that is hosted on the same IIS (but a different site), via the proxy:
http://myHost.com/sean/ProxyAsp.Net/ArcGisProxy.ashx?http://myHost.com:10400/sites/CaSPER/SiteAssets/CaSPER.jpg
[ok on boxes 1-3]
Test-3. opening the ArcGIS map URL, via the proxy:
http://myHost.com/sean/ProxyAsp.Net/ArcGisProxy.ashx?http://mapserver1.com/ArcGIS/rest/services/Global/2D_BaseMap_SurfaceGeology/MapServer?f=json&callback=dojo.io.script.jsonp_dojoIoScript1._jsonpCallback
[fails on boxes 2,3 but succeeds on the proxy host (box 1)!]
code for the ASHX code-behind:
public partial class ArcGisProxy : IHttpHandler, IReadOnlySessionState //ASHX implements IReadOnlySessionState in order to be able to read from session
{
public void ProcessRequest(HttpContext context)
{
try
{
HttpResponse response = context.Response;
// Get the URL requested by the client (take the entire querystring at once
// to handle the case of the URL itself containing querystring parameters)
string uri = context.Request.Url.Query;
uri = uri.Substring(1); //the Substring(1) is to skip the ?, in order to get the request URL.
System.Net.HttpWebRequest req = (System.Net.HttpWebRequest)WebRequest.Create(uri);
{
req.Credentials = CredentialCache.DefaultCredentials; //this works on local box, with -B account. this is the account the web browser is running under (rather than the account logged into CaSPER with, as ASHX has separate server session).
req.ImpersonationLevel = TokenImpersonationLevel.Impersonation;
}
//to turn off caching: req.CachePolicy = new RequestCachePolicy(RequestCacheLevel.NoCacheNoStore);
req.Method = context.Request.HttpMethod;
req.ServicePoint.Expect100Continue = false;
req.Referer = context.Request.Headers["referer"];
// Set body of request for POST requests
req.Method = "GET";
// Send the request to the server
System.Net.WebResponse serverResponse = null;
try
{
serverResponse = req.GetResponse();
}
catch (System.Net.WebException webExc)
{
//logger.Log(GetMyUrl(), webExc, context.Request);
response.StatusCode = 500;
response.StatusDescription = webExc.Status.ToString();
response.Write(webExc.ToString());
response.Write(webExc.Response);
response.Write("Username = " + context.User.Identity.Name + " " + context.User.Identity.IsAuthenticated + " " + context.User.Identity.AuthenticationType);
response.End();
return;
}
// Set up the response to the client
....
......
response.End();
}
catch (Exception ex)
{
throw;
}
}
public bool IsReusable
{
get
{
return false;
}
}
}
note: the following changes, meant proxy request to the map server DOES succeed:
a) set the identity in the web.config to explicitly set username, password to the sean.ryan-B account:
-OR-
b) set the App Pool account to be sean.ryan-B and turn OFF impersonation in the web.config file.
however these changes are not acceptable for Production.
The problem seems to be that:
- ASP.NET impersonation works well enough for test page + image hosted on same IIS (tests 1 and 2)
but NOT well enough for the map server.
as far as I know, the ArcGIS map server is using Negotiate, and then Kerberos authentication.
With WireShark, I monitored a successful proxy request, and found:
after 401, proxy sends GET with AUTH using SPNEGO (Kerberos)
Has anyone had similar issue with ArcGIS proxy ?
My theory is, that the impersonation on box 1 'works better', because browser is running on same box as the proxy.
Could the ArcGIS Server (or the IIS site it is using) be restricted to prevent accepting impersonation ?
Any suggestions welcome ...
p.s. had a hard time getting this post through - had to format most of it as code, as s-o is detecting it as source code !
I have a web role in Azure that has to connect to an SSL-secured external web service. When the application tries to connect to the web service, it's giving an error:
Could not establish trust relationship for the SSL/TLS secure channel
with authority 'certname.organization.org'.
The certificate that it needs has been uploaded to Azure as a service certificate, but for some reason it doesn't seem to be properly referencing it or using it.
Any thoughts on how to fix this?
That sounds like your service client in Azure isn't happy with the SSL certificate of the external service you're calling - do you have control of that service?
You can test this by using the following to ignore SSL errors from your client in Azure:
ServicePointManager.ServerCertificateValidationCallback =
(obj, certificate, chain, errors) => true;
I've seen this problem intermittently as well. In my case it turned out that the network connection to get the one of the root certificates would sometimes time out. Then on future requests it would work again.
I ended up writing a custom callback that would let the particular certificate I was interested in work despite the errors, without affecting validation of other certificates. The below is my code for that. As you can probably tell, I'm trying to hit the Android Cloud-to-Device Messaging endpoint, and trying to work around problems with the wildcard cert that Google uses, but it should be generalizable. This also has all the logging I used to diagnose the particular error. Even if you don't want to force validation of the certificate, the logging code could help you decide how to proceed.
private static readonly Uri PUSH_URI = new Uri("https://android.apis.google.com/c2dm/send", UriKind.Absolute);
/**
//The following function needs to be wired up in code somewhere else, like this:
ServicePointManager.ServerCertificateValidationCallback += ValidateDodgyGoogleCertificate;
**/
/// <summary>
/// Validates the SSL server certificate. Note this is process-wide code.
/// Wrote a custom one because the certificate used for Google's push endpoint is not for the correct domain. Go Google.
/// </summary>
/// <param name="sender">either a host name string, or an object derived from WebRequest</param>
/// <param name="cert">The certificate used to authenticate the remote party.</param>
/// <param name="chain">The chain of certificate authorities associated with the remote certificate.</param>
/// <param name="sslPolicyErrors">One or more errors associated with the remote certificate.</param>
/// <returns>
/// Returns a boolean value that determines whether the specified
/// certificate is accepted for authentication; true to accept or false to
/// reject.
/// </returns>
private static bool ValidateDodgyGoogleCertificate(object sender, X509Certificate cert, X509Chain chain, SslPolicyErrors sslPolicyErrors)
{
if (sslPolicyErrors == SslPolicyErrors.None)
{
// Good certificate.
return true;
}
string hostName = sender as string;
if (hostName == null)
{
WebRequest senderRequest = sender as WebRequest;
if (senderRequest != null)
{
hostName = senderRequest.RequestUri.Host;
}
}
//We want to get past the Google name mismatch, but not allow any other errors
if (sslPolicyErrors != SslPolicyErrors.RemoteCertificateNameMismatch)
{
StringBuilder sb = new StringBuilder();
sb.AppendFormat("Rejecting remote server SSL certificate from host \"{0}\" issued to Subject \"{1}\" due to errors: {2}", hostName, cert.Subject, sslPolicyErrors);
if ((sslPolicyErrors | SslPolicyErrors.RemoteCertificateChainErrors) != SslPolicyErrors.None)
{
sb.AppendLine();
sb.AppendLine("Chain status errors:");
foreach (var chainStatusItem in chain.ChainStatus)
{
sb.AppendFormat("Chain Item Status: {0} StatusInfo: {1}", chainStatusItem.Status, chainStatusItem.StatusInformation);
sb.AppendLine();
}
}
log.Info(sb.ToString());
return false;
}
if (PUSH_URI.Host.Equals(hostName, StringComparison.InvariantCultureIgnoreCase))
{
return true;
}
log.Info("Rejecting remote server SSL certificate from host \"{0}\" issued to Subject \"{1}\" due to errors: {2}", hostName, cert.Subject, sslPolicyErrors);
return false;
}
Ignoring SSL errors is one thing you can do.
But if it works on your machine, and it doesn't work on your instances it might also be that the certificate chain is incomplete on the instances. You'll need to open the certificate on your machine, go to Certification Path and export each certificate in the path.
Then, add these certificates to your project and have a startup task (.bat or .cmd file) add them to the trusted root CA:
REM Install certificates.
certutil -addstore -enterprise -f -v root Startup\Certificates\someROOTca.cer
certutil -addstore -enterprise -f -v root Startup\Certificates\otherROOTca.cer
i added the cer to the root of my project and select "Copy Always" and use the following command to make azure connect to server with SSL self sign
REM Install certificates.
certutil -addstore -enterprise -f -v root startsodev.cer
I worked on a sample application integrating OpenID into ASP.NET Web Forms. It works fine when hosted locally on my machine. However, when I uploaded the application to a live server, it started giving "Login Failed".
You can try a sample here: http://samples.bhaidar.net/openidsso
Any ideas?
Here is the source code that fails to process the OpenID response:
private void HandleOpenIdProviderResponse()
{
// Define a new instance of OpenIdRelyingParty class
using (var openid = new OpenIdRelyingParty())
{
// Get authentication response from OpenId Provider Create IAuthenticationResponse instance to be used
// to retreive the response from OP
var response = openid.GetResponse();
// No authentication request was sent
if (response == null) return;
switch (response.Status)
{
// If user was authenticated
case AuthenticationStatus.Authenticated:
// This is where you would look for any OpenID extension responses included
// in the authentication assertion.
var fetchResponse = response.GetExtension<FetchResponse>();
// Store the "Queried Fields"
Session["FetchResponse"] = fetchResponse;
// Use FormsAuthentication to tell ASP.NET that the user is now logged in,
// with the OpenID Claimed Identifier as their username.
FormsAuthentication.RedirectFromLoginPage(response.ClaimedIdentifier, false);
break;
// User has cancelled the OpenID Dance
case AuthenticationStatus.Canceled:
this.loginCanceledLabel.Visible = true;
break;
// Authentication failed
case AuthenticationStatus.Failed:
this.loginFailedLabel.Visible = true;
break;
}
}
As Andrew suggested, check the exception. In my case, my production server's time & date were off and it wouldn't authenticate because the ticket expired.
Turn on logging on your live server and inspect them for additional diagnostics. It's most likely a firewall or permissions problem on your server that prevents outbound HTTP requests.
You may also find it useful to look at the IAuthenticationResponse.Exception property when an authentication fails for clues.