Can I code HTTP requests to work around SSL warnings? - http

I have code that makes a server side call to another web service. In many situations this is over SSL. We have problems where this fails when the stars are not completely aligned. I realize that the certificate must be trusted and the internal certificate authorities (CAs) likely have to be manually imported into the trusted root certificate store on the servers. However, there are other cases as well. Today I ran into a problem where if I use a browser to hit the URL I get a warning stating that:
The name on the security certificate is invalid or does not match the name of the site
This is just a warning that you can acknowledge with the browser and tell it to continue to the site.
However, it seems to prevent my code from working. Here is a condensed version of my code:
HttpWebRequest request = WebRequest.Create(url) as HttpWebRequest;
request.Credentials = new NetworkCredential(user, password);
HttpWebResponse response = request.GetResponse() as HttpWebResponse;
In looking at HttpWebRequest I don't see anything obvious that would help me out and I don't currently have the luxury of trying things out in my developer environment. Is there a simple change I can make to fix this? Are there other similar gotchas I can prevent this way as well?

Kirk, have you looked into using ServerCertificateValidationCallback and simply returning true (as a test) and then augmenting the error checking for the special cases? You can also check out What is the best way of handling non-validating SSL certificates in C# for some other info on checking cer
Working solution:
HttpWebRequest request = WebRequest.Create(url) as HttpWebRequest;
request.Credentials = new NetworkCredential(user, password);
ServicePointManager.ServerCertificateValidationCallback = delegate(object certsender, X509Certificate cert, X509Chain chain, System.Net.Security.SslPolicyErrors error)
{
return true;
}
HttpWebResponse response = request.GetResponse() as HttpWebResponse;
See also SSLPolicyErrors for the enumeration to review in the delegate.

Related

How to do Preemptive authentication using Java 11 HTTP client?

I am trying to use Java 11 HTTP Client against an authenticated service, using Basic Authentication.
The authentication occurs successfully but it makes an additional round-trip to the server, to understand it should send the authentication data.
Have searched around documentation and code and at some point internally it uses some kind of cache, but I am unable to set the cache value.
Here is my client code:
HttpRequest request = HttpRequest.newBuilder()
.uri(URI.create("http://someurl.com"))
.build();
HttpClient client = HttpClient.newBuilder()
.authenticator(new Authenticator() {
#Override
protected PasswordAuthentication getPasswordAuthentication() {
return new PasswordAuthentication("user", "pass".toCharArray());
}
})
.build();
HttpResponse<String> response = client.send(request, HttpResponse.BodyHandlers.ofString());
What I expected was that somehow I could tell the client to preemptively sent the authentication data, not only when the server requests.
The HttpClient behaves in the same way than HttpURLConnection in what preemptive authentication is concerned: for basic authentication it will preemptively send the credentials if it finds them in its cache. However, the cache is populated after the first successful request (or more exactly after the response headers indicating that the authentication was successful are parsed).
If this is not satisfactory for you then a possibility is to handle authentication directly in your code by preemptively inserting the Authorization header in your request, and not setting any Authenticator.
Thanks to #daniel, this is the solution I came up with, adding the header to the HttpRequest and removing Authenticator.
String encodedAuth = Base64.getEncoder()
.encodeToString(("user" + ":" + "pass").getBytes(StandardCharsets.UTF_8));
HttpRequest request = HttpRequest.newBuilder()
.uri(URI.create("http://someurl.com"))
.header("Authorization", "Basic " + encodedAuth)
.build();
HttpClient client = HttpClient.newHttpClient();
HttpResponse<String> response = client.send(request, HttpResponse.BodyHandlers.ofString());
Wish the client has some other way to tell to preemptively send authentication data rather than manually creating the Header, but this way it works.

An API service I use is disabling SSL 3.0 because of the POODLE exploit. If I use HttpClient and HttpRequestMessage do I need to change my code?

Say I do typical stuff like this:
HttpRequestMessage requestMessage = new HttpRequestMessage();
requestMessage.RequestUri = new Uri("https://api.site.com/");
HttpClient httpClient = new HttpClient();
httpClient.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
Task<HttpResponseMessage> responseMessage = httpClient.SendAsync(requestMessage);
And the API service tells me they're turning off SSL 3.0 because of the POODLE exploit, do I have to do anything to my code to make sure it will use TLS? Does anything need to happen with the machine making the request?
I'm aware that there's a significant amount of ignorance baked into the question, but I think a lot of developers just want this question answered.
I faced same issue for Facebook and twitter yesterday and Linkedin from today. Added the following piece of code before every web request worked for me.
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;
It was lot confusing as on restarting the application, it worked for 10-15 mins before the first error. Following was the error logged.
System.Net.WebException: The request was aborted: Could not create SSL/TLS secure channel.
I tried powershell script to disable ssl and enable tls in the server using http://www.hass.de/content/setup-your-iis-ssl-perfect-forward-secrecy-and-tls-12
It did not work and finally had to change the code to make it work.

Why is Web Request content truncated

I have a Web Service which is doing some screen scraping of an aspx website.
I can get it to log in successfully, but then when I submit a request, it returns a server error. When I check it out with Fiddler, it shows that the content (the query string) is being truncated so it is not all submitted. The content is quite long over 3600 chars. (Not my choice, it's just the way the website was created and what it expects.)
HttpWebRequest webRequest = WebRequest.Create(REQUESTUSAGE) as HttpWebRequest;
webRequest.CookieContainer = this.Cookies;
webRequest.Method = "POST";
webRequest.ContentType = "application/x-www-form-urlencoded";
StreamWriter requestWriter = new StreamWriter(webRequest.GetRequestStream());
requestWriter.Write(GetPostDataForRequest());
WebResponse response = null;
try
{
response = webRequest.GetResponse();
}
catch (Exception ex)
{}
The GetPostDataForRequest returns the content, but like I said, Fiddler shows it is missing the last 600 chars or so for no apparent reason. The debugger shows the string is returned as expected, but somehow it does not get written correctly.
So how to I get it to submit the full string?
OK, I got this resolved. I was not closing the requestWriter.
There are usually limits to the request size - take a look at maximum length of HTTP GET request?
It appears that you are running into a browser issue, not a server issue. Can you issue the request with a command line tool (i.e. something like wget) to verify that it is not a problem with the server?
You can also try a different browser, which may have different limits.

strange cache issue with HttpWebRequest

We have a web app that calls a url of an outside search provider to load search results. Each time our page is hit it can pass in a current page number. This page number is included in the url sent to the 3rd party. I've noticed that while they report 45 pages of results, if I go to one of the pages that includes their results and then try to navigate to another page that has additional results from them, the same results from the first page are loaded.
I tried setting up my HttpWebRequest object to disable caching but everything I've tried doesn't seem to work. And, considering the url changes each time due to the page number, I wouldn't think that it would really be a cache issue. But here's where it gets interesting.
If I copy the url that I'm retrieving in code and paste it into chrome it loads the correct results. I then refresh the web app page and it too now loads the results for that page. This makes no sense to me. The code is running locally but since it's running within asp.net, it isn't using chrome to create the web request, so why does this happen?
Here's the code I have that calls the url and returns the result.
public static string FetchPage(string url)
{
//Specify the encoding
Encoding enc = System.Text.Encoding.GetEncoding(1252);
HttpWebRequest.DefaultCachePolicy = new HttpRequestCachePolicy(HttpRequestCacheLevel.Default); ;
//Create the http request
var request = (HttpWebRequest)WebRequest.Create(url);
request.CachePolicy = new HttpRequestCachePolicy(HttpRequestCacheLevel.NoCacheNoStore);
//Create the http response from the http request
var httpWebResponse = (HttpWebResponse)request.GetResponse();
//Get the response stream
var responseStream = new StreamReader(httpWebResponse.GetResponseStream(), enc);
//Read the response stream
string htmlStream = responseStream.ReadToEnd();
//Close the web response
httpWebResponse.Close();
//Close the response stream
responseStream.Close();
return htmlStream;
}
Please try these following lines to set no-cache in both HttpWebRequest and WebResponse
For Request
Request.Headers.Set(HttpRequestHeader.CacheControl, "max-age=0, no-cache, no-store");
Request.CachePolicy = new System.Net.Cache.RequestCachePolicy(System.Net.Cache.RequestCacheLevel.NoCacheNoStore);
For Response
Response.Headers.Add(HttpRequestHeader.CacheControl, "no-cache");

Authenticated HttpWebRequest with redirection, persisting credentials?

My ASP.NET 2.0 app creates a HTTPWebRequest to a site within a company's intranet, which uses NTLM authentication. The credentials passed are for a service account, which is authenticated on the domain successfully (the security log confirms this)
Some abbreviated code follows..
HttpWebRequest req = WebRequest.Create(queryUrl) as HttpWebRequest;
NetworkCredential cred = new NetworkCredential(username,
pwd, domain);
req.Credentials = cred;
HttpWebResponse response = req.GetResponse() as HttpWebResponse;
As part of the request, there are a couple of redirections (within the same domain) to the final response - which is handled OK on my dev machine (Windows 2k)
When this request is created from my deployment environment (Windows 2k3), I get a 401 Unauthorized error returned from the site, seemingly after the first redirect code is returned (301 Moved), and my request object attempts to follow the redirect.
So basically, does anyone know of any issues surrounding authenticated HttpWebRequests that follow redirections?
PS - The obvious workaround is to simply request the page redirected to - but I the admins in charge of the intranet site want to monitor my app's usage by redirecting me through a specific page.
For HttpWebRequest to reuse credentials across redirects you need to use a credential cache.
If you just assign a NetworkCredentials object it will only be used on the first request.
Here is an example:
HttpWebRequest req = WebRequest.Create(queryUrl) as HttpWebRequest;
NetworkCredential cred = new NetworkCredential(username, pwd, domain);
var cache = new CredentialCache {{queryUrl, "Ntlm", cred}};
req.Credentials = cache;
HttpWebResponse response = req.GetResponse() as HttpWebResponse;
It's going to depend on how your auth. scheme works. The Network credentials is only going to help for the NTLM part of if. I suspect that the site you are trying to access is using forms authentication also. If this is the case, when you log in you should get an auth cookie, you will need to include that in subsequent requests, e.g. after a redirect. I think the WebRequest object has a headers collection that you can use to hold the cookie. Might be a good idea to use fiddler or firebug to see what is coming across when you normally browse.
If you are using NTLM, This is the classic 2 hop problem. It works on your dev machine because the client and the server are on the same box and the credentials are passed at most once (to the redirect final target machine i'm guessing)
When you deploy to your prod environment, there are 3 machines involved. Client browser passes credentials to server1, then server1 tries to pass the credentials to server2 which is not allowed. One work around is to implement Kerberos authentication (a stricter protocol) which will allow server1 to pass credentials to server2

Resources