WCF - 403 until I restart the AppPool - asp.net

I have a rather curious (a.k.a. annoying) issue.
I've setup a WCF web service repository on my local workstation (still under development).
I've published the project under the local IIS with its own dedicated application pool: .NET v4.0.30319, Integrated pipeline mode.
In some of the webservices inside this solution I make some rest calls to other external https webservices (over the Internet).
After, let's say a day or even less, all the rest calls made by the published solution end up with 403 Forbidden error.
If I restart the application pool under which this solution in running, all these 403 errors are gone - that is, of course, for another couple of hours or a day tops - this is the part that's driving me crazy!!!
W/o any change in the code, or anything else, if I just restart the application pool, the ws calls start working again.
Also, when I have the 403 errors, if I run the project in debug mode (VS * IIS express), the calls to the external web services are working perfectly (no 403).
I could provide samples of code, but as stated above, I don't really think this is a code related issue as, once I restart the app pool, everything starts working again...
Has anyone ever encountered such an annoying issue?
Please help!
Below are the application pool's settings.
Later edit:
Perhaps it's worth mentioning that for the REST calls to the external web services, I use the same generic method. Here's the code:
public static string GetRestCall(string requestUri, string contentType, string method, string postData, bool removeServerCerticateValidatioCallBack = true,bool byPassCachePolicy = true,
bool keepAliveRequest = true, string userAgent = null, string acceptHeaderString = null, string referer = null, CookieContainer cookieContainer = null, int timeOutRequest = 60000)
{
try
{
if (removeServerCerticateValidatioCallBack == true)
{
ServicePointManager.ServerCertificateValidationCallback += (se, cert, chain, sslerror) =>
{
return true;
};
}
//ServicePointManager.SecurityProtocol = SecurityProtocolType.Ssl3;
HttpWebRequest req = (HttpWebRequest)WebRequest.Create(#requestUri);
req.KeepAlive = keepAliveRequest;
if (byPassCachePolicy == true)
{
RequestCachePolicy cachePolicy = new RequestCachePolicy(RequestCacheLevel.BypassCache);
req.CachePolicy = cachePolicy;
req.Expect = null;
}
if (!string.IsNullOrEmpty(method))
req.Method = method;
if (!string.IsNullOrEmpty(acceptHeaderString))
req.Accept = acceptHeaderString;
if (!string.IsNullOrEmpty(referer))
req.Referer = referer;
if (!string.IsNullOrEmpty(contentType))
req.ContentType = contentType;
if (!string.IsNullOrEmpty(userAgent))
{ req.UserAgent = userAgent; }
else
{
req.UserAgent = #"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.106 Safari/537.36";
}
if (cookieContainer != null)
req.CookieContainer = cookieContainer;
req.Timeout = timeOutRequest;
using (Stream stm = req.GetRequestStream())
{
using (StreamWriter stmw = new StreamWriter(stm))
{
stmw.Write(postData);
}
}
using (WebResponse response = req.GetResponse() as WebResponse)
{
using (StreamReader streamReader = new StreamReader(response.GetResponseStream()))
{
string responseStream = streamReader.ReadToEnd();
return responseStream;
}
}
}
catch
{
throw;
}
}

Related

IIS removes "Content-Length" from Header

We use a WebMethod on our IIS Webservice, so that users can download a file.
Our client runs into an exception when connected to one of our customers webservices, because the key "Content-Length" cannot be found in the header (KeyNotFoundException). The method does work for all other customers.
The customer installed a fresh version of Windows Server 2016, one of my colleagues then installed the IIS roles and features. We double and triple checked: the configuration is the same as the one on our internal webservice and as far as we know as on all the webservices other customers run.
After debugging and searching on the internet for the past two days I found out that instead of the "Content-Length" header a header named "Transfer-Encoding" is send, value is "chunked".
It seems that this only occurs when we call the method via POST, but I'm not completely sure about that.
What we have tried so far:
Disabling chunked-encoding with this script: cscript adsutil.vbs set
/W3SVC/AspEnableChunkedEncoding "TRUE"
Disabling chunked-encoding via appcmd: appcmd set config /section:asp /enableChunkedEncoding:False
Setting system.webServer/asp/enableChunkedEncoding to false via the iis configuration manager of the server AND the site.
We restarted the whole machine after each of these steps.
IIS Webmethod Code (shortened):
[WebMethod]
public void Download(string uniqueID)
{
Byte[] data;
var path = GetPath(uniqueID);
data = File.ReadAllBytes(path);
if (data != null)
{
string sExtension = Path.GetExtension(path);
string MimeType = GetMIMEType(sExtension);
HttpContext.Current.Response.ClearContent();
HttpContext.Current.Response.ClearHeaders();
HttpContext.Current.Response.ContentType = "application/octet-stream";
HttpContext.Current.Response.AddHeader("Content-Disposition", "attachment;filename=" + path.Replace(" ", "_"));
HttpContext.Current.Response.AddHeader("Content-Type", MimeType);
HttpContext.Current.Response.AddHeader("Content-Length", data.Length.ToString());
HttpContext.Current.Response.BinaryWrite(data);
HttpContext.Current.Response.Flush();
HttpContext.Current.Response.SuppressContent = true;
HttpContext.Current.ApplicationInstance.CompleteRequest();
}
}
Client Code (shortened, written in Xamarin.Android as an example, same error occurs on iOS)
Stream stream = null;
Java.IO.DataOutputStream dos = null;
var urlConnection = CreateUrlConnection();
urlConnection.DoInput = true; // Allow Inputs
urlConnection.DoOutput = true; // Allow Outputs
urlConnection.RequestMethod = "POST";
urlConnection.SetRequestProperty("Content-Type", "application/x-www-form-urlencoded");
dos = new Java.IO.DataOutputStream(urlConnection.OutputStream);
string encodedParameters = "";
bool first = true;
foreach (Parameter param in parameters)
{
if (first)
first = false;
else
encodedParameters += "&";
encodedParameters += HttpUtility.UrlEncode(param.Name);
encodedParameters += "=";
encodedParameters += HttpUtility.UrlEncode(param.Value);
}
dos.WriteBytes(encodedParameters);
dos.Flush();
dos.Close();
stream = urlConnection.InputStream;
var header = urlConnection.HeaderFields;
var bytesToRead = int.Parse(header["Content-Length"][0]); // Exception gets thrown here

HttpClient and WebClient requests / responses don't work for intranet with DefaultCredentials

Problem: I am unable to get content from an company internal webpage through using HttpClient or WebClient. I am able to get the content by accessing the URL directly, however.
Details: .NET Core 3.1 Razor Pages, IIS 10, Windows Authentication.
I have a website http://myintranet/Editor/Bib/4343 where a user can press a button to generate a static page. Behind the scenes, it attempts to read a stream from http://myintranet/Editor/NewBib/4343/true and create a static HTML page from it.
When clicking the button, the response is always IIS 10.0 Detailed Error - 401.1 - Unauthorized etc.
However when I access the webpage directly in the browser, it opens up just fine (note that if it is the first time accessing the website, I am prompted by the browser to enter my username and password. After that, the browser remembers these credentials).
Also note that when running it from localhost through Visual Studio, all works fine, the static page downloads properly too.
Here is my code:
Version 1:
public IActionResult OnPostGenerateStaticPage()
{
try
{
HttpClient client = new HttpClient(new HttpClientHandler() { UseDefaultCredentials = true, PreAuthenticate = true });
HttpResponseMessage response = client.GetAsync(Url.PageLink().Replace("Bib", "NewBib") + "/true").Result;
var indexPageContents = response.Content.ReadAsStreamAsync().Result;
var cd = new System.Net.Mime.ContentDisposition
{
FileName = BibNumber + ".htm",
Inline = false,
};
Response.Headers.Add("Content-Disposition", cd.ToString());
return File(indexPageContents, "text/html");
}
catch (IOException)
{
return RedirectToPage("./Bib");
}
}
Version 2:
public IActionResult OnPostGenerateStaticPage()
{
try
{
WebClient client = new WebClient { UseDefaultCredentials = true };
string desiredUrl = Url.PageLink().Replace("Bib", "NewBib") + "/true";
var indexPageContents = client.OpenRead("desiredUrl");
var cd = new System.Net.Mime.ContentDisposition
{
FileName = BibNumber + ".htm",
Inline = false,
};
Response.Headers.Add("Content-Disposition", cd.ToString());
return File(indexPageContents, "text/html");
}
catch (IOException)
{
return RedirectToPage("./Bib");
}
}
Another thing, I have asked the web server admin to check that NTLM is above Negotiate for the authentication providers for this website and it is. Also, Anonymous and Basic Authentication are disabled and Windows Authentication is enabled.
Not sure where to go from here...

.net core 2.0 proxy requests always result in http 407 (Proxy Authentication Required)

I'm trying to make HTTP requests via a WebProxy in a .net core 2.0 web application. The code I've got works fine in .net framework so I know (believe) its not an environmental issue. I've also tried to make the request using both HttpWebRequest and HttpClient but both mechanisms always result in 407 (Proxy Authentication Required) http error in .net core. Its as if in .net core the credentials I'm supplying are always being ignored.
Here is the code I've been using:
public void Test()
{
var cred = new NetworkCredential("xxxxx", "yyyyyy");
var proxyURI = new Uri("http://xxx.xxx.xxx.xxx:80");
var destinationURI = new Uri("http://www.bbc.co.uk");
WebProxy proxy = new WebProxy(proxyURI, false) { UseDefaultCredentials = false, Credentials = cred };
MakeProxyRequestViaHttpWebRequest(proxy, destinationURI);
MakeProxyRequestViaHttpClient(proxy, destinationURI);
}
private void MakeProxyRequestViaHttpClient(WebProxy proxy, Uri destination)
{
HttpClientHandler handler = new HttpClientHandler()
{
Proxy = proxy,
UseProxy = true,
PreAuthenticate = true,
UseDefaultCredentials = false
};
HttpClient client = new HttpClient(handler);
HttpResponseMessage response = client.GetAsync(destination).Result;
if (response.IsSuccessStatusCode)
{
HttpContent content = response.Content;
string htmlData = content.ReadAsStringAsync().Result;
}
else
{
HttpStatusCode code = response.StatusCode;
}
}
private void MakeProxyRequestViaHttpWebRequest(WebProxy proxy, Uri destination)
{
HttpWebRequest req = HttpWebRequest.Create(destination) as HttpWebRequest;
req.UseDefaultCredentials = false;
req.Proxy = proxy;
req.PreAuthenticate = true;
using (WebResponse response = req.GetResponse())
{
using (StreamReader responseStream = new StreamReader(response.GetResponseStream()))
{
string htmlData = responseStream.ReadToEnd();
}
}
}
I've tried the following in .net core but the result is always 407:
Run the code in a console app
Implement IWebProxy and use that as the proxy
Set default values for other properties on WebProxy, HttpClient, etc. (removed on the example above because it works fine on .net standard)
I've run out of ideas and things to try. I have the following questions:
Does the code need to be different between .net core vs .net framework
Are there additional things that need to go into appsettings.json (ie. the config that would have gone into web.config)
Is there any additional configuration code required in Startup.cs
Should I be looking to use an external library
How would I trouble shoot what the issue is? Fiddler doesn't seem to be helping but then I haven't looked to hard at configuring it.

Keep TCP port open using httpclient in C#

I am a newbie into async programming and am trying to use the httpclient to fire bulk URL requests for the page content.
Here is my attempt:
private async void ProcessUrlAsyncWithHttp(HttpClient httpClient, string purl)
{
Stopwatch sw = new Stopwatch();
sw.Start();
HttpResponseMessage response = null;
try
{
Interlocked.Increment(ref _activeRequestsCount);
var request = new HttpRequestMessage()
{
RequestUri = new Uri(purl),
Method = HttpMethod.Get,
};
request.Headers.TryAddWithoutValidation("User-Agent", "MozillaMozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.95 Safari/537.36");
request.Headers.TryAddWithoutValidation("Accept", "text/html,*.*");
request.Headers.TryAddWithoutValidation("Connection", "Keep-Alive");
request.Headers.TryAddWithoutValidation("Accept-Encoding", "gzip, deflate, sdch");
request.Headers.TryAddWithoutValidation("Accept-Language", "en-US,en;q=0.8");
response = await httpClient.SendAsync(request).ConfigureAwait(false);
string html = await response.Content.ReadAsStringAsync().ConfigureAwait(false);
response.Dispose();
if (IsCaptcha(html)) throw new Exception("Captcha was returned");
request.Dispose();
Interlocked.Increment(ref _successfulCalls);
}
catch (HttpRequestException hex)
{
Console.WriteLine("http:" + hex.Message);
Interlocked.Increment(ref _failedCalls);
}
catch (Exception ex)
{
Console.WriteLine(ex.GetType().AssemblyQualifiedName + " " + ex.Message);
Interlocked.Increment(ref _failedCalls);
}
finally
{
Interlocked.Decrement(ref _activeRequestsCount);
Interlocked.Decrement(ref _itemsLeft);
if (response != null) response.Dispose();
if (httpClient != null) httpClient.Dispose();
sw.Stop();
DateTime currentTime = DateTime.UtcNow;
TimeSpan elapsedTillNow = (currentTime - _overallStartTime).Duration();
Console.WriteLine("Left:" + _itemsLeft + ", Current execution:" + sw.ElapsedMilliseconds + " (ms), Average execution:" + Math.Round((elapsedTillNow.TotalMilliseconds / (_totalItems - _itemsLeft)), 0) + " (ms)");
lock(_syncLock)
{
if (_itemsLeft == 0)
{
_overallEndTime = DateTime.UtcNow;
this.DisplayTestResults();
}
}
}
}
As you can see I am passing an httpclient to the function and it gets destroyed everytime the URL is downloaded. I know this is an overkill and ideally we should be reusing the httpclient. But since I cant use a single httpclient with different proxies for each URL (the handler needs to be passed to the constructor of httpclient and cannot be changed, hence a fresh proxy cant be given without recreating the httpclient object), I needed to use this approach.
At the caller side, I have a pretty basic code:
public async void TestAsyncWithHttp()
{
ServicePointManager.DefaultConnectionLimit = 10;
//ServicePointManager.UseNagleAlgorithm = false;
List<string> urlList = SetUpURLList();
urlList = urlList.GetRange(1, 50);
_itemsLeft = urlList.Count();
_totalItems = _itemsLeft;
List<string> proxies = new List<string>();
proxies.Add("124.161.94.8:80");
proxies.Add("183.207.228.8:80");
proxies.Add("202.29.97.5:3128");
proxies.Add("210.75.14.158:80");
proxies.Add("203.100.80.81:8080");
proxies.Add("218.207.172.236:80");
proxies.Add("218.59.144.120:81");
proxies.Add("218.59.144.95:80");
proxies.Add("218.28.35.234:8080");
proxies.Add("222.88.236.236:83");
Random rnd = new Random();
foreach (string url in urlList)
{
int ind = rnd.Next(0, proxies.Count-1);
var httpClientHandler = new HttpClientHandler
{
Proxy = new WebProxy(proxies.ElementAt(ind), false),
UseProxy = true
};
HttpClient httpClient = new HttpClient(httpClientHandler);
//HttpClient httpClient = new HttpClient();
httpClient.Timeout = TimeSpan.FromMinutes(2);
ProcessUrlAsyncWithHttp(httpClient, url);
}
}
Question is:
1) Why the TCP ports gets closed for each request. I wanted to open the max connections number of ports and reuse them across calls. e.g in the example above I can have 10 concurrent connections. Hence I wanted this to open 10 TCP ports and the rest of the 40 requests could then use these 10 ports in tandem. This is a normal behaviour expected in httpwebrequest. I have a working code for using httpwebrequest that depicts this behavior of reusing ports. Can post the code of that on demand for anyone who might want to have a look. So its kind of weird that httpclient does not mimic this behaviour although it is based on httpwebrequest.
2) How do we assign autoredirect to false for such calls?
3) I intend to use this function for multiple calls - say around 50K. Anything wrong in the way the code is written that might need a correction
4) Lets assume that I somehow manage to use a single httpclient object instead of one object per request. What is the way to ensure I read cookies for all these individual requests and also alter them if necessary all the while remembering that I have a single httpclient class for the whole set of URL requests?
Tks
Kallol
In my experience (I once had a similar problem with TCP port congestion, because of ports always getting closed, when I was hitting a server with around 6000 connections a minute) it suffices to reuse the HttpClientHandler objects, which actually manage the connection pooling, and always recreate the HttpClient objects for each request (using the constructor with HttpClientManager parameter).
Hope this helps.
Matthias
have you tried putting the HttpClient code in class and create 10 classes, each with a HttpClient?

confusion about Certificates

I have WCF REST web service hosted by IIS, it works on HTTPS, I generate Certificate on IIS and assign Https to a port
I generate cer through IE browser. I create a test application and regardless Add a client certificate or not or even add a wrong certificate the connection take place and a I get correct response. I am wondering how the message was decrypted if there is no certificate sent.
Either the destination is not secured or I misunderstand the whole thing.
also
The error I have from the callback "CheckValidationResult()" is either
CertCN_NO_MATCH = 0x800B010F
or
"Unknown Certificate Problem" , the certificateProblem (parameter of CheckValidationResult) is 0 for this case
What is CertCN_NO_MATCH eror, what is CN?
See code below.
ServicePointManager.CertificatePolicy = new CertPolicy();
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(String.Format("https://{0}/uri", ip));
//request.ClientCertificates.Add(new X509Certificate("D:\\ThePubKey.cer"));
request.ContentType = "application/x-www-form-urlencoded";
request.Method = "POST";
using (StreamWriter stream = new StreamWriter(request.GetRequestStream()))
{
stream.Write("RequestType=CheckStatus&ReportType=Fulfillment&ReportID=5");
}
using (StreamReader stream = new StreamReader(request.GetResponse().GetResponseStream()))
{
Response.ContentType = "text/xml";
Response.Output.Write(stream.ReadToEnd());
Response.End();
}
class CertPolicy : ICertificatePolicy
{
public enum CertificateProblem : uint
{
CertEXPIRED = 0x800B0101,
CertVALIDITYPERIODNESTING = 0x800B0102,
CertROLE = 0x800B0103,
CertPATHLENCONST = 0x800B0104,
CertCRITICAL = 0x800B0105,
CertPURPOSE = 0x800B0106,
CertISSUERCHAINING = 0x800B0107,
CertMALFORMED = 0x800B0108,
CertUNTRUSTEDROOT = 0x800B0109,
CertCHAINING = 0x800B010A,
CertREVOKED = 0x800B010C,
CertUNTRUSTEDTESTROOT = 0x800B010D,
CertREVOCATION_FAILURE = 0x800B010E,
CertCN_NO_MATCH = 0x800B010F,
CertWRONG_USAGE = 0x800B0110,
CertUNTRUSTEDCA = 0x800B0112
}
public bool CheckValidationResult(ServicePoint srvPoint, X509Certificate certificate, WebRequest request, int certificateProblem)
{
// You can do your own certificate checking.
// You can obtain the error values from WinError.h.
// Return true so that any certificate will work with this sample.
String error = "";
using (StringWriter writer = new StringWriter())
{
writer.WriteLine("Certificate Problem with accessing " + request.RequestUri);
writer.Write("Problem code 0x{0:X8},", (int)certificateProblem);
writer.WriteLine(GetProblemMessage((CertificateProblem)certificateProblem));
error = writer.ToString();
}
return true;
}
private String GetProblemMessage(CertificateProblem Problem)
{
String ProblemMessage = "";
CertificateProblem problemList = new CertificateProblem();
String ProblemCodeName = Enum.GetName(problemList.GetType(), Problem);
if (ProblemCodeName != null)
ProblemMessage = ProblemMessage + "-Certificateproblem:" +
ProblemCodeName;
else
ProblemMessage = "Unknown Certificate Problem";
return ProblemMessage;
}
}
I've just replied to this similar question (in Java).
CN is the "Common Name". It ought to be the hostname of the server to which you're connecting (unless it's in the subject alternative name). I guess from your code sample that you're using the IP address directly. In this case, the CN should be that IP address (it tends to be better to use a hostname rather than an IP address). See RFC 2818 (sec 3.1) for the specifications.
Note that the CN or subject alternative name is from the point of view of the client, so if you connect to https://some.example.com/, then the name in the cert should be some.example.com, if you connect to https://localhost/, then the name in the cert should be localhost, even if some.example.com and localhost may be the same server effectively.
(I guess that by default, IIS might generate a certificate for the external name, but you'd have to look at the certificate to know; this should be visible in the certificate properties somewhere.)

Resources