Securing http headers - asp.net

I Have website that is in production server and it supposed to be very secure so i want to secure http header so that no unwanted information is leaked.
I have searched on net about securing http headers and so far found that we can remove un anted information like removing
'Server Microsoft-IIS/7.5
X-AspNet-Version 4.0.303319
X-Powered-By ASP.NET -'
I have found solution for X-Aspnet and X powered by :
1. For X-AspNet i have added below code in system.web section
<httpRuntime enableVersionHeader="false"/>
For X-Powered i have added below code in system.webserver section
But for Server header removal code is not working :(
Code i am using for is :
I have added a class with name CustomHeaderModule and inside that class code is as below
///
/// Summary description for CustomHeaderModule
///
public class CustomHeaderModule : IHttpModule
{
public void Dispose()
{
throw new NotImplementedException();
}
public void Init(HttpApplication context)
{
context.PostReleaseRequestState += PostReleaseRequestState;
}
void PostReleaseRequestState(object sender, EventArgs e)
{
//HttpContext.Current.Response.Headers.Remove("Server");
// Or you can set something funny
HttpContext.Current.Response.Headers.Set("Server", "CERN httpd");
}
}
and then registered this in web.config under system.webserver section
<modules runAllManagedModulesForAllRequests="true">
<add name="CustomHeaderModule" type="CustomHeaderModule" />
</modules>
Now this code is not working ..i am still seeing server in header in chrome browser..
how can i fix this and apart from these 3 setting is there any other to secure more ?

Considering your problem what I would suggest you is to use ASafaWeb to test your Website!
Second is to read these articles from Troy Hunt and Paul Bouwer:
Shhh… don’t let your response headers talk too loudly
Clickjack attack – the hidden threat right in front of you
ASafaWeb, Excessive Headers and Windows Azure
Following this articles you will finally have a look at NWebSec!

Sorry if this doesn’t answer your question directly but I wouldn’t really bother removing those headers. Someone can easily find out what server are you using by looking at the html code on the browser side.
If I look at source code and I see things like __VIEWSTATE I’ll immediately know this is ASP.NET and if I dig a little deeper I’ll probably be able to figure out the version too.
What I’d suggest is that you focus on standard security and risk procedures such as making sure you are not open to SQL injections, validating everything on the server side, making sure you have all backups in place and ready to be up in several mins, adding additional layer of authentication if needed, making sure you have all security updates on the server and such…

I have found one solution which works on IIS but not on local but i am okay with that...Removing/Hiding/Disabling excessive HTTP response headers in Azure/IIS7 without UrlScan
anyways apart from these 3 settings ..is there any other way i can more secure http headers..

Related

Service Call on Page Generates Zone Security Error

This error is received on an ajax call
WebSocket Error: SECURITY_ERR, Cross zone connection not allowed
with a 500 error code also returned. I am able to get other responses which don't seem to be related to the error upon further testing. See below for where the reported error is.
From this Angular call
$http.post("Status.aspx/GetDataAsync", {})
.then(function(response){ $scope.theData = data;},
function(response){ $scope.result = "Error!";}
);
when attempting to call a page's code behind WebMethod which tellingly also makes a webservice call to a rest web service.
[System.Web.Services.WebMethod]
public static async Task<string> GetDataAsync()
{
var httpresult = await (new HttpClient()).GetAsync("{Internal site rest service Url}");
return await httpresult.Content.ReadAsStringAsync();
}
This reminds me of an old Silverlight cross-domain issue call...but I digress.
Question
Can the zone issue be resolved or does one have to call rest services directly?
Attempt The use of CORS (see Enabling Cross-Origin Requests (CORS)) at the method level such as
[System.Web.Services.WebMethod]
[EnableCors("AllowAllOrigins", "AllowHeaders", "AllowAllMethods")]
to no luck.
Error
This is where the error is found, in the F12 tools on Edge (and IE). Chrome does not report the issue.
You will need to set the Content Security Policy (CSP) of your page so that your {Internal site rest service Url} is not blocked by the browser.
Here are links, combined, that helped me solve this issue:
https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Content-Security-Policy/connect-src
http://caniuse.com/#feat=contentsecuritypolicy
For MVC, I placed the following in the web.config:
<httpProtocol>
<customHeaders>
<add name="Content-Security-Policy" value="connect-src 'self' wss://localhost:7717" />
</customHeaders>
</httpProtocol>
I was then able to create a websocket connection to wss://localhost:7717, but wasn't able to connect to any other port.
Browsers block these kinds of requests to prevent cross-zone script attacks, so you need your page to set the policy, done through the http header. I'm working on a similar issue. I'm missing something, but I think this is the right direction. I need others input because this isn't quite solving the issue. It's closer, though.

Preventing upload of large files in ASP.NET 4.0

We'd like to restrict the maximum upload file size in our web site. We've already set the appropriate limits in our web.config. The problem we're encountering is if a really large file (1 GB, for example) is uploaded, the entire file is uploaded before a server-side error is generated, and the type of the error is different whether the file is huge or not.
Is there a way to detect the size of a pending file upload before the actual upload takes place?
Here's my relevant web.config settings that restrict requests to 16 MB:
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<system.web>
<httpRuntime maxRequestLength="12288"/>
</system.web>
<system.webServer>
<security>
<requestFiltering>
<requestLimits maxAllowedContentLength="12582912"/>
</requestFiltering>
</security>
</system.webServer>
</configuration>
I've tried creating an HTTP module so I could intercept a request early in the request lifecycle, but the uploads seem to take place even before the BeginRequest event of HttpApplication:
public class UploadModule : IHttpModule
{
private const int MaxUploadSize = 12582912;
public void Init(HttpApplication context)
{
context.BeginRequest += handleBeginRequest;
}
public void Dispose()
{
}
private void handleBeginRequest(object sender, EventArgs e)
{
// The upload takes place before this method gets called.
var app = sender as HttpApplication;
if (app.Request.Files.OfType<HttpPostedFile>()
.Any(f => f.ContentLength > MaxUploadSize))
{
app.Response.StatusCode = 413;
app.Response.StatusDescription = "Request Entity Too Large";
app.Response.End();
app.CompleteRequest();
}
}
}
Update:
I know that client-side technologies like Flash can detect file sizes before upload, but we need a server-side workaround because we're wanting to target platforms that have no Flash/Java/ActiveX/Silverlight support. I believe that IIS or ASP.NET has a bug that's allowing large files to be uploaded despite the limits, so I've filed a bug here.
Would an ISAPI extension give me more control over request processing than HTTP modules and handlers, such as allowing me to abort an upload if the Content-Length header is seen to be larger than the allowed limit?
Update 2:
Sigh. Microsoft has closed the bug I filed as a duplicate but has provided no additional information. Hopefully they didn't just drop the ball on this.
Update 3:
Hooray! According to Microsoft:
This bug is being resolved as it has been ported over to the IIS product team. The IIS team has since fixed the bug, which will be included in future release of Windows.
The problem is that the upload happens all at once using the HTTP Post request so you can only detect it after it's done.
If you want more control over this you should try Flash based upload widgets which have this and more. Check out this link http://www.ajaxline.com/10-most-interesting-upload-widgets
Microsoft has responded on their Microsoft Connect site with the following:
This bug is being resolved as it has been ported over to the IIS product team. The IIS team has since fixed the bug, which will be included in future release of Windows.
If you are requesting a fix for the current OS, a QFE request must be opened. Please let me know if this is the route that you want to take. Please note that opening a QFE request does not necessarily mean that it would be approved.
So I guess we have to wait for the next version of IIS for the fix (unless a QFE request is fulfilled, whatever that is).
Is there a way to detect the size of a
pending file upload before the actual
upload takes place?
No. That would require access to the file size on the client. Allowing a web server direct access to files on the client would be a bit dangerous.
Your best bet is to place a line of text stating the maximum allowed file size.
OR you could create some sort of ActiveX control, java applet, etc so that you're not dependent on browser restrictions. Then you have to convince your users to install it. Probably not the best solution.
Well.... Depends how low-level you want to get.
Create a service app that acts as a proxy for IIS. (All incoming port 80 socket requests go to the service.) Have the service pass everything it receives to IIS (website listening on a different port or IP), but monitor the total request size as its received.
When the size from a give connection exceeds you're desired limit, close connection. Return a redirect to an error page if you want to be polite.
Silly, but it'll let you monitor data in transit without waiting for IIS to hand over the request.

asp.net webservice handling gzip compressed request

I have an asp.net .asmx webservice written to handle requests from a third party tool. The third party tool makes an http POST request to the webservice to get user information. I'm using IIS7
Running Fiddler with "Remove All Encodings" checked, I can see the webservice call and and everything functions properly. If I uncheck "Remove All Encodings", the webservice call fails with a 400 Bad Request. The difference I see is that the header "Content-Encoding: gzip" is being removed by Fiddler and the content is being decompressed.
So, when the Content-Encoding header is removed and the content is decompressed, my webservice functions perfectly. When the header is present and the content is compressed, the webservice fails.
How can I either:
Configure my webservice to tell the client that it won't accept compressed requests (and hope that the third party tool respects that)
Decompress the content early in the asp.net handling
Modify my webservice to work with compressed data
Update: To be clear, I don't need to configure gzip encoding in the Response, I need to deal with a Request TO my webservice that is gzip encoded.
Update 2: The third-party tool is the Salesforce.com Outlook plugin. So, I don't have access to modify it and it is used by many other companies without trouble. It's got to be something I'm doing (or not doing)
Update 3: I found one post here that says that IIS does not support incoming POST requests with compressed data, it only supports compressed Responses. Can this still be true?
The simplest technique is to create an HttpModule that replaces the request filter. It is more reusable and avoids having a Global.asax. There is also no need to create a new decompress stream class as the GZipStream is ready for that. Here is the full code, that also removes the Content-Encoding: gzip that is not needed any more:
public class GZipRequestDecompressingModule : IHttpModule
{
public void Init(HttpApplication context)
{
context.BeginRequest += (sender, e) =>
{
var request = (sender as HttpApplication).Request;
string contentEncoding = request.Headers["Content-Encoding"];
if (string.Equals(contentEncoding, "gzip",
StringComparison.OrdinalIgnoreCase))
{
request.Filter = new GZipStream(request.Filter,
CompressionMode.Decompress);
request.Headers.Remove("Content-Encoding");
}
};
}
public void Dispose()
{
}
}
To activate this module, add the following section into your web.config:
<system.webServer>
<modules runAllManagedModulesForAllRequests="true">
<add name="AnyUniqueName"
type="YourNamespace.GZipRequestDecompressingModule, YourAssembly"
preCondition="integratedMode" />
</modules>
</system.webServer>
Since the 3rd party service is just sending you a POST, I do not think that it is possible to tell them not to send in compressed.
You could try to override GetWebRequest and decompress it on the way in
public partial class MyWebService : System.Web.Services.Protocols.SoapHttpClientProtocol
{
protected override WebRequest GetWebRequest(Uri uri)
{
base.GetWebRequest(uri);request.AutomaticDecompression = System.Net.DecompressionMethods.GZip;
return request;
}
}
GZIP compression is a function of the server.
If you're using IIS6, consult this link.
If you're using IIS7, you could use ISAPI_Rewrite to disable gzip. See this link.
That said, because gzip is a function of IIS, you really shouldn't need to do anything "special" to get it to work with a web service (IIS should handle decompressing and compressing requests). Hopefully this info will get you further down the road to troubleshooting and resolving the issue.
I am not sure that IIS supports decompressing incoming requests, so this might have to be done further down the pipe.
Shiraz's answer has the potential of working and it would be the first thing I would try.
If that doesn't work you might consider switching your server .asmx service to WCF, which while a bit more difficult to setup it also gives more flexibility.
On the WCF side there are two things I can suggest. The first is quite easy to implement and is based on setting the WebRequest object used by WCF to automatically accept compression. You can find the details here. This one is the WCF equivalent to the solution proposed by Shiraz.
The second is more complicated, since it involves creating Custom Message Encoders, but if none of the above methods work, this should solve the problem. Creating a message compression encoder is described here. You might also want to check the answer in here which presents a sample config for the message encoder.
Please let me know if this helped or if you need more help.
I've found a partial answer here.
class DecompressStream : Stream
{
...
public override int Read(byte[] buffer, int offset, int count)
{
GZipStream test = new GZipStream(_sink, CompressionMode.Decompress);
int c = test.Read(buffer, offset, count);
return c;
}
...
}
I can then specify the filter on the request object like this:
void Application_BeginRequest(object sender, EventArgs e)
{
string contentEncoding = Request.Headers["Content-Encoding"];
Stream prevCompressedStream = Request.Filter;
if(contentEncoding == null || contentEncoding.Length == 0)
return;
contentEncoding = contentEncoding.ToLower();
if(contentEncoding.Contains("gzip"))
{
Request.Filter = new DecompressStream(Request.Filter);
}
}
I say partial answer because even though I can now process the incoming request, the response is getting a "Content-Encoding: gzip" header even though the response is not encoded. I can verify in Fiddler that the content is not encoded.
If I do encode the response, the client for the webservice fails. It seems that even though it is sending "Accept-Encoding: gzip", it does not in fact accept gzip compressed response. I can verify in Fiddler that the response is compressed and Fiddler will decompress it successfully.
So, now I'm stuck trying to get a stray "Content-Encoding: gzip" header removed from the response. I've removed all references I can find to compression from the application, the web.config, and IIS.

stopping ZmEu attacks with ASP.NET MVC

recently my elmah exception logs are full of attempts from people using thus dam ZmEu security software against my server
for those thinking “what the hell is ZmEu?” here is an explanation...
“ZmEu appears to be a security tool used for discovering security holes in in version 2.x.x of PHPMyAdmin, a web based MySQL database manager. The tool appears to have originated from somewhere in Eastern Europe. Like what seems to happen to all black hat security tools, it made its way to China, where it has been used ever since for non stop brute force attacks against web servers all over the world.”
Heres a great link about this annoying attack -> http://www.philriesch.com/articles/2010/07/getting-a-little-sick-of-zmeu/
Im using .net so they aint gonna find PHPMyAdmin on my server but the fact that my logs are full ofZmEu attacks its becoming tiresome.
The link above provide a great fix using HTAccess, but im using IIS7.5, not apache.
I have a asp.net MVC 2 site, so im using the global.asax file to create my routes
Here is the HTAccess seugestion
<IfModule mod_rewrite.c>
RewriteEngine on
RewriteCond %{REQUEST_URI} !^/path/to/your/abusefile.php
RewriteCond %{HTTP_USER_AGENT} (.*)ZmEu(.*)
RewriteRule .* http://www.yourdomain.com/path/to/your/abusefile.php [R=301,L]
</IfModule>
My question is there anything i can add like this in the Global.ascx file that does the same thing ?
An alternative answer to my other one ... this one specifically stops Elmah from logging the 404 errors generated by ZmEu, while leaving the rest of your sites behaviour unchanged. This might be a bit less conspicuous than returning messages straight to the hackers.
You can control what sorts of things Elmah logs in various ways, one way is adding this to the Global.asax
void ErrorLog_Filtering(object sender, ExceptionFilterEventArgs e)
{
if (e.Exception.GetBaseException() is HttpException)
{
HttpException httpEx = (HttpException)e.Exception.GetBaseException();
if (httpEx.GetHttpCode() == 404)
{
if (Request.UserAgent.Contains("ZmEu"))
{
// stop Elmah from logging it
e.Dismiss();
// log it somewhere else
logger.InfoFormat("ZmEu request detected from IP {0} at address {1}", Request.UserHostAddress, Request.Url);
}
}
}
}
For this event to fire, you'll need to reference the Elmah DLL from your project, and add a using Elmah; to the top of your Global.asax.cs.
The line starting logger.InfoFormat assumes you are using log4net. If not, change it to something else.
The ZmEu attacks were annoying me too, so I looked into this. It can be done with an HttpModule.
Add the following class to your project:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.Security.Principal;
//using log4net;
namespace YourProject
{
public class UserAgentBlockModule : IHttpModule
{
//private static readonly ILog logger = LogManager.GetLogger(typeof(UserAgentBlockModule));
public void Init(HttpApplication context)
{
context.BeginRequest += new EventHandler(context_BeginRequest);
}
void context_BeginRequest(object sender, EventArgs e)
{
HttpApplication application = (HttpApplication)sender;
HttpRequest request = application.Request;
if (request.UserAgent.Contains("ZmEu"))
{
//logger.InfoFormat("ZmEu attack detected from IP {0}, aiming for url {1}", request.UserHostAddress, request.Url.ToString());
HttpContext.Current.Server.Transfer("RickRoll.htm");
}
}
public void Dispose()
{
// nothing to dispose
}
}
}
and then add the following line to web.config
<httpModules>
...
<add name="UserAgentBlockFilter" type="YourProject.UserAgentBlockModule, YourProject" />
</httpModules>
... and then add a suitable htm page to your project so there's somewhere to redirect them to.
Note that if you're using log4net you can comment in the log4net lines in the code to log the occasions when the filter kicks in.
This module has worked for me in testing (when I send the right userAgent values to it). I haven't tested it on a real server yet. But it should do the trick.
Although, as I said in the comments above, something tells me that returning 404 errors might be a less conspicuous response than letting the hackers know that you're aware of them. Some of them might see something like this as a challenge. But then, I'm not an expert on hacker psychology, so who knows.
Whenever I get a ZmEu or phpMyAdmin or forgotten_password I redirect the query to:
<meta http-equiv='refresh' content='0;url=http://www.ripe.net$uri' />
[or apnic or arin]. I'm hoping the admins at ripe.net don't like getting hacked.
On IIS 6.0 you can also try this...
Set your website in IIS to use host headers. Then create a web site in IIS, using the same IP address, but with no host header definition. (I labeled mine "Rogue Site" because some rogue oonce deliverately set his DNS for his domain to resolve to my popular government site. (I'm not sure why) Anyway, using host headers on multiple sites is a good practice. And having a site defined for the case when no host header is included is a way to catch visitors who don't have your domain name in the HTTP request.
On the site with no host header, create a home page that returns a response header status of "HTTP 410 Gone". Or you can redirect them elsewhere.
Any bots that try to visit your server by the IP address rather than the domain name will resolve the this site and get the error "410 Gone".
I also use Microsoft's URLscan, and modified the URLscan.ini file to exclude the user angent string, "ZmEu".
If you are using IIS 7.X you could use Request Filtering to block the requests
Scan Headers: User-agent
Deny Strings: ZmEu
To try if it works start Chrome with the parameter --User-Agent "ZmEu"
This way asp.net is never invoked and its saves you some CPU/Memory..
I added this pattern in Microsoft URL Rewrite Module:
^$|EasouSpider|Add Catalog|PaperLiBot|Spiceworks|ZumBot|RU_Bot|Wget|Java/1.7.0_25|Slurp|FunWebProducts|80legs|Aboundex|AcoiRobot|Acoon Robot|AhrefsBot|aihit|AlkalineBOT|AnzwersCrawl|Arachnoidea|ArchitextSpider|archive|Autonomy Spider|Baiduspider|BecomeBot|benderthewebrobot|BlackWidow|Bork-edition|Bot mailto:craftbot#yahoo.com|botje|catchbot|changedetection|Charlotte|ChinaClaw|commoncrawl|ConveraCrawler|Covario|crawler|curl|Custo|data mining development project|DigExt|DISCo|discobot|discoveryengine|DOC|DoCoMo|DotBot|Download Demon|Download Ninja|eCatch|EirGrabber|EmailSiphon|EmailWolf|eurobot|Exabot|Express WebPictures|ExtractorPro|EyeNetIE|Ezooms|Fetch|Fetch API|filterdb|findfiles|findlinks|FlashGet|flightdeckreports|FollowSite Bot|Gaisbot|genieBot|GetRight|GetWeb!|gigablast|Gigabot|Go-Ahead-Got-It|Go!Zilla|GrabNet|Grafula|GT::WWW|hailoo|heritrix|HMView|houxou|HTTP::Lite|HTTrack|ia_archiver|IBM EVV|id-search|IDBot|Image Stripper|Image Sucker|Indy Library|InterGET|Internet Ninja|internetmemory|ISC Systems iRc Search 2.1|JetCar|JOC Web Spider|k2spider|larbin|larbin|LeechFTP|libghttp|libwww|libwww-perl|linko|LinkWalker|lwp-trivial|Mass Downloader|metadatalabs|MFC_Tear_Sample|Microsoft URL Control|MIDown tool|Missigua|Missigua Locator|Mister PiX|MJ12bot|MOREnet|MSIECrawler|msnbot|naver|Navroad|NearSite|Net Vampire|NetAnts|NetSpider|NetZIP|NextGenSearchBot|NPBot|Nutch|Octopus|Offline Explorer|Offline Navigator|omni-explorer|PageGrabber|panscient|panscient.com|Papa Foto|pavuk|pcBrowser|PECL::HTTP|PHP/|PHPCrawl|picsearch|pipl|pmoz|PredictYourBabySearchToolbar|RealDownload|Referrer Karma|ReGet|reverseget|rogerbot|ScoutJet|SearchBot|seexie|seoprofiler|Servage Robot|SeznamBot|shopwiki|sindice|sistrix|SiteSnagger|SiteSnagger|smart.apnoti.com|SmartDownload|Snoopy|Sosospider|spbot|suggybot|SuperBot|SuperHTTP|SuperPagesUrlVerifyBot|Surfbot|SurveyBot|SurveyBot|swebot|Synapse|Tagoobot|tAkeOut|Teleport|Teleport Pro|TeleportPro|TweetmemeBot|TwengaBot|twiceler|UbiCrawler|uptimerobot|URI::Fetch|urllib|User-Agent|VoidEYE|VoilaBot|WBSearchBot|Web Image Collector|Web Sucker|WebAuto|WebCopier|WebCopier|WebFetch|WebGo IS|WebLeacher|WebReaper|WebSauger|Website eXtractor|Website Quester|WebStripper|WebStripper|WebWhacker|WebZIP|WebZIP|Wells Search II|WEP Search|Widow|winHTTP|WWWOFFLE|Xaldon WebSpider|Xenu|yacybot|yandex|YandexBot|YandexImages|yBot|YesupBot|YodaoBot|yolinkBot|youdao|Zao|Zealbot|Zeus|ZyBORG|Zmeu
The top listed one, “^$” is the regex for an empty string. I do not allow bots to access the pages unless they identify with a user-agent, I found most often the only things hitting my these applications with out a user agent were security tools gone rogue.
I will advise you when blocking bots be very specific. Simply using a generic word like “fire” could pop positive for “firefox” You can also adjust the regex to fix that issue but I found it much simpler to be more specific and that has the added benefit of being more informative to the next person to touch that setting.
Additionally, you will see I have a rule for Java/1.7.0_25 in this case it happened to be a bot using this version of java to slam my servers. Do be careful blocking language specific user agents like this, some languages such as ColdFusion run on the JVM and use the language user agent and web requests to localhost to assemble things like PDFs. Jruby, Groovy, or Scala, may do similar things, however I have not tested them.
Setup your server up properly and dont worry about the attackers :)
All they do is try some basic possibilities to see if youve overlooked an obvious pitfall.
No point filtering out this one hacker who is nice enough to sign his work for you.
If you have a closer look at your log files you see there are so many bots doing this all the time.

How to send status code "500" for generic error page in IIS?

I am using a generic error page using ASP.NET's <customErrors> directive.
<customErrors mode="On" defaultRedirect="500.html" redirectMode="ResponseRewrite">
</customErrors>
Problem - when an error occurs, this page does not return HTTP status "500". It comes as 200. So link checkers and spiders do not see that there is any problem.
How can I send the 500 HTTP status along with the static 500.html page?
Requirements:
I must use redirectMode="ResponseRewrite"
I can't use a dynamic page, only static .html.
The MSDN documentation for the customErrors element states that it is implemented by System.Web.Configuration.CustomErrorsSection. If we use Red Gate's .NET Reflector to analyze that class, we can see where that setting is used in the Framework.
It is used by System.Web.UI.Page.HandleError and System.Web.HttpResponse.ReportRuntimeError.
Both of these end up calling System.Web.HttpResponse.RedirectToErrorPage. (The name of this method is confusing: it is important to note that RedirectToErrorPage takes the redirectMode setting as a parameter, so it is called even if you are using ResponseRewrite and no redirection actually happens.)
The relevant part of the RedirectToErrorPage method is:
if (redirectMode == CustomErrorsRedirectMode.ResponseRewrite)
{
this.Context.Server.Execute(url);
}
There doesn't appear to be any way to set the response code in the error handling: at the end of the day it's just a plain Server.Execute. It therefore seems unavoidable that you would need to write code to achieve the HTTP response you want.
Can you re-examine why you want to use a plain .html file? This seems a sensible choice for error handling, because you don't want to go through all the overhead of a .aspx page when that might cause another error to occur.
But perhaps there is some middle ground which will be just as robust as a .html file?
For example, you could make a precompiled HttpHandler, register it to the URL /500.error, and then make 500.error your defaultRedirect page. (This would be similar to how ScriptResource.axd works.) If you precompile your module into a DLL (as opposed to on-the-fly compilation from a plain old .axd file), you may find it is just as robust in the face of error conditions. If you encounter an error where not even this will work, then a static .html file probably won't work either -- keep in mind that the customErrors directive still relies on .NET running under the hood, and still uses the StaticFileHandler to serve your .html file.
Alternatively you could consider a reverse proxy in front of your IIS application which would serve a friendly 500 page even in the face of catastrophic failure of the application pool. This would be more work to set up, but would be even more robust than customErrors, e.g. if your web.config becomes corrupted, even customErrors won't work.
In your gllobal.asax file add the following code
protected void Application_EndRequest(object sender, EventArgs e)
{
if (Request.Url.AbsolutePath.EndsWith("500.html"))
Response.StatusCode = 500;
}
OK, I have a solution, the only way I can get this to work is by bypassing the custom errors section in the web configuration file.
So, an example default.aspx
public partial class Default : System.Web.UI.Page
{
protected void Page_Load(object sender, EventArgs e)
{
throw new Exception("boom");
}
}
Then in the global.asax file:
protected void Application_Error(object sender, EventArgs e)
{
// Clear the error to take control of process
Server.ClearError();
Response.WriteFile(Server.MapPath("500.html"));
Response.StatusCode = 500;
Response.StatusDescription = "Internal Server Error";
}
You can probably write a better handling mechanism for the 500.html part - I think this should do what you are trying to achieve.
Only tested with VS Cassini web server, however I see no reason why this shouldn't work in iis6.
Try configuring your custom errors section like this:
<customErrors mode="On" redirectMode="ResponseRewrite">
<error statusCode="500" redirect="500.aspx">
</customErrors>
In your 500.aspx file, change the response code in the page_load;
Response.StatusCode = 500;
Response.StatusDescription = "Internal Server Error";
This could be done with an isapi filter. It would have to be written in c, but it can modify the response status for the .html request so that the browser gets a 500 with you custom html page.
If you insist on changing the core HTTP signaling then you either have to give in on some demands or be prepared to write your own web server. Yes you have to run IIS7 integrated AppPool and you still may have to accept redirection to and active page since you are trying to fake that your server is down and server is not designed to fake suicide. So if it gives you one slim way to do it, your options are to either say thanks or develop your own, non-HTTP compliant server which will scramble response codes at will.

Resources