Something faster than HttpHandlers? - asp.net

What is the fastest way to execute a method on an ASP.NET website?
The scenario is pretty simple: I have a method which should be executed when a web page is hit. Nothing else is happening on the page, the only rendered output is a "done" message. I want the processing to be as fast as possible.
Every single hit is unique, so caching is not an option.
My plan is to use an HttpHandler and configure it in web.config (mypage.ashx) rather than a regular .aspx page. This should reduce the overhead significantly.
So my question is really: Is there a faster way to accomplish this than using HttpHandlers?

Depending on what you're doing, I wouldn't expect to see a lot of improvement over just using an HttpHandler. I'd start by just writing the HttpHandler and seeing how it performs. If you need it to be faster, try looking more closely at the things you're actually doing while processing the request and seeing what can be optimized. For example, if you're doing any logging to a database, try writing to a local database instead of across a network. If it's still not fast enough, then maybe look into writing something lower level. Until that point though, I'd stick with whatever's easiest for you to write.
For reference, I've written an ad server in ASP.NET (using HttpHandlers) that can serve an ad (including targeting and logging the impression to a local database) in 0-15ms under load. I thought I was doing quite a bit of processing - but that's a pretty good response time IMHO.
Update after several months:
If you clear all the HttpModules that are included by default, this will remove a fair amount of overhead. By default, the following HttpModules are included in every site via the machine-level web.config file:
OutputCache
Session (for session state)
WindowsAuthentication
FormsAuthentication
PassportAuthentication
RoleManager
UrlAuthorization
FileAuthorization
AnonymousIdentification
Profile
ErrorHandler
ServiceModel
Like I said above, my ad server doesn't use any of these, so I've just done this in that app's web.config:
<httpModules>
<clear />
</httpModules>
If you need some of those, but not all, you can remove the ones you don't need:
<httpModules>
<remove name="PassportAuthentication" />
<remove name="Session" />
</httpModules>
ASP.NET MVC Note: ASP.NET MVC requires the session state module unless you do something specific to workaround it. See this question for more information: How can I disable session state in ASP.NET MVC?
Update for IIS7: Unfortunately, things aren't quite as simple in IIS7. Here is how to clear HTTP Modules in IIS7

I'm not sure what your exact scenario is, but if all your page is doing is processing some data, you don't really need an aspx page or an http handler at all. You could write an ASMX web service or WCF service to do what you need and this would most likely be less overhead. The WCF service doesn't even have to be hosted in ASP.NET. You can host it from a Windows service or console app, and call it in-proc using named pipes. This would probably reduce the overhead for calling the data processing code significantly.

If you really have to use asp.net, you can also just hook into AuthorizeRequest step and intercept the request from there, do your processing and write your Done response directly.

Related

Proper way to log in an ASP.NET Website running in Azure

I've read several articles about logging in an Azure Website but I cannot make my final decision. My big question is to use or not to use a 3rd party library for logging.
I see that the built in diagnostics functionality is quite good. I could use it to create debug logs like "Xyz with ID [x] was created by User [abc]..." and store these along with trace messages in an Azure table storage, but I'm not sure that this will serve me long enough. On the other hand, I think that using both the built in tracing and NLog for example, is a little overkill.
Any experiences/suggestions on this topic?
You can use NLog or similar library and add a sink that writes to System.Diagnostics.Trace.
This way you can enjoy both worlds, on one hand it's easy to direct these logs to file/blob/table where you can change log level on the fly from the portal (it will not restart your website) and on the other you can enjoy the benefits of those third party libraries.
Azure Application Insights is another great way to do logging in Azure Websites. See here.
There is also ELMAH that you can nuget into your site.
Choosing the right logging approach depends a lot on what exactly you want from your logging. For ASP.NET, it always helps if the log is "REQUEST BASED" rather than just a plain TEXT file which contains all the events. There are many different logging frameworks available but I want to highlight one which is an "out of box" feature and works really well if you want to measure performance of a request and if you want your tracing to be request based. I am basically talking about the old Trace.Write in ASP.NET. With the following configuration in your web.config, your ASP.NET trace events will be logged in to FREB whenever FREB tracing is enabled.
<system.web>
<trace enabled="true" pageOutput="false" requestLimit="10000" writeToDiagnosticsTrace="true" />
</system.web>
<system.diagnostics>
<sharedListeners>
<add name="System.Net.IISETW" type="System.Web.IisTraceListener,System.Web,version=4.0.0.0,Culture=neutral,PublicKeyToken=b03f5f7f11d50a3a" />
</sharedListeners>
<trace autoflush="true" />
</system.diagnostics>
This way when you enable failed request tracing as mentioned here, your custom events will start showing up in the FREB traces and you can see how much time is spent in between each event and this gives you a good way of identify how much time is spent in your request execution.
hope this helps !!!

Can custom IHttpHandler revert to default request handling?

I'm writing a custom HttpHandler to process web requests for a web framework I'm writing but trying to find a way to programatically "ignore" the request if no url route is matched. What I mean by ignore is if no predefined route matches the incoming request url to then default to the standard request processing you would get if you were using a raw ASP.Net web application.
The only way I can find that actually works so far is to remove the custom http handler for a specific path, e.g.:
<location path="Test">
<system.webServer>
<handlers>
<remove name="DefaultHandler"/>
</handlers>
</system.webServer>
</location>
I'm not massively satisfyed with this solution and would like to implement something akin to MVC's IgnoreRoute("..."). Digging through ths source though is a bit of a thankless task and I can't see where it's actually doing it.
So ideally I'd like to know if it's possible to somehow exit the custom http handler and let the application handle it in a default manner or find out how MVC does this.
Anyone have any ideas?
Thanks.
I don't think you can do it in an Httphandler.
Consider using an HttpModule. This is also how MCV routing works.

To Increase Request Timeout only on particular Web Page

Is it possible to increase the request timeout for just the one particular web page? I am working on ASP.Net 4.0 and I need one particular page to have a longer request timeout, since it is responsible for initiating a long running process. Thanks.
Use Web.config:
<location path="Page.aspx">
<system.web>
<httpRuntime executionTimeout="180"/>
</system.web>
</location>
This is an old thread, but it should be emphasized that updating the executionTimeout of a page or the entire machine must also be accompanied by the compilation debug flag being set to "false", otherwise the timeout element is ignored.
Also, depending on whether or not you are using AJAX update panels, you may also have to look at the AsycPostBackTimeout flag on the ScriptManager itself - depends on how your timeout is manifesting itself. Ajax post back timeouts will be seen as error messages logged to the Javascript Console and tend to manifest themselves as ajax operations "dying on the vine" depending on how you are handling things.
The debug="false" bit is probably what is afflicting the gentleman above who was having issues on his Amazon Server, but not locally.
Some googling will also reveal that some folks have noticed that localhost handles things differently as well, so you may need to experiment around that.

ASP.NET MVC3 Publish settings in web.config

I have published an ASP.NET MVC3 site. It runs great. However, looking back at my web.config file, I was not sure if some of the values I used are correct for publishing versus for developing. These configurations are in the <system.web> section.
...
<system.web>
<httpRuntime requestValidationMode="2.0" executionTimeout="200" maxRequestLength="20000000"/>
<compilation debug="true" targetFramework="4.0">
...
I read here ( http://msdn.microsoft.com/en-us/library/e1f13641.aspx ) that using debug=true in compilation will disregard the executionTimeout of 200, and use a default value of 110. This seems to be the case, and the site is setup to allow very large amounts of files to be uploaded all at once. However, with only 110 seconds, not much can be uploaded.
My question is this: Is the correct setting to publish a live site for debug "false"? In addition, is requestValidationMode="2.0" still safe to use considering asp.net is now on version 4 (soon to be 4.5)?
Validationmode 2.0 is not the framework version and can stay like that.
Put debug=false and you are fine.
requestValidationMode... As far as I'm aware, this has to be set to 2.0 if you want to allow special characters (<, >, % etc.) in request data to pass ASP.NET's request validation at all. requestValidationMode="2.0" means "only enforce validation on pages (i.e. .aspx), rather than on every request (as was introduced in 4.0). That allows ASP.NET MVC to take over the validation - and hence also lets you turn it off for specific requests.
Is it safe? It is, if you've made sure that any actions or controllers that have [ValidateInput(false)] applied or models with [AllowHtml] have been properly secured against attacks. Imran Baloch has a full explanation here.
And yes, debug should be "false" for several reasons, including performance and memory usage. Also, debug="true" changes the default cache policy for static files to never cache the files in the browser, meaning tons of redundant requests for scripts, CSS etc.
As for the image upload, other than the suggestions given, check in Event Viewer that it's not really the application pool recycling for one reason or other, rather than an execution timeout.

Neither HttpHandler nor HttpApplication is getting called for /

I have an IHttpHandler registered like this:
<httpHandlers>
<add verb="*" path="*" type="MindTouch.Dream.Http.HttpHandler, mindtouch.core"/>
</httpHandlers>
Which catches /foo, /foo/bar, etc. just fine, but on / the Visual Studio built-in server does not hit hit either the HttpApplication or my handler.
That's the way to do it. Your web server/site will have a setting which specifies the default document to serve for a directory. If not present or not set, the web server will attempt to serve either the directory listing which should be turned off for security, a security error if the listing is not available, or nothing.
So in your case prior to the default document existing, "/" was not actually making an application request.
I fixed it and I think I recall this being an ancient ASP.NET issue:
I created a file called Default.htm, which ASP.NET will try to resolve the / path to and since there is now a real path to resolve to, the HttpApplication gets called, incidentally with a path of /default.htm.
Is there a less hacky solution to this? Gladly would accept a different answer than my own :)

Resources