Does an HttpModule perform as badly as RAMMFAR? - asp.net

I know that we're supposed to be careful about using RAMMFAR (runAllManagedModulesForAllRequests), such as the advice in this Hanselman post. The reason is that it sends all requests to every managed module.
But I recently had to create an HttpModule, and I noticed that it gets all requests anyway (because that's what a module is for), and now I'm wondering if there's any performance difference between setting RAMMFAR=true, and simply having a managed module, if that module is going to get all requests.
To put this another way, are managed modules considered harmful to performance? If I test the url and ignore requests I don't care about, will that hurt my scalability at all?
Edit: By all requests, I mean that I see requests for static content, such as css, js, and jpg files that are on disk.
Edit: The module is registered like this:
<modules>
<add name="MyModule" type="MyNamespace.MyModule, MyAssembly"/>
</modules>

There are a few questions in your post. I'll try to address them individually.
Does having a dummy (no-op) managed module impact performance / throughput?
Yes, from the perspective that IIS and ASP.NET need to coordinate the native <-> managed code transition, and this transition incurs some overhead. For the overwhelming majority of applications this overhead is dwarfed by the actual application logic. The types of applications where this tends to show up in profiles are sites which are serving tens of thousands or hundreds of thousands of requests per second. At that point we generally recommend paying very close attention to which modules are included in the application and trimming them down as much as possible.
Why is my module running for static files?
Because you don't have a managedHandler precondition on the module. If this precondition is present on the module declaration, this module will only run if the request is destined for a managed endpoint (.aspx, .axd, extensionless, and so on). If this precondition is not present, the module always runs.
To specify the managedHandler precondition:
<modules>
<add name="..." type="..." preCondition="managedHandler" />
</modules>
Note: if you're on IIS 7.0 or 7.5, you might need to install the patch http://support.microsoft.com/kb/980368 to get IIS to see extensionless URLs as "managed" endpoints.
What does RAMMFAR actually do?
In a nutshell, it ignores the precondition specified in the module registrations. It does what its name implies: it runs all managedHandler modules for all requests, even if those requests weren't destined for a managed endpoint.

Related

Why ModSecurity OWASP rule blocks .axd files?

I've been going over WAF findings in an ASP.NET application. WAF is ModSecurity with OWASP CRS. One of the findings is:
URL file extension is restricted by policy, Rule ID 920440
and it fired at files WebResource.axd and ScriptResource.axd.
I did some researching. I found that these files are HTTP Handlers and these are embedded in assemblies. I found said rule - it's a simple one, it just checks file extension and blocks request based on this. The .axd just happens to be one of the listed file extensions.
As I understood, these files might be connected with using AJAX (I might be wrong on this one). However, I didn't manage to find any reason/explanation over the internet why are these blacklisted by OWASP. The only piece of information that might give a clue was this question.
Why are .axd files blacklisted? Are they deprecated? Can these be listed as exceptions from the rule or maybe these introduce some actual risk? Finally, how can you modify ASP.NET application so it doesn't need these files?
Maybe too late to reply but there are several security issues including Oracle padding attack CVE-2010-3332, Telerik remote code execution CVE-2019-18935 and ... which all are related to axd files.

Can I see a log of IIS loading (or not loading) a .net httpmodule?

I have a site that uses a .net security module to secure certain areas of the website. It's not working, the pages that should be password protected are not. Other than that, the site doesn't throw any errors.
I don't have access to the code, and the module doesn't seem to log anything.
Is there an IIS or .Net log of loading/calling httpModules? I feel like it's not loading/calling it, and it's just not telling me.
My web.config has this snippet, which loads the module:
<httpModules>
<add name="MyApp.SecurityModule" type="MyApp.Host.Security.WebForms.SecurityModule" />
</httpModules>
You can use Modules property in global.asax class to inspect the http modules that are loaded and associated with the application - use any request event such as BeginRequest or PostMapRequestHandler.
However, I don't think this would be an problem - if you are configured the module in the web.config then it would be loaded. If there are any issues then the ASP.NET will report an error. Most probably, http module code must be working in a certain way or needs certain per-conditions that are not being met.
Use tools such as Reflector or ILSpy to inspect & de-compile your application assemblies - check the code for the said type (MyApp.Host.Security.WebForms.SecurityModule) - frankly speaking, in the absence of documentation/commented code, it would be the only option to figure out how exactly your module is supposed to run (and then trouble-shoot accordingly).
Have you thought about using something like Glimpse: http://getglimpse.com/
If may help you debug the issue a little better.

How to get Umbraco to handle requests for non .aspx files (IIS integrated pipeline mode)?

I'm trying to get the 301 URL Tracker package for Umbraco to work to my likings.
My goal is to be able to map the old URLs (from another CMS) to the new Umbraco URLs. In my specific situation, the old site is PHP based and therefore use the .php file extension (http://example.net/test.php -> http://example.net/test/) - but it could be any non .aspx extension (asp, png and so on). The problem is that Umbraco is not handling request for .php files. It works perfectly for .aspx and directories (extensionless URLs).
I have tried various things for getting this to work. Before I go any further, I should note that the Application Pool is in integrated mode and .NET 4.0.
I kind of got it to work by defining a custom error in the web.config:
<customErrors defaultRedirect="not-exists.aspx" />
This triggers the handlers defined in NotFoundHandlers in the Umbraco config file 404handlers.config. But has the side effect of returning a 302 Found header, before the 301 URL Tracker kicks in and handles the 301 redirect. And this is just a big SEO "no no".
I then tried to explicitly create a HTTP handler module for .php files. I successfully got the System.Web.UI.PageHandlerFactory module to handle the request for the .php file. But this does not invoke any of the NotFoundHandlers in Umbraco.
As I understand the integrated pipeline in IIS 7, all the modules registered should try to handle the request (http://stackoverflow.com/questions/3765317/how-to-catch-non-aspx-files-with-a-http-module). But perhaps somebody can enlighten me on this subject?
Others are also experiencing the difficulties in getting this configuration to work: http://our.umbraco.org/projects/developer-tools/301-moved-permanently/feedback/7271-when-the-old-pages-are-not-from-umbraco
What am I missing in getting Umbraco to handle request for non .aspx files in integrated pipeline mode?
If you are already running under Integrated Pipeline mode then the included UrlRewriting.net module should pick up requests automatically.
Simply add:
<add name="phptoaspx"
virtualUrl="^~/(.*).php"
rewriteUrlParameter="ExcludeFromClientQueryString"
destinationUrl="/$1.aspx"
ignoreCase="true" />
to your /config/UrlRewriting.config file, and all should be well.
P.S. You should not be using a customError handler to handle SEO 301/302'ing content. This can be a massive headache in terms of maintainability - please trust me on this, I tried this once when I was a junior .NET dev!
I am not familiar with Umbraco, but believe this is what you are looking for http://blogs.iis.net/ruslany/archive/2008/09/30/wildcard-script-mapping-and-iis-7-integrated-pipeline.aspx
Of course you'll have to add your own rewrite rules... so this only gets you half way.

w3wp has handle on download files

I set up iis to handle .exe with "asp.net 2.0"s isapi filter to enable dynamic url replacement.
For this I setup the extension in iis and added the following line to web.config. Works fine so far.
<add path="*.exe" verb="*" type="System.Web.StaticFileHandler" />
the problem is that form that point w3wp process has several handles on these files. Most likely because someone is downloading them at the moment.
Is there a way to tell iis/asp.net not to put a exclusive handle on the file? I want to replace them even if the site is running. This works as long as asp.net does not handle these files.
I don't think there is a way to do this. You are telling IIS to handle this filetype, so the server is assuming a certain exclusivity. This behavior is intended IMO, because it prevents corrupted files in production environment.
Maybe it helps to limit the requests to some selected HTTP verbs (POST/GET), but i don't see any other options.
And the other question is: why do you want to replace a file, that is currently downloaded by a user? It will corrupt his download, forcing him to start all over again.

How can I get gzip compression in IIS7 working?

I have installed Static and dynamic compression for IIS7, as well as setting the two web.config values at my application Virtual Folder level. As I understand it, I don't need to enable compression at the server, or site level anymore, and I can manage it on a per folder basis using my web.config file.
I have two settings in my .config file that I have set to customize gzip for my app:
<httpCompression dynamicCompressionDisableCpuUsage="90"
dynamicCompressionEnableCpuUsage="0">
<scheme name="gzip" dll="%Windir%\system32\inetsrv\gzip.dll" />
<dynamicTypes>
<remove mimeType="*/*"/>
<add mimeType="*/*" enabled="true" />
</dynamicTypes>
</httpCompression>
<urlCompression doDynamicCompression="true"
dynamicCompressionBeforeCache="true" />
However, when I run the application, I can clearly see that gzip is not used, because my page sizes are the same. I am also using YSlow for FireFox, which also confirms that my pages are not being gziped.
What am I missing here? In IIS6 it was a simple matter of specifying the file types, and setting the compression level between 0-10. I don't see the need documented to specify the file types or compression level, since the defaults seem to cover the file types, and I'm not seeing the level anywhere.
There was a thread on forums.iis.net about this during the iis 7 beta. Turned out the guy didn't have the modules installed, but it sounds like you've ruled that out from your opening sentence.
Microsofts key advice for him was to enable failed request tracing to find out what was going wrong. This is possibly one of the most under-appreciated features of IIS7, but certainly one of the most powerful.
Open IIS Manager.
Go to your site, and on the actions pane (the very far right), click 'Failed Request Tracing...' under the 'Configure' section.
Click 'enable'.
Then, in the features view, click 'Failed request tracing rules'. Click add, next, enter 200 for the status code, next, click finish.
If you don't see "Failed Request Tracing" in the actions pane, you'll need to add the feature to the server - either using the "Add Role Services" wizard (Health and Diagnostics\Tracing) or through the Web Platform Installer (Products\Server\IIS: Tracing), and then close and re-open IIS Manager.
Next, rerun your test. This will generate some log info for us to examine.
Look in c:\inetpub\logs\FailedReqLogFiles\w3svcx. You will see a bunch of files named fr000xx.xml. Open up any one of them in your browser. (By the way, if you copy these files anywhere, make sure freb.xsl is there. Also, don't delete freb.xsl - if you do, just delete the whole directory or copy it from another location, as IIS only creates it once per folder.)
Click the 'request details' tab and select 'complete request trace'. Search the page for 'compress' - you should find it in several areas; once for static content, and once for dynamic content.
If you don't find either of them, IIS isn't configured correctly. If you do find them, you should see them followed by a compression_success and a compression_do. Success is self explanatory; the 'do' indicates what it did - in my case, it showed "OriginalSize 1462784 CompressedSize 179482"
Since yours isn't working, hopefully you will see something different that helps you solve the problem.
Make sure you turn this off when you're done by disabling failed request tracing in the actions pane for your website.
We had a similar problem and it turns out that IIS7 does some dynamic CPU based throttling here..
http://www.iis.net/ConfigReference/system.webServer/httpCompression
dynamicCompressionDisableCpuUsage
Optional uint attribute.
Specifies the percentage of CPU utilization at which dynamic compression will be disabled.
Note: This attribute acts as an upper CPU limit at which dynamic compression is turned off. When CPU utilization falls below the value specified in the dynamicCompressionEnableCpuUsage attribute, dynamic compression will be reenabled.
The default value is 90.
dynamicCompressionEnableCpuUsage
Optional uint attribute.
Specifies the percentage of CPU utilization below which dynamic compression will be enabled. The value must be between 0 and 100. Average CPU utilization is calculated every 30 seconds.
Note: This attribute acts as a lower CPU limit below which dynamic compression is turned on. When CPU utilization rises above the value specified in the dynamicCompressionDisableCpuUsage attribute, dynamic compression will be disabled.
The default value is 50.
Note the defaults -- if your IIS7 hits 90% CPU usage, it will disable all dynamic gzipped content until CPU usage dips back below 50%!
Also, some great recommendations and benchmarks here on the real CPU cost of GZIP.
http://weblogs.asp.net/owscott/archive/2009/02/22/iis-7-compression-good-bad-how-much.aspx
Long story short, unless you regularly have dynamic pages well in excess of 200kb, it's a non-issue.
Following the excellent advice of JohnW, I too enabled logging to find the culprit, though the reason for the failure turned out to be different:
STATIC_COMPRESSION_NOT_SUCCESS
Reason 14
Reason NOT_FREQUENTLY_HIT
In short, it appears that if you don't hit the page frequently enough then IIS7 will not deem it worthy of compressing, which seems a little bit odd to me. Nonetheless, makes sense in this case because I was just trying to test it on a local machine.
According to this page, the default appears to be that a page has to be hit 2 times within 10 seconds to be a "frequent hit". If you really want to, you can override the default in applicationHost.config (%systemroot%\Windows\System32\inetsrv\config). At least for me it's a locked attribute, so you won't be able to override it in your own web.config.
<serverRuntime frequentHitThreshold="1" />
Also, I note now that SO already had this answer here: In IIS7, gzipped files do not stay that way.
I solved my problem by installing dynamic compression at Add/Remove programs.
In the system.webServer section of your Web.config file, add the following lines:
<remove fileExtension=".js" />
<mimeMap fileExtension=".js" mimeType="application/x-javascript" />
The compression scheme in IIS7 is enabled by default, but it maps only a single javascript mime type to be compressed, application/x-javascript. Adding the line above tells IIS to give all your .js files that mime type, which in turn makes the compression work.
turn on static compression. dynamic compression is for dynamic pages like asp, php, aspx, etc.
Here's a link to the IIS config reference for compression:
For me it turned out to be the setting
noCompressionForProxies
as we are on a proxy here...took myself off proxy and voila, compression.

Resources