Why ModSecurity OWASP rule blocks .axd files? - asp.net

I've been going over WAF findings in an ASP.NET application. WAF is ModSecurity with OWASP CRS. One of the findings is:
URL file extension is restricted by policy, Rule ID 920440
and it fired at files WebResource.axd and ScriptResource.axd.
I did some researching. I found that these files are HTTP Handlers and these are embedded in assemblies. I found said rule - it's a simple one, it just checks file extension and blocks request based on this. The .axd just happens to be one of the listed file extensions.
As I understood, these files might be connected with using AJAX (I might be wrong on this one). However, I didn't manage to find any reason/explanation over the internet why are these blacklisted by OWASP. The only piece of information that might give a clue was this question.
Why are .axd files blacklisted? Are they deprecated? Can these be listed as exceptions from the rule or maybe these introduce some actual risk? Finally, how can you modify ASP.NET application so it doesn't need these files?

Maybe too late to reply but there are several security issues including Oracle padding attack CVE-2010-3332, Telerik remote code execution CVE-2019-18935 and ... which all are related to axd files.

Related

Loading src files once per session in asp.net

I have way too many pages in the application that basically load the same set of xml and js files for client side interaction and validation. So, I have about dozen lines like this one <script type="text/javascript" src="JS/CreateMR.js"></script> or like this one <xml id="DefaultDataIslands" src="../XMLData/DataIslands.xml">.
These same files are included in every page and as such browser sends request to read them every time. It takes about 900ms just to load these files.
I am trying to find a way to load them on just the login page, and then use that temp file as source. Is it possible to do so? If yes, how and where should I start?
P.S. A link to a tutorial will work too, as I have currently no knowledge about that.
Edit:
I can't cache the whole page, because the pages are generated at runtime based on the different possible view modes. I can only cache the js and xml file. Caching everything might be a problem.
Anyway, I am reading through the articles suggested to figure out how to do it. So, I may not be able to accept any answer right away, while I finish reading and try to implement it in one page.
Edit:
Turns out caching is already enabled, it is just that my server is acting crazy. Check the screenshot below.
With Cache
Without cache
As you see, with cache, it is actually taking more time to process some of the requests. I have no idea what that problem is, but I guess I should go to the server stack exchange to figure this out.
As for the actual problem, turns out I don't have to do anything to enable caching of xml and js files. Had no idea browsers automatically cache js files without using specific tag.
Totally possible and in fact recommended.
Browsers cache content that have been sent down with appropriate HTTP caching headers and will not request it again until the cache has expired. This will make your pages faster and more responsive and your server's load much lighter.
Here is a good read to get you started.
Here is ASP.NET MVC caching guide. It focuses on caching content returned from controllers.
Here is a read about caching static content on IIS with ASP.NET MVC.
Basically, you want to use browser caching mechanism to cache the src files after the first request.
If you're using F12 tools in your browser to debug network requests, make sure you have disable cache option unchecked. Otherwise, it forces browser to ignore cached files.
Make sure your server sends and respects cache headers - it should return HTTP status 304 Unmodified after first request to a static file.
Take a look at Asp.Net Bundling and minification - if you have for example multiple js source files, you could bundle them into one file that will be cached on the first request.
Additionally, if you use external js libraries, you could download them from a CDN instead of your server - this will both offload your server and enable user browser to use cached script version (meaning - if some other page that user has visited also used the same script, browser should already have it cached).
One approach is caching static files via IIS by adding <clientCache> element in web.config file. The <clientCache> element of the <staticContent> element specifies cache-related HTTP headers that IIS and later sends to Web clients, which control how Web clients and proxy servers will cache the content that IIS and later returns.
How to configure static content cache per folder and extension in IIS7?
Client Cache
for more info on client side caching read this part of Ultra-Fast ASP.NET 4.5 book:
Browser Cache and Caching Static Content
Other approach is caching portions of page.
if your are using Web Form:
Caching Portions of an ASP.NET Page
and if you are using MVC, use Donut Hole Caching
ASP.NET MVC Extensible Donut Caching
Donut Caching and Donut Hole Caching with Asp.Net MVC
The browser has to ask the server if the file has been modified or not since it put it to the cache, therefore the http statuscode 304. Read more from https://httpstatuses.com/304.
As this is asp.net please make sure you are first running it with
<compilation debug="false"/>
as enabling debugging has some side effects which include.
"All client-javascript libraries and static images that are deployed via
WebResources.axd will be continually downloaded by clients on each page
view request and not cached locally within the browser."
More read from https://blogs.msdn.microsoft.com/prashant_upadhyay/2011/07/14/why-debugfalse-in-asp-net-applications-in-production-environment/

Manually added WebResource.axd - Any security implications?

I added WebResource.axd (empty file) manually in production environment (IIS7 and Windows Server 2008) after chrome and firefox was giving error 404 for WebResource.axd. And now it is working fine, even with parameters. I am not sure why ASP.NET generating this script in final render result (no problem with VS2008), but now it is working. I want to know, is there any security implications other than what normally associated with WebResource.axd as it is in root directory?
Thanks
I also had this issue recently, after a code change WebResource.axd stopped working for my site. Basically, it exposes dynamic resources over http. For a longer explanation visit:
Just where is webresource.axd?
Typically, what happens is that one of the dynamic resources being requested has failed, but the resource name is encrypted. This blog post will help you decrypt the name, use it and the query string of your failing webresource.axd?___ request to figure out where the error is coming from.
Telerik webresource troubleshooting

How can I detect if an ISAPI rewrite has occurred

I've inherited an old system after starting a new job, none of the previous developers work here any more and none of them documented all that much. Fun times.
The system uses an old, defunct CMS and I've just finished a large ordeal whereby I could not for the life of me figure out how routing worked (the client wanted a URL changing). It later turned out that the previous developers had been using a completely separate program called "Helicon ISAPI rewrite" and had been doing all of the site's URL management from there.
My question is: How could I have figured this out more quickly (e.g. are there external tools I could have used or logs I don't know about that would have revealed how this routing was working)?
I spent a whole afternoon picking through 10 years worth of code when the answer wasn't even in there! Right now I'm feeling that I had no chance of figuring that out quickly but I'm wondering if I'm missing something.
I think I understand what you're asking, to discover the rewriting in the first place. Tony's answer is right if you knew about ISAPI_Rewrite up front, but hindsight is 20-20. I'm a big fan of ISAPI_Rewrite and Helicon Ape, so I might have suspected it. However, if the rewriting was being done by IIS7's .NET web.config, I wouldn't have looked there (although I guess web.config should be a place to start for anything IIS-screwy). With a legacy CMS or something like WordPress, I wouldn't know where to start, so I would probably start with the code like you did.
I suppose the real starting point is the top of IIS, before the request even gets to the web code.
Looking around in IIS7, I see Handler Mappings, with a whole bunch of stuff in there, intercepting various requests. These could all "do things" to the request before it hits the website. e.g., I see Microsoft's ExtensionlessUrlHandler... which gave us troubles as a breaking changing when upgrading to .NET 4.0. We had to dig around for this, wondering what was putting eurl.axd into our urls.
IIS6 has an ISAPI Filters tab on the website properties. Mine has ISAPI_Rewrite and ASP.NET_2.0.... in it. There's also an HTTP Headers tabl with MIME types, that can be a culprit for diverting requests.
Knowing this now, perhaps a list of all rewriting software would be handy. Just search the system for any of them installed - might be the fastest way to a get the first clue.
And actually, if you spent an afternoon in 10 years of code, that's not too bad! So you may not have had a chance of figuring this out quickly - any legacy system is going to have buried secrets.
If it's ISAPI_Rewrite v3, you can enable logging in httpd.conf in ISAPI_Rewrite installation folder by putting the following lines:
RewriteLogLevel 9
LogLevel debug
Then after you make some test request, the rewrite.log and error.log file will appear in ISAPI_Rewrite installation folder. error.log shows general issues, while rewrite.log shows how and if the rules are applied and what the resulting URL is.

ASP.NET Friendly URLs

In my research, I found 2 ways to do them.
Both required modifications to the Application_BeginRequest procedure in the Global.Asax, where you would run your code to do the actual URL mapping (mine was with a database view that contained all the friendly URLs and their mapped 'real' URLs). Now the trick is to get your requests run through the .NET engine without an aspx extension. The 2 ways I found are:
Run everything through the .NET engine with a wildcard application extension mapping.
Create a custom aspx error page and tell IIS to send 404's to it.
Now here's my question:
Is there any reason one of these are better to do than the other?
When playing around on my dev server, the first thing I noticed about #1 was it botched frontpage extensions, not a huge deal but that's how I'm used to connecting to my sites. Another issue I have with #1 is that even though my hosting company is lenient with me (as I'm their biggest client) and will consider doing things such as this, they are wary of any security risks it might present.
`#2 works great, but I just have this feeling it's not as efficient as #1. Am I just being delusional?
Thanks
I've used #2 in the past too.
It's more efficient because unlike the wildcard mapping, the ASP.NET engine doesn't need to 'process' requests for all the additional resources like image files, static HTML, CSS, Javascript etc.
Alternatively if you don't mind .aspx extension in your URL's you could use: http://myweb/app/idx.aspx/products/1 - that works fine.
Having said that, the real solution is using IIS 7, where the ASP.NET runtime is a fully fledged part of the IIS HTTP module stack.
If you have the latest version of IIS there is rewrite module for it - see here. If not there are free third party binaries you can use with older IIS (i.e. version 6) - I have used one that reads the rewrite rules from an .ini file and supports regular expression but I cant remember its name sorry (its possibly this). I'd recommend this over cheaping it out with the 404 page.
You have to map all requests through the ASP.NET engine. The way IIS processes requests is by the file extension. By default it only processes the .aspx, .ashx, etc extensions that are meant to only be processed by ASP.NET. The reason is it adds overhead to the processing of the request.
I wrote how to do it with IIS 6 a while back, http://professionalaspnet.com/archive/2007/07/27/Configure-IIS-for-Wildcard-Extensions-in-ASP.NET.aspx.
You are right in doing your mapping from the database. RegEx rewriting, like is used out of the box in MVC. This is because it more or less forces you to put the primary key in the URL and does not have a good way to map characters that are not allowed in URLs, like '.
Did you checked the ASP .Net MVC Framework? Using that framework all your URLs are automatically mapped to Controllers which could perform any desired action (including redirecting to other URLs or controllers). You could also set custom routes with custom parameters. If you don't have seen it yet, maybe it will worth the look.

Should I embed CSS/JavaScript files in a web application?

I've recently started embedding JavaScript and CSS files into our common library DLLs to make deployment and versioning a lot simpler. I was just wondering if there is any reason one might want to do the same thing with a web application, or if it's always best to just leave them as regular files in the web application, and only use embedded resources for shared components?
Would there be any advantage to embedding them?
I had to make this same decision once. The reason I chose to embed my JavaScript/CSS resources into my DLL was to prevent tampering of these files (by curious end users who've purchased my web application) once the application's deployed.
I doubting and questioning the validity of Easement's comment about how browsers download JavaScript files. I'm pretty sure that the embedded JavaScript/CSS files are recreated temporarily by ASP.NET before the page is sent to the browser in order for the browser to be able to download and use them. I'm curious about this and I'm going to run my own tests. I'll let you know how it goes....
-Frinny
Of course if anyone who knew what they were doing could use the assembly Reflector and extract the JS or CSS. But that would be a heck of a lot more work than just using something like FireBug to get at this information. A regular end user is unlikely to have the desire to go to all of this trouble just to mess with the resources. Anyone who's interested in this type of thing is likely to be a malicious user, not the end user. You have probably got a lot of other problems with regards to security if a user is able to use a tool like the assembly reflector on your DLL because by that point your server's already been compromised. Security was not the factor in my decision for embedding the resources.
The point was to keep users from doing something silly with these resources, like delete them thinking they aren't needed or otherwise tamper with them.
It's also a lot easier to package the application for deployment purposes because there are less files involved.
It's true that the DLL (class library) used by the pages is bigger, but this does not make the pages any bigger. ASP.NET generates the content that needs to be sent down to the client (the browser). There is no more content being sent to the client than what is needed for the page to work. I do not see how the class library helping to serve these pages will have any effect on the size of data being sent between the client and server.
However, Rjlopes has a point, it might be true that the browser is not able to cache embedded JavaScript/CSS resources. I'll have to check it out but I suspect that Rjlopes is correct: the JavaScript/CSS files will have to be downloaded each time a full-page postback is made to the server. If this proves to be true, this performance hit should be a factor in your decision.
I still haven't been able to test the performance differences between using embedded resources, resex, and single files because I've been busy with my on endeavors. Hopefully I'll get to it later today because I am very curious about this and the browser caching point Rjlopes has raised.
Reason for embedding: Browsers don't download JavaScript files in parallel. You have a locking condition until the file is downloaded.
Reason against embedding: You may not need all of the JavaScript code. So you could be increasing the bandwidth/processing unnecessarily.
Regarding the browser cache, as far as I've noticed, response on WebRecource.axd says "304 not modified". So, I guess, they've been taken from cache.
I had to make this same decision once. The reason I chose to embed my JavaScript/CSS resources into my DLL was to prevent tampering of these files (by curious end users who've purchased my web application) once the application's deployed.
Reason against embedding: You may not need all of the JavaScript code. So you could be increasing the bandwidth/processing unnecessarily.
You know that if somebody wants to tamper your JS or CSS they just have to open the assembly with Reflector, go to the Resources and edit what they want (probably takes a lot more work if the assemblies are signed).
If you embed the js and css on the page you make the page bigger (more KB to download on each request) and the browser can't cache the JS and CSS for next requests. The good news is that you have fewer requests (at least 2 if you are like me and combine multiple js and css and one), plus javascripts have the problem of beeing downloaded serially.

Resources