unable to get useUnsafeHeaderParsing to work - asp.net

I have a server that sits behind an Incapsula Web App Firewall, which alters the headers sent to IIS. When I perform a specific request I get the following error from IIS: The server committed a protocol violation. Section=ResponseHeader Detail=CR must be followed by LF. This behavior is also described in: http://www.dragonblogger.com/fix-live-writer-protocol-violation-error-cr-lf/
According to this page http://msdn.microsoft.com/en-us/library/65ha8tzh%28v=vs.80%29.aspx I should be able to accept these headers by setting the useUnsafeHeaderParsing to true. So I tried adding this to the web.config in the virtual directory from which the specific request should be handled:
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<system.net>
<settings>
<httpWebRequest useUnsafeHeaderParsing="true" />
</settings>
</system.net>
</configuration>
After restarting IIS it still does not work. I also tried adding this to the c:\windows\Microsoft.NET\framework\config\web.conf but it did not work either.
Does anyone have any idea what I am missing?
Thanks!

When trying to connect Azure Application Insights to one of our websites (also "protected" by Incapsula) we encountered the same response header violation. Setting the useUnsafeHeaderParsing to true didn't work either.
Consulting the Incapsula support desk helped us out: it seems that Incapsula is adding an irregular cookie to the response to identify the visitor as a bot or as a human visitor. The addition of the irregular cookie the to the response can be the cause of the response header violation. Incapsula also has Javascript classification, so we let them disable the irregular cookie and enable the Javascript classification.
This is how we solved the violation error.

Related

How do I ensure that X-HTTP-Method headers are ignored?

I'm currently applying security fixes for a vulnerability which was found by a third party software. This is the issue (Often Misused: HTTP Method Override vulnerability).
The request from the software was similar to:
POST /Home/ViewProfile HTTP/1.1
Referer: https://somesite.com/Home/ViewProfile?qrystr=blahblah
[...]
X-HTTP-METHOD: PUT
X-HTTP-Method-Override: PUT
X-METHOD-OVERRIDE: PUT
[...]
And the response was:
HTTP/1.1 200 OK
[...]
The web application is not a RESTful API, it's just a an ASP.NET MVC site which only has GET and POST actions.
I have a few questions:
Is this a false positive given the type of app?
By default, does ASP.NET do anything with these headers X-HTTP-Method, X-HTTP-Method-Override, X-METHOD-OVERRIDE if not explicitly told to do so such as in this example?
Regarding the first linked issue above, what is the best way to go about achieving the recommended remediations if they're necessary/applicable based on my case:
"Ensure that only the required headers are allowed, and that the allowed headers are properly configured."
and
"Ensure that no workarounds are implemented to bypass security measures implemented by user-agents, frameworks, or web servers."
Another thing to note is I don't have access to modify IIS settings, but I can modify the Web.Config.
I had the same problem with a scan from my security team. What I did was limiting the size of those requests to zero (0) in the web.config. The server then returns a "HTTP Error 431.0 - Request Header Fields Too Large", effectively blocking the overrides.
</system.webServer>
...
<security>
<requestFiltering>
<requestLimits>
<headerLimits>
<add header="X-Http-Method-Override" sizeLimit="0" />
<add header="X-Method-Override" sizeLimit="0" />
<add header="X-HTTP-Method" sizeLimit="0" />
</headerLimits>
</requestLimits>
...
</requestFiltering>
</security>
...
</system.webServer>
However, I haven't checked yet if this effectively cancels the alert by the security scanner. I suspect it might still show, but I'm ready to report back as a false positive because the server is blocking all calls with those headers. I'll let you know as soon as I get a response from the security team.

RequestFiltering not working for MS-DOS device name paths

I'm trying to appease a PCI scan failure we recently had done, in which it states:
Microsoft ASP.NET MS-DOS Device Name DoS
Synopsis :
A framework used by the remote web server has a denial of service vulnerability.
Impact:
The web server running on the remote host appears to be using Microsoft
ASP.NET, and may be affected by a denial of service vulnerability. Requesting a URL
containing an MS-DOS device name can cause the web server to become
temporarily unresponsive.
In a nutshell, we visit a URL on our app such as /AUX/.aspx we get a 500 error.
I'm using RequestFiltering to filter these requests out, and return 404's instead, without the server trying to process the request.
An excerpt of my web.config is below:
<system.webServer>
<security>
<requestFiltering>
<denyUrlSequences>
<add sequence="/AUX/.aspx" />
</denyUrlSequences>
</requestFiltering>
</security>
</system.webServer>
However, this isn't working, it's still returning a 500.
I would expect it to return a 404.
If I add the following catch-all url to the denyUrlSequences then the whole site produces the expected 404.
<add sequence="/" />
It's worth mentioning the application in question is an MVC app running on IIS 7.5 (Windows 2008 R2)
Just had to solve this problem.
My solution was to disable .Net Error Pages and enable IIS Error Pages.
When you move the custom error handling from the higher .Net level to the lower IIS level the HTTP response code changes from 500 to 404.
PCI Test Passed :-)
I struggled with this for quite some time myself. I think the 500 response code is correct for MS-DOS names in the URL, and you do not need to add anything to request filtering.
You'll notice that you will get a 500 error if you use any of the MS-DOS names (https://support.microsoft.com/en-us/kb/74496) without doing anything to your configuration. However, if you add a RequestFiltering denySequence for something else, like "foo", then you will see the 404.5 error when browsing to /foo.
If you add relaxedUrlToFileSystemMapping="true" to the httpRuntime element along with your request filtering denySequence entries, then you will get the 404.5 for MS-DOS names.
But disabling the default asp.net configuration just so you can get something other then a 500 response for a URL with MS-DOS name is a rediculous request from a PCI compliance check.

Access-Control-Allow-Origin header vs cross domain policy

So I'm reading up on these and am a little confused. I'm using an iframe of a site on another domain. I get No 'Access-Control-Allow-Origin' header is present on the requested resource.” Reading up on this I can just set the header in the web.config. However, I want multiple specific domains and not just the wildcard "*". I was reading up on the cross domain policy. Creating an xml file Is this by any means related or are these two completely different things?
This xml policy
<?xml version="1.0"?>
<cross-domain-policy>
<allow-access-from domain="domain1.com"/>
<allow-access-from domain="domain2.com"/>
</cross-domain-policy>
vs this in the web.config
<system.webServer>
<httpProtocol>
<customHeaders>
<add name="Access-Control-Allow-Origin" value="site1.com" />
</customHeaders>
</httpProtocol>
</system.webServer>
CORS works by adding a special header to responses from a server to the client. If a response contains the Access-Control-Allow-Origin header, and if the browser supports CORS, then there is a chance you can load the resource directly with Ajax no need for a proxy.
When you set, Access-Control-Allow-Origin value as “site1.com”.
With this configuration, only scripts that originate from http://site1.com are allowed to load resources. Any other domain trying to use Ajax to load resources will be given the standard security error message. In this way, site owners can limit which domains are allowed to load their resources with CORS.
Alternatively, site owners can grant wide-open access with the always ready to party asterisk:
Access-Control-Allow-Origin: *.
Now, any site that wants to load a resource directly using Ajax can do so without getting the browser security error. It's a very helpful technique for modern apps that often load data using JavaScript, and hopefully more modern web APIs will start to support CORS.

Secure Cookie Issue: Cookies only secure sometimes

I am trying to secure the cookies returned from my ASP.NET application.
I set requireSSL="true" my web.config but it looks like the cookies are only secure sometimes. I will check the request in Firebug or Chrome dev tools and the cookie will be secure sometimes (it look like it is usually the first time I visit the page but subsequent visits they are not secure).
Screen shot of Chrome dev tools: http://i.imgur.com/jII0KDI.png
Does anyone have an idea why this might be happening?
Thanks for the help!
Web.Config Settings
<system.web>
<httpCookies httpOnlyCookies="true" requireSSL="true" />
</system.web>
It could well be working.
Chrome dev tools only show cookies marked as HTTP Only and Secure in the Response and not the Request, so your setup might be working. It seems like it could be a bug in Chrome dev tools or that it is only showing what is provided in the request (the fact that they are secure or HTTP only is not indicated in an actual HTTP request, only the value is sent to the server). Either way I think it should show N/A in these columns to show that they do not apply to HTTP requests.
To verify that your cookie has been set correctly you could try the Edit This Cookie extension. This will indicate for each cookie whether it has the Secure or HTTP Only attributes applied.
can you please elaborate what you mean by secure ? if sslonly flag is set then offcourse all cookies will be sent over encrypted connection only. but that doesn't prevent you from seeing in debugger

HttpWebRequestError: The server committed a protocol violation. Section=ResponseHeader Detail=CR must be followed by LF

I have created a sample asp.net application and try to scrape data from my server using httpwebrequest. But some times I got this above error. I have done some searches on google but all are saying that you should add the property "<httpWebRequest useUnsafeHeaderParsing="true" />" in web.config.
This property have 'UNSAFE" word, so I am too worried about this. I can't add this is in my site configuration. Is there any other option to read the response of my scrape URL. Please let me know how can it be possible without "<httpWebRequest useUnsafeHeaderParsing="true" />"
Thanks in advance,
Laxmilal Menaria
This is certainly a server problem-- the server is not following the HTTP specification and the .NET client is flagging this as a potential problem. "Unsafe" is I think somewhat of a misnomer. There's not really a big security issue here, only non-compliance with the RFC which is bad but unfortunately not rare.
So, as you found in Google, the way to get around the problem is to apply the following configuration change:
<configuration>
<system.net>
<settings>
<httpWebRequest useUnsafeHeaderParsing="true" />
</settings>
</system.net>
</configuration>
One way to debug this (and to make sure it is the protocol violation that is causing the problem), is to use Fiddler (Http Web Proxy) and see if the same error occurs. If it doesn't (i.e. Fiddler handled the issue for you) then you should be able to fix it using the UseUnsafeHeaderParsing flag.
If you are looking for a way to set this value programatically see the examples here: http://o2platform.wordpress.com/2010/10/20/dealing-with-the-server-committed-a-protocol-violation-sectionresponsestatusline/
Another possibility: when doing a POST, the server responds with a 100 continue in an incorrect way.
This solved the problem for me:
request.ServicePoint.Expect100Continue = false;
You may want to check the content of your headers. Try adding the following, as suggested by this link:
Accept: text/html, application/xhtml+xml, */*

Resources