ASP.NET and 400 Bad Request Error - asp.net

Sometimes I get 400 Bad Request error for different websites written on ASP.NET.
The only solution I know is to clean up cookies for that site. Seems like that the cause of problem is _utmz and _utma cookies, which belong to Google Analytics. The problem is occurs in common cases in Mozilla FireFox, sometimes on Ghrome and Safari and never in IE. This error occurs accidentally.
What I found:
From Stefan on the ASP.Net team:
http://forums.asp.net/p/1431148/3221542.aspx
In current versions of ASP.NET Urls
containing characters like the colon
character will be rejected as a
potential security threat. The
historical reason for this is that the
underlying NTFS file system supports
alternate resource streams that can be
accessed with names like
"yourfile.txt:hiddendata.txt".
Blocking the colon character from Urls
prevents poorly written applications
from accidentally working with
alternate resource streams.
There is also a limitation in current
versions of ASP.NET that incoming Urls
need to map to the NTFS file system
for purposes of determining managed
configuration data.
In ASP.NET 4 these limitations can be
optionally removed. However these
changes are in the Beta 2 version of
ASP.NET 4 - they are not in Beta 1. We
tried out the Url listed earlier in
this forum post and confirmed that
with our internal builds of ASP.NET 4
you can use that style of Url and
process it without any 400 error.
Is is a problem with ASP.NET runtime, cookie managment process of FireFox or Google Analytics code? What problem solutions do you know?

The issue is with the way that firefox handles special characters in cookies. It doesn't have anything to do with asp.net, you will have the error whatever language you are using.
The issue seems to occur most often with Google Analytics campaign sources. You should simply try to keep the values in these to alpha characters.
I have personally had to fix issues where people used an apostrophe and an mDash in their query strings. As a result I tell people to just avoid hyphens and apostrophes all together. If you aren't the one including them, you can't be sure they didn't copy a special character the browser can't handle.
This forum has a suggestion for clearing the bad cookies on the 400 page so that they can access the site properly.
There is already another question with the same problem on stackoverflow.
You need to make sure that you are not including any special characters before you set the campaign sources etc. Sources that are created dynamically through CMS newsletters etc. can include bad characters if they are not formatted aggressively. We have had trouble with people pasting MSOffice characters into titles and links which break cookies and stop the page from being served.
There is no solution to the problem other than making sure that you aren't sending bad data to GA and causing cookies to be corrupted.

Related

Weird URL bypassing routing, parenthesis capital letter parenthesis anything two closing parentheses

I stumbled upon some strange behaviour today on a website at work. Our SEO consultant wanted some strange looking links taken away from Googles index, a seemingly straight-forward task. But it turned out to be very difficult.
The website was a .net MVC 5.2.3 application. We looked at routing, our own libraries etc. Nothing strange. After a while we gave up and tried simply redirect request to these urls by setting up a rule in web.config. Turns out these URL:s are unmatchable! Somehow under the right conditions the critical part of the URL seem to avoid matching rules as well as routing later on in the MVC application.
We narrowed down the mystical URL:s to the format (T(anything)) where T can by any capital letter and anything can be eh, anything. This is placed in the beginning of the URL as if it were a directory. In regex: \([A-Z]\([a-zA-Z0-9]*\)\)
I've tested and found the same behaviour on:
.net MVC5 sites
.net MVC3 sites
.net Web Forms sites
http://asp.net
http://stackoverflow.com
Some examples from stackoverflow.com:
Bypasses routing: https://stackoverflow.com/(K(jonas))/questions
Routes normal (404): https://stackoverflow.com/jonas/questions
Bypasses routing: https://stackoverflow.com/(G(hello))/users/1049710/jonas-%C3%84ppelgran
Routes normal (404): https://stackoverflow.com/gandhello/users/1049710/jonas-Äppelgran
It doesn't seem to affect the whole web, so it shouldn't be a browser or HTTP issue. Some examples:
Routes normal (404): http://php.net/(T(testing))/downloads
Routes normal (404): https://www.iana.org/(T(testing))/domains/reserved
Can anybody explain what is going on?
And what I can do to prohibit these URL:s to bypass routing?
Apparently this is a feature called a "cookieless session" in ASP.NET. See "Cookieless SessionIDs" section here in the MSDN docs.
The basic idea is that instead of storing the session id (if session state is enabled) in a cookie, it's now embedded in the URL.
We (Stack Overflow) disable session state entirely (by setting sessionState mode to off). As far as I know, the end result is that any time one of the URLs that match the session id format is used, that information is simply discarded.
None of the links leading to us in Google include it either, which makes me think that your site may be configured to actually generate session IDs in URLs? Short of disabling the feature, there's probably not much you can do here. Although, see "Regenerating Expired Session Identifiers" on the MSDN page I linked above to see how to at least prevent accidental session sharing if that's not already done.

ASP MVC. Some users get old scripts, despite that we use bundles

We have an ASP MVC 5 applications. We use bundles with optimization enabled by default. But we have heard several times from users, that they get errors, that we think are caused by old versions of user scripts. Their browsers somehow take scripts from cache, despite the fact, that we have edited that script files and bundles should be updated. The worst part of the problem is that we can't imitate or recreate this problem. We don't know how. We already have tried to make test-changes to scripts like adding some "console.log('test')" lines in order to see, if the browser takes the cached version, but everything was ok, the hash in the end of <script src="....?v='hash'"> changed and the browser took the newest version from first time. I should mention, that our site is a single page application. Don't know, maybe its somehow related with the problem.
Have you faced this kind of problem?
There's not enough information here to give a definitive answer. The bundler detects changes in files and will regenerate the bundle along with the link to that bundle, which will include an updated query string param. Since the query string is part of the URI, it's considered a totally different resource at this point, and the browser should fetch it again, because there is technically no cache available. The only logical reason this would not occur is if the HTML with the link to the bundle is not being updated. This can happen if you're using OutputCache or otherwise caching the HTML document. It can also happen if the client's browser is aggressively caching the HTML document. Unfortunately, there's not much you can do about that, as the client browser ultimately has control over what is or is not cached and for how long.
That said, given that this is a single page app, it's very possible that it's also including a cache manifest. This manifest will very often include the HTML file itself, and the browser will not refetch any file in the manifest unless the manifest itself is updated.

Safe to allow double escaping on IIS

I have a site that is using IIS 7.5 it is also using ISAPI rewrite. I would like to redirect some URLS that will be passed by a piece of desktop software one of our business partners use. Basically they have an app that can be used to look up product, they can click on a product in the app and be take to the supplier's web site.
The problem is the app, which is used by many customers and cannot be changed to accommodate me, uses a "+" to eliminate spaces in some of the URL parameters. This is causing problems because I am trying to use ISAPI rewrite to re-write the URL based on the part number. The "+" sign causes IIS to return a "not found" error.
According to everything I have read this can be fixed by allowing double escaping in the web.config file. Everyone seems pretty cavalier about allowing this, but I see MS considers it a security issue.
So what exactly is the danger of allowing double escaping? We have an asp classic site running on this server so I don't want to risk any exposure of the site or DB by using this. OTOH if I could feel confident allowing it, it would be a really easy way to solve a problem.
That's a complex question. I don't have OTOH for you, but I can tell you that in the last 4 years of my experience with IIS and double-escaping, our customers have been enabling double escaping without any further problems. Member of HeliconTech Support Team

ASP.NET Friendly URLs

In my research, I found 2 ways to do them.
Both required modifications to the Application_BeginRequest procedure in the Global.Asax, where you would run your code to do the actual URL mapping (mine was with a database view that contained all the friendly URLs and their mapped 'real' URLs). Now the trick is to get your requests run through the .NET engine without an aspx extension. The 2 ways I found are:
Run everything through the .NET engine with a wildcard application extension mapping.
Create a custom aspx error page and tell IIS to send 404's to it.
Now here's my question:
Is there any reason one of these are better to do than the other?
When playing around on my dev server, the first thing I noticed about #1 was it botched frontpage extensions, not a huge deal but that's how I'm used to connecting to my sites. Another issue I have with #1 is that even though my hosting company is lenient with me (as I'm their biggest client) and will consider doing things such as this, they are wary of any security risks it might present.
`#2 works great, but I just have this feeling it's not as efficient as #1. Am I just being delusional?
Thanks
I've used #2 in the past too.
It's more efficient because unlike the wildcard mapping, the ASP.NET engine doesn't need to 'process' requests for all the additional resources like image files, static HTML, CSS, Javascript etc.
Alternatively if you don't mind .aspx extension in your URL's you could use: http://myweb/app/idx.aspx/products/1 - that works fine.
Having said that, the real solution is using IIS 7, where the ASP.NET runtime is a fully fledged part of the IIS HTTP module stack.
If you have the latest version of IIS there is rewrite module for it - see here. If not there are free third party binaries you can use with older IIS (i.e. version 6) - I have used one that reads the rewrite rules from an .ini file and supports regular expression but I cant remember its name sorry (its possibly this). I'd recommend this over cheaping it out with the 404 page.
You have to map all requests through the ASP.NET engine. The way IIS processes requests is by the file extension. By default it only processes the .aspx, .ashx, etc extensions that are meant to only be processed by ASP.NET. The reason is it adds overhead to the processing of the request.
I wrote how to do it with IIS 6 a while back, http://professionalaspnet.com/archive/2007/07/27/Configure-IIS-for-Wildcard-Extensions-in-ASP.NET.aspx.
You are right in doing your mapping from the database. RegEx rewriting, like is used out of the box in MVC. This is because it more or less forces you to put the primary key in the URL and does not have a good way to map characters that are not allowed in URLs, like '.
Did you checked the ASP .Net MVC Framework? Using that framework all your URLs are automatically mapped to Controllers which could perform any desired action (including redirecting to other URLs or controllers). You could also set custom routes with custom parameters. If you don't have seen it yet, maybe it will worth the look.

Using Url.RouteUrl to Redirect to URL with #, %, and so on

In a web application I'm working on, we'd like to be able to show information about resources at a given path. The path is entirely virtual—it only exists in the application—so we don't really have a problem with users setting virtual paths that are "weird" by normal file system standards.
The issue: we have a route that reads something similar to
/Files/{*path}
and we attempt to redirect with
Url.RouteUrl("File", new { path = somePath })
This usually works, but fails if somePath contains & or #, among others. In those cases, I'm suck. I can't UrlEncode(somePath) at this point, because RouteUrl does its own URL encoding, but I can't leave them as-is, because otherwise they're treated improperly (the octothorp doesn't get passed to the routing data, and the ampersand confuses IIS). Is there a sane way around this? Or do I basically just need to implement my own routes via string interpolation?
I've been playing around with your example, and I cannot find any problems with using RouteUrl with special characters as you describe above, at least not in my test environments. The RouteUrl method encodese the urls correctly, and the controller gets the value in decoded form without any problems or deformities.
I've tested this out in IIS 7 as well as the VS 2008 development web server.
Can you provide more complete sample code?
The best answer I've found so far, though it only works on IIS7, is to follow the instructions at http://dirk.net/2008/06/09/ampersand-the-request-url-in-iis7/ to edit the registry and change some of IIS7's default behavior. This is unacceptable for us, since we're making an application that will be installed on end user's machines—and at any rate, even if we weren't, there'd be the simple fact that IIS6 and IIS5 don't respond to this sequence. Any ideas for earlier versions of IIS, or ways to override this behavior programmatically in IIS7, would be wonderful.

Resources