Disable GZIP compression for IE6 clients - asp.net

We need to conditionally disable GZIP compression if user's browser is IE6 (it hangs browser for 5min) in few pages of larger site. Server is IIS7 and has compression for static content turned on - want that compression left working if user agent is not Mozilla/4.0. ASPX code sample anyone?
Alternatively, code to conditionally redirect to the same page on another site (could create another virtual site with compression disabled) but need to pass all parameters (GET/POST).

Try intercepting the browser's request to stop claiming support for Gzip, if the request is from IE5/IE6 . I believe ISAPI rewrite is available for IIS.
Take note: this does not require you to have separate gzipped and non-gzipped pages. This is probably a better approach than your proposal, since it cuts the problem at its source.

Related

How does hsts preload work on the backend

I know the hsts (http to https) will work from the very first time If my site is registered in the preload list.
On the other hand I am also declaring preload in hsts header in my web server.
What if I access my site for the very first time with http which one is gonna happen first?
I mean will the site access the preload list first or web server first?
You need to submit your site to the browsers preload list. It will then yet you are issuing the preload header (to prevent bad actors submitting sites to preload list when they don’t want it), and include it in the inbuilt list in future release.
Some browsers also regularly scan or crawl websites looking for sites with preload headers to include. Though I believe this is done less, and it’s better to explicitly submit your site.
After the site is included in the browsers preload list and request for http:// version will automatically be converted to https://. This happens before you send the request, so before you get the HSTS header response.
That’s the point of preloading - to protect you before you even make a single request.
Personally I’m not a fan of preload. Hard coding a list of sites something into a browser has obvious scaling issues but, more importantly, when you do that you’re taking a risk with something you can’t change back without waiting months or possibly years for browser vendors to pickup the reverted setting to remove the code. I personally believe preload is overkill for most sites.

Control caching and CDN on Cloudflare when SSL is forced "on" via a page rule

This question is specifically about page rules in Cloudflare, which allow you to specify wildcard patterns on your site using rules - and handle each pattern differently.
One of the patterns is "Force SSL" - in effect, any request that matches that pattern will be forced down the path of https:// - whether that's Flexible SSL or otherwise.
The problem with choosing this option is that all other options over the CDN/cache time, etc. disappear.
This raises some obvious issues to which I've found no clear answer:
If Cloudflare serves a https:// resource, does it still cache static resources?
How do I control the nature of the resources cached? In other words, the settings equivalent to "Simple" caching, and "Aggressive" caching.
Is there any ability to set options such as cache expiry, time that they reside on edge servers before expiration, etc?
Is it possible to set "Cache Everything" when serving requests over https://? It certainly exists on the http:// equivalent.
I would like Cloudflare to re-direct my visitors from http:// to https:// automatically as opposed to do it on my app, because the various apps on my domain (Wordpress included) have various quirks that make configuring each one both tedious and error-prone.
you can add another rule for caching for https - the first rule would be to divert all http to https with another rule right after that to handle the https traffic.
"If Cloudflare serves a https:// resource, does it still cache static resources?"
Yes. It doesn't matter if it is http or https://
What CloudFlare caches by default
"How do I control the nature of the resources cached? In other words, the settings equivalent to "Simple" caching, and "Aggressive" caching."
By using those settings in your performance settings.
"Is it possible to set "Cache Everything" when serving requests over https://? It certainly exists on the http:// equivalent."
I would actually recommend not doing cache everything, really. While it is an option that is available, you could have issues with users that have to sign in, etc.
"Is there any ability to set options such as cache expiry, time that they reside on edge servers before expiration, etc?"
You can set a browser cache TTL in your performance settings; we should also honor the expire headers you have set on your server.

serving images from one domain for multiple websites

we have nearly 13 domains within our company and we would like to serve images from one application in order to leverage caching.
for example, we will have c1.example.com and we will put all of our product images under this application. but here I have some doubts;
1- how can I force client browser's to cache the image and do not request it again?
2- when I reference those images on my application, I will use following html markup;
<img scr="http://c1.example.com/core/img1.png" />
but this causes a problem when I run the website under https. It gives warning about the page. It should have been used https//c1.example.com/core/img1.png when I run my apps under https. what should I do here? should I always use https? or is there a way to switch between auto?
I will run my apps under IIS 7.
Yes you need to serve all resources over https when the html-page is served over https. Thats the whole point of using https.
If the hrefs are hardcoded in the html one solution could be to use a Response Filter that will parse all content sent to the client and replace http with https when necessary. A simple Regular Expression should do the trick. There are plenty of articles out there about how these filters are working.
About caching you need to send the correct cache-headers and etag. There are several of questions and answers on this on SO like this one IIS7 Cache-Control
You need to use HTTP headers to tell the browser how to cache. It should work by default (assuming you have no query string in your URLs) but if not, here's a knowledge base article about the cache-control header:
http://support.microsoft.com/kb/247404
I really don't know much about IIS, so I'm not sure if there are any other potential pitfalls. Note that browsers may still send HEAD requests sometimes.
I'd recommend you setup the image server so that HTTP/S is interchangeable, then just serve HTTPS Urls from HTTPS requests.

Avoiding cookies while requesting static content

I just did an audit of one of my web application page (built using ASP.Net and running on development server) using Google chrome's developer tool. One particular warning caught my eyes:
Serve static content from a cookieless domain (5)!
I would like to know is it possible to avoid cookies for these kind of requests. I see that there is no cookie requests for javascript files as well. I it possible to avoid cookies in the header for these files as well? and why didn't the browser attach cookies for javascript files and attach for CSS and image?
Cookie are "attached" to a domain and a path. If you set cookies for a path above your files, they'll be sent with any request for those files.
The warning message itself tells you how to fix this - use another domain for your static content. Or a subdomain, as long as you make sure you keep your main domain cookieless in that case.
The easiest thing to do is to follow the exact suggestion in the warning message you pasted in (serve your static assets from a completely different hostname on which you don't set cookies). But in modern browsers you now also have the option of setting the crossorigin="anonymous" attribute on the relevant elements, which will prevent cookies from being sent for the matching requests. You will need to combine this with returning an access-control-allow-origin: YOUR-ORIGIN-HERE.com header in your static asset responses.

IHTTPModule to switch between HTTP and HTTPS in ASP.NET

I'm working on a web site which contains sections that need to be secured by SSL.
I have the site configured so that it runs fine when it's always in SSL, I see the SSL padlock in IE7/IE8/FireFox/Safari/Chrome
To implement the SSL switching, I created a class that implemented IHTTPModule and wired up HTTPApplication.PreRequestHandlerExecute.
I go through some custom logic to determine whether or not my request should use SSL, and then I redirect. I have to deal with two scenarios:
Currently in SSL and request doesn't require SSL
Currently not in SSL but request requires SSL
I end up doing the followng (where ctx is HttpContext.Current and pathAndQuery is ctx.Request.Url.PathAndQuery)
// SSL required and current connection is not SSL
if (requestRequiresSSL & !ctx.Request.IsSecureConnection)
ctx.Response.Redirect("https://www.myurl.com" + pathAndQuery);
// SSL not required but current connection is SSL
if (!requestRequiresSSL & ctx.Request.IsSecureConnection)
ctx.Response.Redirect("http://www.myurl.com" + pathAndQuery);
The switching back and forth now works fine. However, when I go into SSL mode, FireFox and IE8 warns me that my request isn't entirely encrypted.
It looks like my module is short circuiting my request somehow, would appreciate any thoughts.
I would suspect, that when you determine which resources require encryption, and which not, you do not include the images, or some header and footers as well, or even CSS files, if you use any.
As you always throw away SSL for such a content, it may happen that part of the page (main html) requires SSL, but the consequential request for an image on this page does not.
The browser is warning you, that some parts of the page were not delivered using SSL.
I will check if the request is for HTML, and only then drop the SSL if needed. Otherwise, keep it the way it is (most probably images and such are referenced with relative paths, than a full blown url).
I.e., if you have:
<html>
<body>
Some content...
<img src="images/someimage.jpg">
</body>
</html>
and you request this page using SSL, but your evaluation of requestRequiresSSL does not take into account the images as secured resources, it will form a http, not https request, and you will see the warning.
Make sure when you request a resource and evaluate requestRequiresSSL, to check the referrer and if this is an image:
// SSL not required but current connection is SSL
if (!requestRequiresSSL && ctx.Request.IsSecureConnection && !isHtmlContent)
ctx.Response.Redirect("http://www.myurl.com" + pathAndQuery);
Just figure out how to determine isHtmlContent (if you do not serve images from a database, etc., but from a disk location), just check the the resource filename (.aspx, .asmx, .ashx, .html, etc.).
That way, if the connection is encrypted, but the resource itself is not html, and no set for "encryption", you are not going to drop the encryption.
I highly recommend using this (free / open source) component to do what you're trying:
http://www.codeproject.com/KB/web-security/WebPageSecurity_v2.aspx
Any content that is not normally handled by .Net (such as regular html and most graphic files) will not execute the httpmodule because it doesn't go through .net
Your best bet is to just handle this at the IIS level. See the following for info on how to configure your server.
http://www.jameskovacs.com/blog/HowToAutoRedirectToASSLsecuredSiteInIIS.aspx
I highly recommend you this product:
http://www.e2xpert.com/web/Http-Https-Switch.aspx
It is professional and easy to use. It comes with a powerful configuration tool, by which just one click can finish the entire configuration for you.
Just use SSL throughout your site, for all pages and for all images/scripts/stylesheets. That just makes everything oh-so-simple. IE and Firefox will no longer complain, you will no longer have crazy modules trying to guess whether any given request should be redirected, etc.
For the average user it's nearly impossible for them to make a informed decision when the only thing Firefox vaguely tells them is, "Parts of the page you are viewing were not encrypted before being transmitted over the Internet." This is about as helpful as the "somethings wrong" engine light and in fact is telling them after their information has been transferred.
The least this message should be accompanied with is a list providing the URL, type of content (images, javascript, css) and what it means to the user. BTW I get this message when using GMail.
Until that happens, as others stated your code should work once you determine the unsecured elements. Then you can use Firebug (http://getfirebug.com) to check the content being delivered over the connection.

Resources