Avoiding cookies while requesting static content - asp.net

I just did an audit of one of my web application page (built using ASP.Net and running on development server) using Google chrome's developer tool. One particular warning caught my eyes:
Serve static content from a cookieless domain (5)!
I would like to know is it possible to avoid cookies for these kind of requests. I see that there is no cookie requests for javascript files as well. I it possible to avoid cookies in the header for these files as well? and why didn't the browser attach cookies for javascript files and attach for CSS and image?

Cookie are "attached" to a domain and a path. If you set cookies for a path above your files, they'll be sent with any request for those files.
The warning message itself tells you how to fix this - use another domain for your static content. Or a subdomain, as long as you make sure you keep your main domain cookieless in that case.

The easiest thing to do is to follow the exact suggestion in the warning message you pasted in (serve your static assets from a completely different hostname on which you don't set cookies). But in modern browsers you now also have the option of setting the crossorigin="anonymous" attribute on the relevant elements, which will prevent cookies from being sent for the matching requests. You will need to combine this with returning an access-control-allow-origin: YOUR-ORIGIN-HERE.com header in your static asset responses.

Related

Securing HTTP referer

I develop software which stores files in directories with random names to prevent unauthorized users to download a file.
The first thing we need about this is to store them in a separate top-level domain (to prevent cookie theft).
The second danger is HTTP referer which may reveal the name of the secret directory.
My experiments with Chrome browser shows that HTTP referer is sent only when I click a link in my (secret) file. So the trouble is limited only to files which may contain links (in Chrome HTML and PDF). Can I rely on this behavior (not sending the referer is the next page is opened not from a current (secret) page link but with some other method such as entering the URL directly) for all browsers?
So the problem was limited only to HTML and PDF files. But it is not a complete security solution.
I suspect that we can fully solve this problem by adding Content-Disposition: attachment when serving all our secret files. Will it prevent the HTTP referer?
Also note that I am going to use HTTPS for a man-in-the-middle not to be able to download our secret files.
You can use the Referrer-Policy header to try to control referer behaviour. Please take note that this requires clients to implement this.
Instead of trying to conceal the file location, may I suggest you implement proper authentication and authorization handling?
I agree that Referrer-Policy is your best first step, but as DaSourcerer notes, it is not universally implemented on browsers you may support.
A fully server-side solution is as follows:
User connects to .../<secret>
Server generates a one-time token and redirects to .../<token>
Server provides document and invalidate token
Now the referer will point to .../<token>, which is no longer valid. This has usability trade-offs, however:
Reloading the page will not work (though you may be able to address this with a cookie or session)
Users cannot share URL from URL bar, since it's technically invalid (in some cases that could be a minor benefit)
You may be able to get the same basic benefits without the usability trade-offs by doing the same thing with an IFRAME rather than redirecting. I'm not certain how IFRAME influences Referer.
This entire solution is basically just Referer masking done proactively. If you can rewrite the links in the document, then you could instead use Referer masking on the way out. (i.e. rewrite all the links so that they point to https://yoursite.com/redirect/....) Since you mention PDF, I'm assuming that this would be challenging (or that you otherwise do not want to rewrite the document).

How do I make CSS and JS not submit cookie with each request?

CSS and JS file carry cookies with individual request. Is there any trick to disable carrying cookies and request and response header with individual headers? There shouldn't be any need to carry these for JS and CSS files.
I am using IIS 7 and Asp.Net 4.0
You'll need to use a different domain, ie typically websites will setup static.yourwebsite.com.
As long as the cookies domain is set to only the original domain, it wont be sent to static.bla.com
The static.bla.com of course can be pointing to the same site/server, but for the browser, its a different domain.

serving images from one domain for multiple websites

we have nearly 13 domains within our company and we would like to serve images from one application in order to leverage caching.
for example, we will have c1.example.com and we will put all of our product images under this application. but here I have some doubts;
1- how can I force client browser's to cache the image and do not request it again?
2- when I reference those images on my application, I will use following html markup;
<img scr="http://c1.example.com/core/img1.png" />
but this causes a problem when I run the website under https. It gives warning about the page. It should have been used https//c1.example.com/core/img1.png when I run my apps under https. what should I do here? should I always use https? or is there a way to switch between auto?
I will run my apps under IIS 7.
Yes you need to serve all resources over https when the html-page is served over https. Thats the whole point of using https.
If the hrefs are hardcoded in the html one solution could be to use a Response Filter that will parse all content sent to the client and replace http with https when necessary. A simple Regular Expression should do the trick. There are plenty of articles out there about how these filters are working.
About caching you need to send the correct cache-headers and etag. There are several of questions and answers on this on SO like this one IIS7 Cache-Control
You need to use HTTP headers to tell the browser how to cache. It should work by default (assuming you have no query string in your URLs) but if not, here's a knowledge base article about the cache-control header:
http://support.microsoft.com/kb/247404
I really don't know much about IIS, so I'm not sure if there are any other potential pitfalls. Note that browsers may still send HEAD requests sometimes.
I'd recommend you setup the image server so that HTTP/S is interchangeable, then just serve HTTPS Urls from HTTPS requests.

How to respect "Serve static content from a cookieless domain" page speed rule in IIS6?

How to respect "Serve static content from a cookieless domain" page speed rule in IIS6?
To create a cookieless site (or subdomain, which is a very common best-practice) in IIS6/IIS7/IIS7.5 is simple : you need to tell the website that you are not to use cookies :) Which means in IIS terms, not to use a session.
This can be achieved in IIS6/IIS7 via two ways.
Modifying the Web.config file (my personal recommendation)
Using the IIS Manager GUI to find the setting and changing it.
IMPORTANT
Before you do any testing, you must must must clear all cookies (or all cookies for the domain u are testing) otherwise, they will get passed along even if u have done all the steps.
1. Via Config File
You need to define the session state to off.
<system.web>
<sessionState cookieName="What_ever" mode="Off" />
</system.web>
NOTE: Please note that the attribute cookieless (true|false) does NOT mean 'send cookies/do not sent cookies). That's for using sessions with/without cookies ... and passes some cookie guid into the url instead (if set to true).
2. Via Gui
Hope this Helps (i assume u know how to test that no cookies are working/not working...)
What this means is that your content needs to come from a domain that has no cookies attached to it. StackOverflow.com is an example of a site that does this. You will notice that all SO's static content comes from a domain called sstatic.net.
http://sstatic.net/stackoverflow/all.css
http://sstatic.net/js/master.js
This is so that the client and the server don't have to waste resources on actually parsing and handling cookie data. The good news is, you can use a sub-domain, assuming that you set your cookie path correctly.
Yahoo Best Practices for Speeding Up
Your Web Site
Use Cookie-free Domains for Components
When the browser makes a request for a
static image and sends cookies
together with the request, the server
doesn't have any use for those
cookies. So they only create network
traffic for no good reason. You should
make sure static components are
requested with cookie-free requests.
Create a subdomain and host all your
static components there. If your
domain is www.example.org, you can
host your static components on
static.example.org. However, if you've
already set cookies on the top-level
domain example.org as opposed to
www.example.org, then all the requests
to static.example.org will include
those cookies. In this case, you can
buy a whole new domain, host your
static components there, and keep this
domain cookie-free. Yahoo! uses
yimg.com, YouTube uses ytimg.com,
Amazon uses images-amazon.com and so
on.
Another benefit of hosting static
components on a cookie-free domain is
that some proxies might refuse to
cache the components that are
requested with cookies. On a related
note, if you wonder if you should use
example.org or www.example.org for
your home page, consider the cookie
impact. Omitting www leaves you no
choice but to write cookies to
*.example.org, so for performance reasons it's best to use the www
subdomain and write the cookies to
that subdomain.
create subdomain ( for example static.example.com ) and store all static content(images, css, js) here

IHTTPModule to switch between HTTP and HTTPS in ASP.NET

I'm working on a web site which contains sections that need to be secured by SSL.
I have the site configured so that it runs fine when it's always in SSL, I see the SSL padlock in IE7/IE8/FireFox/Safari/Chrome
To implement the SSL switching, I created a class that implemented IHTTPModule and wired up HTTPApplication.PreRequestHandlerExecute.
I go through some custom logic to determine whether or not my request should use SSL, and then I redirect. I have to deal with two scenarios:
Currently in SSL and request doesn't require SSL
Currently not in SSL but request requires SSL
I end up doing the followng (where ctx is HttpContext.Current and pathAndQuery is ctx.Request.Url.PathAndQuery)
// SSL required and current connection is not SSL
if (requestRequiresSSL & !ctx.Request.IsSecureConnection)
ctx.Response.Redirect("https://www.myurl.com" + pathAndQuery);
// SSL not required but current connection is SSL
if (!requestRequiresSSL & ctx.Request.IsSecureConnection)
ctx.Response.Redirect("http://www.myurl.com" + pathAndQuery);
The switching back and forth now works fine. However, when I go into SSL mode, FireFox and IE8 warns me that my request isn't entirely encrypted.
It looks like my module is short circuiting my request somehow, would appreciate any thoughts.
I would suspect, that when you determine which resources require encryption, and which not, you do not include the images, or some header and footers as well, or even CSS files, if you use any.
As you always throw away SSL for such a content, it may happen that part of the page (main html) requires SSL, but the consequential request for an image on this page does not.
The browser is warning you, that some parts of the page were not delivered using SSL.
I will check if the request is for HTML, and only then drop the SSL if needed. Otherwise, keep it the way it is (most probably images and such are referenced with relative paths, than a full blown url).
I.e., if you have:
<html>
<body>
Some content...
<img src="images/someimage.jpg">
</body>
</html>
and you request this page using SSL, but your evaluation of requestRequiresSSL does not take into account the images as secured resources, it will form a http, not https request, and you will see the warning.
Make sure when you request a resource and evaluate requestRequiresSSL, to check the referrer and if this is an image:
// SSL not required but current connection is SSL
if (!requestRequiresSSL && ctx.Request.IsSecureConnection && !isHtmlContent)
ctx.Response.Redirect("http://www.myurl.com" + pathAndQuery);
Just figure out how to determine isHtmlContent (if you do not serve images from a database, etc., but from a disk location), just check the the resource filename (.aspx, .asmx, .ashx, .html, etc.).
That way, if the connection is encrypted, but the resource itself is not html, and no set for "encryption", you are not going to drop the encryption.
I highly recommend using this (free / open source) component to do what you're trying:
http://www.codeproject.com/KB/web-security/WebPageSecurity_v2.aspx
Any content that is not normally handled by .Net (such as regular html and most graphic files) will not execute the httpmodule because it doesn't go through .net
Your best bet is to just handle this at the IIS level. See the following for info on how to configure your server.
http://www.jameskovacs.com/blog/HowToAutoRedirectToASSLsecuredSiteInIIS.aspx
I highly recommend you this product:
http://www.e2xpert.com/web/Http-Https-Switch.aspx
It is professional and easy to use. It comes with a powerful configuration tool, by which just one click can finish the entire configuration for you.
Just use SSL throughout your site, for all pages and for all images/scripts/stylesheets. That just makes everything oh-so-simple. IE and Firefox will no longer complain, you will no longer have crazy modules trying to guess whether any given request should be redirected, etc.
For the average user it's nearly impossible for them to make a informed decision when the only thing Firefox vaguely tells them is, "Parts of the page you are viewing were not encrypted before being transmitted over the Internet." This is about as helpful as the "somethings wrong" engine light and in fact is telling them after their information has been transferred.
The least this message should be accompanied with is a list providing the URL, type of content (images, javascript, css) and what it means to the user. BTW I get this message when using GMail.
Until that happens, as others stated your code should work once you determine the unsecured elements. Then you can use Firebug (http://getfirebug.com) to check the content being delivered over the connection.

Resources