I'm not sure whether this is a server issue, or whether I'm failing to understand how HTTP caching really works.
I have an ASP MVC application running on IIS7. There's a lot of static content as part of the site including lots of CSS, Javascript and image files.
For these files I want the browser to cache them for at least a day - our .css, .js, .gif and .png files rarely change.
My web.config goes like this:
<system.webServer>
<staticContent>
<clientCache cacheControlMode="UseMaxAge"
cacheControlMaxAge="1.00:00:00" />
</staticContent>
</system.webServer>
The problem I'm getting is that the browser (tested Chrome, IE8 and FX) doesn't seem to be caching the files as I'd expect. I've got the default settings (check for newer pages automatically in IE).
On first visit the content downloads as expected
HTTP/1.1 200 OK
Cache-Control: max-age=86400
Content-Type: image/gif
Last-Modified: Fri, 07 Aug 2009 09:55:15 GMT
Accept-Ranges: bytes
ETag: "3efeb2294517ca1:0"
Server: Microsoft-IIS/7.0
X-Powered-By: ASP.NET
Date: Mon, 07 Jun 2010 14:29:16 GMT
Content-Length: 918
<content>
I think that the Cache-Control: max-age=86400 should tell the browser not to request the page again for a day.
Ok, so now the page is reloaded and the browser requests the image again. This time it gets an empty response with these headers:
HTTP/1.1 304 Not Modified
Cache-Control: max-age=86400
Last-Modified: Fri, 07 Aug 2009 09:55:15 GMT
Accept-Ranges: bytes
ETag: "3efeb2294517ca1:0"
Server: Microsoft-IIS/7.0
X-Powered-By: ASP.NET
Date: Mon, 07 Jun 2010 14:30:32 GMT
So it looks like the browser has sent the ETag back (as a unique id for the resource), and the server's come back with a 304 Not Modified - telling the browser that it can use the previously downloaded file.
It seems to me that would be correct for many caching situations, but here I don't want the extra round trip. I don't care if the image gets out of date when the file on the server changes.
There are a lot of these files (even with sprite-maps and the like) and many of our clients have very slow networks. Each round trip to ping for that 304 status is taking about a 10th to a 5th of a second. Many also have IE6 which only has 2 HTTP connections at a time. The net result is that our application appears to be very slow for these clients with every page taking an extra couple of seconds to check that the static content hasn't changed.
What response header am I missing that would cause the browser to aggressively cache the files?
How would I set this in a .Net web.config for IIS7?
Am I misunderstanding how HTTP caching works in the first place?
You need to use the Expires directive, otherwise the browser will always check to see if the content has updated.
If a cached entry has a valid expiration date the browser can reuse the content without having to contact the server at all when a page or site is revisited. This greatly reduces the number of network round trips for frequently visited pages. For example, the Google logo is set to expire in 2038 and will only be downloaded on your first visit to google.com or if you have emptied your browser cache. If they ever want to change the image they can use a different image file name or path.
To change in IIS7 use following. This is easiest to manage if you keep static content in specific directories.
Log onto the server
Open IIS Manager (start -> adminstrative tools -> iis manager
Expand the server node
Expand the sites node
Open the site and navigate to the directory you want to change
Open the IIS HTTP Response Headers section
Click Set Common Headers on the task pane on the right
Set "Expire Web Content" as your app requires.
use expires header instead of using the cache-control.Tell your server for the first time that serve me content from my browser cache until this expiry date. There will be no cross checking for changes in file until your expiry date.
add the header in your web.config’s system.webServer section like so:
<system.webServer>
<staticContent>
<clientCache httpExpires="Sun, 29 Mar 2020 00:00:00 GMT"
cacheControlMode="UseExpires" />;
</staticContent>
</system.webServer>
Short anwser: remove Etag and use Expire header.
You should check out the 35 Yahoo Performance best practices, and more specifically:
Configure ETags on your server (removing them is a good choice)
Add Expire header for static resources
For each rule, they usually cover Apache and IIS web server configurations.
Edit: okay, it looks like there is no simple way to remove Etags in IIS, besides installing some 3rd party software...
Related
I am observing that CSS file is not getting cached on Chrome browser. My application is built by Angular-CLI and all the required cache-control headers and Expires header set to 5 minutes:
Accept-Ranges:bytes
Cache-Control:max-age=600
Content-Encoding:gzip
Content-Type:text/css
Date:Wed, 13 Sep 2017 05:11:17 GMT
ETag:W/"441246-1505278984000"
Expires:Wed, 13 Sep 2017 05:21:18 GMT
Last-Modified:Wed, 13 Sep 2017 05:03:04 GMT
Server:Apache-Coyote/1.1
Transfer-Encoding:chunked
Vary:Accept-Encoding
JS files which are set with same response headers are cached as expected. css file is cached in Firefox Mozilla as well.
I searched through the posts and few posts suggestions were:
Resources presenting self-signed certificates and working with HTTPS are not cached by Chrome sometimes if there is any SSL error. But in my case, all other files like .js, .png files are operated on same channel and are cached.
The Transfer-Encoding: chunked is causing any problems in caching on chrome? It works fine in FireFox though.
gzip compression doesn't works well with Chrome: https://github.com/expressjs/compression/issues/64
Any pointers/suggestions?
It seems that Chrome do not caches the resource file if it has Transfer-Encoding:chunked response header. This response header was getting set even if the resource file was of small size. I think the header is set automatically depending upon the http server configuration. This configuration could be based on size of the file, etc.
Since I do not have control over the server configurations to set any HTTP protocol setting, I
ended up setting the response header: Transfer-Encoding: identity.
With this response header, http server do not modifies the header further and puts on the Content-Length header as well. With Content-Length header in response, gives Chrome a clear picture that the resource file can be cached.
I am having the same issue on one web site, but deploying the same web app to another web site (on the same IIS) server, I get everything cached.
Recycling the application pool did not help.
Only after stopping and starting IIS, both web sites now cache CSS and JS files in Chrome. Beats me, but you could give it a try.
Not sure if this is your case but, I had the same problem and the problem disappeared when I started to use the proper SSL certificate for the domain.
I'm trying to test some caching configuration, i want my page to stay in cache for 1 min before the request reaches the server again.
Using this simple test.asp page that has a link to itself:
<% Option Explicit %>
<%
Response.Expires = 1
Response.CacheControl = "private, max-age=60"
%>
<html>
<head><title>test</title></head>
<body>
<% =Now() %>
<br />
test
</body>
</html>
This works perfectly on my development computer http://localhost/test.asp, (clicking the link does not refresh the printed datetime during 1 min).
However it has not the desired effect when i put the page on the production server. After only few seconds of clicking the link I get a new datetime meaning the request reached the web server.
I use Chrome dev tool and see these response headers:
HTTP/1.1 200 OK
Cache-Control: private, max-age=60
Content-Type: text/html
Expires: Tue, 12 May 2015 19:16:52 GMT
Last-Modified: Tue, 12 May 2015 19:10:00 GMT
Server: Microsoft-IIS/7.5
X-Powered-By: ASP.NET
Date: Tue, 12 May 2015 19:15:55 GMT
Content-Length: 205
Can anyone help explain why it does not work on the prod server ?
update
I tried with Chrome, Firefox and IE, and also 2 pages test.asp and test2.asp, both having a link to the other page, and got exactly the same problem, after 8-12 sec, the page refresh instead of waiting 60sec before refreshing.
To follow up on my comment, It looks like you might be looking at caching your dynamic asp pages on the server, not on the client. Caching on the client doesnt really do you much good because modern browsers / proxies will still request the item when its an HTML document. Caching static resources which dont change such as images, css, js should work, and depending on the cache header you push out the browser will respect those.
To get your pages to cache on the server (meaning IIS doesnt have to re-generate the page) here is how you do it.
Web.config
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<system.webServer>
<caching>
<profiles>
<add extension=".asp" policy="CacheForTimePeriod" kernelCachePolicy="DontCache" duration="00:01:00" />
</profiles>
</caching>
</system.webServer>
</configuration>
You can place your web.config in a specific directory only to cache its contents, you can also break the caching using querystring params or certain request headers.
When I visit chesseng.herokuapp.com I get a response header that looks like
Cache-Control:private
Connection:keep-alive
Content-Encoding:gzip
Content-Type:text/css
Date:Tue, 16 Oct 2012 06:37:53 GMT
Last-Modified:Tue, 16 Oct 2012 03:13:38 GMT
Status:200 OK
transfer-encoding:chunked
Vary:Accept-Encoding
X-Rack-Cache:miss
and then I refresh the page and get
Cache-Control:private
Connection:keep-alive
Date:Tue, 16 Oct 2012 06:20:49 GMT
Status:304 Not Modified
X-Rack-Cache:miss
so it seems like caching is working. If that works for caching then what is the point of Expires and Cache-Control:max-age. To add to confusion, when I test the page at https://developers.google.com/speed/pagespeed/insights/ it tells me to "Leverage browser caching".
Cache-Control: private
Indicates that all or part of the response message is intended for a single user and MUST NOT be cached by a shared cache, such as a proxy server.
From RFC2616 section 14.9.1
To answer your question about why caching is working, even though the web-server didn't include the headers:
Expires: [a date]
Cache-Control: max-age=[seconds]
The server kindly asked any intermediate proxies to not cache the contents (i.e. the item should only be cached in a private cache, i.e. only on your own local machine):
Cache-Control: private
But the server forgot to include any sort of caching hints:
they forgot to include Expires (so the browser knows to use the cached copy until that date)
they forgot to include Max-Age (so the browser knows how long the cached item is good for)
they forgot to include E-Tag (so the browser can do a conditional request)
But they did include a Last-Modified date in the response:
Last-Modified: Tue, 16 Oct 2012 03:13:38 GMT
Because the browser knows the date the file was modified, it can perform a conditional request. It will ask the server for the file, but instruct the server to only send the file if it has been modified since 2012/10/16 3:13:38:
GET / HTTP/1.1
If-Modified-Since: Tue, 16 Oct 2012 03:13:38 GMT
The server receives the request, realizes that the client has the most recent version already. Rather than sending the client 200 OK, followed by the contents of the page, it instead tells you that your cached version is good:
304 Not Modified
Your browser did have to suffer the round-trip delay of sending a request to the server, and waiting for the response, but it did save having to re-download the static content.
Why Max-Age? Why Expires?
Because Last-Modified sucks.
Not everything on the server has a date associated with it. If I'm building a page on the fly, there is no date associated with it - it's now. But I'm perfectly willing to let the user cache the homepage for 15 seconds:
200 OK
Cache-Control: max-age=15
If the user hammers F5, they'll keep getting the cached version for 15 seconds. If it's a corporate proxy, then all 67,198 users hitting the same page in the same 15-second window will all get the same contents - all served from close cache. Performance win for everyone.
The virtue of adding Cache-Control: max-age is that the browser doesn't even have to perform a "conditional" request.
if you specified only Last-Modified, the browser has to perform a If-Modified-Since request, and watch for a 304 Not Modified response
if you specified max-age, the browser won't even have to suffer the network round-trip; the content will come right out of the caches.
The difference between "Cache-Control: max-age" and "Expires"
Expires is a legacy (c. 1998) equivalent of the modern Cache-Control: max-age header:
Expires: you specify a date (yuck)
max-age: you specify seconds (goodness)
And if both are specified, then the browser uses max-age:
200 OK
Cache-Control: max-age=60
Expires: 20180403T192837
Any web-site written after 1998 should not use Expires anymore, and instead use max-age.
What is ETag?
ETag is similar to Last-Modified, except that it doesn't have to be a date - it just has to be a something.
If I'm pulling a list of products out of a database, the server can send the last rowversion as an ETag, rather than a date:
200 OK
ETag: "247986"
My ETag can be the SHA1 hash of a static resource (e.g. image, js, css, font), or of the cached rendered page (i.e. this is what the Mozilla MDN wiki does; they hash the final markup):
200 OK
ETag: "33a64df551425fcc55e4d42a148795d9f25f89d4"
And exactly like in the case of a conditional request based on Last-Modified:
GET / HTTP/1.1
If-Modified-Since: Tue, 16 Oct 2012 03:13:38 GMT
304 Not Modified
I can perform a conditional request based on the ETag:
GET / HTTP/1.1
If-None-Match: "33a64df551425fcc55e4d42a148795d9f25f89d4"
304 Not Modified
An ETag is superior to Last-Modified because it works for things besides files, or things that have a notion of date. It just is
RFC 2616, section 14.9.1:
Indicates that all or part of the response message is intended for a single user and MUST NOT be cached by a shared cache...A private (non-shared) cache MAY cache the response.
Browsers could use this information. Of course, the current "user" may mean many things: OS user, a browser user (e.g. Chrome's profiles), etc. It's not specified.
For me, a more concrete example of Cache-Control: private is that proxy servers (which typically have many users) won't cache it. It is meant for the end user, and no one else.
FYI, the RFC makes clear that this does not provide security. It is about showing the correct content, not securing content.
This usage of the word private only controls where the response may be cached, and cannot ensure the privacy of the message content.
The Expires entity-header field gives the date/time after which the response is considered stale.The Cache-control:maxage field gives the age value (in seconds) bigger than which response is consider stale.
Althought above header field give a mechanism to client to decide whether to send request to the server. In some condition, the client send a request to sever and the age value of response is bigger then the maxage value ,dose it means server needs to send the resource to client? Maybe the resource never changed.
In order to resolve this problem, HTTP1.1 gives last-modifided head. The server gives the last modified date of the response to client. When the client need this resource, it will send If-Modified-Since head field to server. If this date is before the modified date of the resouce, the server will sends the resource to client and gives 200 code.Otherwise,it will returns 304 code to client and this means client can use the resource it cached.
I have the following action method:
[HttpGet, Authorize, OutputCache(Duration = 60, VaryByHeader = "Cookie", Location = OutputCacheLocation.Any)]
public ActionResult Index()
But when I make a request, these are the headers issued:
Cache-Control: private, max-age=60, s-maxage=0
Content-Type: text/html; charset=utf-8
Content-Encoding: gzip
Expires: Fri, 22 Jun 2012 09:56:32 GMT
Last-Modified: Fri, 22 Jun 2012 09:55:32 GMT
Vary: Accept-Encoding
Why isn't it including the Cookie header in the Vary?
I've tried many variations of the OutputCache settings but to no avail :(
I hate IIS almost as much as I hate ASP.NET.
http://blogs.msdn.com/b/chaun/archive/2009/10/01/iis-compression-overwrites-the-vary-header-average-rating-0-ratings.aspx
So the response is included in the linked by Andrew Bullock url "page file name", aka "iis compression overwrites the vary header".
But since December 2013 a hotfix is available (found on Andrew Bullock link). As usual, this hotfix may be included in regular patches, so make sure you still need it before applying it.
And as it is "best practice" to explicitly supply information from links rather than only supplying the link (in case the page goes down), here is an abstract :
IIS dynamic compression overwrites the Vary header. This seems to apply to IIS from version 5 to version 8 included (when cumulating information from msdn blog and from msdn kb). Either disable IIS dynamic compression, or try applying the hotfix available at http://support.microsoft.com/kb/2877816/en-us if needed. (This issue may be fixed by regular patches on windows 8.1/2012 R2 series. The hotfix is available starting from IIS 7, personally tested on win7 sp1, not yet on my servers.)
In my specific case, I had also another issue : asp.net was not emitting any Vary header (compression enabled or not) with outputCache location set to Client. Setting it to Downstream "solved" this additional issue. But of course, this has the side-effect of changing Cache-Control from private to public. (Now cacheable on proxy instead of just on client browser.)
This is just making me angry. I can't figure out why all the resources in my page are being requested EVERY single time.
E.g. my site.css returns the following headers (using fiddler):
HTTP/1.1 200 OK
Server: ASP.NET Development Server/9.0.0.0
Date: Mon, 29 Nov 2010 17:36:21 GMT
X-AspNet-Version: 2.0.50727
Content-Length: 9093
Cache-Control: public, max-age=2592000
Expires: Wed, 29 Dec 2010 17:36:21 GMT
Last-Modified: Mon, 08 Nov 2010 17:20:16 GMT
Content-Type: text/css
Connection: Close
But every time I hit refresh I see all the resources (css,js,images) getting re-requested. I have control over the headers returned for any and all of these resources, but I haven't figured it out yet.
I have even debugged my ASP.NET app and the HttpModule is definitely being asked for the resources again.
Can someone give me an idea of what to do?
EDIT:
Ok, I removed must-revalidate, proxy-revalidate from the headers and that is getting me closer to where I want to be, now when I press back it still requests my css/js files when I press back.
Is there anything more I can do to avoid this?
The following links might be of help to you.
Differences in reload behavior between FF and IE
http://blog.httpwatch.com/2008/10/15/two-important-differences-between-firefox-and-ie-caching/
In a nutshell, your caching behavior is going to be determined by the headers and the browser you are using.
What browser are you using for testing? The back button is also handled differently.
Back Button (Browser Behavior)
And, finally, a breakdown of f5/ctrl f5, click, shift click, etc behavior between browsers:
What requests do browsers' "F5" and "Ctrl + F5" refreshes generate?
If you are handling the requests in your own module - which seems to be the case - and the request contains an If-Modified-Since header, you can use that to determine whether to respond with a 200 and sending the whole resource again, or just send a 304 and skip sending the js/css/etc contents.
Other than that, expect browsers to re-query resources on hitting F5 / Refresh. It is just that you may skip sending the whole js/css/etc and return a 304 if everything is OK.
Other than that, #Chris's answer covers pretty much everything else.
What are the request headers you see when you hit back?