I am trying to improve the performance of my website by adding cache headers for static content.
So far, I can get the content to cache in Chrome and Internet Explorer, but not Firefox.
Here are the caching-related headers I'm supplying:
Cache-Control:private, max-age=1800
ETag:"809067e0179acb1:0"
Expires:Mon, 20 Dec 2010 21:35:10 GMT
(NOTE: ETag and Expires are variable; Expires is 30 minutes in the future)
I verified the behavior using Fiddler 2. Chrome and IE7 do not request the images, CSS, and JS after the first request, while Firefox requests them every time.
Is there any header I should supply to make Firefox cache these?
UPDATE 2010.12.22
I noticed the same behavior on most websites, including www.yahoo.com. Is there a way to force Firefox to cache?
This would have occurred for SSL content, but it should no longer be the case. FireFox altered the way it caches with the resolution to Gecko bug 531801.
Now, SSL content is cached to disk regardless of the Cache-Control header.
Related
At the moment I have 2 sites: A - with content and CORS header enabled, B - in which I want to embed content using AJAX Include Script.
Everything works great when page is not compressed. When W3 Total Cache is enabled, I get XMLHttpRequest Exception 101.
Weird behavior:
When I navigate to page where content should be and then purge A site's page cache and refresh B site, everything loads up fine. When I clear browser cache and refresh - XMLHttpRequest Exception 101 again. It's the same for Chrome, Firefox and Safari(both, desktop and mobile).
What's going wrong when compression is enabled?
p.s. I've tried setting up CORS by PHP and Apache. Makes no difference.
Could it be that when you turn compression on, the response has the Content-Encoding header in it? This header would need to be added to the CORS Access-Control-Allow-Headers response header.
Ok, so I get the rule that browsers should not confuse history store and web cache: Clicking back should not send a request to the server. I also get that browsers manufacturers have the poetic licence to break this rule.
What I don't get is the following (please stay with me here)...
OK: Browsing our web site in HTTP, the history buttons did not send requests to server. Great! Behaviour as expected.
NOK: Browsing history on the same site in HTTPS mode, Chrome faired well but IE9/10 and FF did not. They would send the request for the HTML page to the server and then correctly use the store for the static files. Why the difference?
So after a little head scratching and testing, I found that the presence of the Pragma:no-cache header in the responses we were sending was responsible for this behaviour. After removing the header, that should never have been there in the first place, IE and FF faired well when using history buttons in HTTPS - no more sent requests.
Now, how can the presence of a header which should be ignored by modern browsers and only used in requests, be causing this strange issue in browser history navigation?
I have a web application that contains a few hundred small images, and is performing quite badly on load.
To combat this, I would like to cache static files in the browser.
Using a servlet filter on Tomcat 7, I now set the expires header correctly on static files, and can see that this is returned to Chrome:
Accept-Ranges:bytes
Cache-Control:max-age=3600
Content-Length:40284
Content-Type:text/css
Date:Sat, 14 Apr 2012 09:37:04 GMT
ETag:W/"40284-1333964814000"
**Expires:Sat, 14 Apr 2012 10:37:05 GMT**
Last-Modified:Mon, 09 Apr 2012 09:46:54 GMT
Server:Apache-Coyote/1.1
However, I notice that Chrome is still doing a round trip to the server for each static resource on reloads, sending an if-modified header and getting a correct 304 Not Modified response from Tomcat.
Is there any way to make Chrome avoid these 100+ requests to the server until the expiry has genuinely passed?
There are 3 ways of loading a page -
Putting the url in the address bar and pressing enter which is equivalent to navigating from a hyper link (Default browsing behaviour). This will honour the Expires headers and will first check the cache of the static content to be valid and then if the Expires header time is in future it will load it directly from the cache. In this case the browser will make no request at all to the server. In case the cached content is in-valid it will make a request to the server.
Pressing f5 to refresh the page. This would basically send a if-modified header to the server and verify if the content has changed. If it has changed you would get a 200 response else if not then a 304 response. In both cases the image is not loaded on the page until a response is received from the server.
Pressing ctrl+f5 which would forcefully clear all the cache and reload all the images. It will not spend time in verifying if the images have changed or not using the headers.
I guess the behaviour you are expecting is the first kind. The only thing that you should be looking at is the way you are loading the page. Normally people are not going to press f5 or ctrl+f5 thus your static content will not be re-validated every time. It will forcefully clear the cache and reload every static item on the page.
In short just remember - reload the page by pressing enter in the address bar instead. The browser will honour the headers that you have provided. This is not specific to chrome, its a W3C standard.
Be carefull when you are testing. I noticed that in Chrome version 20 if I hit F5 to reload the page then in the network panel I see new requests.
Hoewer if I place the cursor to the title bar, after the current page url, and hit enter, I get resources from cache, whitch header was set to cache.
Also a good reading:
http://betterexplained.com/articles/how-to-optimize-your-site-with-http-caching/
Assuming you have ruled out the various gotchas that have already been suggested, I found that Google Chrome can ignore the Cache-Control directive unless it includes public, and that it has to be first. For example:
Cache-Control: public, max-age=3600
In my experiments I also removed ETags from the server response, so that could be a factor, but I didn't go back and check.
Background story:
I have a web portal in .NET 3.5 on an IIS 6 web server. Currently there is a page that is given a value and based on that value looks up a PDF file on a web service and displays the results to the user in another tab in the web page. This is done with the following code.
context.Response.ClearContent();
context.Response.ClearHeaders();
context.Response.Clear();
context.Response.AddHeader("Accept-Header", pdfStream.Length.ToString());
context.Response.ContentType = "application/pdf";
context.Response.BinaryWrite(pdfStream.ToArray());
context.Response.Flush();
This works and has worked for years. However we got an issue from the client that a particular client was having the PDF returned as the same PDF every time until they cleared temp internet cache.
I thought oh cool, this is an easy one. I will just add the cache headers to the response to never cache it. So I added the following:
context.Response.Cache.SetCacheability(HttpCacheability.NoCache);//IE set to not cache
context.Response.Cache.SetNoStore();//Firefox/Chrome not to cache
context.Response.Cache.SetExpires(DateTime.UtcNow); //for safe measure expire it immediately
After a quick test I got exactly what I was expecting in the response header.
Cache-Control no-cache, no-store
Pragma no-cache
Expires -1
The Problem:
So this went live. Everything seemed cool day one. The day after, bam, everyone started getting white screens and no PDF displayed. After further investigation, I found out it was only IE 6,7,8. Chrome is fine, Firefox fine, safari fine, even IE 9 fine. Without knowing the why this happened, I reverted my change and deployed it, and everything started worked again.
I have searched all over trying to find out why my caching headers seemed to confuse IE 6-8 to no avail. Has anyone experienced this type of issue with IE 6-8? Is there something I am missing? Thanks for any insight.
I found the solution. Here is what tipped me off. Here is a link
Basically IE8 (and lower) was having issues with the Cache-Control header if it had no-cache or store-cache. I was able to work around the problem by basically allowing private caching only and set a max age to very short so it expires almost immediately.
//Ie 8 and lower have an issue with the "Cache-Control no-cache" and "Cache-Control store-cache" headers.
//The work around is allowing private caching only but immediately expire it.
if ((Request.Browser.Browser.ToLower() == "ie") && (Request.Browser.MajorVersion < 9))
{
context.Response.Cache.SetCacheability(HttpCacheability.Private);
context.Response.Cache.SetMaxAge(TimeSpan.FromMilliseconds(1));
}
else
{
context.Response.Cache.SetCacheability(HttpCacheability.NoCache);//IE set to not cache
context.Response.Cache.SetNoStore();//Firefox/Chrome not to cache
context.Response.Cache.SetExpires(DateTime.UtcNow); //for safe measure expire it immediately
}
Update:
Looks like the header request information is the culprit. How would I change the max-age property of the request header? TIA.
Hi, I'm using #font-face on a website and i'm experiencing delayed loading of the text (presumably due to the loading of the font every page). I understand the client has to download the font once to display properly, but every page?
Is there a way I can force the browser to cache that file? Or is there another alternative to speed up the font's loading time? (Is this a question more appropriate to post on Server Fault?)
Thanks in advance. Worst case, I'll live with the delay, so I don't need any "take off #font-face" answers... ;)
Additional Information:
I've tested this in both Safari (4) and Firefox (3.5RC1) on both Mac and Windows (XP and 7)
All the browsers I've tested on are currently set up to allow caching (it's on by default)
The URL is not dynamic, it's simply "/fonts/font.otf"
The font URL is correct, as the page loads the font and displays it correctly, albeit slower then normal
Request Header :
Cache-Control:max-age=0
If-Modified-Since:Wed, 24 Jun 2009 03:46:28 GMT
If-None-Match:W/"484d9f2-a5ac-46d10ff2ebcc0"
Referer:http://testurl.com/
User-Agent:Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6; en-us) AppleWebKit/530.13 (KHTML, like Gecko) Version/4.0 Safari/530.15
Response headers:
Connection:Keep-Alive
Date:Thu, 25 Jun 2009 02:21:31 GMT
Etag:"484d9f2-a5ac-46d10ff2ebcc0"
Keep-Alive:timeout=10, max=29
Server:Apache/2.2.11 (Unix) mod_ssl/2.2.11 OpenSSL/0.9.8i DAV/2 mod_auth_passthrough/2.1 mod_bwlimited/1.4 FrontPage/5.0.2.2635
You can never force a browser to cache something, only encourage it. I can think of no reason why a font file with the correct expires headers wouldn't be cached which brings us to:
It's a browser bug (you don't say which browser)
Your cache control headers are missing or wrong
Your browser is configured to not cache anything (do images cache?)
Your font URL is dynamic so the browser thinks each request is for a different resource
The font face file is actually missing or or the URL misspelt.
The delay is NOT caused by the font download (you did say you presume this is the issue)
I think more information is in order.
EDIT: To set cache control is a server and language specific thing. Look at mod_expires for information on caching in Apache.
Are you sure your font files are cachable? Just like other static content, they should have far-future expires dates, and their headers should be configured to allow them to be cached. If you are hosting your fonts on a server farm, you will want to make sure your etag header is normalized across all the servers in the farm...otherwise subsequent requests for the font may force it to be re-downloaded from an alternative server even though the same data was already downloaded from another server.