I just read that files that do have Expires Headers should not be requested again until they expire.
While testing some caching stuff I wonderd why files do have a size and do consume time on "Content Download" on chrome dev tools, even if they got a max-age set and should be loaded from cache without sending any request?
Any explanations?
Quoting from this answer:
Chrome appears to be ignoring your Cache-Control settings if you're reloading in the same tab. If you copy the URL to a new tab and load it there, Chrome will respect the cache control tags and reuse the contents from the cache.
Related
I have static content (icons etc...) served via Asp.Net
Every response gets caching added to it, like this:
Response.Cache.SetExpires(Now.AddMinute(30))
Response.Cache.SetValidUntilExpires(True)
When I browse from my office everything is fine
When one of the users browses from home, the icons are not cached. Which makes browsing very slow.
I have a log that shows the incoming requests, and the requests from this user have this header
"Cache-Control":"no-cache, no-store"
I don't know if that's the issue, and if yes, how can I solve it? Or can there be something else wrong?
Also, after setting the cache expiration, it seems that the Response.Headers are not affected. I don't see the caching info in the headers.
This is the header string. Not a word about caching.
{Server=Microsoft-IIS%2f10.0&HitID=9&X-AspNetMvc-Version=5.2}
Why are my Cache settings being ignored?
Please check your IIS cache setting. The static file setting may not related with the Response.Cache.SetExpires method().
And you can also set the Cache-Control in iis. About how to set the Cache-Control you can refer to this link.
Cache-Control
I know this question is asked for several times. But Still I am not clear about the concept. After reading many blogs and answers in SO what I got is,
Expiry headers are used when you don’t even want client (and proxies/caches) to make a request to
the server. In ETAG, the client will check with the server for the update, but in expiry
headers, the client will know, when to expire the file and check for an update, till then it
(browsers and proxies/caches) won’t bother server for checking the update.
So basically it say if we use expires/max-age header , It will not even check for the server for an updated file. So I thought to test it locally.
So I have created on simple html file including 2 js files and 1 image file. In IIS , I have set the Expires header to 2 days for the image folder. So as per my understanding , after getting the image file from the server once, for next request it should not send a request to the server to check the image file is modified or not.
But what I got is each time I refresh the page I see a request sent to the server and the server returns a 304 not modified status. But as per the specs/blogs I read It should not send a request to the server.
Someone please explain.
For what you have described
It is clear that ETag works as it expected to be by responding with 304 not modified for the request with If-None-Match field and ETag value.
so now the browser will load the image from cache instead of getting a new image from server costing bandwidth and time.
It seems that caching is disabled in your browser.That's why a new request has been sent before the cache expiration or else a request wouldn't have been sent in the first place.
Here is a wonderful article that explains how to find caching is disabled in browser programatically
Here is a another wonderful article that explains caching and Etag in depth.
Note:
Generally speaking
If you are using multiple servers with load balancer to host your website
then simple Etag configurations likely going to cost more bandwidth by having Etag in their header and it has no purpose which is checking if browser cache is valid.(Its always going to say invalid)
The important part is what you said: I refresh the page. In this case browser is trying provide you the fresh content so it has no other option than to contact server and check all resources. (There is cache control extension immutable which prevents this behaviour but is not widely used and implemented).
If you want to see behaviour of your browser which respects caches without reload you have to use "standard page entry". Either follow a link to the page or use another tab and write the page url to the url bar and press enter.
Caches respects expiration time so if document is not stale then is returned from the cache. If expired that ETag is used to validation of the resource (and after validation it is possible that resource is still not modified - 304 response)
At the moment I have 2 sites: A - with content and CORS header enabled, B - in which I want to embed content using AJAX Include Script.
Everything works great when page is not compressed. When W3 Total Cache is enabled, I get XMLHttpRequest Exception 101.
Weird behavior:
When I navigate to page where content should be and then purge A site's page cache and refresh B site, everything loads up fine. When I clear browser cache and refresh - XMLHttpRequest Exception 101 again. It's the same for Chrome, Firefox and Safari(both, desktop and mobile).
What's going wrong when compression is enabled?
p.s. I've tried setting up CORS by PHP and Apache. Makes no difference.
Could it be that when you turn compression on, the response has the Content-Encoding header in it? This header would need to be added to the CORS Access-Control-Allow-Headers response header.
I have a web application that contains a few hundred small images, and is performing quite badly on load.
To combat this, I would like to cache static files in the browser.
Using a servlet filter on Tomcat 7, I now set the expires header correctly on static files, and can see that this is returned to Chrome:
Accept-Ranges:bytes
Cache-Control:max-age=3600
Content-Length:40284
Content-Type:text/css
Date:Sat, 14 Apr 2012 09:37:04 GMT
ETag:W/"40284-1333964814000"
**Expires:Sat, 14 Apr 2012 10:37:05 GMT**
Last-Modified:Mon, 09 Apr 2012 09:46:54 GMT
Server:Apache-Coyote/1.1
However, I notice that Chrome is still doing a round trip to the server for each static resource on reloads, sending an if-modified header and getting a correct 304 Not Modified response from Tomcat.
Is there any way to make Chrome avoid these 100+ requests to the server until the expiry has genuinely passed?
There are 3 ways of loading a page -
Putting the url in the address bar and pressing enter which is equivalent to navigating from a hyper link (Default browsing behaviour). This will honour the Expires headers and will first check the cache of the static content to be valid and then if the Expires header time is in future it will load it directly from the cache. In this case the browser will make no request at all to the server. In case the cached content is in-valid it will make a request to the server.
Pressing f5 to refresh the page. This would basically send a if-modified header to the server and verify if the content has changed. If it has changed you would get a 200 response else if not then a 304 response. In both cases the image is not loaded on the page until a response is received from the server.
Pressing ctrl+f5 which would forcefully clear all the cache and reload all the images. It will not spend time in verifying if the images have changed or not using the headers.
I guess the behaviour you are expecting is the first kind. The only thing that you should be looking at is the way you are loading the page. Normally people are not going to press f5 or ctrl+f5 thus your static content will not be re-validated every time. It will forcefully clear the cache and reload every static item on the page.
In short just remember - reload the page by pressing enter in the address bar instead. The browser will honour the headers that you have provided. This is not specific to chrome, its a W3C standard.
Be carefull when you are testing. I noticed that in Chrome version 20 if I hit F5 to reload the page then in the network panel I see new requests.
Hoewer if I place the cursor to the title bar, after the current page url, and hit enter, I get resources from cache, whitch header was set to cache.
Also a good reading:
http://betterexplained.com/articles/how-to-optimize-your-site-with-http-caching/
Assuming you have ruled out the various gotchas that have already been suggested, I found that Google Chrome can ignore the Cache-Control directive unless it includes public, and that it has to be first. For example:
Cache-Control: public, max-age=3600
In my experiments I also removed ETags from the server response, so that could be a factor, but I didn't go back and check.
I send back an image with the following HTTP response header:
Cache-Control: private,max-age=86400
My understanding is that the browser should not even ask for this file for 24 hours (86,400 = 60s * 60m * 24h).
What I'm seeing on subsequent requests is that it still asks for the file, but gets back a "304 Not Modified." This is good, but I want to remove even that request/response.
What header is required to prevent the browser from even bothering to ask for the file, and just have it blindly use the file it has in local cache, until that file expires?
It all really depends on how you're testing this. On Firefox 3.6 and IE8, clicking on a link and then on a link that move you back to the first page will use the cache correctly with max-age. Hitting the Return key again in the URL field will show the same behavior.
However, hitting F5 will ask again for the file but allows 304 responses.
Hitting Ctrl+F5 will always ask again for the file, with Cache-Control and Pragma set to no-cache, forcing a 200 response.
This simply can't be done reliably in HTML < 5.
You could use client side storage in HTML5 or use a browser extension such as Gears to provide this functionality.