Cache control not working when hit refresh in the browser - http

I'm trying to implement cache control on my application. I've set up the tomcat filter for the all fonts giving a max-age=120.
When I request a font for the first time with the cache cleared, the call/response is the following:
and as you can see I have the max-age response. Now I expect that if I hit refresh the browser won't send the http request again instead this is what happens:
As you can see the second request has a
cache-control: max-age=0
value and the response is returned from the server cache. What I'm trying to achieve is to block the entire call from the browser.
Am I doing something wrong?
Thanks

Hitting refresh has semantics that are dependent upon the browser you're using, but often it will make a conditional request to make sure the user is seeing a fresh response (because they wanted to refresh).
If you want to check cache operation, try navigating to the page, rather than hitting refresh.
OTOH if you don't want refresh to behave like this -- and you really mean it -- Mozilla is prototyping Cache-Control: immutable to do this (but it's early days, and mob-only for the moment).

Related

Client with ETag always performs conditional GET

I am working on a new API. With ETags enabled, Chrome always conditionally GETs a resource with a long cache lifetime, that is, Chrome no longer uses the cached version without first confirming the ETag.
Is this how ETags are supposed to work with GET?
If so, and considering that I don't need them for update race-checking, I will remove the ETags and alleviate the extra work my server is doing.
Edit
I would have expected Chrome to use the cached item according to the max-age until it expires and then use the conditional GET to update its cached-item.
As I write this, it has now occurred to me that it would not be able to 'update its cached-item' since the 304 contains no max-age to use to extend the expiry time.
So I guess with ETags enabled, a client will always conditionally GET unless there's a good reason to use the cached version, such as no network.
Thus, it appears that ETags harm server performance when no concurrency control is needed.
Edit
I think I've already guessed the answer (above) but here's the question in a nutshell:
When ETags are enabled and the Max-Age is far in the future, should User Agents always use conditional GET instead of using the previously cached, fresh response?
How you load your page in the browser might be relevant here:
If you press Ctrl and reload the page using the refresh button, that will cause an unconditional reload of your resources with 200s returned.
If you just reload using the refresh button (or equivalent key like F5), conditional requests will be sent and 304s will be returned for static resources.
If you press enter in the address box, add the page to your bookmarks and load from there, or visit the page from a hyperlink, the cache max-age should work as expected.
With ETags enabled, Chrome always conditionally GETs a resource with a long cache lifetime, [...] Is this how ETags are supposed to work with GET?
The answer is in the RFC:
10.3.5 304 Not Modified
If the client has performed a conditional GET request and access is
allowed, but the document has not been modified, the server SHOULD
respond with this status code.
So, yes.
If this is not the answer you expect, you might want to include some actual traffic in your question that shows the order of request-response pairs that you expect and that you actually get.
considering that I don't need them for update race-checking
If you're not going to use conditional requests (If-Match and the likes), you may indeed omit the ETag calculation and processing.

In what scenario could an AJAX request not have the cookies set by the page which fired the AJAX?

Some small percentage of the time, we see a flow like this, deducing from looking at server logs (I have not been able to reproduce this case with any browser):
At time A, client hits our page:
with no cookies
gets back a response with a Set-Cookie HTTP response header that gives them a session id of B
body has JS to fire an AJAX request /ajax/foo.
At time A + 1 second, client hits us with the AJAX request to /ajax/foo
the referrer is set to the page in step 1 that fired the AJAX, as expected
with no cookies - why?
gets back a response with a Set-Cookie header that gives them a session id of C (expected, since they didn't send us a cookie)
At some time slightly later, all of the client requests are sending either session id B or C - so the problem is not that the browser has cookies turned off.
This seems to be essentially a race condition -- the main page request and the AJAX request come in together very close in time, both with no cookies, and there is a race to set the cookie. One wins and one loses.
What is puzzling to me is how could this happen? My assumption is that by time the browser has read enough of the response to know that it needs to fire an AJAX request, it has already received the HTTP response headers and thus the Set-Cookie response header. So it seems to me that the client would always send back the cookie that we set in the page that fired the AJAX request. I just don't see how this could happen unless the browser is not promptly processing the Set-Cookie response.
Like I said, I can't reproduce this in Firefox, Safari, or Chrome, but we do see it several times a day.
There is a new feature in google chrome that could cause this misbehavior. It is called prerender.
Prerendering is an experimental feature in Chrome (versions 13 and up)
that can take hints from a site’s author to speed up the browsing
experience of users. A site author includes an element in HTML that
instructs Chrome to fetch and render an additional page in advance of
the user actually clicking on it.
Even if you do not proactively trigger prerendering yourself, it is
still possible that another site will instruct Chrome to prerender
your site. If your page is being prerendered, it may or may not ever
be shown to the user (depending on if the user clicks the link). In
the vast majority of cases, you shouldn’t have to do anything special
to handle your page being prerendered—it should just work.
For more information read: http://code.google.com/chrome/whitepapers/prerender.html
Edit:
You could trigger prerender on your page with: http://prerender-test.appspot.com/
a) Does the cookie have an expiration time?
b) If so, have you tried to reproduce it by setting the computer's clock back or forward by more than the TTL of the cookie? (I mean the clock of the computer running the browser, obviously; not the server running the app ... which should be a separate computer whose clock is set accurately.)
I've seen this as well; it seems to be triggered by users with screwed up system clocks. The cookie was sent with an expiration date that, from the browser's perspective, was already past.

Detecting User pressing back button without Javascript?

I know there are ways to tell if an user has pressed the back button using Javascript, but is there a way to tell without resorting to Javascript? Is there a solution that involves just looking at referer URLs perhaps?
Without javascript no.
The problem is the back button will not guarantee you get a server hit. It can cache the page client side and even if it did hit the server (loading the page), then it would have the request from the initial hit not like it came from the page you were just on. The back button doesn't add 'referral' information to the request. It just goes back to the last thing you did without sending the details of where you just were.
You need to handle this client side.
Yes, it is possible. There are two parts to this -
Every URL should have a unique token in it. On the server side, you keep track of the current and past tokens. When a request comes along, if the token matches a past token, the back button was hit. If it equals the current token, process the request normally. Otherwise fail the request.
Now, the page could have been cached and your server may not see the request. So you have to take steps to defeat the browser cache. Setting the following http headers should force the browser to make a request -
Cache-Control : no-Cache
Pragma : No-Cache
Expires : Thu, 01 Jan 1970 00:00:00 GMT
Because it is possible doesn't mean you should use it though. Backbutton is an essential technique for the web, and breaking it is poor usability.

How do you cache a file client-side such that the browser stops even bothering to request it again?

I send back an image with the following HTTP response header:
Cache-Control: private,max-age=86400
My understanding is that the browser should not even ask for this file for 24 hours (86,400 = 60s * 60m * 24h).
What I'm seeing on subsequent requests is that it still asks for the file, but gets back a "304 Not Modified." This is good, but I want to remove even that request/response.
What header is required to prevent the browser from even bothering to ask for the file, and just have it blindly use the file it has in local cache, until that file expires?
It all really depends on how you're testing this. On Firefox 3.6 and IE8, clicking on a link and then on a link that move you back to the first page will use the cache correctly with max-age. Hitting the Return key again in the URL field will show the same behavior.
However, hitting F5 will ask again for the file but allows 304 responses.
Hitting Ctrl+F5 will always ask again for the file, with Cache-Control and Pragma set to no-cache, forcing a 200 response.
This simply can't be done reliably in HTML < 5.
You could use client side storage in HTML5 or use a browser extension such as Gears to provide this functionality.

Far future expire header and HTTP 304

I'm trying to optimize the loading time of a website. One of the things I've done is set a far-futures expires header for static content so that they are cached (as described by Yahoo). However, even though they are cached, the browser still sends a request and gets back a 304 (Not Modified) response for that resource.
I realize the 304 response is very small and probably has minimal performance effect, but is there a way to make it such that the browser will no longer send the request at all and just always use the cache for that resource?
You may want to try turning off ETags if you are sending both etags and expires. Some people suggest turning off eTags, especially if you have a load balancer.
Also, note, when you press reload on your page, Firefox WILL recheck all the resources. These will come back with 304's. If you press shift-reload, it will re-request all the resources without etags. So don't use the refresh/reload button to test your last-modifed/etag settings.

Resources