cache problem in asp.net - asp.net

I'm seeing an issue of some static pages that are using the browser cache, which is not desired. To prevent caching, I'm setting
<clientCache cacheControlMode="DisableCache" />
in the relevant <location> tag in web.config
If I open the page in Firebug (in the Net tab), I see that the Response headers have Cache-Control: no-cache which is correct, but the status of the Response is 304 Not Modified! Isn't that a contradiction? How can I get it to stop caching (i.e. always send a 200 with content)?

According to the RFC (http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.9.1, section 14.9.1) Cache-control: no-cache tells the browser to not use the cached contents without first revalidating with the server. That's why you're seeing the 304's. I think you're looking for Cache-Control: no-store.
I'm not sure if you can send no-store via a configuration file. You can, however, set the header explicitly in the response:
Response.Cache.SetNoStore();
EDIT: (by OP)
What I was looking for was:
<clientCache cacheControlCustom="no-store" />

Related

How can user set no-cache on browser requests?

I understand, to some degree, the HTTP(S) response cache-control: headers, and associated controls for caching but the request cache-control: headers? How does a user control his own request headers? If users are using a normal browser, they have no ability to manually tweak any request parameters outside of those the URL itself indirectly generates.
How is request cache-control even a thing? Is it only intended for programmatically generated (curl, wget, JavaScript) HTTP(S) requests? or interaction between caches and origins?
Most browsers don't give a lot of fine-grained cache control to users. They'll let you clear any local cache, which is purely a local operation. Many will also let you request a page with caching disabled; see Force browser to refresh css, javascript, etc for details.
To give a specific example, in Firefox requesting a page will send headers:
GET /... HTTP/1.1
...
However, if I use 'Reload current page', the request will include cache-control headers to request uncached data from upstream:
GET /... HTTP/1.1
...
Cache-Control: max-age=0
...
Similarly for a resource on that page referenced through <img src...>.
GET /... HTTP/1.1
...
Accept: image/webp,*/*
...
Cache-Control: max-age=0
As you suggest, this isn't fine-grained control; I'm not aware of any browsers that allow anything as complex as choosing the max-age for regular browsing.
However, it is a good example of the general cache-control header interacting with the browser's user-facing functionality.

Browsers seem to ignore response cache-control instruction

I'm trying to test some caching configuration, i want my page to stay in cache for 1 min before the request reaches the server again.
Using this simple test.asp page that has a link to itself:
<% Option Explicit %>
<%
Response.Expires = 1
Response.CacheControl = "private, max-age=60"
%>
<html>
<head><title>test</title></head>
<body>
<% =Now() %>
<br />
test
</body>
</html>
This works perfectly on my development computer http://localhost/test.asp, (clicking the link does not refresh the printed datetime during 1 min).
However it has not the desired effect when i put the page on the production server. After only few seconds of clicking the link I get a new datetime meaning the request reached the web server.
I use Chrome dev tool and see these response headers:
HTTP/1.1 200 OK
Cache-Control: private, max-age=60
Content-Type: text/html
Expires: Tue, 12 May 2015 19:16:52 GMT
Last-Modified: Tue, 12 May 2015 19:10:00 GMT
Server: Microsoft-IIS/7.5
X-Powered-By: ASP.NET
Date: Tue, 12 May 2015 19:15:55 GMT
Content-Length: 205
Can anyone help explain why it does not work on the prod server ?
update
I tried with Chrome, Firefox and IE, and also 2 pages test.asp and test2.asp, both having a link to the other page, and got exactly the same problem, after 8-12 sec, the page refresh instead of waiting 60sec before refreshing.
To follow up on my comment, It looks like you might be looking at caching your dynamic asp pages on the server, not on the client. Caching on the client doesnt really do you much good because modern browsers / proxies will still request the item when its an HTML document. Caching static resources which dont change such as images, css, js should work, and depending on the cache header you push out the browser will respect those.
To get your pages to cache on the server (meaning IIS doesnt have to re-generate the page) here is how you do it.
Web.config
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<system.webServer>
<caching>
<profiles>
<add extension=".asp" policy="CacheForTimePeriod" kernelCachePolicy="DontCache" duration="00:01:00" />
</profiles>
</caching>
</system.webServer>
</configuration>
You can place your web.config in a specific directory only to cache its contents, you can also break the caching using querystring params or certain request headers.

When do browsers refresh a stale cache entry?

I've been trying to refresh my understanding of HTTP/1.1 caching - something I find I have to do every once in a while, as there seems to be too many possible combinations for my brain to remember reliably.
I took it as a given that the Expires or max-age cache control directives were used by browsers as hints, not ironclad laws: if a cache entry is stale (older than its max age), the browser MAY validate it.
A colleague and I had a bit of a row about this and he forced me to read the RFC, which I felt was a bit harsh but proved him entirely right: if I understand it correctly, clients are not allowed to use a cache entry that they know to be stale.
In other words: if a document specifies a max-age cache header, and no other directives such as must-validate affect caching behaviour, and that document's cache entry becomes stale, the browser will always re-validate it.
This is entirely clear and logical, and to put that argument to rest, I set out to confirm it empirically by having a server serve a test JavaScript file with the following headers:
< HTTP/1.1 200 OK
< Content-Type: application/x-javascript; charset=UTF-8
< Cache-Control: public, max-age=30
< Date: Thu, 30 Jan 2014 22:11:28 GMT
< Accept-Ranges: bytes
< Server: testServer/1.0
< Vary: Accept-Encoding
< Transfer-Encoding: chunked
This JavaScript file is included by an HTML page hosted on the same server through a <script> element in the document's <head>. The HTML page is served with the following cache control header:
Cache-Control: no-cache, must-revalidate, no-store, max-age=0
After loading the HTML page, I clicked through to a different page, waited 5 minutes for good measure and clicked on a link back to the original page, monitoring all HTTP requests - and the test file was never requested. This was reproduced consistently with both Firefox and Safari at their latest versions.
This got a bit long-winded, but the gist of my question is: my tests seem to show that mainstream browsers do not respect the RFC and will not revalidate stale cache entries. Did I misinterpret the RFC? Just as likely, did I botch my tests, and can someone prove them wrong? Or do browsers really not respect the max-age cache directive?
After much digging and help from #CodeCaster, the canonical answer to this question is that browsers do appear to respect the RFC: stale cache entries are always re-validated, except in the very specific case that they were accessed, directly or indirectly, through the browser's history. In this case, section 13.3 of the RFC applies:
History mechanisms and caches are different. In particular history mechanisms SHOULD NOT try to show a semantically transparent view of the current state of a resource. Rather, a history mechanism is meant to show exactly what the user saw at the time when the resource was retrieved.
I cannot reproduce your issue. I have tried to, using ASP.NET MVC, with the following code:
public ActionResult Index()
{
Response.AddHeader("Cache-Control",
"no-cache, must-revalidate, no-store, max-age=0");
return View();
}
public ActionResult JavaScript()
{
Response.AddHeader("Cache-Control", "public, max-age=30");
return View();
}
public ActionResult Page2()
{
return View();
}
The first action returns the Index page (relevant headers only):
HTTP/1.1 200 OK
Cache-Control: no-cache, no-store, must-revalidate, max-age=0
Pragma: no-cache
Expires: -1
Vary: Accept-Encoding
Content-Length: 182
<html>
<head>
<script src="/Home/JavaScript" type="text/javascript"></script>
</head>
<body>
Page 2
</body>
</html>
The JavaScript is returned like this:
HTTP/1.1 200 OK
Cache-Control: public, max-age=30
Vary: Accept-Encoding
Content-Length: 24
document.write('Foo');
And Page2 merely contains a link back to Home.
Now when I, using Internet Explorer 11, Chrome 32 or Firefox 26, I see the following behavior:
Upon the first request to /, the Index document is requested as well as the JavaScript file, and they are returned like shown above as verified with Fiddler.
When I click the "Page 2" link, only Page 2 is requested, because it doesn't contain anything but a link back to Page 1.
Now when I click the "Home" link from Page 2 within 30 seconds, the JS file is not requested again in any browser.
When I wait a while (> 30 seconds) on Page 2 and then click the "Home" link, the JS file is requested in all three browsers.
However, when I click the Back button from Page 2 in any browser after any period of time (either greater or less than 30 seconds), the Index file is always, but the JS file is never requested again in any browser.
It requires a refresh (F5) or navigating away and back by clicking "Page 2" followed by "Home" again to make the browser perform a new request for the JS file after it became stale.

This is How Google Bot Fetched My Site

This is how a Google bot views my site -
HTTP/1.1 302 Found
Connection: close
Pragma: no-cache
cache-control: no-cache
Location: /LageX/
What does it mean? Is it good or bad? Thanks.
It's bad.
The above indicates that the content of your site is temporarily available at another location. Unless you have a good reason to set up a temporary (302) redirect, you should either move your content to where it is expected or set up a permanent (301) redirect.
The Location: header which is expected to hold the URI where the content is available is itself invalid, because its value is expected to be an absolute URI — something like http://domain.com/LageX/.

HTTP caching confusion

I'm not sure whether this is a server issue, or whether I'm failing to understand how HTTP caching really works.
I have an ASP MVC application running on IIS7. There's a lot of static content as part of the site including lots of CSS, Javascript and image files.
For these files I want the browser to cache them for at least a day - our .css, .js, .gif and .png files rarely change.
My web.config goes like this:
<system.webServer>
<staticContent>
<clientCache cacheControlMode="UseMaxAge"
cacheControlMaxAge="1.00:00:00" />
</staticContent>
</system.webServer>
The problem I'm getting is that the browser (tested Chrome, IE8 and FX) doesn't seem to be caching the files as I'd expect. I've got the default settings (check for newer pages automatically in IE).
On first visit the content downloads as expected
HTTP/1.1 200 OK
Cache-Control: max-age=86400
Content-Type: image/gif
Last-Modified: Fri, 07 Aug 2009 09:55:15 GMT
Accept-Ranges: bytes
ETag: "3efeb2294517ca1:0"
Server: Microsoft-IIS/7.0
X-Powered-By: ASP.NET
Date: Mon, 07 Jun 2010 14:29:16 GMT
Content-Length: 918
<content>
I think that the Cache-Control: max-age=86400 should tell the browser not to request the page again for a day.
Ok, so now the page is reloaded and the browser requests the image again. This time it gets an empty response with these headers:
HTTP/1.1 304 Not Modified
Cache-Control: max-age=86400
Last-Modified: Fri, 07 Aug 2009 09:55:15 GMT
Accept-Ranges: bytes
ETag: "3efeb2294517ca1:0"
Server: Microsoft-IIS/7.0
X-Powered-By: ASP.NET
Date: Mon, 07 Jun 2010 14:30:32 GMT
So it looks like the browser has sent the ETag back (as a unique id for the resource), and the server's come back with a 304 Not Modified - telling the browser that it can use the previously downloaded file.
It seems to me that would be correct for many caching situations, but here I don't want the extra round trip. I don't care if the image gets out of date when the file on the server changes.
There are a lot of these files (even with sprite-maps and the like) and many of our clients have very slow networks. Each round trip to ping for that 304 status is taking about a 10th to a 5th of a second. Many also have IE6 which only has 2 HTTP connections at a time. The net result is that our application appears to be very slow for these clients with every page taking an extra couple of seconds to check that the static content hasn't changed.
What response header am I missing that would cause the browser to aggressively cache the files?
How would I set this in a .Net web.config for IIS7?
Am I misunderstanding how HTTP caching works in the first place?
You need to use the Expires directive, otherwise the browser will always check to see if the content has updated.
If a cached entry has a valid expiration date the browser can reuse the content without having to contact the server at all when a page or site is revisited. This greatly reduces the number of network round trips for frequently visited pages. For example, the Google logo is set to expire in 2038 and will only be downloaded on your first visit to google.com or if you have emptied your browser cache. If they ever want to change the image they can use a different image file name or path.
To change in IIS7 use following. This is easiest to manage if you keep static content in specific directories.
Log onto the server
Open IIS Manager (start -> adminstrative tools -> iis manager
Expand the server node
Expand the sites node
Open the site and navigate to the directory you want to change
Open the IIS HTTP Response Headers section
Click Set Common Headers on the task pane on the right
Set "Expire Web Content" as your app requires.
use expires header instead of using the cache-control.Tell your server for the first time that serve me content from my browser cache until this expiry date. There will be no cross checking for changes in file until your expiry date.
add the header in your web.config’s system.webServer section like so:
<system.webServer>
<staticContent>
<clientCache httpExpires="Sun, 29 Mar 2020 00:00:00 GMT"
cacheControlMode="UseExpires" />;
</staticContent>
</system.webServer>
Short anwser: remove Etag and use Expire header.
You should check out the 35 Yahoo Performance best practices, and more specifically:
Configure ETags on your server (removing them is a good choice)
Add Expire header for static resources
For each rule, they usually cover Apache and IIS web server configurations.
Edit: okay, it looks like there is no simple way to remove Etags in IIS, besides installing some 3rd party software...

Resources