How long will cached CSS file get updated in browser? - css

How long will cached CSS file get updated in browser if I don't do anything specifically?
I googled this but haven't found a clear answer. I know I can use file.css?v=1 to force the browser to load the updated version or I can use hard reload feature of the browser. But what if I don't do all of these? So far the browser will always load the cached old version.
Without hard reload and any other setup in server, how long will a local browser update the cached CSS file? Will the cached version stay there forever? (unless the cache space is full to make space).

Browsers generally follow the IETF spec for HTTP caching. This was introduced in the HTTP 1.1 spec. But they do all vary if the content being served doesn't use an HTTP Cache-Control header. Ultimately you can't rely on the hope that your updated file will be loaded by the client unless you either use a URL cache-buster, as you mentioned, or serve your content with proper cache-control headers.

Related

Browser doesn't cache images despite cache header when html is not cached

I'm caching our generated images using the HTTP header "Cache-Control", however, when I don't cache the HTML file (comprising those img tags) using "no-cache" I see further requests sent to the server (as I add, remove, and re-add those tags). Caching the HTML file results in cached images (and no further requests).
The Only similar case I could find is this.
Any lead/link will be appreciated.
Browser: Version 32.0.1661.0 canary Aura
p.s. I very much prefer keeping the HTML file not cached.
Eugene Olshenbaum answered on twitter: "close developer tools, when it is open, chrome ignores headers :))"
The cache was "disabled" while the dev tools were open. Not sure why I didn't see any calls to the server if the HTML file was cached.

What is the function of adding request attributes when linking to css files?

In the head section of the sample html file from the html5boilerplate project, I notice this:
<link rel="stylesheet" href="css/style.css?v=2">
Note the v=2 request variables. I also notice that this is never done for javascript files.
What is the actual function of doing this ?
The ?v=2 might be to prevent reading from cache by the browser. It's used when loading dynamic content from a static file, like so:
changingListOfStuff.txt?randomUselessPropertyToTrickBrowser=123456789
This forces the browser to use this exact file, not a cached version of changingListOfStuff.txt previously downloaded and stored by the browser. Caching speeds up loading time, but might provide an older version of the file if it changes rapidly.
Read more about caching here: http://en.wikipedia.org/wiki/Web_cache
this is just telling the version of the url. This is done to make a fresh request to the server.In case of css as we know to achieve performance some headers are modified so that next time css is served by the browser history.But every time a css is modifed specifically in case of version releases. Browser should make new requests that would happen only when the url changes. So v=2 probaly means a new version is in and the url should freshly fetch the content from css.
This is called cache busting...you can read it here too
http://manikandanc.blogspot.com/2005/11/cache-busting-with-javascript.html
this will avoid client to get the version from browser. When you change the javascript or css , the end client who already visited your website may get javascript from his cache.
You can increment the version no whenever you deploy the files to the production , so that it will get the latest file

Caching Typekit CSS

I'm using TypeKit to provide fonts for a site I'm developing. When the page is loading it loads slow (more than a second). Turns out that this is because it's downloading the fonts on every requests. It's beyond me that a service such as this doesn't have etags configured to get clients to cache the fonts...but I digress. Until TypeKit fixes this I host the CSS locally temporarily.
Anyone had this issue with TypeKit? How did you work around it? Perhaps I'm wrong?
According to a posting on their getsatisfaction.com account, they have at least some caching in place:
One thing to note is that although the fonts are served with an Expires header, they're also served with an Etag. The browser is required to make a request after 5 minutes, but will normally use the Etag to generate a 304 (Not Modified) response - meaning, the fonts aren't actually downloaded again.
can you check what happens using Firebug?

Custom webserver caching

I'm working with a custom webserver on an embedded system and having some problems correctly setting my HTTP Headers for caching.
Our webserver is generating all dynamic content as XML and we're using semi-static XSL files to display it with some dynamic JSON requests thrown in for good measure along with semi-static images. I say "semi-static" because the problems occur when we need to do a firmware update which might change the XSL and image files.
Here's what needs to be done: cache the XSL and image files and do not cache the XML and JSON responses. I have full control over the HTTP response and am currently:
Using ETags with the XSL and image files, using the modified time and size to generate the ETag
Setting Cache-Control: no-cache on the XML and JSON responses
As I said, everything works dandy until a firmware update when the XSL and image files are sometimes cached. I've seen it work fine with the latest versions of Firefox and Safari but have had some problems with IE.
I know one solution to this problem would be simply rename the XSL and image files after each version (eg. logo-v1.1.png, logo-v1.2.png) and set the Expires header to a date in the future but this would be difficult with the XSL files and I'd like to avoid this.
Note: There is a clock on the unit but requires the user to set it and might not be 100% reliable which is what might be causing my caching issues when using ETags.
What's the best practice that I should employ? I'd like to avoid as many webserver requests as possible but invalidating old XSL and image files after a software update is the #1 priority.
Are we working on the same project? I went down a lot of dead ends figuring out the best way to handle this.
I set my .html and my .shtml files (dynamic JSON data) to expire immediately. ("Cache-Control: no-cache\r\nExpires: -1\r\n")
Everything else is set to expire in 10 years. ("Cache-Control: max-age=290304000\r\n")
My makefile runs a perl script over all the .html files and identifies what you call "semi-static" content (images, javascript, css.) The script then runs a md5 checksum on those files and appends the checksum to the file:
<script type="text/Javascript" src="js/all.js?7f26be24ed2d05e7d0b844351e3a49b1">
Everything after the question mark is ignored, but no browser will cache it unless everything between the quotes matches.
I use all.js and all.css because everything's combined and minified using the same script.
Out of curiosity, what embedded webserver are you using?
Try Cache-Control: no-store. no-cache tells the client that the response can be cached; it just generally isn't reused unless the cache can't contact the origin server.
BTW, setting an ETag alone won't make the response cacheable; you should also set Cache-Control: max-age=nnn.
You can check how your responses will be treated with http://redbot.org/

Browser Caching in ASP.NET application

Any suggestions on how to do browser caching within a asp.net application. I've found some different methods online but wasn't sure what would be the best. Specifically, I would like to cache my CSS and JS files. They do change, however, it is usually once a month at the most.
Another technique is to stores you static images, css and js on another server (such as a CDN) which has the Expires header set properly. The advantage of this is two-fold:
The expires header will encourage browsers and proxies to cache these static files
The CDN will offload from your server serving up static files.
By using another domain name for your static content, browsers will download faster. This is because serving resources from four or five different hostnames increases parallelization of downloads.
If the CDN is configured properly and uses cookieless domain then you don't have unnecessary cookies going back and forth.
Its worth bearing in mind that even without Cache-Control or Expires headers most browsers will cache content such as JS and CSS. What should happen though is the browser should request the resource every time its needed but will typically get a "304 Unmodified" response and the browser then uses the cached item. This is can still be quite costly since its a round trip to the server but the resource itself isn't sent so the bytes transfered is limited.
IE left with no specific instructions regarding caching will by default use its own heuristics to determine if it should even bother to re-request an item its cached. This despite not being explicitly told that it can cache a resource. Its hueristics are based on the Last-Modified date of the resource, the older its the less likely it'll have changed now is its typical reasoning. Very wooly.
Frankly if you want to make a site perfomant you need to have control over such cache settings. If you don't have access to these settings then don't wouldn't worry about performance. Just inform the sponsor that it may not perform well because they haven't facilitated a platform that lets you deliver that.
You best bet to do this is to set an Expires header in IIS on the folders you want the content cached. This will tell most modern browsers and proxies to cache this static content. In IIS 6:
Right click on the folder (example CSS or JS) you want to be cached by the browser.
Click properties
Go to the HTTP Headers Tab
Check "Enabled content expiration"
Set some long period for expiration, like "Expires after 90 days"
Yahoo Developer's Blog talks about this technique.
Unless you configure IIS to give asp.net control of js/css/image requests it won't see them by default, hence your best plan (for long term maintainability) is to deliberately tweak the response headers at your firewall/trafficmanager/server or (better and what most of the world does at this point) to version your files in path, i.e:
Instead of writing this in your mark-up:
http://www.foo.com/cachingmakesmesad.css
Use this:
http://www.foo.com/cachingmakesmesad.css?v1
..and change the version number when you need to effectively clear cache. If that's every time then you could even append a GUID or datestamp instead, but I can't think of any occasion where I would want to do that really.
I thought your question was anti-cache but re-reading it I see I wasted a good answer there :P
Long story short, browsers are normally very aggressively pro-caching "simple" resources so you shouldn't have to worry about this, but if you did want to do something about it you would have to have access to a firewall/trafficmanager/IIS for the reasons above (ASP.NET won't be given a chance by default).
However... there is no way you can absolutely force caching, and nor should you. What is and isn't cached is rightfully the decision of the end-user, all you can do is strongly request.
In .net you can set up your JavaScript, CSS and Images as embedded resources.
.Net will then handle the file expiration for you.
The downside to this approach is that you have to do a new build for each set of changes (this might be an upside, depending on your deployment and workflow).
You could also use ETags, but from what I understand in some cases it doesn’t work well if you have mix of IIS and apache Webservers hosting your images, (or if you plan to switch in the future).
You can just make sure the file date is newer, and let the server handle it for you, but you’ve got to make sure the server is configured correctly.
You can cache static content by adding following code in web.config
<system.webServer>
<staticContent>
<clientCache httpExpires="Tue, 12 Apr 2016 00:00:00 GMT" cacheControlMode="UseExpires" />
</staticContent>
</system.webServer>
See the clientCache documentation for more details.

Resources