When we are developing new sites or testing changes in new ones that involve css after the new code is committed and someone goes to check the changes they always see a cached version of the old css. This is causing a lot of problems in testing because people never are sure if they have the latest css on screen (I know shift and clicking refresh clears this cache but I can't expect end users to know to do this). What are my possible solutions?
If you're serving your CSS from static files (or anything that the query string doesn't matter for), try varying that to ensure that the browser makes a fresh request, as it will think that it's pulling a completley different resource, so have for example:
"styles.css?token=1234" in the CSS reference in your markup and change the value of "token" on each CSS check-in
In your development environment, set the Expires header much lower. In your Production environment, set it higher, and then set it low about a week before you do your release.
Its not a great solution, but I've gotten around this before at the page level by adding a querystring to the end of the call to the CSS file:
<link href="/css/global.css?id=3939" type="text/css" rel="stylesheet" />
Then I'd randomize the id value so that it always loads a different value on page load. Then I'd take this code out before pushing to production. I suppose you could also pull the value from a config file, so that it only has to be loaded once per commit.
Similar (a bit more detail) answers given for the JavaScript version of this question, which has the same problem/solution
Help with aggressive JavaScript caching
Related
I can see this site.com/assets/css/screen.css?954d46d92760d5bf200649149cf28ab453c16e2bwhat is this random alpha numeric vales question mark ? i don't think it's taking some value to use or what is it about ?
edit : also on refreshing page the alpha-numeric value is same.
It is for preventing the browser from caching the CSS. When a CSS is requested by some browsers, specifically Internet Explorer, the browser will have a local copy of the CSS.
When a request is given to a server as:
site.com/assets/css/screen.css?skdjhfk
site.com/assets/css/screen.css?5sd4f65
site.com/assets/css/screen.css?w4rtwgf
site.com/assets/css/screen.css?helloWd
The server at site.com sees only:
site.com/assets/css/screen.css
And gives the latest version. But when the HTML page is requesting the browser to fetch the CSS as: site.com/assets/css/screen.css, for the first time, it fetches from the site.com server. There are many possibilities that the content might be changed in the meantime when the next request is sent. So programmers generally add a ?and-some-random-text, which is called Query String. This will force the browser to get a new copy from the server.
Some more detailed explanation:
It is a well known problem that IE caches too much of html, even when
giving a Cache-Control: no-cache or Last-Modified header to
everypage.
This behaiviour is really troubling when working with querystrings to
get dynamic information, as IE considers it to be the same page
(i.e.: http://example.com/?id=10) and serves the cached version.
I've solved it adding either a random number or a timestring to the
querystring (as others have done) like this
http://example.com/?id=10&t=2009-08-06_13:12:56 that I just ignore
serverside.
Is there a better option? Is there another, cleaner way to acomplish
this? I'm aware that POST isn't cached, but it is semanticaly
correct to use GET here.
Reference: Random Querystring to avoid IE caching
It's about making changes in design (css-files and images) on a website which is already online and in use. I wonder what is the best-practice to make sure that visitors see the changes without clearing there browser's cache manually. Things that came in my mind:
change meta-tag - dismissed because I do not want the site to be ALWAYS loaded from the server
include the css-file with a parameter (like timestamp) after made a change
change the names of included images so that they are reloaded - means also change names in the files where images are included
?
What else could achieve the loading from server? Did I forget some advantages/disadvantages?
Possible duplicate of this post: How to control web page caching, across all browsers?
My favoured solution is to set a random number after you call the file e.g.
css/styles.ccs?628454548
images/sprite.gif?8356484894
You could use javascript/php or whatever to set those random numbers every time the page is called to the browser.
When doing webpages, the client/browser decides if it updates a file like an image or .css or .js or if it takes that from the Cache.
In case of .aspx page it is the server who decides.
Sure, on IIS level or also using some HttpModule techniques I can change the headers of requests to tell the client if and how long a file should be cached.
Now, I do have a website where the .aspx goes hand-in-hand with a corresponding .js. So, perhaps I have some jQuery code in the .js which accesses an element in the .aspx. If I remove that element from the .aspx I would also adapt the .js. If the user goes to my page he will get the new .aspx but he might still get the old .js, leading to funny effects.
My site uses lots of scripts and lots of images. For performance reasons I configured in the IIS that those files "never" expire.
Now, from time to time a file DOES change and I want to make sure that users get the update files.
In the beginning I helped myself by renaming the files. So, I had SkriptV1.js and SkriptV2.js and so on. That's about the worst option since the repository history is broken and I need to adapt both the references and the file name.
Now, I improved here and change only the references by using Skript.js?v=1 or Skript.js?v=2.
That forces the client to refresh the files. It works fine, but still I have to adapt the references.
Now, there is a further improvement here like this:
<script type='text/javascript' src='../scripts/<%# GetScriptLastModified("MyScript.js") %>'></script>
So, the "GetScriptLastModified" will append the ?v= parameter like this:
protected string GetScriptLastModified(string FileName)
{
string File4Info = System.Threading.Thread.GetDomain().BaseDirectory + #"scripts\" + FileName;
System.IO.FileInfo fileInfo = new System.IO.FileInfo(File4Info);
return FileName + "?v=" + fileInfo.LastWriteTime.GetHashCode().ToString();
}
So, the rendered .js-Link would look like this to the client:
<script type='text/javascript' src='/scripts/GamesCharts.js?v=1377815076'></script>
The link will change every time, when I upload a new version and I can be sure that the user immediately gets a new script or image when I change it.
Now, two questions:
a) Is there a more elegant way to achieve this?
b) If not: Has someone a guess how big the performance overhead on the server would be? There can be easily 50 versioned elements on one page, so for one .aspx the GetScriptLastModified would be invoked 50 times.
Looking forward to a discussion :)
There are a few different answers to this question.
First of all, if you have files which only change every once in a while, set the Expires and Cache-Control headers to expire in one year. Only if the files truly never expire should you say that they never expire. You're seeing the issues with saying "never expire" right now.
Also, if you are having performance issues on your site from serving up lots of images and JavaScript, the commonly accepted solution is to use a CDN (Content Delivery Network). There are many different providers and I'm sure that you can find one that meets your budget. You'll also save money in the long run as the CDN will offload a great deal of I/O and CPU time from IIS. It's astounding how big of a difference it can make.
Lastly, one way to make sure that users are getting the latest for your files which almost never change is to implement some sort of versioning scheme in your assets URLs to make cache busting happen. There are many different ways to do this, but one (very naive) way to do it is to have a version number that increases every time you deploy to your site.
E.g. all your asset URLs will look like /static/123/img/dog_and_pony.jpg
Then, next time you deploy to your site, you increase the version number so that it's "124". You would need some way to keep track of the version, dynamically injecting it into asset URLs, as well as making sure that the version number changes every time you deploy. The idea being that anything referencing this asset should automatically know the new version number.
In terms of performance, it's an admirable goal to never need the user to refresh or have to download the same thing twice. But sometimes it's just a lot less hassle, and if users are only refreshing everything periodically, that's probably okay for most websites.
Hope this helps.
I understand there isn't a way to interrogate a users IE settings directly due to security reasons, but is there a way to derive this answer with some other mechanism? I would like to stop a user from using my site if the setting "Check for newer versions of stored pages" is set to "Never". Any suggestions?
Is there a way I could test for this using javascript? An example of what I am trying to accomplish is this: While it is not possible to check IE settings to see if you are running a popup blocker, that is a way to "test" for a popup blocker via javascript. I am looking for something similiar but for the cache setting, not the popup blocker.
Append something to the querystring that is unlikely to have been used before (a datestring, a random number).
That will make IE 6 request the page again, as far as IE is concerned, its another page.
So if you have http://somepage.com/dontcachethis.html you'd replace this with something like http://somepage.com/dontcachethis.html?please=23143425 where you asign please a random number.
Add the following to your web pages:
meta http-equiv="Pragma" content="no-cache"
meta http-equiv="Expires" content="0"
This forces the user's browser to reload the pages every time, regardless of the browser's cache settings.
I've not implemented this yet, but this is my plan:
Write out a date onto the page that is on an attribute of some HTML element (this should be written server side...writing it with javascript will defeat the purpose)
Using javascript, load in the date. If it is over a certain amount of time, I can assume they have not loaded a fresh copy of the page. If that is the case, continue on...
If I've detected that they have an old version of a page, I could do a couple things:
I could warn them of this problem. I could show them the setting that they need to change.
I could redirect them to the same page but add something onto the querystring so it looks like a different page to the browser. If I do this, I would add something in my javascript that knows about this potential redirect loop.
Whenever we make changes to the CSS, it generally takes 24 hours to reflect those changes on my site. I have tried clearing the server cache and browser cache but it doesn't help too. Is there any other way to make the CSS changes reflect immediately after updation?
it happens in all the browsers... when i check it in the browser , i can access my css file with two paths eg : i store my css in folder named "Cssfolder" and my css name is say 135.css
So when i access the folder paths, Cssfolder/135.css & cssfolder/135.css, one of the path shows me latest css whereas other one shows me old css.Notice the "c" is captital in one path whereas small in other path.
Thanks.
I've found this to be a pretty common problem in a lot of my projects. I would suggest two things...
If it's just an app that you are working on you can use the CSS Cachebuster during development.
Following the idea behind the Cachebuster I have found that often adding the timestamp of the CSS file as a query string off of the CSS link will help in telling the browser that the file is different... something like... whatever.css?12212009035543
You might want to use a monitoring tool, like Live Http Headers for Firefox, to see the requests and responses to and from the server. This usually solves a lot of problems for me. Take a look at the "Expire" headers and conditional requests (like: "If-modified-since"). This said, take a look at server and client local times and timezones - it might be that they differ significantly and conditional GET requests "seem to be" handled correctly, because of future or otherwise mangled timestamps.
You can force to load the current css directly from the server with appending a random unique value to the url, like http://example.com/Cssfolder/135.css?983274928374 and http://example.com/cssfolder/134.css?08973249827. There's no way that this would ever get cached unless you use the same random value twice.
This way you learn where to look further for the solution to your problem: At the server, the ISP/a proxy or your browser.
You really need to see whether this is server side or client side. If the server is still serving the old CSS then clearly you've got no chance on the client side.
I've occasionally seen times where I've had to show the CSS in the browser, and then next time I've been to the real page, it's used that new CSS. Usually just hitting refresh does it.
Do you have any web caches like Akamai involved anywhere?
If you try to go to the CSS page from a computer which has never seen the old version, which version does it show?
EDIT: Changed answer to reflect edits in question.
I have been dealing with this issue in the past, and ended up writing a httpmodule to deal with it.
It's pretty simple, it just finds all script/css links in head tag (they now need to have runat=server) and appends the assembly version number to the link, in the same way as Tim K describes. This way im sure my clients always fetches the newest css/scripts when my app is updated in production, and never have to deal with this issue again.
Maybe Internet Service Provider cache, as in this case?
I was perplexed by this issue then someone said Ctrl+F5. Worked for me :)
When I am developing and I need to be sure that I am seeing changes as I work, I stick the css in the page ie
<style type="text/css">
/* your css */
</style>
Or you could constantly change the name of the css file itself, not very useful in a production environment, but perhaps okay while developing.
I know it doesn't solve the problem, but for developing it is okay.