ASP.NET: How to enforce a reload of a web static file - http

When doing webpages, the client/browser decides if it updates a file like an image or .css or .js or if it takes that from the Cache.
In case of .aspx page it is the server who decides.
Sure, on IIS level or also using some HttpModule techniques I can change the headers of requests to tell the client if and how long a file should be cached.
Now, I do have a website where the .aspx goes hand-in-hand with a corresponding .js. So, perhaps I have some jQuery code in the .js which accesses an element in the .aspx. If I remove that element from the .aspx I would also adapt the .js. If the user goes to my page he will get the new .aspx but he might still get the old .js, leading to funny effects.
My site uses lots of scripts and lots of images. For performance reasons I configured in the IIS that those files "never" expire.
Now, from time to time a file DOES change and I want to make sure that users get the update files.
In the beginning I helped myself by renaming the files. So, I had SkriptV1.js and SkriptV2.js and so on. That's about the worst option since the repository history is broken and I need to adapt both the references and the file name.
Now, I improved here and change only the references by using Skript.js?v=1 or Skript.js?v=2.
That forces the client to refresh the files. It works fine, but still I have to adapt the references.
Now, there is a further improvement here like this:
<script type='text/javascript' src='../scripts/<%# GetScriptLastModified("MyScript.js") %>'></script>
So, the "GetScriptLastModified" will append the ?v= parameter like this:
protected string GetScriptLastModified(string FileName)
{
string File4Info = System.Threading.Thread.GetDomain().BaseDirectory + #"scripts\" + FileName;
System.IO.FileInfo fileInfo = new System.IO.FileInfo(File4Info);
return FileName + "?v=" + fileInfo.LastWriteTime.GetHashCode().ToString();
}
So, the rendered .js-Link would look like this to the client:
<script type='text/javascript' src='/scripts/GamesCharts.js?v=1377815076'></script>
The link will change every time, when I upload a new version and I can be sure that the user immediately gets a new script or image when I change it.
Now, two questions:
a) Is there a more elegant way to achieve this?
b) If not: Has someone a guess how big the performance overhead on the server would be? There can be easily 50 versioned elements on one page, so for one .aspx the GetScriptLastModified would be invoked 50 times.
Looking forward to a discussion :)

There are a few different answers to this question.
First of all, if you have files which only change every once in a while, set the Expires and Cache-Control headers to expire in one year. Only if the files truly never expire should you say that they never expire. You're seeing the issues with saying "never expire" right now.
Also, if you are having performance issues on your site from serving up lots of images and JavaScript, the commonly accepted solution is to use a CDN (Content Delivery Network). There are many different providers and I'm sure that you can find one that meets your budget. You'll also save money in the long run as the CDN will offload a great deal of I/O and CPU time from IIS. It's astounding how big of a difference it can make.
Lastly, one way to make sure that users are getting the latest for your files which almost never change is to implement some sort of versioning scheme in your assets URLs to make cache busting happen. There are many different ways to do this, but one (very naive) way to do it is to have a version number that increases every time you deploy to your site.
E.g. all your asset URLs will look like /static/123/img/dog_and_pony.jpg
Then, next time you deploy to your site, you increase the version number so that it's "124". You would need some way to keep track of the version, dynamically injecting it into asset URLs, as well as making sure that the version number changes every time you deploy. The idea being that anything referencing this asset should automatically know the new version number.
In terms of performance, it's an admirable goal to never need the user to refresh or have to download the same thing twice. But sometimes it's just a lot less hassle, and if users are only refreshing everything periodically, that's probably okay for most websites.
Hope this helps.

Related

Programming grunt-rev/grunt-usemin to use commit-id

The default behavior of grunt-rev is to evaluate the given resource and put a hash of the content in the path, so /images/sprites/rewards.png becomes /images/sprites/f936712a.rewards.png
I don't want that. I've got several revisions being served simultaneously and I want to be able to delete old revs at will, so I'd like to rename it to /2013-06-10-e75607/images/sprites/rewards.png (where e75607 is the commit-id for the whole revision, nothing to do with the individual file).
Is this a possibility with grunt-rev and grunt-usemin? Is there an equivalent tool that would do it?
Edit
Several people have asked me why not use the hash of each file. Let me explain:
In old-fashion websites, in response to pretty much any user input, the browser is reloaded, the back-end generates a brand new page from the input, the HTML is sent to to the browser, which displays the page. It's a slow, CPU- and bandwidth-intensive process, but it does have one advantage: all the assets that are ever loaded are loaded together, within a few seconds.
In a more modern website, the page that is loaded describes the whole application. When the user makes some input, the Javascript on the page renders new DOM elements and loads new assets as needed. The page as a whole is rarely or never reloaded The site is enormously more responsive in consequence (and much easier to develop, more secure, cheaper to run, and on and on) but has the corresponding disadvantage: an asset might be loaded hours or days after the page is loaded.
Say, you visit the site at 10 am, when the site is running version cd1d0906. At 10:30, the site is upgraded to version 4b571377. At 11 am, you press a button that causes a popup that is rendered with a image called sprite.png. Obviously, you need the cd1d0906 version of sprite.png -- not the 4b571377 version.
Therefore, a well-maintained site will continue to offer old versions of all assets for several days after the versions have changed over. The easiest way to do that is to keep all the assets in a directory that is named after the version.
The complaint that this "unnecessarily" discards cache entries for unchanged files is rather unconvincing. Most deployed assets are CSS files, JS files, and sprites, all of which are compilation of many smaller files. It's a rare deployment that doesn't changes one CSS file and one JS file and one sprited image. The cache is rarely valuable after a version change.

Best way to let browsers refresh from cache on a live website?

It's about making changes in design (css-files and images) on a website which is already online and in use. I wonder what is the best-practice to make sure that visitors see the changes without clearing there browser's cache manually. Things that came in my mind:
change meta-tag - dismissed because I do not want the site to be ALWAYS loaded from the server
include the css-file with a parameter (like timestamp) after made a change
change the names of included images so that they are reloaded - means also change names in the files where images are included
?
What else could achieve the loading from server? Did I forget some advantages/disadvantages?
Possible duplicate of this post: How to control web page caching, across all browsers?
My favoured solution is to set a random number after you call the file e.g.
css/styles.ccs?628454548
images/sprite.gif?8356484894
You could use javascript/php or whatever to set those random numbers every time the page is called to the browser.

Caching static content

I'm trying to understand what's the best Cache-Control value to be set for static content (images, css, javascript). The issue with this is that my JavaScript/CSS is still very much in development, and whenever I make a change I want people to see changes immediately (they shouldn't have to reload their cache).
What's the best way to go about this? Should I add a ?version=1000202210 after each static request so the browser knows it's new?
Yes, a long expiration date + fingerprinting brings you maximal browser caching and at the same time the necessary flexibility to propagate changes immediately. Google page speed has a good explanation. You can either add a fingerprint in the query string or in the path of the assets. It doesn't really matter how you do it as long as the URL changes when you want the resource to be fetched again.

CSS changes not reflecting on site

Whenever we make changes to the CSS, it generally takes 24 hours to reflect those changes on my site. I have tried clearing the server cache and browser cache but it doesn't help too. Is there any other way to make the CSS changes reflect immediately after updation?
it happens in all the browsers... when i check it in the browser , i can access my css file with two paths eg : i store my css in folder named "Cssfolder" and my css name is say 135.css
So when i access the folder paths, Cssfolder/135.css & cssfolder/135.css, one of the path shows me latest css whereas other one shows me old css.Notice the "c" is captital in one path whereas small in other path.
Thanks.
I've found this to be a pretty common problem in a lot of my projects. I would suggest two things...
If it's just an app that you are working on you can use the CSS Cachebuster during development.
Following the idea behind the Cachebuster I have found that often adding the timestamp of the CSS file as a query string off of the CSS link will help in telling the browser that the file is different... something like... whatever.css?12212009035543
You might want to use a monitoring tool, like Live Http Headers for Firefox, to see the requests and responses to and from the server. This usually solves a lot of problems for me. Take a look at the "Expire" headers and conditional requests (like: "If-modified-since"). This said, take a look at server and client local times and timezones - it might be that they differ significantly and conditional GET requests "seem to be" handled correctly, because of future or otherwise mangled timestamps.
You can force to load the current css directly from the server with appending a random unique value to the url, like http://example.com/Cssfolder/135.css?983274928374 and http://example.com/cssfolder/134.css?08973249827. There's no way that this would ever get cached unless you use the same random value twice.
This way you learn where to look further for the solution to your problem: At the server, the ISP/a proxy or your browser.
You really need to see whether this is server side or client side. If the server is still serving the old CSS then clearly you've got no chance on the client side.
I've occasionally seen times where I've had to show the CSS in the browser, and then next time I've been to the real page, it's used that new CSS. Usually just hitting refresh does it.
Do you have any web caches like Akamai involved anywhere?
If you try to go to the CSS page from a computer which has never seen the old version, which version does it show?
EDIT: Changed answer to reflect edits in question.
I have been dealing with this issue in the past, and ended up writing a httpmodule to deal with it.
It's pretty simple, it just finds all script/css links in head tag (they now need to have runat=server) and appends the assembly version number to the link, in the same way as Tim K describes. This way im sure my clients always fetches the newest css/scripts when my app is updated in production, and never have to deal with this issue again.
Maybe Internet Service Provider cache, as in this case?
I was perplexed by this issue then someone said Ctrl+F5. Worked for me :)
When I am developing and I need to be sure that I am seeing changes as I work, I stick the css in the page ie
<style type="text/css">
/* your css */
</style>
Or you could constantly change the name of the css file itself, not very useful in a production environment, but perhaps okay while developing.
I know it doesn't solve the problem, but for developing it is okay.

IIS CSS Caching

When we are developing new sites or testing changes in new ones that involve css after the new code is committed and someone goes to check the changes they always see a cached version of the old css. This is causing a lot of problems in testing because people never are sure if they have the latest css on screen (I know shift and clicking refresh clears this cache but I can't expect end users to know to do this). What are my possible solutions?
If you're serving your CSS from static files (or anything that the query string doesn't matter for), try varying that to ensure that the browser makes a fresh request, as it will think that it's pulling a completley different resource, so have for example:
"styles.css?token=1234" in the CSS reference in your markup and change the value of "token" on each CSS check-in
In your development environment, set the Expires header much lower. In your Production environment, set it higher, and then set it low about a week before you do your release.
Its not a great solution, but I've gotten around this before at the page level by adding a querystring to the end of the call to the CSS file:
<link href="/css/global.css?id=3939" type="text/css" rel="stylesheet" />
Then I'd randomize the id value so that it always loads a different value on page load. Then I'd take this code out before pushing to production. I suppose you could also pull the value from a config file, so that it only has to be loaded once per commit.
Similar (a bit more detail) answers given for the JavaScript version of this question, which has the same problem/solution
Help with aggressive JavaScript caching

Resources