Website needs force refresh after deploy - asp.net

After deploying a new version of a website the browser loads everything from its cache from the old webpage until a force refresh is done. Images are old, cookies are old, and some AJAX parts are not working.
How should I proceed to serve the users with the latest version of the page after deploy?
The webpage is an ASP.Net webpage using IIS7+.

You can append a variable to the end of each of your resources that changes with each deploy. For example you can name your stylesheets:
styles.css?id=1
with the id changing each time.
This will force the browser to download the new version as it cannot find it in its cache.

For ASP.NET you can use the cache control and expires headers. You can also set up similar headers in IIS 7 for your images. If you have any other cookies you can expire them manually.
I have not tried it, but it looks like you can do an ever better job of bulk setting cache control in IIS 7. See this thread and this link. At that point you are only left with unsetting any custom cookies you have (which you won't be able to control with HTTP cache control settings).
I don't know of any method to "unset everything all at once" easily.

You could use http headers to control the cache of your clients.
I'll just leave this here for you. http://support.microsoft.com/kb/234067

Related

Loading src files once per session in asp.net

I have way too many pages in the application that basically load the same set of xml and js files for client side interaction and validation. So, I have about dozen lines like this one <script type="text/javascript" src="JS/CreateMR.js"></script> or like this one <xml id="DefaultDataIslands" src="../XMLData/DataIslands.xml">.
These same files are included in every page and as such browser sends request to read them every time. It takes about 900ms just to load these files.
I am trying to find a way to load them on just the login page, and then use that temp file as source. Is it possible to do so? If yes, how and where should I start?
P.S. A link to a tutorial will work too, as I have currently no knowledge about that.
Edit:
I can't cache the whole page, because the pages are generated at runtime based on the different possible view modes. I can only cache the js and xml file. Caching everything might be a problem.
Anyway, I am reading through the articles suggested to figure out how to do it. So, I may not be able to accept any answer right away, while I finish reading and try to implement it in one page.
Edit:
Turns out caching is already enabled, it is just that my server is acting crazy. Check the screenshot below.
With Cache
Without cache
As you see, with cache, it is actually taking more time to process some of the requests. I have no idea what that problem is, but I guess I should go to the server stack exchange to figure this out.
As for the actual problem, turns out I don't have to do anything to enable caching of xml and js files. Had no idea browsers automatically cache js files without using specific tag.
Totally possible and in fact recommended.
Browsers cache content that have been sent down with appropriate HTTP caching headers and will not request it again until the cache has expired. This will make your pages faster and more responsive and your server's load much lighter.
Here is a good read to get you started.
Here is ASP.NET MVC caching guide. It focuses on caching content returned from controllers.
Here is a read about caching static content on IIS with ASP.NET MVC.
Basically, you want to use browser caching mechanism to cache the src files after the first request.
If you're using F12 tools in your browser to debug network requests, make sure you have disable cache option unchecked. Otherwise, it forces browser to ignore cached files.
Make sure your server sends and respects cache headers - it should return HTTP status 304 Unmodified after first request to a static file.
Take a look at Asp.Net Bundling and minification - if you have for example multiple js source files, you could bundle them into one file that will be cached on the first request.
Additionally, if you use external js libraries, you could download them from a CDN instead of your server - this will both offload your server and enable user browser to use cached script version (meaning - if some other page that user has visited also used the same script, browser should already have it cached).
One approach is caching static files via IIS by adding <clientCache> element in web.config file. The <clientCache> element of the <staticContent> element specifies cache-related HTTP headers that IIS and later sends to Web clients, which control how Web clients and proxy servers will cache the content that IIS and later returns.
How to configure static content cache per folder and extension in IIS7?
Client Cache
for more info on client side caching read this part of Ultra-Fast ASP.NET 4.5 book:
Browser Cache and Caching Static Content
Other approach is caching portions of page.
if your are using Web Form:
Caching Portions of an ASP.NET Page
and if you are using MVC, use Donut Hole Caching
ASP.NET MVC Extensible Donut Caching
Donut Caching and Donut Hole Caching with Asp.Net MVC
The browser has to ask the server if the file has been modified or not since it put it to the cache, therefore the http statuscode 304. Read more from https://httpstatuses.com/304.
As this is asp.net please make sure you are first running it with
<compilation debug="false"/>
as enabling debugging has some side effects which include.
"All client-javascript libraries and static images that are deployed via
WebResources.axd will be continually downloaded by clients on each page
view request and not cached locally within the browser."
More read from https://blogs.msdn.microsoft.com/prashant_upadhyay/2011/07/14/why-debugfalse-in-asp-net-applications-in-production-environment/

IIS 6 content expiry and image cashing

I have new a favicon in production but this does not seems to be changing even after the IIS is restarted or using content expiry immediate.
Even after i do multiple refresh it is not seems to be loading.
Or is there any other way to refresh this data?
using meta keywords or HttpCacheability.NoCache can not be done as i belive these has to be added to the .net code or aspx and will not affect actual image cashing.
Could you please suggest?

How to invalidate browser cache using just configuration in the webserver?

For a long time I've been updating ASP.NET pages on the server and never find the correct way to make changes visible on files like CSS and images.
I know if a append something in the URL the browser will think the file is another one:
<img src="/images/myLogo.png?v=1"/>
or perhaps changing its name:
<img src="/images/myLogo.v1.png"/>
Unfortunately it does not look the correct way. In a case were I'm using App_Themes the files in this folder are automatically injected in the page in a way I can't easily change the URL.
So my question is:
When I'm publishing de ASP.NET Application on the server what is the correct way to signal to IIS (and it notify browser after that) that a file was changed? It is not automatic? Should I change some configuration in IIS or perhaps make some "decoration" in the code?
I've already tried many questions here in SO like "ASP.NET - Invalidate browser cache", "How to refresh the browser cache of an image?", "Handle cached images? How to get the browser to show the new version?", and even "What is an elegant way to force browsers to reload cached CSS/JS files?" but none of them actually take another aproach else in a way you must handle it manually in the code instead of IIS or ASP.NET configuration.
The closer I could find is "Asking browsers to cache our images (ASP.NET/IIS)" where they set expiration but not based on the fact the files were update. Instead they used days or hour to cache those file so they would updated even when no changes were made.
I'm want to know if IIS or ASP.NET offers something related to this, automatically send to the browser that the files was changed. Is it possible/built in?
The options you have to update the browser side, cached item are:
Change the file name
Add url parameter
Place it on cache for a limited time (eg for couple of hours)
Compare the date-time of creation.
Signaling with eTag.
With the three two you avoiding one server call for each item, but the third option load it again after some time.
With the others you have to make one call to the server to see if needs to be load it again.
So you can not have all here, there is not correct way, and you need to chose what is the best for you, and what you can do. The faster from client perspective is the (1) and (2) options.
The direct answer to your question is to use eTag, or date-time compare of the file creation, but you loose that way, a call to the server, you only win the size of what is travel back.
Some more links:
http eTag
How do I support ETags in ASP.NET MVC?
Configuring ETags with Http module in asp.net
How to control web page caching, across all browsers?
Jquery getScript caching
and you can find even more.

How make ASP.Net recompile page on changes?

I created an aspx page and viewed it in Firefox and Chrome and it worked correctly, running the C# code. But when I make changes to the page (including deleting everything and serving up a blank page), both browsers continue to show the original compiled aspx page!
It appears that ASP.Net (the web server) is not recompiling despite changes to the aspx file. The only way to get it to recompile is to change web.config and then restart the web server!
I even added the following code, but it still loads the original page:
<script runat="server">
Sub Page_Load
Random rd = new Random();
Response.AddHeader("ETag", rd.Next(1111111, 9999999).ToString());
Response.AddHeader("Pragma", "no-cache");
Response.CacheControl = "no-cache";
Response.Cache.SetNoStore();
Response.Expires = -1;
End Sub
</script>
TEST I DID TO RULE OUT BROWSER CACHING:
Created an aspx page and loaded it in firefox only (not in chrome)
Changed the aspx file
Loaded aspx again in firefox but got no changes
Loaded it (for the first time ever) in Chrome and it still showed the old version!
Using Apache and Mono, not IIS
This appears to be a Mono+Apache on Linux issue. It doesn't see changes to pages that have been compiled. The only workarounds are:
Restart Apache webserver (this causes it to see them as changed) - only takes about 2 seconds
Delete the temporary files in "/tmp/www-data-temp-aspnet-0/" (This can be a bit buggy so #1's a better choice)
Check whether your web application is a Web Application project or a Web Site project. If it was a Web Application, you will need to compile each time you change something, whereas web site projects allow changes to get reflected without compilation. Also, you can use Ctrl+F5 in browsers to get a non-cached copy of pages. Hope his helps.
Read details here
Short Answer: It is browser's fault, and it is expected behavior by design. Force cache refresh in browsers (Ctrl-F5 in IE).
Correction: When it is Mono/Apache stack, not an IIS. Then manual restarts can be the only workaround. In IIS the stale effects are naturally cleared in between periods of inactivity, when server kills idle processes. In Mono, there may or may not be same cleaning schedule, so, the process lifecycle and configs are the first place to look at for fix.
The behavior of not recompiling is caused by complex identity of requested pages. Identity includes URL, timestamp and session. If you trying to refresh page without closing the browser, then ASP server will need to serve a stale copy of older compiled page, because server tries to maintain consistency of served page with existing session, viewstate, perhaps even client side scripts wich exist on client in between partial updates, etc. Also browsers are designed to comply to slowly changed internet pages with storing copies in cache and maintaining the age of copies to skip unnecessary network trips. Or else internet would be 10 times slower.
Other note: The most slow files to push through ASP server are css files.

Chrome returns "Bad Request - Request Too Long" when navigating to local IIS Express

I have a web application that runs perfectly fine when I use the Visual Studio 2010 development server (Cassini). However when I try to use IIS Express to host the site Chrome just displays a "Bad Request - Request Too Long" error. The IIS Express site does display in other browsers (FireFox and IE9) so I'm kind of confused. The error occurs in Chrome when I try request pages in my application or even basic resources like an image, so I don't think it is an issue with URL rewriting or routing.
Just to see if the problem was somehow a result of my site's code, I created a new MVC3 website and tried running that. This worked in the VS development server, but once again produced the "Bad Request" error when running under IIS Express.
I am about to start testing the site using some mobile devices so I need to get this running under IIS. Any suggestions would be greatly appreciated.
EDIT:
The root url of the site (http://localhost:50650/) is being requested using GET. I am currently using Chrome v12.0.742.112.
I get this all the time ONLY in Chrome and I have to clear browsing data to fix it.
Wrench > Tools > Clear Browsing Data
Check the following:
Clear browsing history
Clear download history
Empty the cache
Delete cookies and other site data
Then click "Clear Browsing Data" button and refresh your page.
UPDATE:
I figured out that it has to do with writing too many cookies to the browser and that if you just close all instances of Chrome, the error goes away for a while. To prevent it, you'll need to clear out your cookies programmatically.
Instead of clearing all the cookies, just do the following:
Right click the lock in the address bar area (see picture below)
Under cookies there is a link saying how many cookies are used
Click that link
Remove all cookies in there (or just the troublesome if you can identify them)
Problem gone
This error is caused by a corrupt cookie for the website you are trying to view, so to clear it all you need to do is clear the bad cookie(s) for that website.
In Chrome, go to...
chrome://settings/cookies
(Or manually go to Settings->Advanced Settings->Privacy->Content->All Cookies and Site data)
From there, you can search for cookies that match the site you are having problems on. Finally, click "remove all" for the matching cookies.
The problem is usually that the site in question has accumulated too many cookies or created cookies which are too large, making the HTTP headers swell beyond the allowed maximum.
One-time work-around
As has been mentioned, you can go to Settings|Advanced|Content Settings|All Cookies and Site Data, search for the site in question, and delete the cookies using the X button on the right. This reduces the header size of the HTTP request when contacting the site.
Long-term work-around
In addition to removing them one-time, however, you can prevent further problems with heavy cookie sites by going to Settings|Advanced|Content Settings|Manage Exceptions, and add the base site url (e.g. "msdn.microsoft.*" without the quotes) and select Behavior as "Clear on Exit". You might have to login more often to these sites, but this should prevent the problem.
I encountered this problem when using ADB2C login from ASP.NET WebApp. In Firefox you can do similar use case to delete related coockies and problem is gone for a while. Click on HTTPS (i) lock icon with, select ">" button on the right, select More information, select Security tab, click on View Cookies and click on Remove All. Done 4 a while.
If Above methods didn't work then enter
chrome://settings/resetProfileSettings
and Click on Reset Settings
This will reset your startup page, new tab page, search engine, and pinned tabs. It will also disable all extensions and clear temporary data like cookies. Your bookmarks, history and saved passwords will not be cleared.

Resources