How to verify ASP.NET MVC OutputCache works on server? - asp.net

I'm fixing a bug with ASP.NET OutputCache and it's driving me insane.
We want caching on the server, but it does not appear to work (it did a while ago, in an older version of our app, but we discovered the bug by accident recently).
Locally, I just cannot get the caching to work on the server-side. Using this attribute:
[OutputCache(CacheProfile = "MyProfile", Location = OutputCacheLocation.Server)] // doesn't work
Now, based on a few things I've read by googling around, here is possibly relevant information:
Output caching is enabled in IIS (localhost)
I DO use an AuthorizeAttribute (a custom one with inheritance). I've debugged towards this specifically, and I'm 95% confident this is not the cause.
I've fiddled around with various VaryByParams values, nothing works.
Caching does work client-side.
I've opened a perfmon session and added some counters from the Web Service Cache group, All I see is that there are cached URL's but the cache is missed:
The bigger problem/bug we're facing now is that OutputCache is not working at all right now. We were able to fix that by specifying VaryByParams="" (an empty string). That did it for the client. But it doesn't work server-side yet.
I'm actually checking whether it works or not by placing a debug breakpoint in the action that should be cached. It gets hit everytime, which should mean the cache is not hit.

From http://www.asp.net/mvc/overview/older-versions-1/controllers-and-routing/improving-performance-with-output-caching-cs:
There is no guarantee that content will be cached for the amount of time that you specify. When memory resources become low, the cache starts evicting content automatically.
If the available memory resources on your server are low enough during your testing, the cache will evict content immediately. I'm sure it could even refuse to put your content in the cache.

Related

ASP MVC. Some users get old scripts, despite that we use bundles

We have an ASP MVC 5 applications. We use bundles with optimization enabled by default. But we have heard several times from users, that they get errors, that we think are caused by old versions of user scripts. Their browsers somehow take scripts from cache, despite the fact, that we have edited that script files and bundles should be updated. The worst part of the problem is that we can't imitate or recreate this problem. We don't know how. We already have tried to make test-changes to scripts like adding some "console.log('test')" lines in order to see, if the browser takes the cached version, but everything was ok, the hash in the end of <script src="....?v='hash'"> changed and the browser took the newest version from first time. I should mention, that our site is a single page application. Don't know, maybe its somehow related with the problem.
Have you faced this kind of problem?
There's not enough information here to give a definitive answer. The bundler detects changes in files and will regenerate the bundle along with the link to that bundle, which will include an updated query string param. Since the query string is part of the URI, it's considered a totally different resource at this point, and the browser should fetch it again, because there is technically no cache available. The only logical reason this would not occur is if the HTML with the link to the bundle is not being updated. This can happen if you're using OutputCache or otherwise caching the HTML document. It can also happen if the client's browser is aggressively caching the HTML document. Unfortunately, there's not much you can do about that, as the client browser ultimately has control over what is or is not cached and for how long.
That said, given that this is a single page app, it's very possible that it's also including a cache manifest. This manifest will very often include the HTML file itself, and the browser will not refetch any file in the manifest unless the manifest itself is updated.

How to invalidate browser cache using just configuration in the webserver?

For a long time I've been updating ASP.NET pages on the server and never find the correct way to make changes visible on files like CSS and images.
I know if a append something in the URL the browser will think the file is another one:
<img src="/images/myLogo.png?v=1"/>
or perhaps changing its name:
<img src="/images/myLogo.v1.png"/>
Unfortunately it does not look the correct way. In a case were I'm using App_Themes the files in this folder are automatically injected in the page in a way I can't easily change the URL.
So my question is:
When I'm publishing de ASP.NET Application on the server what is the correct way to signal to IIS (and it notify browser after that) that a file was changed? It is not automatic? Should I change some configuration in IIS or perhaps make some "decoration" in the code?
I've already tried many questions here in SO like "ASP.NET - Invalidate browser cache", "How to refresh the browser cache of an image?", "Handle cached images? How to get the browser to show the new version?", and even "What is an elegant way to force browsers to reload cached CSS/JS files?" but none of them actually take another aproach else in a way you must handle it manually in the code instead of IIS or ASP.NET configuration.
The closer I could find is "Asking browsers to cache our images (ASP.NET/IIS)" where they set expiration but not based on the fact the files were update. Instead they used days or hour to cache those file so they would updated even when no changes were made.
I'm want to know if IIS or ASP.NET offers something related to this, automatically send to the browser that the files was changed. Is it possible/built in?
The options you have to update the browser side, cached item are:
Change the file name
Add url parameter
Place it on cache for a limited time (eg for couple of hours)
Compare the date-time of creation.
Signaling with eTag.
With the three two you avoiding one server call for each item, but the third option load it again after some time.
With the others you have to make one call to the server to see if needs to be load it again.
So you can not have all here, there is not correct way, and you need to chose what is the best for you, and what you can do. The faster from client perspective is the (1) and (2) options.
The direct answer to your question is to use eTag, or date-time compare of the file creation, but you loose that way, a call to the server, you only win the size of what is travel back.
Some more links:
http eTag
How do I support ETags in ASP.NET MVC?
Configuring ETags with Http module in asp.net
How to control web page caching, across all browsers?
Jquery getScript caching
and you can find even more.

How make ASP.Net recompile page on changes?

I created an aspx page and viewed it in Firefox and Chrome and it worked correctly, running the C# code. But when I make changes to the page (including deleting everything and serving up a blank page), both browsers continue to show the original compiled aspx page!
It appears that ASP.Net (the web server) is not recompiling despite changes to the aspx file. The only way to get it to recompile is to change web.config and then restart the web server!
I even added the following code, but it still loads the original page:
<script runat="server">
Sub Page_Load
Random rd = new Random();
Response.AddHeader("ETag", rd.Next(1111111, 9999999).ToString());
Response.AddHeader("Pragma", "no-cache");
Response.CacheControl = "no-cache";
Response.Cache.SetNoStore();
Response.Expires = -1;
End Sub
</script>
TEST I DID TO RULE OUT BROWSER CACHING:
Created an aspx page and loaded it in firefox only (not in chrome)
Changed the aspx file
Loaded aspx again in firefox but got no changes
Loaded it (for the first time ever) in Chrome and it still showed the old version!
Using Apache and Mono, not IIS
This appears to be a Mono+Apache on Linux issue. It doesn't see changes to pages that have been compiled. The only workarounds are:
Restart Apache webserver (this causes it to see them as changed) - only takes about 2 seconds
Delete the temporary files in "/tmp/www-data-temp-aspnet-0/" (This can be a bit buggy so #1's a better choice)
Check whether your web application is a Web Application project or a Web Site project. If it was a Web Application, you will need to compile each time you change something, whereas web site projects allow changes to get reflected without compilation. Also, you can use Ctrl+F5 in browsers to get a non-cached copy of pages. Hope his helps.
Read details here
Short Answer: It is browser's fault, and it is expected behavior by design. Force cache refresh in browsers (Ctrl-F5 in IE).
Correction: When it is Mono/Apache stack, not an IIS. Then manual restarts can be the only workaround. In IIS the stale effects are naturally cleared in between periods of inactivity, when server kills idle processes. In Mono, there may or may not be same cleaning schedule, so, the process lifecycle and configs are the first place to look at for fix.
The behavior of not recompiling is caused by complex identity of requested pages. Identity includes URL, timestamp and session. If you trying to refresh page without closing the browser, then ASP server will need to serve a stale copy of older compiled page, because server tries to maintain consistency of served page with existing session, viewstate, perhaps even client side scripts wich exist on client in between partial updates, etc. Also browsers are designed to comply to slowly changed internet pages with storing copies in cache and maintaining the age of copies to skip unnecessary network trips. Or else internet would be 10 times slower.
Other note: The most slow files to push through ASP server are css files.

Priming the asp.net output cache

Is there a way to programmatically prime the asp.net output cache? I've investigated the caching API and can't seem to find an obvious way to do this. Has anyone tried something like this? If so, what method did you use?
I gave some thought to this last year and ended up concluding that it was not that important for the case, but if it's important for you website, all you have to do is to simply call the webpages from somewhere like Application_Start (after all code has run) event but you shouldn't stop there!
The cache will eventually expire and to avoid that you should set up some way to cache the pages again before any clients requests that page.
Make the outputcache dependent on someother object in cache and set an expiration callback.
Thus, when that cache object expires, so does your pages and you should make http requests to the pages you want to recache and so on.
I'm answering to this question, but the amount of effort and question marks I still have in my mind lead me to advise not to go through with this...
UPDATE
The only kind of dependency you may set in outputcache is sql dependency. Use it if you want, but if you would need to depend your outputcache on some other business object, then this might get very difficult. I could tell you that you could set a database object and depend your database on it and expire it yourself using some kind of timer.
Man, the longer I write the more solutions and difficulties I find! I can't write a book for something that is not worthy your precious time. Believe me you that the usefulness for this will be nearly zero.
Priming the cache is as others have suggested as easy as requesting the pages you want cached. Of course if you do this programmaticly it will only request the HTML and not all the linked resources (CSS, JavaScript, Images...) which is a good thing to avoid wasted bandwidth.
For many websites the items that are cached which consume the biggest performance penalties are common to many or all pages. For example a navigation system on a large CMS or storefront may query the database and do a bunch of rendering work which can then be cached for all pages. Also a big part of the initial load in ASP.net is when the website if first accessed and loaded into memory. Both of these issues can be addressed by even calling a single page on your site, but there is nothing stopping you from making a list of URLs and calling each one periodically.
If your cache policy is set for a 20 minutes timeout, maybe request each page once every 17-18 minutes.
Here are some resources with source code to help you get started:
Good Simple Primer on requesting web URL in C#
Website Monitoring Windows Service
Asyncronous Website Monitor
As I mentioned before, you can easily extend these to "foreach" over an array or list of URLs to be requested.

ASP.NET AJAX Load Balancing Issues

This would be a question for anyone who has code in the App_Code folder and uses a hardware load balancer. Its true the hardware load balancer could be set to sticky sessions to solve the issue, but in a perfect world, I would like the feature turned off.
When a file in the App_Code folder, and the site is not pre-compiled iis will generate random file names for these files.
server1 "/ajax/SomeControl, App_Code.tjazq3hb.ashx"
server2 "/ajax/SomeControl, App_Code.wzp3akyu.ashx"
So when a user posts the page and gets transfered to the other server nothing works.
Does anyone have a solution for this? I could change to a pre-compiled web-site, but we would lose the ability for our QA department to just promote the changed files.
Do you have the <machinekey> node on both servers set to the same value?
You can override the machine.config file in web.config to set this. This needs to match otherwise you can get strange situations like this.
Does your load balancer supports sticky sessions? With this on, the balancer will route the same IP to the same server over and over within a certain time window. This way, all requests (AJAX or otherwise) from one client would always hit the same server in the cluster/farm.
Ok, first things first... the MachineKey thing is true. That should absolutely be set to the same on all of the load balanced machines. I don't remember everything it affects, but do it anyway.
Second, go ahead and precompile the site. You can actually still push out new versions, whenever there is a .cs file for a page that page gets recompiled. What gets tricky is the app_code files which get compiled into a single dll. However, if a change is made in there, you can upload the new dll and again everything should be fine.
To make things even easier, enable the "Used fixed naming and single page assemblies" option. This will ensure things have the same name on each compilation, so you just test and then replace the changed .dll files.
All of that said, you shouldn't be having an issue as is. The request goes to IIS, which just serves up the page and compiles as needed. If the code behind is different on each machine it really shouldn't matter, the code is the same and that machine will reference it's own code. The actual request/postback doesn't know or care about any of that. Everything I said above should help simplify things, but it should be working anyway... so it's probably a machinekey issue.
You could move whatever is in your app_code to an external class library if your QA dept can promote that entire library. I think you are stuck with sticky sessions if you can't find a convenient or tolerable way to switch to a pre-compiled site.
If it's a hardware load balancer, you shouldn't have an issue, because all that is known there is the request URL, in which the server would compile the requested page and serve it.
the only issue i can think of that you might have is with session and view state.
Its true the hardware load balancer could be set to sticky sessions to solve the issue, but in a perfect world, I would like the feature turned off.
It appears that the is only for ViewState encryption. It doesn't affect the file names for auto compiled assemblies.
I think asp.net model has quite a bit dependency for encryption and machine specific storage, so I am not sure if it works to avoid sticky IP for session.
I don't know about ASP.NET AJAX (I use MonoRail NJS approach instead), but session state could be an issue for you.
You have to make sure session states are serializable, and don't use InMemory session. You probably need to run ASP.NET Session State Server to make sure the whole frontend farm are using the same session storage. In such case session has to be perfectly serializable (thats why no object in session is preferred, you have to always use ID, and I bet MS stick on this limitation when they do AJAX library development)

Resources