Caching the profiles from SqlProfileProvider -- ProfileManager.GetAllProfiles result - asp.net

I'm using the SqlProfileProvider on one of my websites and in one page I need to fetch the whole list of profiles (it is an intranet).
The method that I use is the ProfileManager.GetAllProfiles(). The problem is that its performance is really bad and it slows down the website considerably.
Therefore, I was thinking of caching the result of the method call in the Application scope as a DataTable (so I could filter/search on it as well).
My problem is that I have several servers running this webapp, and I would like the cache to be in sync. I started using memcached but I was put off by some problems (hence going back to thinking in caching in the Application scope).
So, here are my questions:
Would it be efficient to store the DataTable containing the profiles in the Application object? Or, is it possible to store objects in the Cache and have them available for all clients/browsers?
Is it possible to add a (SQL) Cache Depedency to this cache?

You could cache portions of the web page which will depend on the list of profiles by putting them in a user control and marking it as cacheable. SqlCacheDependency cache policy expiration could be defined as well. As for the cache location, every web server in the farm will have it's own version in memory but using cache expiration will make sure that this version is not out of sync with the data in the DB.
Page or fragment caching is the most effective caching technique because contrary to caching your model (a DataTable or whatever) you don't pay the price of HTML rendering.

Related

Storing large amounts of data locally

I'm experienced with ASP.NET but some of my knowledge is a little shaky. I'm creating an application that uses services to get large portions of data. I then want to filter it with LINQ to what I need. This data rarely changes, but I know there's too much of it to fit in Session.
How can I store rarely-changing , large amounts of data in memory? Would this be suitable for Application variables?
Instead of caching the source data, consider using HTTP caching via ASP.NET Output Cache or memcached to store the rendered output itself. ASP.NET OutputCache can be tuned to work on specific ASP.NET resources and there are many ways to invalidate the cache explicitly if need be. See the following MSDN resources for more information.
ASP.NET Caching Overview
Output Cache Configuration
OutputCache Attribute for ASP.NET MVC
You can store large amounts of fairly static data:
In the HttpContext.Cache
In a private static field of a C# class. Wrap the variable with an public get property that can initialize the static field. The static field will need to be initialized when the app first starts, and any time the app domain is recycled by IIS.
HttpContext.Cache
Pro's
Configurable cache expiration policy.
Designed specifically for the ASP.Net environment
Con's
Might be evoked from cache if ASP.Net determines it is under-utilized.
Static Field
Pro's
Complete user control over cache expiration.
Con's
Any cache expiration must be explicitly programmed. No built-in expiration support.

Is Caching in C# the right approach for me?

I've tried to read up on Caching in ASP.NET and still have a few questions.
When using a Sql Cache Dependency ... I know that you can specify which tables will be monitored but if a change happens to any one of those tables does it reset the entire cache? I understand that I don't want to cache tables that will have frequent changes but we could end up with a good handful of cached tables and even if each table only gets a few updates a day, that could turn into 50ish resets of the cache daily (8 hour window).
I would be creating and maintaining this cache via a GAC DLL. A large number of different applications would be accessing that GAC at any one time. Does each application maintain its own copy of the cache or is it just stored in one global location (or possibly per app pool)?
Is there a physical location on the server where I can see how much space the Cache is currently consuming? This would be extremely pertinent if each application maintains its own Cache as that could end up taking large amounts of disk space.
Is there some way to physically force the cache to rebuild itself? I could see my boss assuming that the cache was at fault for a particular issue and I'd need to be able to rule that out at the rootest level. No "changing a record and saying that SHOULD rebuild the cache" but rather "doing [Action X] and KNOWING that whatever was in the cache is now gone"
Thanks in advance for your answers and time.
SqlCacheDependency only monitors tables in the old-style SQL 2000 approach, which relies on triggers and polling. The SQL 2005+ method monitors changes at the row level, and uses Service Broker. At the level of the Cache object, changes will invalidate just the Cache entries associated with the given SqlCacheDependency (not the entire cache).
Each application has a separate copy of the Cache. If you have many apps sharing the same data, you might consider creating a separate "caching server," and have your apps get their data from there, using WCF -- basically add another tier to your app.
You can look at a couple of cache-related performance counters, but if your concern is disk space, then there's nothing to worry about, since the ASP.NET cache is stored entirely in RAM. In addition, if RAM gets too full, one feature of the cache is that it will let go of old/infrequently referenced objects to make room for new objects.
The easiest way to force the cache to be dropped is to simply recycle your application or AppPool (which happens once a day or so by default anyway). If you want something more targeted, you would need to write some code to forcibly remove certain items from the cache, either using Cache.Remove() or using linked dependencies.
from top of my head:
Only that table's content will be invalidated.
Each web application has it's own cache.
Cache is stored in memory. and see this question How to determine total size of ASP.Net cache? regarding cache size
http://bit.ly/vsqNDl this may help

Synchronizing local cache with external application

I have two separate web applications:
The "admin" application where data is created and updated
The "public" application where data is displayed.
The information displayed on the "public" changes infrequently, so I want to cache it.
What I'm looking for is the "simplest possible thing" to update the cache on the public site when a change is made in the admin site.
To throw in some complexity, the application is running on Windows Azure. This rules out file and sql cache dependencies (at least the built in ones).
I am running both applications on a single web role instance.
I've considered using Memcached for this purpose. but since I'm not really after a distributed cache and that the performance is not as good as using a memory cache (System.Runtime.Caching) I want to try and avoid this.
I've also considered using NServiceBus (or the Azure equivalent) but again, this seems overkill just to send a notification to clear the cache.
What I'm thinking (maybe a little hacky, but simple):
Have a controller action on the public site that clears the in memory cache. I'm not bothered about clearing specific cached items, the data doesn't change enough for me to worry about that. When the "admin" application makes a cache, we make a httpwebrequest to the clear cache action on the public site.
Since the database is the only shared resource between the two applications, just adding a table with the datetime of the last update. The public site will make a query on every request and compare the database last update datetime to one that we will hold in memory. If it doesn't match then we clear the cache.
Any other recommendations or problems with the above options? The key thing here is simple and high performance.
1., where you have a controller action to clear the cache, won't work if you have more than one instance; otherwise, if you know you have one and only one instance, it should work just fine.
2., where you have a table that stores the last update time, would work fine for multiple instances but incurs the cost of a SQL database query per request -- and for a heavily loaded site this can be an issue.
Probably fastest and simplest is to use option 2 but store the last update time in table storage rather than a SQL database. Reads to table storage are very fast -- under the covers it's a simple HTTP GET.
Having a public controller that you can call to tell the site to clear its cache will work as long as you only have one instance of the main site. As soon as you add a second instance, as calls go through the load balancer, your one call will only go to one instance.
If you're not concerned about how soon the update makes it from the admin site to the main site, the best performing and easiest (but not the cheapest) solution is to use the Azure AppFabric Cache and then configure it to use a a local (in memory) cache with a short-ish time out (say 10 minutes).
The first time your client tries to access an item this would be what happens
Look for the item in local cache
It's not there, so look for the item in the distributed cache
It's not there either so load the item from persistent storage
Add the item to the cache with a long-ish time to live (48 hours is the default I think)
Return the item
Steps 1 and 2 are taken care of for you by the library, the other bits you need to write. Any subsequent calls in the next X minutes will return the item from the in memory cache. After X minutes it falls out of the local cache. The next call loads it from the distributed cache back into the local cache and you can carry on.
All your admin app needs to do is update the database and then remove the item from the distributed cache. The next time the item falls out of the local cache on the client, it will simply reload the data from the database.
If you like this idea but don't want the expense of using the caching service, you could do something very similar with your database idea. Keep the cached data in a static variable and just check for updates every x minutes rather than with every request.
In the end I used Azure Blobs as cache dependencies. I created a file change monitor to poll for changes to the files (full details at http://ben.onfabrik.com/posts/monitoring-files-in-azure-blob-storage).
When a change is made in the admin application I update the blob. When the file change monitor detects the change we clear the local cache.

Cache data returned by stored procedures?

I have an Asp.Net MVC 3 site. The following is the call stack
Web page/jQuery: $(document).Ready(.... Ajax calls... render the page...)
=> MVC Control methods
=> Entity framework 4.1
=> mapped store procedures (SQL Server 2008)
Question:
Where is the best place to implement cache?
How to let the page know that the underline SQL server tables have been updated?
Not sure about the "best" way to do it but one way to do it would be to have an MVC controller action which calls to the db to check and see if the data has been updated. (You can do it by time-stamp.)
The resulting function will then retreive the data from cache or from the server.
http://davidwalsh.name/cache-ajax
The only interesting thing to note however; is that you should make sure that the call to first find out if you can use cached content is faster than not caching content at all.
Try to add caching as close to the source as possible. This way more of your app could gain benefits from the improved speed.
If you control the code that is modifying the underlying tables you could invalidate the cache from there. You could also place a short timeout on your cache. If its a heavily used query caching it only a second could increase speed many fold. Make sure to test the performance gain so that you can tweak timeouts.
For question #2, you may want to look into Query Notifications. Setting everything up is a bit complicated, but that will enable you to do things such as caching until the data in your database has been updated.
One way is to cache rendered views some specified time.
Let's say that you have page that is not updated often. So instead of hitting database on every visit you can store rendered view in cache. This is achieved using OutputCaching - http://www.asp.net/mvc/tutorials/improving-performance-with-output-caching-cs.
Another way could be to store data.
Here again You can cache it for some specified time. In ASP.NET (MVC) it can be achieved using Cache object - http://msdn.microsoft.com/en-us/library/aa478965.aspx.
Cache object let's you specify how long data is to be cached when You put it in cache. For example:
Cache.Insert("key",
myTimeSensitiveData,
null,
DateTime.Now.AddMinutes(1),
TimeSpan.Zero);
Or you can cache until it is 'invalidated'.
Say you have GetCustomers and UpdateCustomer methods. In GetCustomers you check if data is in Cache. If not you hit the database, put it in cache and return. It is in cache until someone calls UpdateCustomer. In that method you write modified customer to database and invalidate data stored in Cache. You can just remove it. That way when GetCustomers is called again it will hit the database and populate Cache again. But remember that Cache has global scope and is accessible for many threads at the same time. You will need some synchronization code around access to Cache.

Custom caching in ASP.NET

I want to cache custom data in an ASP.NET application. I am putting lots of data into it, such as List<objects>, and other objects.
Is there a best practice for this? Since if I use a static data, if the w3p.exe dies or gets recycled, the cache will need to be filled again.
The database is also getting updated by other applications, so a thread would be needed to make sure it is on the latest data.
Update 1:
Just found this, which problably helps me
http://www.codeproject.com/KB/web-cache/cachemanagementinaspnet.aspx?fid=229034&df=90&mpp=25&noise=3&sort=Position&view=Quick&select=2818135#xx2818135xx
Update 2:
I am using DotNetNuke as the application, ( :( ). I have enabled persistent caching and now the whole application feels slugish.
Such as a Multiview takes about 3 seconds to swap view....
Update 3:
Strategies for Caching on the Web?
Linked to this, I am using the DotNetNuke caching method, which in turn uses the ASP.NET Cache object, it also has file based caching.
I have a helper:
CachingProvider.Instance().Add( _
(label & "|") + key, _
newObject, _
Nothing, _
Cache.NoAbsoluteExpiration, _
Cache.NoSlidingExpiration, _
CacheItemPriority.NotRemovable, _
Nothing)
Which runs that to add the objects to the cache, is this correct? As I want to keep it cached as long as possible. I have a thread which runs every x Minutes, which will update the cache. But I have noticied, the cache is getting emptied, I check for an object "CacheFilled" in the cache.
As a test I've told the worker process not to recycle, etc., but still it seems to clear out the cache. I have also changed the DotNetNuke settings from "heavy" to "light" but think that is for module caching.
You are looking for either out of process caching or a distributed caching system of some sort, based upon your requirements. I recommend distributed caching, because it is very scalable and is dedicated to caching. Someone else had recommended Velocity, which we have been evaluating and thoroughly enjoying. We have written several caching providers that we can interchange while we are evaluating different distributed caching systems without having to rebuild. This will come in handy when we are load testing the various systems as part of the final evaluation.
In the past, our legacy application has been a random assortment of cached items. There have been DataTables, DataViews, Hashtables, Arrays, etc. and there was no logic to what was used at any given time. We have started to move to just caching our domain object (which are POCOs) collections. Using generic collections is nice, because we know that everything is stored the same way. It is very simple to run LINQ operations on them and if we need a specialized "view" to be stored, the system is efficient enough to where we can store a specific collection of objects.
We also have put an abstraction layer in place that pretty much brokers calls between either the DAL or the caching model. Calls through this layer will check for a cache miss or cache hit. If there is a hit, it will return from the cache. If there is a miss, and the call should be cached, it will attempt to cache the data after retrieving it. The immediate benefit of this system is that in the event of a hardware or software failure on the machines dedicated to caching, we are still able to retrieve data from the database without having a true outage. Of course, the site will perform slower in this case.
Another thing to consider, in regards to distributed caching systems, is that since they are out of process, you can have multiple applications use the same cache. There are some interesting possibilities there, involving sharing database between applications, real-time manipulation of data, etc.
Also have a look at the MS Enterprise Caching Application block which allows your to write custom expiration policy, custom store etc.
http://msdn.microsoft.com/en-us/library/cc309502.aspx
You can also check "Velocity" which is available at
http://code.msdn.microsoft.com/velocity
This will be useful if you wish to scale your application across servers...
There are lots of articles about the Cache object in ASP.NET and how to make it use SqlDependencies and other types of cache expirations. No need to write your own. And using the Cache is recommended over session or any of the other collections people used to cram lots of data into.
Cache and Session can lead to sluggish behaviour, but sometimes they're the right solutions: the rule of right tool for right job applies.
Personally I've often created collections in pseudo-static singletons for the kind of role you describe (typically to avoid I/O overheads like storing a compiled xslttransform), but it's very important to keep in mind that that kind of cache is fragile, and design for it to A). filewatch or otherwise monitor what it's supposed to cache where appropriate and B). recreate/populate itself with use - it should expect to get flushed frequently.
Essentially I recommend it as a performance crutch, but don't rely on it for anything requiring real persistence.

Resources