I am a learner.I am learning Caching in ASP.NET.There are three types of caching in ASP.NET.
1.Page output caching.
2.Partial Output caching.
3.Data Caching.
In Page output caching, all the rendered content of the page saved in Cache and page every time re-execute.
In Partial Output caching, we can apply caching rules on different parts of pages.
But Data Caching, I didn't understand.
Could anyone please explain me Data Caching?
Thanx in advance.
In simple terms data caching is storing data in memory for quick access. Typically information that is costly to obtain (in terms of performance) is stored in the cache. One of the more common items stored in a cache in a Web application environment is commonly displayed database values; by caching such information, rather than relying on repeated database calls, the demand on the Web server and database server's system resources are decreased and the Web application's scalability increased. As Microsoft eloquently puts it, "Caching is a technique widely used in computing to increase performance by keeping frequently accessed or expensive data in memory. In the context of a Web application, caching is used to retain pages or data across HTTP requests and reuse them without the expense of recreating them."
Read more : .NET Data Caching
It is about caching application data (using the Cache class) - persistence of some objects (values).
Related
I'm experienced with ASP.NET but some of my knowledge is a little shaky. I'm creating an application that uses services to get large portions of data. I then want to filter it with LINQ to what I need. This data rarely changes, but I know there's too much of it to fit in Session.
How can I store rarely-changing , large amounts of data in memory? Would this be suitable for Application variables?
Instead of caching the source data, consider using HTTP caching via ASP.NET Output Cache or memcached to store the rendered output itself. ASP.NET OutputCache can be tuned to work on specific ASP.NET resources and there are many ways to invalidate the cache explicitly if need be. See the following MSDN resources for more information.
ASP.NET Caching Overview
Output Cache Configuration
OutputCache Attribute for ASP.NET MVC
You can store large amounts of fairly static data:
In the HttpContext.Cache
In a private static field of a C# class. Wrap the variable with an public get property that can initialize the static field. The static field will need to be initialized when the app first starts, and any time the app domain is recycled by IIS.
HttpContext.Cache
Pro's
Configurable cache expiration policy.
Designed specifically for the ASP.Net environment
Con's
Might be evoked from cache if ASP.Net determines it is under-utilized.
Static Field
Pro's
Complete user control over cache expiration.
Con's
Any cache expiration must be explicitly programmed. No built-in expiration support.
I have an ASP.net application that I'm moving to Azure. In the application, there's a query that joins 9 tables to produce a user record. Each record is then serialized in json and sent back and forth with the client. To increase query performance, the first time the 9 queries run and the record is serialized in json, the resulting string is saved to a table called JsonUserCache. The table only has 2 columns: JsonUserRecordID (that's unique) and JsonRecord. Each time a user record is requested from the client, the JsonUserCache table is queried first to avoid having to do the query with the 9 joins. When the user logs off, the records he created in the JsonUserCache are deleted.
The table JsonUserCache is SQL Server. I could simply leave everything as is but I'm wondering if there's a better way. I'm thinking about creating a simple dictionary that'll store the key/values and put that dictionary in AppFabric. I'm also considering using a NoSQL provider and if there's an option for Azure or if I should just stick to a dictionary in AppFabric. Or, is there another alternative?
Thanks for your suggestions.
"There are only two hard problems in Computer Science: cache invalidation and naming things."
Phil Karlton
You are clearly talking about a cache and as a general principle, you should not persist any cached data (in SQL or anywhere else) as you have the problem of expiring the cache and having to do the deletes (as you currently are). If you insist on storing your result somewhere and don't mind the clearing up afterwards, then look at putting it in an Azure blob - this is easily accessible from the browser and doesn't require that the request be handled by your own application.
To implement it as a traditional cache, look at these options.
Use out of the box ASP.NET caching, where you cache in memory on the web role. This means that your join will be re-run on every instance that the user goes to, but depending on the number of instances and the duration of the average session may be the simplest to implement.
Use AppFabric Cache. This is an extra API to learn and has additional costs which may get quite high if you have lots of unique visitors.
Use a specialised distributed cache such as Memcached. This has the added cost/hassle of having to run it all yourself, but gives you lots of flexibility in the long run.
Edit: All are RAM based. Using ASP.NET caching is simpler to implement and is faster to retrieve the data from cache because it is on the same machine - BUT requires the cache to be populated for each instance of the web role (i.e. it is not distributed). AppFabric caching is distributed but is also a bit slower (network latency) and, depending what you mean by scalable, AppFabric caching currently behaves a bit erratically at scale - so make sure you run tests. If you want scalable, feature rich distributed caching, and it is a big part of your application, go and put in Memcached.
I have been reading that lots of people use Redis or another key-value store/NoSQL solution as a distributed cache for their website.
Maybe I'm not understanding completely, but it seems a solution like this only works for shared data. For example, if I have a website that requires a user to log-in and the queries they generate return data specific to only that user (in my case, banking/asset information) that can't be cached for all users, this type of solution doesn't work.
Unfortunately, the database is shared across all our applications and when it get bogged down, the website gets bogged down as well. Since each user has gigabytes of information, I obviously can't cache all of that and each web page queries completely different information.
Is there some caching strategy that I can employ for this type of scenario?
A distributed cache like Velocity doesn't require that the data it stores be limited to "shared" data. But you do have to read the data from your DB and store it in the cache, which takes time.
A few alternatives:
Partition your data, so it's spread out among several DB servers
Add as much RAM as you can to each DB server, to allow SQL Server to cache what it can
There are many variations to the partitioning theme....
Is your web app load balanced? There are caching options at the web tier as well -- the ASP.NET object cache is a good place to start.
It's possible that your web clients are requesting the same data more than once (for a given user). So caching could give a benefit in that case.
But before you go implementing a huge caching solution, you really need to look at the queries that are particularly slow or executed a huge number of times and see if you can optimize them in any way.
Then look at upgrading your DB machine.
I read a nice article about the performance issues that MySpace had when they had a huge growth.
You can find the article here.
One quote from the article that stands out:
The addition of the cache servers is "something we should have done
from the beginning, but we were growing too fast and didn't have time
to sit down and do it," Benedetto adds
If the problem is in your database server think about partitioning your data and making use of a database farm to spread the load. Also think about SSD's! They can really speed up your database access code.
Depending how dynamic your data is you could consider using Fragment Caching. This will cache the HTML of the page rather than the data so if the volume of data is prohibtive to cache then this might work for you
I'm using the SqlProfileProvider on one of my websites and in one page I need to fetch the whole list of profiles (it is an intranet).
The method that I use is the ProfileManager.GetAllProfiles(). The problem is that its performance is really bad and it slows down the website considerably.
Therefore, I was thinking of caching the result of the method call in the Application scope as a DataTable (so I could filter/search on it as well).
My problem is that I have several servers running this webapp, and I would like the cache to be in sync. I started using memcached but I was put off by some problems (hence going back to thinking in caching in the Application scope).
So, here are my questions:
Would it be efficient to store the DataTable containing the profiles in the Application object? Or, is it possible to store objects in the Cache and have them available for all clients/browsers?
Is it possible to add a (SQL) Cache Depedency to this cache?
You could cache portions of the web page which will depend on the list of profiles by putting them in a user control and marking it as cacheable. SqlCacheDependency cache policy expiration could be defined as well. As for the cache location, every web server in the farm will have it's own version in memory but using cache expiration will make sure that this version is not out of sync with the data in the DB.
Page or fragment caching is the most effective caching technique because contrary to caching your model (a DataTable or whatever) you don't pay the price of HTML rendering.
I have to port a smaller windows forms application (product configurator) to an asp.net app which will be used on a large company's website, demand should be moderate because it's for a specialized product line.
I don't have access to a database and using XML is a requirement from their web developers.
There are roughly 30 different products with roughly 300 different possible configurations stored in the xml files, and linked questions / answers that lead to a product recommendation. Also some production options. The app is available in 6 languages.
How would you solve the 'data access' layer, if you could call it this way? I thought of reading / deserializing the xml files into their objects and store them in asp.net's cache if they're not there already and then read from the cache on subsequent requests. But that would mean all objects live in the memory all day and night.
Is that even necessary, or smart, performance wise? As I said before, the app is not that big, the xml files not that large. Could I just create some Repository class that reads the xml files whenever an object is requested (ie. 'Product Details', or 'Next question') and returns it that way, and drive memory consumption down?
The whole approach seems to be sticking to a single server. First consider if this is appropriate as you mentioned a "large company's website", that sets a red flag for me. If you need the site to scale, you will end up having more than a single server, which prevents considering a simple local file.
If you are constrained to using that, analyze what data is more appropriate to keep in cache (does not change often, its long lived, the same info is requested different times). Try to keep the cached stuff separated from the non cached, which will reduce the amount of amount of info in the more dynamic files. If you expect big amounts of information, consider splitting the files with something appropriate to your domain.
I use Cache whenever I can. I cache objects upon their first request. If memory is of any concern, I set expiration policy. And whether it is or not, when short on memory, the framework will unload the cache anyway.
Since it is per application and not per user, it makes sense to have it, especially if the relative footprint is small.
If you have to expand to multiple servers later, you can access the same file over the network or modify DA layer to retrieve data by any other means (services, DB, etc). The caching code will stay the same and performance will be virtually unaffected.
If you set dependency, objects will always stay current.
I'm for it.
Using the cache, and setting an appropriate expiration policy as advised by others is a sound approach. I'd suggest you look at using LINQ to XML as the basis for your data access code as it is so much easier to use than traditional methods of querying XML. You can find a decent introduction here.