Advantages and disadvantages of using caching in an asp.net application? - asp.net

What are the advantages and disadvantages of using caching in an asp.net application?

Answers will vary based on Environments and Technology.
Advantages
Reduce load on Web Services/ Database
Increase Performance
Reliability (Assuming db backed cache. Server goes down and db is backed by cache there is no time wasted to repopulate an in memory cache)
Disadvantages
Could run into issues syncing caches
Increased Maintenance
Scalability Issues
With great power comes great responsibility ;). We ran into an issue where we decided to use the HttpContext.Cache (bad idea) in an application that was distributed. Early on in the project someone deemed to just throw it in there and we didn't run into issues until we went live. Whenever it comes to caching you need to look at the big picture. Ask yourself do we have enough Data, enough Users, or a performance requirement that warrants implementing caching?
If you answer yes then you are probably going to need a farm of servers so choose your caching provider wisely.
With all that being said, Microsoft has a new cache API AppFabric/Velocity that you could leverage that handles the distribution and syncing of the cache auto-magically.
AppFabric caching allows you to do time out eviction, or even built in notification eviction, so as your data chances the caching server takes not of it and periodically the cache client checks in with the server and gets a list of stuff it needs to sync.

http://msdn.microsoft.com/en-us/library/xsbfdd8c%28VS.71%29.aspx
Advantage: performances
Disadvantage: new data is not displayed immediately

Related

Is it worth adding an Azure Cache for Redis instance Output Cache provider to a single instance Web App?

I have a single instance ASP.NET MVC website running on Azure. I'm working on improving its speed.
ASP.NET Output Caching was added for faster page loads, with good results.
I was reading up on the possibility of using an Azure Redis instance as the Output Cache.
My thinking is:
Default Output Cache is the best for single instance apps, it should be the fastest because it runs on the same machine
Azure Redis Cache would most likely be slower, since it would add an extra cache lookup roundtrip between the Web App and the Redis instance
Is that correct?
Correct, given that all of your requests are being processed within the same application it's sufficient to use in-memory caching.
Azure Redis Cache would be beneficial if you had multiple processes which all wanted to share the same cache, e.g. if your website was running across multiple containers.
It depends on what you are trying to achieve. In memory cache will be quicker than Redis but lets say you restart your app and the cache would need to be refreshed. |In cases, where you have large reference data which you are caching, this might be an overhead. You can use a combination of in memory and Redis in such a case, which will also act as a fail safe in case something goes wrong.

ASP.NET Session limit best practice

We're running a PaaS ASP.NET application in an Azure App Service with 3 instances and managing session data outproc in a SQL Server database.
The application is live and we've noticed a large amount of session data for some users when following certain paths e.g. some users have session data upwards of 500k (for a simply site visit with no login the average session is around the 750 - 3000 mark which is what I'd expect).
500k sounds excessive but was wondering what is normal in large enterprise applications these days and the cons of holding so much data in session.
My initial thoughts would be,
No affect on Web App CPU (possible decrease in fact) because not constantly doing queries,
No affect on Web App Memory because we running outproc,
Large spikes in DTU on Sql Server session database when garbage collection runs,
Application may be a bit slower because it takes longer to read and write session data between requests,
May not be ideal for users with poor internet connections,
Possible increase in memory leaks if objects aren't scoped correctly.
Does my reasoning make sense or have I missed something?
Any thoughts and advice would be appreciated,
Many thanks.
I totally agree with your reasoning behind using out-proc session management in Azure App instances.Using IN-PROC sessions in the cloud is a strict no. The reason to host to cloud is to have high availability which is done by having a distributed environment.
Understanding from your point, i assume that speed is a concern to you or if matters to most of the web application , To overcome this , you might think of using Azure redis cache.
Here is the article for configuring session management using Azure redis cache:
Refer the documentation here: https://learn.microsoft.com/en-us/azure/redis-cache/cache-aspnet-session-state-provider

AppFabric vs asp.net cache with sqldependency performance

I'm working on a plan to increase performance and scalability of a web app by caching a user database for a WCF web service. Goals are to increase performance by accessing this data inProc vs a round trip the database server, as well as increase scalability of the service by reducing the load on the database server, thus allowing more web servers to be added to increase scale.
In researching AppFabric, I really don't see the value in my situation because it seems like for the most part, I'm just replacing a round trip to the database with a round trip to a cache cluster (which seems like it might even have more overhead than the db to keep nodes in synch).
For the performance question, it seems like using the asp.net cache (in process) would be much faster than a round trip to the cache cluster, even though the data is in memory on those servers, and even if some of it is cached locally (I believe that would still be out of process from the web app).
For the scalability issue, it also seems easier to be able to add identical web servers to a web farm (each caching the user data in process), rather than manage a cache cluster seperately which adds complexity.
With that said, could someone explain why I would choose one approach over the other, given my stated goals? If you recommend the AppFabric approach, can you explain how the performance would be better than storing data in the asp.net cache in process.
Thanks!
You are right that the App fabric cache is stored out of process.
When the request comes in for a app fabric cache item, there is first a lookup to find where the item is, then a wcf net.tcpip call to get the item. Therefore, it will be slower than asp.net caching. But there are times when appfabric caching is better:
You do not loose the cache when the application pool is recycled.
If you have 100 web servers then you need to get the data from the database once, not 100 times
If you are running Enterprise Edition of windows you do not loose the cache if a machine goes down
I found this topic on codeproject. Hope it can answer your question
you should consider NCache as an other option. NCache is an extremely fast in-memory distributed cache which reduces the performance bottlenecks associates with the database enhance the scalability of the app.
As far as use of asp.net cache is concerned, you should keep into mind its limitations as well. it is good for small web farms only. but when the number of servers grow, asp.net cache may ends up with some performance and scalability issues due to its in-process nature. in a larger web garden you need to have an in-memory distributed cache. Read this for reference

How can I use caching to improve performance?

My scenario is : WebApp -> WCF Service -> EDMX -> Oracle DB
When I want to bind grid I fetch records from Oracle DB using EDMX i.e LINQ Query. But, this degrades performance as multiple layers take place between WebApp & Oracle DB. Can I use caching mechanism to improve the performance? But as far as I know cache is shared across the whole application. So, if I update cache other user might receive wrong information. Can we use caching per user? Or is there any other way to improve performance of the application?
Yes, you can definitely use caching techniques to improve performance. Generally speaking, caching is “application wide” (or it should be) and the same data is available to all users. But this really depends on your scenario and implementation. I don't see how adding the extra caching layer will degrade performance, it's a sound architecture and well worth the extra complexity.
ASP.NET Caching has a concept of "cache dependencies" which is a method to notify the caching mechanism that the underlying source has changed, and the cached data should be flushed and reloaded on the next request. ASP.NET has a built-in cache dependency for SQL Server, and a quick Google search revealed there’s probably also something you can use with Oracle.
As Jakob mentioned, application-wide caching is a great way to improve performance. Generally user context-agnostic data (eg reference data) can be cached at the application level.
You can also cache user context data by storing data in the user's session when they login. Then the data can be cached for the duration of that users session (HttpContext.Session)
Session data can be configured to be stored in the web application process memory, in a state server (a special WCF service) or in a SQL Server database, depending on the architecture and infrastructure.

Which .net architecture should I implement for 10,000 concurrent users for web application [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
I need to create a web application for tasks delegation, monitor and generate reports for a organization.
I am thinking about ASP.Net and MVC should be the architecture to support so many concurrent users. Is there any other better architecture that I should be looking for ?
What kind of server configuration will require to hold this application ?
How do I test so many users connected before I launch this application, is there any free/economical testing tools available ?
Thanks in advance.
anil
the choice of MVC versus webforms have little/nothing to do with the ability for the app to handle load. Your problems will be reading/writing to the DB, and that doesn't change no matter which of the two you choose.
ideas for improving ability to handle load:
first and foremost: absolute minimum is two servers: web server and DB server, they should NEVER run on the same box.
DB:
Efficient queries towards the DB, indexes in the DB, denormalizing tables that are hit a lot, CACHE, CACHE CACHE, running the DB in a cluster, oh, and did I mention CACHING?
Processing:
if you need heavy processing, do this in web services that can run on separate machines from the web servers so that you can scale out (buy more servers and put them behind a load balancer if needed)
WEB:
avoid the need for server affinity (so that it doesn't matter which web server serves a given user at any given time) this means using DB or StateServer to store sessions, syncing the MachineKey on the servers.
the decision of using MVC or not have NO impact on the ability to handle 10k concurrent users, however it's a HUGE benefit to use MVC if you want the site to be unit-testable
remember: Applications are either testable or detestable, your choice
Cache Cache Cache Cache :-) a smart caching policy will make even one server go a long way ... aside from that, you will need to find out where your bottleneck will be. If your application is database heavy, then you will need to consider scaling your database either by clustering, or sharding. If you expect your web server to be the bottleneck (for example if you are doing a lot of processing, like image processing or something), then you can put a load balancer to distribute requests between N number of servers in your webfarm.
For a setup this large I would highly recommend using a Distributed memory caching provider to be layered above your database. I would also really suggest using an ORM that has built in support for the memory cache, like NHibernate since in an application of this scale your biggest bottleneck will definitely be your database.
You will most likely need a webfarm for this scenario, if a single server is strong enough for it currently at some point in the near future you will most likely out grow a single box which is why it's important to architect in the distributed cache first so you can grow your farm out and not have to re-architect your entire system.

Resources