Unwanted ASP.Net MVC3 server side caching - asp.net

I'm working on an MVC3 app and I've come across an issue with objects being cached unintentionally.
My code is creating objects from calls to a separate custom business logic dll.
This business logic dll gets data from a database.
After I change data in the database, I'm still seeing the old data, even after closing my browser and re-running the application. It's not a browser caching issue because I can see it when I'm debugging in the development environment.
In development, if I stop the asp.net development server, then re-run the app, I get the new data.
In IIS, if I restart the website, I get the new data.
Any idea why asp.net is caching and re-using these objects, even after they have gone out of scope?
The business logic dll does have some caching built into it, so maybe that's the main issue. In that case, I guess the question is whether there is some way I can tell asp.net to wipe out the objects once the session is over.

There's no caching by default in ASP.NET MVC3, at least no caching of data. Make sure your IIS settings are correct and you don't accidentally use the OutputCacheAttribute.
As for caching in the business layer: I've seen at least three caching-related problems in the last two days. Keep in mind: Caching is tricky, and so are static variables. If it's not necessary, don't do it. Caching is extremely powerful, but it's also dangerous. That is also true for the beforementioned OutputCacheAttribute.

It sounds to me like you're creating your data context statically, rather than creating a new one and destroying it after ever request. This is a bad thing to do for a lot of reasons.
When you say that business layer has "some cacheing", what does that mean? How are you cacheing?

Related

Is caching using Application slower or more problematic than using Global.asax.cs static variable?

We have a Webforms application that stores a bunch of settings and terminology mappings (several hundred) that are used throughout the application.
http://www.dotnetperls.com/global-variables-aspnet makes these assertions:
The Application[] collection .... may be slower and harder to deal with.
the Application[] object ...is inefficient in ASP.NET.
Is this recommended? Yes, and not just by the writer of this article. It is noted in Professional ASP.NET by Apress and many sites on the Internet. It works well
So I am wondering if these statements are true. Can anyone elaborate on why using Application is slower or what kind of problems can crop up if you use Application? I am sort of assuming that any problems or slowdowns might only surface under production loads, so that is why I am asking for real world experience, rather than just benchmarking myself.
I am aware that there are many alternatives to caching (HttpRuntime.Cache, memcached, etc) but specifically I want to know if I need to go back and rewrite my legacy code that uses Application[]. Specifically if in any way I am incurring a performance penalty I would want to get rid of that.
How are you saving these settings? I would recommend the web.config
If you're using the web.config to store these settings (if they're application variables that's a solid place to start), then no need for Application variables.
I try to steer clear of the Application level variables because they are way more expensive than Session variables.
Also, variables in the web.config / app.config files can change without having to change code and/or recompile your project.
Application class (global variables) only exist in ASP.NET to help with backwards compatibility with classic ASP, you could say it's deprecated.
Another thing you could look into would be caching your settings so you're not always reading from disk.

SQL Server State for large asp.net application and any advantages of writing own Custom Store Provider

Background
We have a large asp.net application and uses a lot of sessions like datasets, datatables etc.
We want to support web farms for this application, so we want to save the session state in sql server.
I am successfully storing all the required data into the sql sever and getting all the data fine as well.
Our supported database is SQL Server 2005-Sql Server 2008.
We have to store datatables and datasets in sessions, even we know it is going to be bit expensive.
Question
I want to know from other developers is there any advantage of using Custom Store Provider to store data. (any help in debugging or error finding or future proofing etc.)
Or i just change the web config and make all the classes serializable to make it work.
Any custom way to make all the related classes serializable using c# code.
Any better way to intervene the process used by .net to store data in sql server (default process on changing web config)and make it better, by changing one or more classes.
Thanks,
I would go with marking your business objects as [Serializable]. I think it should be a lot leaner than storing datatables/datasets.
The best way to make your classes serializable is to simply decorate them with [Serializable] I don't think you need anything else besides that.
If you use a load balancer with sticky sessions, I would actually go with LocalStateServer as it should perform faster than SqlServer
I think you can go for your point 2.
there is no magic/automated way to change all your classes to be serializable, either you use the attribute way or the interface way but in some classes you could need some fixes or changes depending on what are the property types.
apart from that everything should work smoothly once everything is Serializable and yes, you touch the web.config and all should work.
if some of those objects are non user specific but can be shared among users, a possible alternative could be appFabric, if you configure a cache cluster (which could consist in multiple machines), you can then save objects in that cache, but as I said before this depends on your application, are those objects absolutely user specific so MUST be in the session and not in a shared cache? have a look at this answer: AppFabric vs System.Runtime.Caching

Sharing Data Between Two Web Applications in ASP.NET

I have a web application (MainApplication) where many of the pages contain a custom Web Control that looks for some content in a cache. If it can't find any data within the cache, then it goes out to a database for the content. After retrieving the content, the Control displays the content on the page.
There is a web application (CMS) in a subdirectory within the aforementioned web application. Users use this CMS to update the content pulled in by the MainApplication.
When a user updates some content using the CMS, I need the CMS to clear the relevant portion of the cache used by the MainApplication. The problem is that, as two different web applications, they can't simply interact with the same static cache object.
The ideal solution would be to somehow share an instance of a cache object between both web applications.
Failing that, what would be the best (performance-wise) way of communicating between the two web applications? Obviously, writing/reading to a database would defeat the purpose. I was thinking about a flat file?
Update
Thank you all for your help. Your wonderful answers actually gave me the right search terms to discover that this was a duplicate question (sorry!): Cache invalidation between two web applications
We had the exact same setup in a previous project i worked on, where we had one ASP.NET Web Application (with MCMS Backing), and another ASP.NET Web Application to display data.
Completely different servers (same domain though).
However, when a "editor" updated content in the CMS application, the UI was automatically refreshed.
How? Glad you asked.
We stored the content in SQL Server, and used Replication. :)
The "frontend" Web Application would read the data from the database (which was replicated by the CMS system).
Now - we don't cache this data, because in the database, we actually stored the markup (the HTML) for the control. Therefore we dynamically re-rendered the HTML.
Why is that "defeating the purpose"?
You can't get one application to "invalidate" the cache on another application.
If you're going down this path, you need to consider a distributed caching engine (e.g Velocity).
One option that comes to my mind in such scenario is using Velocity distributed cache mechanism. Do read about it and give it a try if possible http://msdn.microsoft.com/en-us/magazine/dd861287.aspx
In ASP.NET there is the notion of Cache Dependency. You can have a look here: http://www.codeproject.com/KB/web-cache/CachingDependencies.aspx or http://www.devx.com/dotnet/Article/27865/0/page/5.
There is also the Enterprise Library Caching Block available here that adds some feature to the standard stuff: http://msdn.microsoft.com/en-us/library/ff649093.aspx
Now, if you're running on .NET 4, there is a new System.Runtime.Caching namespace that you should definitely use: http://msdn.microsoft.com/en-us/library/system.runtime.caching.aspx
This article here "Caching in ASP.NET with the SqlCacheDependency Class" is quite interesting: http://msdn.microsoft.com/en-us/library/ms178604.aspx

Concurrency ASP.NET best-practices worst-practices

In which cases to you need to watch out for Concurrency problems (and use lock for instance) in ASP.NET?
Are there 'best practices' around on this topic
Documentation?
Examples?
'worst practices...' or things you've seen that can cause a disaster...?
I'm curious about for instance singletons (even though they are considered bad practice - don't start a discussion on this), static functions (do you need to watch out here?), ...?
Since ASP.NET is a web framework and is mainly stateless there are very few concurrency concerns that need to be addressed.
The only thing that I have ever had to deal with is managing application cache but this is easily done with a cache-management type that wraps the .NET caching mechanisms.
One huge problem that caused us a lot of grief was using Modules vs. Classes in our main Web Service. This was before we really knew what we were doing and has since been fixed.
The big problem with using modules is that by default any module level variables are visible to every instance of the ASP worker process. We pass in multiple datasets and manipulate them then return them to the client. Because we were using modules the variables holding these datasets were getting corrupted by multiple calls occuring at one time.
This was not caught in testing and was difficult to reproduce until we figured out how to properly load test our web services. It took something like 10-20 requests per second before we could reproduce it accurately.
In the end, we just changed all the modules to classes, and then used those classes instead of calls to the modules, this cleared up this concurrency issue as each instantiated class had its own copy of the dataset in memory.

Code reusability - App_Code or BIN or UserControls?

I recently had a discussion on another forum with another developer and the topic was Code Reuse in ASP.NET. The stated scenario was that he needs to update code frequently on Production servers during server uptimes, and this results in Session getting reset for all users. He is avoiding putting shared code or classes into the App_Code folder or precompiled DLL's into the Bin folder because any updates will also refresh the Session.
The solution he has come up with is to put his shared code into UserControls and reference them wherever required. This enables him to update only the UserControl files which would be recompiled dynamically on next request without forcing a Session restart. Note that the Usercontrols are not intended to have any UI, they probably only house some business logic.
I tried to convince him against this because it felt intrinsically wrong to me - but I could not provide any hard facts to support my claim that this was a very bad way of doing things. The only thing I could think of is that it violates the principle of separation of business logic from UI. Am I grossly mistaken or are there concrete reasons why this should not be done? Links or examples would be helpful.
Note: Using out-of-process Session state is not an option at present, nor have they been able to decide on scheduled downtimes. Also, since this is a site under active development, they don't seem to be using any sort of professional deployment model yet.
Thanks in advance.
Edit: Additionally, it would be helpful if someone could clarify exactly why the Session restarts in the above mentioned cases.
It does seem like an unusual approach, and persistent session is the obvious answer. Assuming that reasons not to use persistent session are legitimate, sometime you just have to go with whatever works. I'd make a point of clearly documenting in the source files the unusual use of usercontrols and live with it.
To answer the why does session get reset edit. With in process session all the session data is in memory as part of your application. Various changes to the web site (e.g. web.config and others I don't recall off the top of my head) cause the application to restart wiping out all current state in your application. Persisting to SQL or the out of process session state server would allow the application to reset and lose any state without affecting the session data.
It sounds like the main problem is that he's updating production code too frequently. Other than that, UserControls seem like a perfectly reasonable place to put business logic, especially if you have a good naming convention for them or can put them in a common folder.
May i ask, why isn't out-of-process session state an option, really?
Since this guy seems to put in so much effort to get around this "problem", wouldn't he be better off looking at better solutions? out-of-process session state is the only good solution.
I'll agree with Dennis, there are really no issues moving from inproc to the state server. Not sure what your dev/deployment platforms are, but they should include a session state service - start that up, change your web.config, and the problem is solved.
it's a clever (and ugly) solution to a common problem
The main problem is the architecture of such system; the code that needs to be updated can be put on a different service outside his web app, his code behind can then call these services, and the services can be updated when needed without affecting the web app
Every base has been covered already, but I really hate bad practices like this. If the guy can't simply change to a state server to fix the problem that he has, then he doesn't really deserve the help. What would happen if he put his class in the root folder of the project and compiled it independently? Either way, I would think this guy is a bad developer for not thinking about scalability, and not planning for downtime. What I'm guessing is he doesn't have a development environment available. Tsk tsk tsk.
As an answer to your question, as stated by everyone else, put the code in a user control, and document well.

Resources