Concurrency ASP.NET best-practices worst-practices - asp.net

In which cases to you need to watch out for Concurrency problems (and use lock for instance) in ASP.NET?
Are there 'best practices' around on this topic
Documentation?
Examples?
'worst practices...' or things you've seen that can cause a disaster...?
I'm curious about for instance singletons (even though they are considered bad practice - don't start a discussion on this), static functions (do you need to watch out here?), ...?

Since ASP.NET is a web framework and is mainly stateless there are very few concurrency concerns that need to be addressed.
The only thing that I have ever had to deal with is managing application cache but this is easily done with a cache-management type that wraps the .NET caching mechanisms.

One huge problem that caused us a lot of grief was using Modules vs. Classes in our main Web Service. This was before we really knew what we were doing and has since been fixed.
The big problem with using modules is that by default any module level variables are visible to every instance of the ASP worker process. We pass in multiple datasets and manipulate them then return them to the client. Because we were using modules the variables holding these datasets were getting corrupted by multiple calls occuring at one time.
This was not caught in testing and was difficult to reproduce until we figured out how to properly load test our web services. It took something like 10-20 requests per second before we could reproduce it accurately.
In the end, we just changed all the modules to classes, and then used those classes instead of calls to the modules, this cleared up this concurrency issue as each instantiated class had its own copy of the dataset in memory.

Related

How secure are singletons in ASP.NET?

Are singletons in ASP.NET shared between users/sessions? And if they are, are there any safety considerations? Think serializing/deserializing vulnerabilities, thread safety etc.
Is it the way to go using settings from the database that are the same for all users?
Hand crafting the anti-pattern called "Singleton" in C# code is a really bad idea in general, ASP.NET or not.
The singleton lifetime that is supported in the dependency injection framework is a good idea if it does what you need.
I would advise you to only use it for read-only data, like settings, though. You don't have an application on the desktop as of old. Your application might be recycled on the fly, or maybe stretched across multiple nodes on a server farm. So suddenly your "singleton" is actually only a singleton if you have a single instance of your program running. Building your application so this becomes an artificial problem (i.e. the framework would support it, but your own code is build to fail if you actually do so) is not a smart way to go about this.
So to summarize: Singleton lifetime in your dependency injection container? Might be okay. Depends on your use case. An actual "Singleton" pattern in your code? Bad. Very Bad. Tells me you don't actually do any unit testing and nothing is planned to bring this application over a few thousand hobbyist users who don't care if your app is down every time you deploy.

Lock convoys in ASP.NET website

TL;DR:
Made refactoring for performance, website got slower. Ran the Concurrency Visualizer, the graph looks like the lock convoys as described on MSDN.
Context
I’m helping with refactoring an ASP.NET website to switch user controls from performing business logic on datasets to perform presentation logic on business objects and also reduce database calls made from the user controls.
The issue
We have noticed a significant performance drop (hangs/blockings) after introducing changes involving what we thought would be performance improvements in multiple areas.
We’re using Lean Sentry to monitor our websites’ performance. According to the hang diagnostics, the thread pool was running out of threads and (according to the descriptions on the diagnostics page) when GC runs, it stops more threads from being created. The GC Heap and Gen 0 were consuming a lot of memory (~ 9 GB), according to the memory diagnostics.
What I did so far?
I used memory profiler in Visual Studio and identified issues with our excessive DataAdapter and DataTable usage. Memory consumption dropped to 3 GB but that only helped with GC blocking. It is still slower than it had been before we introduced the changes and I still see blocking on high load caused by functions like CompilationLock.GetLock() and BuildManager.GetBuildResultFromCacheInternal(). Googling them didn’t return anything useful.
This is a website that uses JIT compilation. I had assumed that the issue with CompilationLock might be because of JIT compiling and wanted to run the website precompiled, but one of our global Utilities classes caused ambiguity with some other Utilities class/namespace that I don’t know of. I’ve found out that there is a Microsoft.Build.Utilities namespace, but it’s not referenced in our website and I can’t reproduce the ambiguity in my own environment when I reference Microsoft.Build myself, so I couldn’t get the website running on precompiled mode on the staging server to test this theory.
I made additional changes on memory allocation and the amount of database calls, using Visual Studio’s memory allocation and instrumentation profilers as a measure, but I didn’t notice any progress on performance.
I used a concurrency profiler to gather more information on thread utilization. I haven’t used this tool before, so I’m not sure about my interpretations here. There are multiple threads in each handle and in one handle I’m seeing 42% contention. I see that the DataAdapter.Fill and SqlHelper.ExecuteReader methods show up most when it’s set to “Show Just My Code” and WaitForSingleObjectExImplementation shows up most when it’s set to “Show All Code”.
I encountered a SO question about ASP.NET websites’ performance issues and set EnableSessionState="ReadOnly" for each page, but I didn’t notice difference with this change, either.
Concurrency Visualizer and Common Patterns for Poorly-Behaved Multithreaded Applications helped me identify the issue. My graph doesn’t seem like serial execution, but I see 80–90% synchronization as shown in Lock Convoys graph. I checked out a SO question on lock convoys debugging, too.
Testing approach
I’m using Screaming Frog to crawl the website in order to reproduce the issues and taking numbers of requests per second and response times in both Screaming Frog and Lean Sentry as a performance measure. It might not be the best way but the difference is noticeable, reproducible and it’s pretty much all I have at this point.
Architecture of the website
The website was originally coded in VB.NET for .NET Framework 1.0 about 10 years ago, and upgraded to .NET Framework 4.6.1 by fixing some compatibility issues. There haven’t been any architectural changes so far. There is a shared SqlHelper class, which is a collection of shared data access functions like ExecuteDataset or ExecuteDatareader, that return either a DataSet, DataReader or String value. These functions read the connection string information from the web.config file and create a new SqlConnection, SqlDataAdapter, SqlDataReader and SqlCommand object to perform the database operations. The Data Access Layer that consumes this shared class consists of classes for each module like shopping cart, category, product, etc. to be instantiated in each user control and they consist of functions that represent stored procedures in the database.
The refactoring
We have introduced some new objects to be instantiated either inside page load of the related user control, or inside OnItemDataBound event of repeaters and attached to its child user controls’ public properties, which are refactored to use the object. However, there are still other child user controls that need multiple data tables, so we decided to store one of the data tables in one of the objects and pass it to related user controls by assigning it to their public properties.
I guess that we hurt performance by introducing these objects. Even though database calls and memory consumption seem to be reduced, I’m wondering if the objects are causing threads to be synced all the time.
The graph before any refactoring happened:
The graph after all the refactoring I mentioned applied:
Will you help me identify the issue?
Your problem is rather complex. I think that you have two basic options to resolve your refactoring performance issues:
Revert changes to the code to a point where all or much of the refactoring hadn’t yet been done and when you had better performance than what you are currently experiencing. Then, proceed gradually with the addition of new classes for performance improvements. If a change does not improve performance, then undo it and try something else.
Replace some / much of the newly added classes with versions that support the interfaces but lack the performance overhead. Do this selectively to isolate where the performance issues exist. Perhaps, the website has tapped into an unknown performance bug that was not triggered by prior implementations of the added classes.
I would favor option 1, though it may seem counterproductive. It is a bit like punting in U.S. football. Sure, it is nice to just drive down the field. But sometimes the dominant strategy is to punt, get the ball back and try to score on another drive.

Make a .Net DLL Thread-safe for Web App Consumption?

I've written a class in VB.Net that is consumed in an ASP.NET Web Application running IIS7. I use .NET Framework 4.0. The class performs a REST request to an online and retrieves an XML response containing strongly typed data.
This class also performs caching using an SQL Server database. The class is compiled to a .DLL and referenced by the Web Application. It works excellent and now I need to know how to make the class thread-safe.
I have no experience with making code 'thread-safe'. I don't know where to begin in determining whether or not it is thread-safe. Am assuming, because I didn't pay attention to this during development, that it is NOT thread-safe and that since it the web application will be used by many users at the same time that it must be payed attention to.
Can anyone point me on how to test for thread-safety? Are there any resources online that that will give me some ideas? Are there any rules of thumb that will tip me off as to where my main concerns are?
The easiest possible thing you can look out for is the use of "static" (C#) or "Shared" (VB.NET) variables. If these variables can be modified throughout the lifetime of the application you will likely run into threading issues which can really often result in "random looking" problems.
I would also be concerned about how you are doing the caching in your database as multiple .NET threads hitting SQL (for the cache) could cause issues as well depending on how its designed.
Bottom line is you are likely going to need to learn more about threading if you want to be sure this is going to not have issues. Probably the best book I have ever read in terms of simple to very complex C# topics is C# 4.0 In a Nutshell I would take a look at that book especially the threading chapters. (Seriously read the whole thing though) If you get that read through and have a good understanding of the concepts mentioned you should be fine.

Unwanted ASP.Net MVC3 server side caching

I'm working on an MVC3 app and I've come across an issue with objects being cached unintentionally.
My code is creating objects from calls to a separate custom business logic dll.
This business logic dll gets data from a database.
After I change data in the database, I'm still seeing the old data, even after closing my browser and re-running the application. It's not a browser caching issue because I can see it when I'm debugging in the development environment.
In development, if I stop the asp.net development server, then re-run the app, I get the new data.
In IIS, if I restart the website, I get the new data.
Any idea why asp.net is caching and re-using these objects, even after they have gone out of scope?
The business logic dll does have some caching built into it, so maybe that's the main issue. In that case, I guess the question is whether there is some way I can tell asp.net to wipe out the objects once the session is over.
There's no caching by default in ASP.NET MVC3, at least no caching of data. Make sure your IIS settings are correct and you don't accidentally use the OutputCacheAttribute.
As for caching in the business layer: I've seen at least three caching-related problems in the last two days. Keep in mind: Caching is tricky, and so are static variables. If it's not necessary, don't do it. Caching is extremely powerful, but it's also dangerous. That is also true for the beforementioned OutputCacheAttribute.
It sounds to me like you're creating your data context statically, rather than creating a new one and destroying it after ever request. This is a bad thing to do for a lot of reasons.
When you say that business layer has "some cacheing", what does that mean? How are you cacheing?

Is caching using Application slower or more problematic than using Global.asax.cs static variable?

We have a Webforms application that stores a bunch of settings and terminology mappings (several hundred) that are used throughout the application.
http://www.dotnetperls.com/global-variables-aspnet makes these assertions:
The Application[] collection .... may be slower and harder to deal with.
the Application[] object ...is inefficient in ASP.NET.
Is this recommended? Yes, and not just by the writer of this article. It is noted in Professional ASP.NET by Apress and many sites on the Internet. It works well
So I am wondering if these statements are true. Can anyone elaborate on why using Application is slower or what kind of problems can crop up if you use Application? I am sort of assuming that any problems or slowdowns might only surface under production loads, so that is why I am asking for real world experience, rather than just benchmarking myself.
I am aware that there are many alternatives to caching (HttpRuntime.Cache, memcached, etc) but specifically I want to know if I need to go back and rewrite my legacy code that uses Application[]. Specifically if in any way I am incurring a performance penalty I would want to get rid of that.
How are you saving these settings? I would recommend the web.config
If you're using the web.config to store these settings (if they're application variables that's a solid place to start), then no need for Application variables.
I try to steer clear of the Application level variables because they are way more expensive than Session variables.
Also, variables in the web.config / app.config files can change without having to change code and/or recompile your project.
Application class (global variables) only exist in ASP.NET to help with backwards compatibility with classic ASP, you could say it's deprecated.
Another thing you could look into would be caching your settings so you're not always reading from disk.

Resources