Does ASP.NET on IIS use a thread per request? - asp.net

Benefits of reducing threadcount
It is fashionable these days to attempt to improve the scalability of web-servers by reducing the number of threads handling requests, in an effort to prevent unnecessary context switching from having tens or hundreds of threads. This is particularly important when waiting on some kind of application event for a 'long polling' (aka COMET) request - it simply is rude to consume all the resources of an entire thread and do nothing with it.
Node.ns as an extreme example
Node.ns is a rather extreme example of reducing thread-count, requiring handlers to give up the single thread handling all HTTP requests when waiting on any event (even I/O from a local disk!). In this way, a CPU core is kept busy doing useful computational work, instead of 'wasting' instructions pausing and resuming many different threads.
But API support is needed
However, arranging for resumption of the handler once the waited-for event has completed typically requires explicit support in the handler API. For example, there is a part of the Java Servlet Specification which deals specifically with 'asynchronous' responses.
Wherefore, ASP.NET?
So, does ASP.NET have a similar API for asynchronous responses (aka. continuations, aka. (a bit loosely) 'reactor pattern')? Is there authoritative documentation which directly addresses this matter and its interaction with IIS? What nomenclature does ASP.NET / IIS use to refer to these properties?
I am coming to IIS/.NET shortly, and don't want to write any code that's unnecessarily 'thread-per-request' if I can help it. I'll be using C# by preference.

Related

Why is a .NET request waiting for the other?

As of my empirically gathered knowledge suggests, .NET WebForms is probably using a queue of requests and when the first request is properly handled, the new first comes to the head of the queue and so on. This behavior recently led to a misunderstanding, where we thought that a feature is very slow, but as a matter of fact, some other features always running before it were the slow ones. But this is only a minor misunderstanding. I can imagine more serious problems, for instance a longer request blocking the other requests and I did not find the time yet to test this on multiple sessions, to see whether this queue is session-level, but I think it should be if I am even right about its existence. Hence my question: why are later requests waiting for earlier requests' parsing in .NET WebForms projects?
Probably Session.
Requests from the same session that use session state don't run concurrently. This means that applications can use session state without needing to worry about race conditions.
There is no blocking for calls from different sessions. Also not blocking for calls from the same client that have session state disabled or readonly.
See the MSDN description of the PagesEnableSessionState Enumeration

WebAPI Lifecycle/Request Queue

I have an AngularJS app that calls WebAPI. If I log the time I initiatiate a request (in my angluar controller) and log the time OnActionExecuting runs (in an action filter in my WebAPI controller), I notice at times a ~2 second gap. I'm assuming nothing else is running before this filter and this is due to requests being blocked/queued. The reason I assume this is because if I remove all my other data calls, I do not see this gap.
What is the number of parallel requests that WebAPI can handle at once? I tried looking at the ASP.NET performance monitors but couldn't find where I could see this data. Can someone shed some insight into this?
There's no straight answer for this but the shortest one is ...
There is no limit to this for WebApi the limits come from what your server can handle and how efficient the code you have it run is.
...
But since you asked, lets consider some basic things that we can assume about our server and our application ...
concurrent connections
A typical server is known for issues like "c10k" ... https://en.wikipedia.org/wiki/C10k_problem ...so that puts a hard limit on the number of concurrent connections.
Assuming each WebApi call is made from say, some AJAX call on a web page, that gives us a limit of around 10k connections before things get evil.
2.Dependency related overheads
If we then consider the complexity of the code in question you may then have a bottleneck in doing things like SQL queries, I have often written WebApi controllers that have business logic that runs 10+ db queries, the overhead here may be your problem?
Feed in Overhead
What about network bandwidth to the server?
Lets assume we are streaming 1MB of data for each call, it wont take long to choke a 1Gb/s ethernet line with messages that size.
Processing Overhead
Assuming you wrote an Api that does complex calculations (e.g mesh generation for complex 3D data) you could easily choke your CPU for some time on each request.
Timeouts
Assuming the server could accept your request and the request was made asynchronously the biggest issue then is, how long are you prepared to wait for your response? Assuming this is quite short you would reduce the number of problems you have time to solve before each request then needed a response.
...
So as you can see, this is by no means an exhaustive list but it outlines the complexity of the question you asked. That said, I would argue that WebApi (the framework) has no limits, it's really down to the infrastructure around it that has limitations in order to determine what can be possible.

Is there any reason *not* to implement asynchronous ASP.NET web pages in every application?

With regards to asynchronous ASP.NET web pages article on MSDN.
The advantages are obvious with long-running pages or high server load. So, given projects where you think demand may be high somewhere down the track, when usage grows, is there any reason NOT to implement async ASP.NET in every web application as a standard? Are there any disadvantages to the approach?
Secondary question: are there any real-world studies/examples of where the advantages start to appear, in different web app situations? Or is it just a matter of suck it and see?
From your own link:
Only I/O-bound operations are good candidates for becoming async action methods on an asynchronous controller class. An I/O-bound operation is an operation that doesn’t depend on the local CPU for completion. When an I/O-bound operation is active, the CPU just waits for data to be processed (that is, downloaded) from external storage (a database or a remote service). I/O-bound operations are in contrast to CPU-bound operations, where the completion of a task depends on the activity of the CPU.
Async pages are not free, they do come at a price. They are generally good when your page is making an external call to a service or performing some long-running, non-CPU bound, operation. Otherwise, you are likely to thrash the CPU, leaving you with a worse situation that you had before going async.
The idea is to use async when you would be eating up a thread from your application's thread pool doing non-CPU intensive work (waiting for a response from a long-running service). That way, your application can continue processing requests and doesn't start queuing new ones, slowly draining the responsiveness from your app.
Here is another link with information when/when not to use async pages.
Edit
As for what is considered "long running," you're faced with the crummy answer of "It depends." To figure this out, you would need to profile your application, see how many of your "long running" requests cause subsequent requests to be queued, instead of processed, by IIS. The decision comes down to being in a situation in which paying the costly toll of context switching is less than the return you're going to get for doing so. If your bottleneck is a certain page or service that causes incoming requests to be held off, it is probably a good idea to start thinking about async work. But, you might also be doing too much work in the request and it could be a "code smell" that you need to refactor your code.
In the end, It depends.
Here is an exerpt from MSDN.
In general, use asynchronous pipelines when the following conditions are true:
The operations are network-bound or I/O-bound instead of CPU-bound.
Testing shows that the blocking operations are a bottleneck in site performance and that IIS can service more requests by using asynchronous action methods for these blocking calls.
Parallelism is more important than simplicity of code.
You want to provide a mechanism that lets users cancel a long-running request.
While the link is about MVC, the idea holds true for other flavors of ASP.NET, too.

ASP.NET, asynchronous, when to use it?

I'm somewhat vague about when to use asynchronous operations in ASP.NET. I understand, whenever I make a call to external web services such as calling Twitter APIs and what not, I should be using asynchronous operations so the CLR threads could be freed and service other requests, make sense.
I read an excellent blog once mentioned that if your operation is using CPU efficiently, then you shouldn't do asynchronous operation because it has a context switch penalty, however, if you are doing a long operation and waits a lot, then it's worth to do the context switch.
What about a page that use ajax call to a local web service which in turn makes a database operations (takes around 3 seconds), returns JSON and then the page itself using JQuery, renders it for another second for a total of 4 seconds?
What about a traditional webform, from page_load makes the same database operation call (3 seconds) and then take another 3 seconds to render? For instance, a big forum post with 1000 comments?
My general impression is that shouldn't IIS be designed such that EVERY operation is asynchronous by default in the background such that all operation is non-blocking without the context switch penalty? Is that the idea of node.js? and If you do have static pages, no wait operations, only then should you specifically write a synchronous page? Basically the reverse of what is happening now?
Thanks a lot.
The only benefit of asynchronous requests is that it frees up worker threads. So you only notice it if you run out of worker threads. By default, there are 100 threads per CPU. So you won't notice the performance improvement unless you have more than 100 requests per 4 seconds per CPU.
The drawback of asynchronous requests is that it makes your code harder to understand and maintain.

Multithreading in asp.net

What kind of multi-threading issues do you have to be careful for in asp.net?
It's risky to spawn threads from the code-behind of an ASP.NET page, because the worker process will get recycled occasionally and your thread will die.
If you need to kick off long-running processes as a result of user actions on web pages, your best bet is to drop a message off in MSMQ and have a separate background service monitoring the queue. The service could take as long as it wants to accomplish the task, and the web page would be finished with its work almost immediately. You could accomplish the same thing with an asynch call to a web method, but don't rely on getting the response when the web method is finished working. From code-behind, it needs to be a quick fire-and-forget.
One thing to watch out for at things that expire (I think httpContext does), if you are using it for operations that are "fire and forget" remember that all of a sudden if the asp.net cleanup code runs before your operation is done, you won't be able to access certain information.
If this is for a web service, you should definitely consider thread pooling. Too many threads will bring your application to a grinding halt because they will eventually start competing for CPU time.
Is this for file or network IO? If so, you should also consider using asynchronous IO. It can be a bit more of a pain to program, but you don't have to worry about spawning off too many threads at once.
Programmatic Caching is one area which immediately comes to my mind. It is a great feature which needs to be used carefully. Since it is shared across requests, you have to put locks around it before updating it.
Another place I would check is any code accessing filesystem like writing to log files. If one request has a read-write lock on a file, other concurrent requests will error out if not handled properly.
Isn't there a Limit of 25 Total Threads in the IIS Configuration? At least in IIS 6 i believe. If you exceed that limit, interesting things (read: loooooooong response times) may happen.
Depending on what you need, as far as multi threading is concerned, have you thought of spawning requests from the client. It's safe to spawn requests using AJAX, and then act on the results in a callback. Or use a service as a backgrounding mechanism, which runs every X minutes and processes in the background that way.

Resources