ASP.NET async/await part 2 - asp.net

I have a variation of the benefits-of-async/await-on-ASP.NET from this question.
My understanding is that asynchrony is not the same thing as parallelism. So, on a web server, I wonder about how much benefit async/await brings to ASP.NET pages.
Isn't IIS+ASP.NET already really good at allocating threads for requests, and if onen page is busy waiting for a resource the server will just switch to processing another request that has work to do?
There are a limited number of threads in the pool for ASP.NET to use - does async use them any more effectively?
As Mr. Skeet pointed out in answering the question above, we're not talking about blocking a UI thread. We're already multi-threaded and the web response can't be completed until all the request's tasks are done, async or not, right?
I guess what it boils down to is this:
Is there any benefit to an async read of a resource (say a file or DB request) in an ASP.NET page vs. blocking on it?

if one page is busy waiting for a resource the server will just switch to processing another request that has work to do?
I don't think so. I would be very surprised if this were the case. It's theoretically possible, but very complex.
There are a limited number of threads in the pool for ASP.NET to use - does async use them any more effectively?
Yes, because when you await something, the thread for that request is immediately returned to the pool.
We're already multi-threaded and the web response can't be completed until all the request's tasks are done, async or not, right?
That is correct. async in a server scenario is all about removing pressure on the thread pool.
Is there any benefit to an async read of a resource (say a file or DB request) in an ASP.NET page vs. blocking on it?
Absolutely!
If you block on a file/service call/db request, then that thread is used for the duration of that operation. If you await a file/service call/db request, then that thread is immediately returned to the thread pool.
One (really cool!) result of this is that you can have a request in progress, and while it's (a)waiting some operation, there are no threads servicing that request! Zero-threaded concurrency, if you will.
When the operation completes, the method resumes after the await - on a (possibly different) thread from the thread pool.
In conclusion: async scales better than threads, so there is definitely a benefit on the server side.
More info: my own intro to async post and this awesome video.

Related

Async. programming in .Net Core

I was reading the documentation of Microsoft specifically the Async programming article and I didn't understand this section while he is explaining the work of the server's threads when using Async code.
because it(The server) uses async and await, each of its threads is freed up when the I/O-bound work starts, rather than when it finishes.
Could anyone help what does it mean by the threads r freed up when the I/O starts??
Here is the article : https://learn.microsoft.com/en-us/dotnet/standard/async-in-depth
When ASP.NET gets an HTTP request, it takes a thread from the thread pool and uses that to execute the handler for that request (e.g., a specific controller action).
For synchronous actions, the thread stays assigned to that HTTP request until the action completes. For asynchronous actions, the await in the action method may cause the thread to return an incomplete task to the ASP.NET runtime. In this case, ASP.NET will free up the thread to handle other requests while the I/O is in flight.
Further reading about the difference between synchronous and asynchronous request handling and how asynchronous work doesn't require a thread at all times.
When your application makes a call to an external resource like Database or HttpClient thread, that initiated connection needs to wait.
Until it gets a response, it waits idly.
In the asynchronous approach, the thread gets released as soon as the app makes an external call.
Here is an article about how it happens:
https://medium.com/#karol.rossa/asynchronous-programming-73b4f1988cc6
And performance comparison between async and sync apporach
https://medium.com/#karol.rossa/asynchronous-performance-1be01a71925d
Here's an analogy for you: have you ever ordered at a restaurant with a large group and had someone not be ready to order when the waiter came to them? Did they bring in a different waiter to wait for him or did the waiter just come back to him after he took other people's orders?
The fact that the waiter is allowed to come back to him later means that he's freed up immediately after calling on him rather than having to wait around until he's ready.
Asynchronous I/O works the same way. When you do a web service call, for example, the slowest part (from the perspective of the client at least) is waiting for the result to come back: most of the delay is introduced by the network (and the other server), during which time the client thread would otherwise have nothing to do but wait. Async allows the client to do other things in the background.

async and await in an asp.net core mvc web project

I have understood the reason of async and await in a WPF application: This permit to the main UI thread not wasting time. The user can interact with UI whereas an I/O operation is waiting at the same time.
What i do not understand is why we see the same concept in ASP.Net Core MVC 2.0.
WebMethods [HttpGet], [HttpPost] should wait anyway before rendering the page...
Thanks
It frees up the cpu to service other requests to your controllers rather than busy wait on whatever async operations you are doing. This answer has a pretty good analogy to help explain:
Think of it like getting on a bus, there's five people waiting to get on, the first gets on, pays and sits down (the driver serviced their request), you get on (the driver is servicing your request) but you can't find your money; as you fumble in your pockets the driver gives up on you and gets the next two people on (servicing their requests), when you find your money the driver starts dealing with you again (completing your request) - the fifth person has to wait until you are done but the third and fourth people got served while you were half way through getting served. This means that the driver is the one and only thread from the pool and the passengers are the requests.
Without an async controller, the passengers behind you would have to wait ages while you looked for your money, meanwhile the bus driver would be doing no work.

ASP.NET and async - how it works?

I know that is a common question, but I've read a kiloton of articles and feel confused. And now I think that it would be better not to read them at all )).
So, how ASP.NET works (only about threads):
http request is served by thread from the thread pool.
while request is processing this thread is busy, because request is processing inside exactly this thread.
when request processing finishes, the thread returns to thread pool, server sends a response.
Is this described behaviour right ?
What really happens when I start new task inside ASP.NET MVC controller?
public ActionResult Index()
{
var task1 = Task.Factory.StartNew(() => DoSomeHeavyWork());
return View();
}
private static async Task DoSomeHeavyWork()
{
await Task.Delay(5000);
}
controller action starts to execute inside thread that processes current request - T1.
the thread pool allocates another one thread (T2) for task1.
task1 starts "immediately" inside T2.
the View result returns "immediately".
ASP.NET do some work, server sends a response, T1 returns to the thread pool, T2 is still alive.
after some time when the DoSomeHeavyWork will be finished, the T2 thread will be returned to the thread pool.
Is it correct ?
Now lets look to async action
public async Task<ActionResult> Index()
{
await DoSomeHeavyWork();
return View();
}
, I understand the difference with previous code sample, but not the process, in this example the behaviour is the following:
action starts to execute inside thread that processes current request - T1.
DoSomeHeavyWork "immediately" returns a task, let's call it "task1" too.
T1 returns to the thread pool.
after the DoSomeHeavyWork finishes, Index action continue to execute.
after Index action execution the server will send a response.
Please explain what happening between points 2 and 5, the questions are:
is the DoSomeHeavyWork processed inside task1 or where (where it is "awaited") ? I think this a key question.
which thread will continue to process the request after await - any new one from the thread pool, right ?
request produces thread allocating from the thread pool, but response will not be sent until the DoSomeHeavyWorkAsync finished and it doesn't matter in which thread this method executes. In other words, according to single request and single concrete task (DoSomeHeavyWork) there is no benefits of using async. Is it correct ?
if the previous statement is correct, then I don't understand how the async can improve performance for multiple requests with the same single task. I'll try to explain. Let's assume that the thread pool has 50 threads available to handle requests. Single request should be processed at least by one single thread from the thread pool, if the request starts another threads, then all of them will be taken from the thread pool, e.g. request takes one thread to process himself, starts 5 different tasks in parallel and wait all of them, thread pool will have 50 - 1 - 5 = 44 free threads to handle incoming requests - so this is a parallelism, we can improve performance for single request, but we reduce number of requests which can be processed. So according request processing in ASP.NET I suppose that only task that somehow starts IO completion thread can achieve a goal of async (TAP). But how IO completion thread calls back thread pool thread in this case ?
Is this described behaviour right ?
Yes.
Is it correct ?
Yes.
is the DoSomeHeavyWork processed inside task1 or where (where it is
"awaited") ? I think this a key question.
From the current code, DoSomeHeavyWork will asynchronously wait for Task.Delay to complete. Yes, this will happen on the same thread allocated by the thread-pool, it won't spin any new threads. But there isn't a guarantee that it will be the same thread, though.
which thread will continue to process the request after await?
Because we're talking about ASP.NET, that will be an arbitrary thread-pool thread, with the HttpContext marshaled onto it. If this was WinForms or WPF app, you'd be hitting the UI thread again right after the await, given that you don't use ConfigureAwait(false).
request produces thread allocating from the thread pool, but response
will not be sent until the DoSomeHeavyWorkAsync finished and it
doesn't matter in which thread this method executes. In other words,
according to single request and single concrete task (DoSomeHeavyWork)
there is no benefits of using async. Is it correct ?
In this particular case, you won't see the benefits of async. async shines when you have concurrent requests hitting the server, and alot of them are doing IO bound work. When using async while hitting the database, for example, you gain freeing the thread-pool thread for the amount of time the query executes, allowing the same thread to process more requests in the meanwhile.
But how IO completion thread calls back thread pool thread in this
case ?
You have to separate parallelism and concurrency. If you need computation power for doing CPU bound work in parallel, async isn't the tool that will make it happen. On the other hand, if you have lots of concurrent IO bound operations, like hitting a database for CRUD operations, you can benefit from the usage of async by freeing the thread while to IO operation is executing. That's the major key point for async.
The thread-pool has dedicated pool of IO completion threads, as well as worker threads, which you can view by invoking ThreadPool.GetAvailableThreads. When you use IO bound operations, the thread that retrieves the callbacks is usually an IO completion thread, and not a worker thread. They are both have different pools.

Web API 2 - are all REST requests asynchronous?

Do I need to do anything to make all requests asynchronous or are they automatically handled that way?
I ran some tests and it appears that each request comes in on its own thread, but I figure better to ask as I might have tested wrong.
Update: (I have a bad habit of not explaining fully - sorry) Here's my concern. A client browser makes a REST request to my server of http://data.domain/com/employee_database/?query=state:Colorado. That comes in to the appropriate method in the controller. That method queries the database and returns an object which is then turned into a JSON structure and returned to the calling app.
Now let's say 10,000 clients all make a similar query to the same server. So I have 10,000 requests coming in at once. Will my controller method be called simultaneously in 10,000 distinct threads? Or must the first request return before the second request is called?
I'm not asking about the code in my handler method having asynchronous components. For my case the request becomes a single SQL query so the code has nothing that can be handled asynchronously. And until I get the requested data, I can't return from the method.
No REST is not async by default. the request are handled synchronously. However, your web server (IIS) has a number of max threads setting which can work at the same time, and it maintains a queue of the request received. So, the request goes in the queue and if a thread is available it gets executed else, the request waits in the IIS queue till a thread is available
I think you should be using async IO/operations such as database calls in your case. Yes in Web Api, every request has its own thread, but threads can run out if there are many consecutive requests. Also threads use memory so if your api gets hit by too many request it may put pressure on your system.
The benefit of using async over sync is that you use your system resources wisely. Instead of blocking the thread while it is waiting for the database call to complete in sync implementation, the async will free the thread to handle more requests or assign it what ever process needs a thread. Once IO (database) call completes, another thread will take it from there and continue with the implementation. Async will also make your api run faster if your IO operations take longer to complete.
To be honest, your question is not very clear. If you are making an HTTP GET using HttpClient, say the GetAsync method, request is fired and you can do whatever you want in your thread until the time you get the response back. So, this request is asynchronous. If you are asking about the server side, which handles this request (assuming it is ASP.NET Web API), then asynchronous or not is up to how you implemented your web API. If your action method, does three things, say 1, 2, and 3 one after the other synchronously in blocking mode, the same thread is going to the service the request. On the other hand, say #2 above is a call to a web service and it is an HTTP call. Now, if you use HttpClient and you make an asynchronous call, you can get into a situation where one request is serviced by more than one thread. For that to happen, you should have made the HTTP call from your action method asynchronously and used async keyword. In that case, when you call await inside the action method, your action method execution returns and the thread servicing your request is free to service some other request and ultimately when the response is available, the same or some other thread will continue from where it was left off previously. Long boring answer, perhaps but difficult to explain just through words by typing, I guess. Hope you get some clarity.
UPDATE:
Your action method will execute in parallel in 10,000 threads (ideally). Why I'm saying ideally is because a CLR thread pool having 10,000 threads is not typical and probably impractical as well. There are physical limits as well as limits imposed by the framework as well but I guess the answer to your question is that the requests will be serviced in parallel. The correct term here will be 'parallel' but not 'async'.
Whether it is sync or async is your choice. You choose by the way to write your action. If you return a Task, and also use async IO under the hood, it is async. In other cases it is synchronous.
Don't feel tempted to slap async on your action and use Task.Run. That is async-over-sync (a known anti-pattern). It must be truly async all the way down to the OS kernel.
No framework can make sync IO automatically async, so it cannot happen under the hood. Async IO is callback-based which is a severe change in programming model.
This does not answer what you should do of course. That would be a new question.

Asp.net SynchronizationContext locks HttpApplication for async continuations?

This comment by Stephen Cleary says this:
AspNetSynchronizationContext is the strangest implementation. It treats Post as synchronous rather than asynchronous and uses a lock to execute its delegates one at a time.
Similarly, the article that he wrote on synchronization contexts and linked to in that comment suggests:
Conceptually, the context of AspNetSynchronizationContext is complex. During the lifetime of an asynchronous page, the context starts with just one thread from the ASP.NET thread pool. After the asynchronous requests have started, the context doesn’t include any threads. As the asynchronous requests complete, the thread pool threads executing their completion routines enter the context. These may be the same threads that initiated the requests but more likely would be whatever threads happen to be free at the time the operations complete.
If multiple operations complete at once for the same application, AspNetSynchronizationContext will ensure that they execute one at a time. They may execute on any thread, but that thread will have the identity and culture of the original page.
Digging in reflector seems to validate this as it takes a lock on the HttpApplication while invoking any callback.
Locking the app object seems like scary stuff. So my first question: Does that mean that today, all asynchronous completions for the entire app execute one at a time, even ones that originated from separate requests on separate threads with separate HttpContexts? Wouldn't this be a huge bottleneck for any apps that make 100% use of async pages (or async controllers in MVC)? If not, why not? What am I missing?
Also, in .NET 4.5, it looks like there's a new AspNetSynchronizationContext, and the old one is renamed LegacyAspNetSynchronizationContext and only used if the new app setting UseTaskFriendlySynchronizationContext is not set. So question #2: Does the new implementation change this behavior? Otherwise, I imagine with the new async/await support marshaling completions through the synchronization context, this kind of bottleneck would be noticed much more frequently going forward.
The answer to this forum post (linked from SO answer here) suggests that something fundamentally changed here, but I want to be clear on what that is and what behaviors have improved, since we have a .NET 4 MVC 3 app which is pretty much 100% async action methods making web service calls.
Let me answer your first question. In your assumption you didn't consider the fact that separate ASP.NET requests are processed by different HttpApplication objects. HttpApplication objects are stored in pool. Once you request a page, an application object is retrieved from pool and belongs to the request till its completion. So, my answer to your question:
all asynchronous completions for the entire app execute one at a time, even ones that originated from separate requests on separate threads with separate HttpContexts
is: No, they don't
Separate requests are processed by separate HttpApplication objects, locked HttpApplication will affect only single request.
Synchronization context is a powerful thing that helps developers to synchronize access to shared (in scope of request) resources. That is why all callbacks are executed under lock. Synchronization context is a heart of event-based synchronization pattern.

Resources