Async GUI using WebForms and .NET 4.5 await/async - asp.net

I've been looking for any simple example building async interfaces using ASP.NET WebForms. That is when an async method is done the await shall render.
This is one of the examples I've been looking at, How and When to use `async` and `await`. The implementation I've been looking for would look something like this
protected async void button1_Click(object sender, EventArgs e)
{
// Render when done
textBox1.Text += await WaitAsynchronouslyAsync(RandomNumber(2000, 4000));
// Render when done
textBox1.Text += await WaitAsynchronouslyAsync(RandomNumber(100, 1000));
}
public async Task<string> WaitAsynchronouslyAsync(int delay)
{
await Task.Delay(delay);
return string.Concat(delay, "; ");
}
private int RandomNumber(int min, int max)
{
Random random = new Random();
return random.Next(min, max);
}
This will however always render when everything is done, but at the same time. In the example above the desired result would be the second call to WaitAsynchronouslyAsync to render before the first call since it always will be less delay.
Or is it even possible using webforms? I do know how to do this in JavaScript using webapi's, websockets and whatnot and that's not the solution I desire at the moment.

As I describe on my blog, async does not change the HTTP protocol.
HTTP provides you with one response for each request. So, when an HTTP request arrives, it must execute your page to completion before sending the response.
In the ASP.NET world, await does not yield to the client/browser. Instead, it yields to the ASP.NET runtime. ASP.NET will not send the response until it sees that your processing is all done.
If you want to dynamically update a page (or partially render one), then you'll need to do it yourself using an appropriate technology (SignalR, UpdatePanel, etc).

When you use await in the manner that you did, execution flow is sequential. The first await, and only after it finishes will the second await execute.
If you want them executed concurrently, you can initiate both operations and use Task.WhenAny and assign the value of whichever task finishes first:
Task<string> slowerTask = WaitAsynchronouslyAsync(RandomNumber(2000, 4000));
Task<string> fasterTask = WaitAsynchronouslyAsync(RandomNumber(100, 1000));
List<Task<string>> tasks = new List<Task<string>> { slowerTask, fasterTask };
while (tasks.Count > 0)
{
Task<string> finishedTask = await Task.WhenAny(tasks);
tasks.Remove(finishedTask);
textBox1.Text = await finishedTask;
}

Related

async / await: If it doen't use threads what is it doing to running processing at the same time?

I am doing a little research to understand async / await of C# better.
I found a web site that has the following code to show how much slower synchronous processing is vs async / await:
public IActionResult Index()
{
Stopwatch watch = new Stopwatch();
watch.Start();
ContentManagement service = new ContentManagement();
var content = service.GetContent();
var count = service.GetCount();
var name = service.GetName();
watch.Stop();
ViewBag.WatchMilliseconds = watch.ElapsedMilliseconds;
return View();
}
[HttpGet]
public async Task<ActionResult> IndexAsync()
{
Stopwatch watch = new Stopwatch();
watch.Start();
ContentManagement service = new ContentManagement();
var contentTask = service.GetContentAsync();
var countTask = service.GetCountAsync();
var nameTask = service.GetNameAsync();
var content = await contentTask;
var count = await countTask;
var name = await nameTask;
watch.Stop();
ViewBag.WatchMilliseconds = watch.ElapsedMilliseconds;
return View("Index");
}
public class ContentManagement
{
public string GetContent()
{
Thread.Sleep(2000);
return "content";
}
public int GetCount()
{
Thread.Sleep(5000);
return 4;
}
public string GetName()
{
Thread.Sleep(3000);
return "Matthew";
}
public async Task<string> GetContentAsync()
{
await Task.Delay(2000);
return "content";
}
public async Task<int> GetCountAsync()
{
await Task.Delay(5000);
return 4;
}
public async Task<string> GetNameAsync()
{
await Task.Delay(3000);
return "Matthew";
}
}
I understand the above code at a high level and why it performs faster.
What I don't understand is if threads are not being used, how is the processing running at the same time?
I have read in a couple of places that async / await does not create new threads to do the processing. So, what is async / await doing to allow processing to happen at the same time? The three await Task.Delay are running in parallel, correct? If it is not creating 3 threads, what is it doing?
I just want to understand what is happening at a high level.
Let me know.
Thanks in advance.
if threads are not being used, how is the processing running at the same time?
Threads let you parallelize computations on the same system. When communications or other I/O are involved, there is a different system with which your code communicates. When you initiate the task, the other system starts doing work. This happens in parallel to your system, which is free to do whatever else it needs to do until you await the task.
The three await Task.Delay are running in parallel, correct?
They are not exactly running, they are sleeping in parallel. Sleeping takes very little resources. That's why they appear to be "running" in parallel.
What I don't understand is if threads are not being used, how is the processing running at the same time?
You can think of it as an event firing when the operation is complete, as opposed to a thread being blocked until the operation is complete.
I have read in a couple of places that async / await does not create new threads to do the processing.
async and await do not; that is true. For more about how async and await work, see my intro post.
So, what is async / await doing to allow processing to happen at the same time?
One of the primary use cases of async/await is for I/O-based code. I have a long blog post that goes into the details of how asynchronous I/O does not require threads.
The three await Task.Delay are running in parallel, correct?
I prefer to use the term "concurrently", just to avoid confusion with Parallel and Parallel LINQ, both of which were created for CPU-bound parallelism and do not work as generally expected with async/await. So, I would say that both parallelism and asynchrony are forms of concurrency, and this is an example of asynchronous concurrency.
(That said, using the term "parallel" is certainly in concord with the common usage of the term).
If it is not creating 3 threads, what is it doing?
Task.Delay is not an I/O-based operation, but it is very similar to one. It uses timers underneath, so it's completely different than Thread.Sleep.
Thread.Sleep will block a thread - I believe it does go all the way to an OS Sleep call, which causes the OS to place the thread in a wait state until its sleep time is expired.
Task.Delay acts more like an I/O operation. So, it sets up a timer that fires off an event when the time expires. Timers are managed by the OS itself - as time proceeds forward (clock ticks on the CPU), the OS will notify the timer when it has completed. It's a bit more complex than that (for efficiency, .NET will coalesce managed timers), but that's the general idea.
So, the point is that there is no dedicated thread for each Task.Delay that is blocked.

How to abort old request processing when new request arrives on ASP.NET MVC 5?

I have a form with hundreds of check boxes and dropdown menus (Which value of many of them are coupled together). In the action there is updating mechanism to update an object in Session. This object does all validation and coupling of values, for example if user types %50 in one input filed, we might add 3 new SelectListItem to a dropdown.
Everything works fine, but if use starts to clicking on check boxes very quick (which is the normal case in our scenario), controller get multiple posts while it is processing previous ones. Fortunately we are only interested in the last POST, so we need a way to abort\cancel on going requests when newer request from same form comes.
What I tried:
1- blocking client side to make multiple posts when server still working on previous one. It is not desirable because it makes noticeable pauses on browser side.
2- There are several solutions for blocking multiple post backs by using HASH codes or AntiForgeryToken. But they don't what I need, I need to abort on-going thread in favor of new request, not blocking incoming request.
3- I tried to extend pipeline by adding two message handlers (one before action and another after executing action) to keep a hash code (or AntiForgeryToken) but problem is still there, even I can detect there is on-going thread working on same request, I have no way to abort that thread or set older request to Complete.
Any thoughts?
The only thing you can do is throttle the requests client-side. Basically, you need to set a timeout when a checkbox is clicked. You can let that initial request go through, but then any further requests are queued (or actually dropped after the first queued request in your scenario) and don't run that until the timeout clears.
There's no way to abort a request server-side. Each request is idempotent. There is no inherent knowledge of anything that's happened before or since. The server has multiple threads fielding requests and will simply process those as fast as it can. There's no order to how the requests are processed or how responses are sent out. The first request could be the third one that receives a response, simply due to how the processing of each request goes.
You are trying to implement transactional functionality (i.e. counting only the last request) over an asynchronous technology. This is a design flaw.
Since you refuse to block on the client side, you have no method by which to control which requests process first, OR to correctly process the outcome again on the client-side.
You might actually run into this scenario:
Client sends Request A
Server starts processing Request B
Client sends Request B
Server starts processing Request B
Server returns results of Request B, and client changes accordingly
Server returns results of Request A, and client changes accordingly (and undoes prior changes resulting from Request B)
Blocking is the only way you can ensure the correct order.
Thanks for your help #xavier-j.
After playing around this, I wrote this. Hope it be useful for someone who needs same thing:
First you need add this ActionFilter
public class KeepLastRequestAttribute : ActionFilterAttribute
{
public string HashCode { get; set; }
public override void OnActionExecuting(ActionExecutingContext filterContext)
{
base.OnActionExecuting(filterContext);
Dictionary<string, CancellationTokenSource> clt;
if (filterContext.HttpContext.Application["CancellationTokensDictionary"] != null)
{
clt = (Dictionary<string, CancellationTokenSource>)filterContext.HttpContext.Application["CancellationTokensDictionary"];
}
else
{
clt = new Dictionary<string, CancellationTokenSource>();
}
if (filterContext.HttpContext.Request.Form["__RequestVerificationToken"] != null)
{
HashCode = filterContext.HttpContext.Request.Form["__RequestVerificationToken"];
}
CancellationTokenSource oldCt = null;
clt.TryGetValue(HashCode, out oldCt);
CancellationTokenSource ct = new CancellationTokenSource();
if (oldCt != null)
{
oldCt.Cancel();
clt[HashCode] = ct;
}
else
{
clt.Add(HashCode, ct);
}
filterContext.HttpContext.Application["CancellationTokensDictionary"] = clt;
filterContext.Controller.ViewBag.CancellationToken = ct;
}
public override void OnResultExecuted(ResultExecutedContext filterContext)
{
base.OnResultExecuted(filterContext);
if (filterContext.Controller.ViewBag.ThreadHasBeenCanceld == null && filterContext.HttpContext.Application["CancellationTokensDictionary"] != null) {
lock (filterContext.HttpContext.Application["CancellationTokensDictionary"])
{
Dictionary<string, CancellationTokenSource> clt = (Dictionary<string, CancellationTokenSource>)filterContext.HttpContext.Application["CancellationTokensDictionary"];
clt.Remove(HashCode);
filterContext.HttpContext.Application["CancellationTokensDictionary"] = clt;
}
}
}
}
I am using AntiForgeryToken here as key token, you can add your own custom hash code to have more control.
In the controller you will have something like this
[HttpPost]
[KeepLastRequest]
public async Task<ActionResult> DoSlowJob(CancellationToken ct)
{
CancellationTokenSource ctv = ViewBag.CancellationToken;
CancellationTokenSource nct = CancellationTokenSource.CreateLinkedTokenSource(ct, ctv.Token, Response.ClientDisconnectedToken);
var mt = Task.Run(() =>
{
SlowJob(nct.Token);
}, nct.Token);
await mt;
return null;
}
private void SlowJob(CancellationToken ct)
{
for (int i = 0; i < 10; i++)
{
Thread.Sleep(200);
if (ct.IsCancellationRequested)
{
this.ViewBag.ThreadHasBeenCanceld = true;
System.Diagnostics.Debug.WriteLine("cancelled!!!");
break;
}
System.Diagnostics.Debug.WriteLine("doing job " + (i + 1));
}
System.Diagnostics.Debug.WriteLine("job done");
return;
}
And finally in your JavaScript you need to abort ongoing requests, otherwise browser blocks new requests.
var onSomethingChanged = function () {
if (currentRequest != null) {
currentRequest.abort();
}
var fullData = $('#my-heavy-form :input').serializeArray();
currentRequest = $.post('/MyController/DoSlowJob', fullData).done(function (data) {
// Do whatever you want with returned data
}).fail(function (f) {
console.log(f);
});
currentRequest.always(function () {
currentRequest = null;
})
}

Parallel httprequest in UWP app

I'm creating an app that requires todo parallel http request, I'm using HttpClient for this.
I'm looping over the urls and foreach URl I start a new Task todo the request.
after the loop I wait untill every task finishes.
However when I check the calls being made with fiddler I see that the request are being called synchronously. It's not like a bunch of request are being made, but one by one.
I've searched for a solution and found that other people have experienced this too, but not with UWP. The solution was to increase the DefaultConnectionLimit on the ServicePointManager.
The problem is that ServicePointManager does not exist for UWP. I've looked in the API's and I thought I could set the DefaultConnectionLimit on HttpClientHandler, but no.
So I have a few Questions.
Is DefaultConnectionLimit still a property that could be set somewhere?
if so, where do i set it?
if not, how do I increase the connnectionlimit?
Is there still a connectionlimit in UWP?
this is my code:
var requests = new List<Task>();
var client = GetHttpClient();
foreach (var show in shows)
{
requests.Add(Task.Factory.StartNew((x) =>
{
((Show)x).NextEpisode = GetEpisodeAsync(((Show)x).NextEpisodeUri, client).Result;}, show));
}
}
await Task.WhenAll(requests.ToArray());
and this is the request:
public async Task<Episode> GetEpisodeAsync(string nextEpisodeUri, HttpClient client)
{
try
{
if (String.IsNullOrWhiteSpace(nextEpisodeUri)) return null;
HttpResponseMessage content; = await client.GetAsync(nextEpisodeUri);
if (content.IsSuccessStatusCode)
{
return JsonConvert.DeserializeObject<EpisodeWrapper>(await content.Content.ReadAsStringAsync()).Episode;
}
}
catch (Exception ex)
{
Debug.WriteLine(ex.Message);
}
return null;
}
Oke. I have the solution. I do need to use async/await inside the task. The problem was the fact I was using StartNew instead of Run. but I have to use StartNew because i'm passing along a state.
With the StartNew. The task inside the task is not awaited for unless you call Unwrap. So Task.StartNew(.....).Unwrap(). This way the Task.WhenAll() will wait untill the inner task is complete.
When u are using Task.Run() you don't have to do this.
Task.Run vs Task.StartNew
The stackoverflow answer
var requests = new List<Task>();
var client = GetHttpClient();
foreach (var show in shows)
{
requests.Add(Task.Factory.StartNew(async (x) =>
{
((Show)x).NextEpisode = await GetEpisodeAsync(((Show)x).NextEpisodeUri, client);
}, show)
.Unwrap());
}
Task.WaitAll(requests.ToArray());
I think an easier way to solve this is not "manually" starting requests but instead using linq with an async delegate to query the episodes and then set them afterwards.
You basically make it a two step process:
Get all next episodes
Set them in the for each
This also has the benefit of decoupling your querying code with the sideeffect of setting the show.
var shows = Enumerable.Range(0, 10).Select(x => new Show());
var client = new HttpClient();
(Show, Episode)[] nextEpisodes = await Task.WhenAll(shows
.Select(async show =>
(show, await GetEpisodeAsync(show.NextEpisodeUri, client))));
foreach ((Show Show, Episode Episode) tuple in nextEpisodes)
{
tuple.Show.NextEpisode = tuple.Episode;
}
Note that i am using the new Tuple syntax of C#7. Change to the old tuple syntax accordingly if it is not available.

Scatter / Gather using Rebus

I have a requirement to a batch a number of web service calls on the receipt of a single message appearing in a (MSMQ) queue.
Is "sagas" the way to go?
The interaction with the 3rd party web service is further complicated because I need to call it once, then poll subsequently for an acknowledgement with a correlation id, returned in the reply to the initial call to the web service.
Yes, sagas could be the way to coordinate the process of initiating the call, polling until the operation has ended, and then doing something else when all the work is done.
If you don't care too much about accidentally making the web service call more than once, you can easily use Rebus' async capabilities to implement the polling - I am currently in the process of building something that basically does this:
public async Task Handle(SomeMessage message)
{
var response = await _client.Get<SomeResponse>("https://someurl") ;
var pollUrl = response.PollUrl;
var resultUrl = response.ResultUrl;
while(true)
{
var result = await _client.Get<PollResult>(pollUrl);
if (result.Status == PollStatus.Processing)
{
await Task.Delay(TimeSpan.FromSeconds(2));
continue;
}
if (result.Status == PollStatus.Done)
{
var finalResult = await _client.Get<FinalResult>(resultUrl);
return new SomeReply(finalResult);;
}
throw new Exception($"Unexpected status while polling {pollUrl}: {result.Status}")
}
}
thus taking advantage of async/await to poll the external webservice while it is processing, while consuming minimal resources in our end.

Some questions concerning the combination of ADO.NET, Dapper QueryAsync and Glimpse.ADO

I have been experimenting with a lightweight solution for handling my business logic. It consists of a vanilla ADO.NET connection that is extended with Dapper, and monitored by Glimpse.ADO. The use case for this setup will be a web application that has to process a handful of queries asynchronously per request. Below a simple implementation of my setup in an MVC controller.
public class CatsAndDogsController : Controller
{
public async Task<ActionResult> Index()
{
var fetchCatsTask = FetchCats(42);
var fetchDogsTask = FetchDogs(true);
await Task.WhenAll(fetchCatsTask, fetchDogsTask);
ViewBag.Cats = fetchCatsTask.Result;
ViewBag.Dogs = fetchDogsTask.Result;
return View();
}
public async Task<IEnumerable<Cat>> FetchCats(int breedId)
{
IEnumerable<Cat> result = null;
using (var connection = CreateAdoConnection())
{
await connection.OpenAsync();
result = await connection.QueryAsync<Cat>("SELECT * FROM Cat WHERE BreedId = #bid;", new { bid = breedId });
connection.Close();
}
return result;
}
public async Task<IEnumerable<Dog>> FetchDogs(bool isMale)
{
IEnumerable<Dog> result = null;
using (var connection = CreateAdoConnection())
{
await connection.OpenAsync();
result = await connection.QueryAsync<Dog>("SELECT * FROM Dog WHERE IsMale = #im;", new { im = isMale });
connection.Close();
}
return result;
}
public System.Data.Common.DbConnection CreateAdoConnection()
{
var sqlClientProviderFactory = System.Data.Common.DbProviderFactories.GetFactory("System.Data.SqlClient");
var dbConnection = sqlClientProviderFactory.CreateConnection();
dbConnection.ConnectionString = "SomeConnectionStringToAwesomeData";
return dbConnection;
}
}
I have some questions concerning the creation of the connection in the CreateAdoConnection() method. I assume the following is happening behind the scenes.
The call to sqlClientProviderFactory.CreateConnection() returns an instance of System.Data.SqlClient.SqlConnection passed as a System.Data.Common.DbConnection. At this point Glimpse.ADO.AlternateType.GlimpseDbProviderFactory kicks in and wraps this connection in an instance of Glimpse.Ado.AlternateType.GlimpseDbConnection, which is also passed as a System.Data.Common.DbConnection. Finally, this connection is indirectly extended by the Dapper library with its query methods, among them the QueryAsync<>() method used to fetch the cats and dogs.
The questions:
Is the above assumption correct?
If I use Dapper's async methods with this connection - or create a System.Data.Common.DbCommand with this connection's CreateCommand() method, and use it's async methods - will those calls internally always end up using the vanilla async implementations of these methods as Microsoft has written them for System.Data.SqlClient.SqlConnection and System.Data.SqlClient.SqlCommand? And not some other implementations of these methods that are actually blocking?
How much perf do I lose with this setup compared to just returning a new System.Data.SqlClient.SqlConnection directly? (So, without the Glimpse.ADO wrapper)
Any suggestions on improving this setup?
Yes pretty much. GlimpseDbProviderFactory wraps/decorates/proxies all the registered factories. We then pass any calls we get through to the factory we wrap (in this case SQL Server). In the case of CreateConnection() we ask the inner factory we have, to create a connection, when we get that connection, we wrap it and then return it to the originating caller
Yes. Glimpse doesn't turn what was an async request into a blocking request. We persevere the async chain all the way though. If you are interested, the code in question is here.
Very little. In essence, using a decorator pattern like this adds only one or two frames to the call stack. Compared to most operations performed during the request lifecycle, the time to observe whats happening here is extremely minimal.
What you have looks great. Only suggestion is to maybe us this code to build the factory. This code means that you can shift your connection string, etc to the web.config.

Resources