I have a web site which makes frequent requests to an external web service, and I'd like these calls to be async and parallel to avoid blocking and to speed up the site a bit. Basically, I have 8 widgets, each of which has to make its own web call(s).
For some reason, only the first 3 or so of them truly load async, and then the threads don't free up in time, and the rest of the widgets load sequencially. If i could get 3 of them to load in parallel, then 3 more in parallel, then 2 more in parallel, i'd be happy. So the issue is really that the threads aren't freeing up in time.
I'm guessing the answer has to do with some IIS configuration. I'm testing on a non-server OS, so maybe that's part of it.
Edit for #jon skeet:
I'm using reflection to invoke the web calls like this:
output = methodInfo.Invoke(webservice, parameters);
The widget actions (which eventually call the web service) are called via a jquery $.each() loop and the .load function (maybe this causes a bottleneck?). The widget actions are set up as async methods in an async controller.
Here is the code for one of the async methods (they are all set up like this):
public void MarketTradeWidgetAsync()
{
AsyncManager.OutstandingOperations.Increment();
//a bunch of market trade logic
//this eventually calls the web service
PlanUISetting uiSettingMarketQuotesConfig = WebSettingsProviderManager.Provider.GetMarketQuotes(System.Configuration.ConfigurationManager.AppSettings["Theme"], SessionValues<String>.GlobalPlanID, SessionValues<String>.ParticipantID, "MARKETQUOTES");
AsyncManager.OutstandingOperations.Decrement();
}
public ActionResult MarketTradeWidgetCompleted(MarketTradeTool markettradetool)
{
if (Session.IsNewSession)
return PartialView("../Error/AjaxSessionExpired");
else
{
ViewData["MarketData"] = markettradetool;
return PartialView(markettradetool);
}
}
And, like I said, these methods are called via jquery. My thinking is that since the action methods are async, they should give control back to the jquery after they get called, right?
SessionState = "readonly" for the page at hand fixed this issue. Evidently session locking was the issue.
Related
...without async await...
I would like to demonstrate the classic completion handler logic with Task library. (not with the pure Thread object) So I would like to create a Task instance, run it (against the default thread pool, or any way), and have a completion handler, which runs (probably on the Task's thread) on completion and access to the result.
(I do know now we can use async await, also know about marshaling, also know about ASP.NET threads and contexts, please do not explain those.)
I would like to demonstrate the classic completion handler logic with Task library.
I really don't think this is very useful. It sounds like you're describing continuation passing style, which was not widely adopted among .NET libraries. There was one HTTP library using it, but other than that I did not see much adoption of that pattern in .NET at all. On the other hand, JavaScript (particularly server-side JS) did use CPS quite a bit.
That said, if you really want to do it, you can:
public static void Run<T>(Func<T> action, Action<Exception> onError, Action<T> onComplete)
{
Task.Run(action).ContinueWith(
t =>
{
T result;
try
{
result = t.Result;
}
catch (Exception ex)
{
onError(ex);
return;
}
onComplete(result);
},
default,
TaskContinuationOptions.ExecuteSynchronously,
TaskScheduler.Default);
}
I am working on an ASP .NET MVC 5 application that requires me to use the Task objects that were introduced in .NET 4.0. I am browsing through a few links that give an overview of Task objects. However, I could use a bit of help to check if I am going in the right direction.
Here is the stub that is generated by Visual Studio:
public Task<MyAppUser> FindByNameAsync(string userName) {
throw new System.NotImplementedException();
}
I have written a method called mySearch() that searches through a list. I could use this function for my implementation:
public Task<MyAppUser> FindByNameAsync(string userName) {
MyAppUser val = mySearch(userName);
return Task<MyAppUser>.FromResult<MyAppUser>(val);
}
While this may work, I am thinking I am not really utilizing the Task paradigm properly. Perhaps I can write the code as follows:
public Task<MyAppUser> FindByNameAsync(string userName) {
return Task<MyAppUser>.Factory.StartNew(() => mySearch(userName));
}
As I understand, I am simply returning a delegate as a Task object which the ASP.NET engine will execute as needed.
Am I using the Task paradigm correctly?
Don't ever return a new Task from an XXXAsync method - that's almost the worst thing you can do. In your case, using Task.FromResult is probably the best option (if you are indeed forced to use the XXXAsync methods and if you really don't have asynchronous I/O for the search method). In a web application, it's better to do the whole thing synchronously rather than appearing asynchronous while still taking up a different thread.
The reasoning is simple - asynchronous methods are a great way to conserve resources. Asynchronous I/O doesn't require a thread, so you can afford to reuse the current thread for other work until the data is actually ready. In ASP.NET, the callback will be posted back to a ThreadPool thread, so you've managed to increase your throughput essentially for free.
If you fake the asynchronous method by using Task.FromResult, it's true that this is lost. However, unlike in WinForms or WPF, you're not freezing the GUI, so there's no point in masking the lacking asynchronicity by spawning a new thread.
When you do the faking by using TaskFactory.StartNew or Task.Run, you're only making things worse, essentially - it's true that you release the original thread as with proper async I/O, but you also claim a new thread from the ThreadPool - so you're still blocking one thread, you just added a bunch of extra work for the plumbing.
#Luaan's answer is quite good. I just want to expound on a couple of principles for using async on ASP.NET.
1) Use synchronous method signatures for synchronous work.
I'm not sure why VS is generating an asynchronous stub. Since your mySearch just "searches through a list" (a synchronous operation), then your method should look like this instead:
public MyAppUser FindByName(string userName) {
return mySearch(userName);
}
2) Use async/await for asynchronous work (i.e., anything doing I/O). Do not use Task.Run or (even worse) Task.Factory.StartNew to fake asynchronous work within a request context.
For example, if you needed to search in a database (I/O), then that would be naturally asynchronous, and you should use the asynchronous APIs (e.g., EF6 has asynchronous queries):
public Task<MyAppUser> FindByNameAsync(string userName) {
return dbContext.Users.Where(x => x.Name == userName).FirstAsync();
}
If you're planning to have asynchronous APIs but for now you're just doing test/stub code, then you should use FromResult:
public Task<MyAppUser> FindByNameAsync(string userName) {
return Task.FromResult(mySearch(userName));
}
I've got quite a lot of code on my site that looks like this;
Item item;
if(Cache["foo"] != null)
{
item = (Item)Cache["foo"];
}
else
{
item = database.getItemFromDatabase();
Cache.insert(item, "foo", null, DateTime.Now.AddDays(1), ...
}
One such instance of this has a rather expensive getItemFromDatabase method (which is the main reason it's cached). The problem I have is that with every release or restart of the application, the cache is cleared and then an army of users come online and hit the above code, which kills our database server.
What is the typical method of dealing with these sorts of scenarios?
You could hook into the Application OnStart event in the global.asax file and call a method to load the expensive database calls in a seperate thread when the application starts.
It may also be an idea to use a specialised class for accessing these properties using a locking pattern to avoid multiple database calls when the initial value is null.
I am developing an application in GWT as my Bachelor's Thesis and I am fairly new to this. I have researched asynchronous callbacks on the internet. What I want to do is this: I want to handle the login of a user and display different data if they are an admin or a plain user.
My call looks like this:
serverCall.isAdmin(new AsyncCallback<Boolean>() {
public void onFailure(Throwable caught) {
//display error
}
public void onSuccess(Boolean admin) {
if (!admin){
//do something
}
else{
//do something else
}
}
});
Now, the code examples I have seen handle the data in the //do something// part directly. We discussed this with the person who is supervising me and I had the idea that I could fire an event upon success and when this event is fired load the page accordingly. Is this a good idea? Or should I stick with loading everything in the inner function? What confuses me about async callbacks is the fact that I can only use final variables inside the onSuccess function so I would rather not handle things in there - insight would be appreciated.
Thanks!
Since the inner-class/ anonymous function it is generated at runtime it needs a static memory reference to the variables it accesses. Putting final to a variable makes its memory address static, putting it to a safe memory region. The same happens if you reference a class field.
Its just standard java why you can only use Final variables inside an inner-class. Here is a great discussion discussing this topic.
When I use the AsyncCallback I do exactly what you suggested, I fire an event though GWT's EventBus. This allows several different parts of my application to respond when a user does log in.
Let's say that, theoratically, I have a page / controller action in my website that does some very heavy stuff. It takes about 10 seconds to complete it's operation.
Now, I use .NET's outputcache mechanism to cache it for 15 minutes (for examle, I use [OutputCache(Duration = 900)]) What happens if, after 15 minutes, the cache is expired and 100 users request the page again within those 10 seconds that it takes to do the heavy processing?
The heavy stuff is done only the first time, and there is some locking mechanism so that the other 99 users will get the cache result
The heavy stuff is done 100 times (and the server is crippled as it can take up to 100 * 10 seconds)
Easy question maybe, but I'm not 100% sure. I hope it is number one, though :-)
Thanks!
Well, it depends upon how you have IIS configured. If you have less than 100 worker threads (let's say, 50), then the "heavy stuff" is done 50 times, crippling your server, and then the remaining 50 requests will be served from cache.
But no, there is no "locking mechanism" on a cached action result; that would be counterproductive, for the most part.
Edit: I believe this to be true, but Nick's tests say otherwise, and I don't have time to test now. Try it yourself! The rest of the answer is not dependent on the above, though, and I think it's more important.
Generally speaking, however, no web request, cached or otherwise, should take 10 seconds to return. If I were in your shoes, I would look at somehow pre-computing the hard part of the request. You can still cache the action result if you want to cache the HTML, but it sounds like your problem is somewhat bigger than that.
You might also want to consider asynchronous controllers. Finally, note that although IIS and ASP.NET MVC will not lock on this heavy computation, you could. If you use asynchronous controllers combined with a lock on the computation, then you would get effectively the behavior you're asking for. I can't really say if that's the best solution without knowing more about what your doing.
It seems to lock here, doing a simple test:
<%# OutputCache Duration="10" VaryByParam="*" %>
protected void Page_Load(object sender, EventArgs e)
{
System.Threading.Thread.Sleep(new Random().Next(1000, 30000));
}
The first page hits the a breakpoint there, even though it's left sleeping...no other request hits a breakpoint in the Page_Load method...it waits for the first one to complete and returns that result to everyone who's requested that page.
Note: this was simpler to test in a webforms scenario, but given this is a shared aspect of the frameworks, you can do the same test in MVC with the same result.
Here's an alternative way to test:
<asp:Literal ID="litCount" runat="server" />
public static int Count = 0;
protected void Page_Load(object sender, EventArgs e)
{
litCount.Text = Count++.ToString();
System.Threading.Thread.Sleep(10000);
}
All pages queued up while the first request goes to sleep will have the same count output.
Old question, but I ran in to this problem, and did some investigation.
Example code:
public static int Count;
[OutputCache(Duration = 20, VaryByParam = "*")]
public ActionResult Test()
{
var i = Int32.MaxValue;
System.Threading.Thread.Sleep(4000);
return Content(Count++);
}
Run it in one browser, and it seems to lock and wait.
Run it in different browsers (I tested in IE and firefox) and the requests are not put on hold.
So the "correct" behaviour has more to do with which browser you are using than the function in IIS.
Edit: To clarify - No lock. The server gets hit by all requests that manage to get in before the first result is cached, possibly resulting in a hard hit on the server for heavy requests. (Or if you call an external system, that system could be brought down if your server serves many requests...)
I made a small test that might help. I believe what I've discovered is that the uncached requests do not block, and each request that comes in while the cache is expired and before the task is completed ALSO trigger that task.
For example, the code below takes about 6-9 seconds on my system using Cassini. If you send two requests, approximately 2 seconds apart (i.e. two browser tabs), both will receive unique results. The last request to finish is also the response that gets cached for subsequent requests.
// CachedController.cs
using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.Web.Mvc;
namespace HttpCacheTest.Controllers
{
public class CachedController : Controller
{
//
// GET: /Cached/
[OutputCache(Duration=20, VaryByParam="*")]
public ActionResult Index()
{
var start = DateTime.Now;
var i = Int32.MaxValue;
while (i > 0)
{
i--;
}
var end = DateTime.Now;
return Content( end.Subtract(start).ToString() );
}
}
}
You should check this information here:
"You have a single client making multiple concurrent requests to the server. The default behavior is that these requests will be serialized;"
So, if the concurrent request from the single client is serialized, the subsequent request will use the cache. That explain some behavior seem in some answer above (#mats-nilsson and #nick-craver)
The context that you showed us is multiple users, that will hit you Server in the same time, and you server will get busy until have completed at least one request and created the output cache, and use it for the next request. So if you want to serialize multiple users requesting the same resource, we need to understand how the serialized request works for single user. Is that what you want?