I'm not talking about asynchronous pages or asynchronous handlers, I just want to know if I should be afraid of any side effect when I invoke an asynchronous method that will end after the page is finished rendering.
Example given: Each time that a user do login, I have to launch a heavy and time consuming SQL operation, but the user doesn't need to know the result of that operation, so I can execute the query using BeginExecuteNonQuery without pass any callback, and finish rendering the page.
My concern is, what happen if the HTTP call ends (because the page is served) and whatever I've executed asynchronously is already running? is ASP.NET or IIS going to cut, destroy, void anything?
Cheers.
That operation will run, even when the request has finished. However, please note that the ASP.NET host aggressively kills threads. When IIS has any reason for unloading or recycling the AppDomain, your background thread will be killed. Unloading happens in several situations. For instance when no new requests have come in for a certain period of time. Or when too many exceptions are fired from the application within a certain period of time. Or when the memory pressure gets too high.
If you need the guarantee, that the operation will finish, I think there are three things you can do:
Speed up the operation so that it can run synchronously, or
Move that that heavy operation to a Windows Service and let that execute it, or
You can hook onto the HostingEnvironment.RegisterObject method (as Phill Haack explains here) (demands full trust) to prevent the AppDomain to go down while that thread is running.
If you have a callback registered, the process will comeback to notify the callback otherwise it will still complete the job. AFAIK - neither ASP.NET or IIS will cut/destroy or void anything as the execution was already ordered and it has to complete.
Related
For logging purposes of an ASP.NET web application, I keep some state information in a static class. These fields are marked [ThreadStatic] so every thread has its own copy of the field. The logging methods are called from the HttpApplication event methods:
Application_BeginRequest (request start, initialise state)
Application_AcquireRequestState (session and user are known)
Application_EndRequest (request end, clean up)
I can now observe that, under certain circumstances, a page request is processed in different threads. The BeginRequest event runs on thread 18 while the following events run on thread 4. Of course my thread-static data is then not available and an error occurs.
Most of the time this works just fine and every request is processed in a single thread only. But when I request a page that loads ~5 seconds and after 1-2 seconds click on another link, both requests run in parallel. The first is completed on thread 24 (where it was also started) after 5 seconds, while the other starts on thread 18, but after the first request has completed, the second continues to run on thread 4.
Trying it with 3 overlapping long requests, it's a pure chaos. I can even watch two requests starting on the same thread while they later continue on different threads each. There doesn't seem to be any relationship between a request and a thread.
How can it be that a request is changing threads? It's losing all of its state if it decides to move on to another thread. And every description I can find says that it all happens in a single thread.
ASP.NET 4.0 on IIS 7, Windows Server 2008 R2, x64.
Alternative: If I cannot rely on requests being processed in only a single thread from the start to the end, then what would be the best location to store small amounts of per-request data (currently an integer and a class) that is very fast accessibly? And preferably also works without a reference to System.Web (my code is targeting the client profile as well). I know about HttpContext.Current.Items[key] but it's looked up somewhere deep in the remoting assembly and involves a dictionary which seems a lot slower than a thread-static field.
ASP.NET is thread agile and a request can be processed on more than one thread (but not more than one at time). Because of this you really can't use ThreadStatics in ASP.NET. However, you can safely use the HttpContext.Items dictionary to store things which need to be scoped to a single request.
To allow your code to work outside the context of an ASP.NET application, you could create a wrapper that swaps HttpContext / CallContext, depending on which environment the code is in. Here is an example of such a wrapper.
I am migrating an app written on asp.net 1.1. There is a process which can take 5 minutes on one page, processing data in SQL, and letting the user know when it's complete.
To get around the HTTP page timeout, the process runs asynchronously and the page refreshes every 5 seconds checking for completion. It's very simple. Here is the problem: I use a session variable as a semaphore to signal process completion.
This is not working now as I cannot read the semaphore set in the asynch process. The asynch process can read the session from the calling routine, but cannot write back.
First, is there a way to get the asynch process to write to a session variable which can be read by another process? This probably is not the best approach today, but getting the app working is my biggest priority.
Second, if I rewrite it, what approach should be used? This is an asp web app. Not MVC.
use callback technologie it allow you to query an operation server side from your client and get a return from server so no session to manage any more:
http://msdn.microsoft.com/en-us/library/ms178210(v=vs.80).aspx
I'd like to start using asynchronous processing in IIS. Edit: I'm talking about using the task parallel library.
For example, on certain page loads I want to log a bunch of crap, send an email, update some tables, etc. But I don't want to make the user wait for me to log all that crap.
So normally what I do is I have a static Queue that I push the log info onto, and then I have a cron job that calls a special page every 10 minutes whose OnLoad flushes out the queue. This works, but it's kind of clunky to setup, especially when you want to log 50 things. I'd rather do this:
Task.CreateNew(() => Log(theStuff));
However I'm terrified of running tasks in IIS because one slip up and your entire website goes down.
So now I have
SafeTask.FireAndForget(() => Log(theStuff));
This wraps the delegate in some try/catch and passes it into Task.CreateNew. So if someone changes something that affects something else that generates an exception somewhere else that accidentally gets thrown on the task thread, we get a notification instead of a crashed website. Also, the error notification inside the catch is also inside its own try/catch, and the catch for that also has a try/catch that tries to log in a different way.
Now that I can safely run stuff asynchronously in IIS, what other things do I need to worry about before I can start using my SafeTask class?
Every request in IIS and .net is processed in one thread by default. This thread comes from a thread pool called the "Application Pool". Existing threads are reused so you can't really use them for thread state unless you clear or set it every time. You define the size of this thread pool using a formula from MSDN in the machine.config or even your web.config.
Now, every async function call is put on a different thread. This includes async web service calls, async page functions, async delegates, etc. This thread comes from the "application pool" thus reducing the number of thread available for IIS to service new requests.
Most likely, your application will work just fine while using async function calls. In case you are worried or you have a lot of async tasks then you may want to create your own thread pool or look at SmartThreadPool on codeplex.
Hope this helps.
Consider using the page's OnUnload event. Read about it here: http://msdn.microsoft.com/en-us/library/ms178472.aspx
This event fires after the content is sent to the user (so the user isn't blocked while you do work), and should completely satisfy your requirement without introducing additional threads.
Specific to your question, you should be concerned about thread pool exhaustion only if your load and performance testing suggests you're running up against thread limits. If you're not then what you propose is certainly reasonable.
I have an ASP.Net application that needs needs to have some work performed by another machine. To do this I am leaving a message on queue visible to both machines. When the work is done a message is left on second queue.
I need the ASP.Net application to check the second queue periodically to see if any of the tasks are complete.
Where is the best place to but such a loop? Global.asax?
I remember reading somewhere that you can get a function called after an interval. Would that be suitable?
To achieve periodical tasks on asp.net, I've found two acceptable approaches:
Spawn a thread during Application_Start at global.asax, in a while loop (1) Do the work (2) Sleep the thread for an interval.
Again in Application_Start, insert a dummy item into asp.net cache, expires in a certain interval and give that cache item a callback to be called when it's expired. In that callback, you can do the work and insert the cache item back the same way.
In both ways, you need to make sure that your thread keeps working even if there happens an error. You may place a restore code in SessionStart and BeginRequest to check your thread or cache item is there, and renew it if something has happened to it.
I assume that this is done on a regular basis, and that some other process puts the items on the queue?
If that is the case, you might put something in Global.asax that on application start creates a separate thread that simply monitors the queue, you could use a timer to have that thread sleep for X seconds, then check for results.
Is it possible to use BackGroundWorker thread in ASP.NET 2.0 for the following scenario, so that the user at the browser's end does not have to wait for long time?
Scenario
The browser requests a page, say SendEmails.aspx
SendEmails.aspx page creates a BackgroundWorker thread, and supplies the thread with enough context to create and send emails.
The browser receives the response from the ComposeAndSendEmails.aspx, saying that emails are being sent.
Meanwhile, the background thread is engaged in a process of creating and sending emails which could take some considerable time to complete.
My main concern is about keeping the BackgroundWorker thread running, trying to send, say 50 emails while the ASP.NET workerprocess threadpool thread is long gone.
If you don't want to use the AJAX libraries, or the e-mail processing is REALLY long and would timeout a standard AJAX request, you can use an AsynchronousPostBack method that was the "old hack" in the .net 1.1 days.
Essentially what you do is have your submit button begin the e-mail processing in an asynchronous state, while the user is taken to an intermediate page. The benefit to this is that you can have your intermediate page refresh as much as needed, without worrying about hitting the standard timeouts.
When your background process is complete, it will put a little "done" flag in the database/application variable/whatever. When your intermediate page does a refresh of itself, it detects this flag and automatically redirects the user to the "done" page.
Again, AJAX makes all of this moot, but if for some reason you have a very intensive or timely process that has to be done over the web, this solution will work for you. I found a nice tutorial on it here and there are plenty more out there.
I had to use a process like this when we were working on a "web check-in" type application that was interfacing with a third party application and their import API was hideously slow.
EDIT: GAH! Curse you Guzlar and your god-like typing abilities 8^D.
You shouldn't do any threading from ASP.NET pages. Any thread that is long running is in danger of being killed when the worker process recycles. You can't predict when this will happen. Any long-running processes need to be handled by a windows service. You can kick off these processes by dropping a message in MSMQ, for example.
ThreadPool.QueueUserWorkItem(delegateThatSendsEmails)
or on System.Net.Mail.SmtpServer use the SendAsync method.
You want to put the email sending code on another thread, because then it will return the the user immediately, and will just process, no matter how long it takes.
It is possible. Once you start a new thread asynchronously from page, page request will proceed and send the page back to the user. The async thread will continue to run on the server but will no longer have access to the session.
If you have to show task progress, consider some Ajax techniques.
What you need to use for this scenario is Asynchronous Pages, a feature that was added in ASP.NET 2.0
Asynchronous pages offer a neat
solution to the problems caused by
I/O-bound requests. Page processing
begins on a thread-pool thread, but
that thread is returned to the thread
pool once an asynchronous I/O
operation begins in response to a
signal from ASP.NET. When the
operation completes, ASP.NET grabs
another thread from the thread pool
and finishes processing the request.
Scalability increases because
thread-pool threads are used more
efficiently. Threads that would
otherwise be stuck waiting for I/O to
complete can now be used to service
other requests. The direct
beneficiaries are requests that don't
perform lengthy I/O operations and can
therefore get in and out of the
pipeline quickly. Long waits to get
into the pipeline have a
disproportionately negative impact on
the performance of such requests.
http://msdn.microsoft.com/en-us/magazine/cc163725.aspx
If you want using multitheading in your ASP page, you might using simple threading model like this:
{
System.Threading.Thread _thread = new Thread(new ThreadStart(Activity_DoWork));
_thred.Start();
}
Activity_DoWork()
{
/*Do some things...
}
This method is correct working with ASP pages. The ASP page with BackgroundWorker will not start while BackgroundWorker will finish.
5 years later, but problems the sameā¦ If you want to perform fire-and-forget operations from your application and forget about all difficulties related to background job processing in ASP.NET applications, you can use http://hangfire.io.
It does not loose your jobs on recycling process, because it uses persistent storage to keep information about background jobs.
It automatically retries your background jobs that were aborted or failed due to transient exception (SMTP Server connectivity errors).
It allows you to easily debug background jobs through the integrated web interface.
It is very easy to install/configure/use HangFire.
There is also tutorial Sending Mail in Background with ASP.NET MVC for using HangFire with Postal.