Passing HTTP authenticated principal onto another worker thread - asp.net

We have a web front end on our business layer server.
Certain pages in our web application instantiate very long running tasks (could be up to 10+ minutes). The way that these requests are handled is like so: -
(on the HTTP request thread)
we make a connection to the business server.
we create a new thread to make the long running call passing in the connection object.
The HTTP request then completes, passing a handle back to the browser,
the browser periodically polls the web server to get updates on the long running task progress.
All requests to the business server are authenticated - the connection's user principal page must have permission to call the method on the business server.
This mechanism works fine as long as our web application is running in Classic mode.
When we run in pipeline mode, we get ObjectDisposedExceptions when the browser polls.
System.ObjectDisposedException: Safe handle has been closed
at System.StubHelpers.StubHelpers.SafeHandleC2NHelper(Object pThis, IntPtr CleanupWorkList)
at Microsoft.Win32.Win32Native.GetTokenInformation(SafeTokenHandle TokenHandle, UInt32 TokenInformationClass, SafeLocalAllocHandle TokenInformation, UInt32 TokenInformationLength, ref UInt32 ReturnLength)
at System.Security.Principal.WindowsIdentity.GetTokenInformation(SafeTokenHandle tokenHandle, TokenInformationClass tokenInformationClass, ref UInt32 dwLength)
at System.Security.Principal.WindowsIdentity.get_User()
at System.Security.Principal.WindowsIdentity.GetName()
at System.Security.Principal.WindowsIdentity.get_Name()
the problem appears to be that the windows principal used to make the connection is disposed when the original request ends (which is understandable - in fact I am surprised that the code worked at all!).
As a way around this problem I was wondering if it was possible to either create a duplicate of the HTTP request principal and use that to create the connection (and dispose of it when the long running task completes) or would it be possible to impersonate the HTTP request principle on the worker thread even after the principal is disposed?
Update
(My comment under Aliostad's question was incorrect: the test page did fail. I managed to confuse myself sufficiently that I wrote my test page so that it did not exercise the same code path as the real (faulting) code. Nevermind!)
I have written a "workaround" for this problem: -
I am in the fortunate position of knowing what roles/groups the business server logic will be querying for before the call to the business server is made. So my workaround is to create a new generic principal based upon the request's principal's membership of these roles. The long running task is run using the generic principal.
I am not 100% happy with this workaround because it is very much a "hack" - i.e. I can see that it would easily fall down if some logic did the (eminently sensible) check of verifying that the principal's identity is authenticated.
So I would still very much appreciate any help / insight into this issue.
Thanks

OK, here is my catch on this.
First of all, if you create a thread, all the current thread's security context will be copied to the new thread - by default. This operation is heavy but much needed (as you can imagine most things will not work without it). In case you need to prevent it and you do not need the copying of context, there is a way to do it and it has been explained in Richter's C# via CLR. Lucky enough, he has shared this very bit of the book here and basically calling a static method to prevent context to be flowed:
ExecutionContext.SuppressFlow();
I cannot think this is being called in WCF although using Reflector, I found a single use of it in here:
[SecuritySafeCritical]
private IAsyncResult BeginGetContext(bool startListening)
{
Exception exception;
do
{
exception = null;
try
{
try
{
if (ExecutionContext.IsFlowSuppressed())
{
return this.listener.BeginGetContext(this.onGetContext, null);
}
using (ExecutionContext.SuppressFlow())
{
return this.listener.BeginGetContext(this.onGetContext, null);
}
}
// .... the rest
Interestingly enough, this is used in 3 places one of them in SharedHttpTransportManager.
Now all this might look like we have found the issue and it is a bug but I very much doubt it.
My hunch is that there is a process recycling happening in between and the context is lost. The way to prove or disprove this would be to use perfmon to register all process recycles and find out if any was in between.
My solution is basically - which you might not like! - to simply insert an item into a queue (MSMQ or a simple database queue) and have a windows service reading it. With this operation being so important, I would never trust IIS to carry out to the finish.
Hope this is useful to you.

Related

Asp.net web api + entity framework: multiple requests cause data conflict

I'm developing an app with VS2013, using EF6.02, and Web API 2. I'm using the ASP.NET SPA template, and creating a RESTful api against an entity framework data source backed by a sql server. (In development, this resides on the SQL Server local instance.)
I've got two API methods so far (one that just reads data, one that writes data), and I'm testing them by calling them in the javascript. When I only call a single method in my script, either one works perfectly. But if I call both in script (without waiting for either's callback to fire), I get bad results and different exceptions in the debugger. Some exceptions state that the save can't be completed because there are pending transactions. Another exception stated something about a conflict with other threads. And sometimes, the read operation fails with a null pointer exception when trying to read a result set.
"New transaction is not allowed because there are other threads running in the session."
This makes me question if I'm correctly getting a new DBContext per request. My code for this looks like:
static Startup()
{
context = new Data.SqlServer.AppDbContext();
...
}
and then whenever instantiating a unit of work, I access Startup.context.
I've tried to implement the unit of work pattern, and each request shares a single UOW object which has a single DBContext object.
My question: Do I have additional responsibility to ensure that web requests "play nicely" with eachother? I hope that this is a problem that others have already dealt with. Perhaps the errors that I'm seeing are legitimate in the sense that if one user's data is being touched, it is temporarily in an invalid state and if other requests come in at that exact moment, they indeed will fail (and I should code anticipating these failures). I guess that even if each request has its own DBContext, they still share the same underlying SQL data source so perhaps that's causing issues.
I can try to put together a testcase, but I get differing behavior depending on where I put breakpoints and how long I spend on them, reaffirming to me that this is timing related.
Thanks for any help or suggestions...
-Ben
Your problem is where you are setting your context. The Startup method is for when the entire application starts, thus any request made will all use the same context. This is not a per request setup, but rather a per application setup. As to why you are getting the errors, EntityFramework is NOT thread-safe. Since IIS spawns many threads to handle concurrent request, your single context is being used across multiple threads.
As for a solution, you can look into
-Dependency Injection frameworks (such as Ninject or Unity)
-place a using statement in your UnitOfWork classes
using(var context = new Data.SqlServer.AppDbContext()){//do stuff}
-Or, I have seen instances of people creating a class that gets the context for that request and stores it in the HttpContext.Cache[] element (using a unique name so you can retrieve it in another class easily), making it so that you will reuse the same context for the same request. Something like this:
public AppDbContext GetDbContext()
{
var httpContext = HttpContext.Current;
if (httpContext == null) return new AppDbContext();
const string contextTypeKey = "AppDbContext";
if (httpContext.Items[contextTypeKey] == null)
{
httpContext.Items.Add(contextTypeKey, new AppDbContext());
}
return httpContext.Items[contextTypeKey] as AppDbContext;
}
To use the above method, make a simple call var context = GetDbContext();
Note
We have all of the above methods, but this is specifically to the third method. It seems to work well with two caveats. First, do not use this in a using statement as it will not be available to any other classes during the scope of the request (you dispose it). And secondly, ensure that you have a call on Application_EndRequest that does actually dispose of it. We saw these little buggers hanging around after the request ended in memory causing a huge spike in memory usage.

handling XMLHttpRequest abort on asp.net

I use asynchronous XMLHttpRequest to call a function in ASP.net web service.
When I call an abort method on the XMLHttpRequest, after the server has received the request and processing it, the server continues processing the request.
Is there a way to stop the request processing on the server?
Generally speaking, no, you can't stop the request being processed by the server once it has started. After all, how would the server know when a request has been aborted?
It's like if you navigated to a web page but browsed to another one before the first one had loaded. That initial request will, at least to some extent (any client-side work will of course not take place), be fulfilled.
If you do wish to stop a long-running operation on the server, the service that is being invoked will need to be architected such that it can support being interrupted. Some psuedo code:
void MyLongRunningMethod(opId, args)
{
work = GetWork(args)
foreach(workItem in work)
{
DoWork(workItem)
//Has this invocation been aborted?
if(LookUpSet.Contains(opId))
{
LookUpSet.Remove(opId)
return
}
//Or try this:
if(Response.IsClientConnected)
{
HttpContext.Current.Response.End();
return;
}
}
}
void AbortOperation(opId)
{
LookUpSet[opId] = true
}
So the idea here is that MyLongRunningMethod periodically checks to see if it has been aborted, returning if so. It is intended that opId is unique, so you could generate it based on the session Id of the client appended with the current time or something (in Javascript, new Date().getTime() will get you the number of milliseconds since the epoch).
With this sort of approach, the server must maintain state (the LookUpSet in my example), so you will need some way of doing that, such as a database or just storing it in memory. The service will also need to be architected such that calling abort does not leave things in a non-working state, which of course depends very heavily on what it does.
The other really important requirement is that the data can be split up and worked on in chunks. This is what allows the service to be interruptable.
Finally, if some operation is to be aborted, then AbortOperation must be called - simply aborting the XMLHttpRequest invocation won't do help as the operation will continue until completion.
Edit
From this question: ASP.Net: How to stop page execution when browser disconnects?
You could also check the Response.IsClientConnected property to try and determine whether the invocation had been aborted.
Generally speaking, the server isn't going to know that a client has disconnected until it attempts to send data to it. See Best practice to detect a client disconnection in .NET? and Instantly detect client disconnection from server socket.
As nick_w wrote you can't stop the request being processed by the server once it has started. But there is ability to implement solution which will give you ability to cancel server task. Dino Esposito has several great articles about how such things can be implemented:
Canceling Server Tasks with ASP.NET AJAX
And in the following articles to implement pooling to server Dino Esposito describes how to use SignalR library:
Build a Progress Bar with SignalR;
Long Polling and SignalR
So if you really need to cancel some task on server these articles can be used as starting point to implement required solution.

Correctly implement background process Thread in ASP.NET

I need to execute an infinite while loop and want to initiate the execution in global.asax.
My question is how exactly should I do it? Should I start a new Thread or should I use Async and Task or anything else? Inside the while loop I need to do await TaskEx.Delay(5000);
How do I do this so it will not block any other processes and will not create memory leaks?
I use VS10,AsyncCTP3,MVC4
EDIT:
public void SignalRConnectionRecovery()
{
while (true)
{
Clients.SetConnectionTimeStamp(DateTime.UtcNow.ToString());
await TaskEx.Delay(5000);
}
}
All I need to do is to run this as a singleton instance globally as long as application is available.
EDIT:SOLVED
This is the final solution in Global.asax
protected void Application_Start()
{
Thread signalRConnectionRecovery = new Thread(SignalRConnectionRecovery);
signalRConnectionRecovery.IsBackground = true;
signalRConnectionRecovery.Start();
Application["SignalRConnectionRecovery"] = signalRConnectionRecovery;
}
protected void Application_End()
{
try
{
Thread signalRConnectionRecovery = (Thread)Application["SignalRConnectionRecovery"];
if (signalRConnectionRecovery != null && signalRConnectionRecovery.IsAlive)
{
signalRConnectionRecovery.Abort();
}
}
catch
{
///
}
}
I found this nice article about how to use async worker: http://www.dotnetfunda.com/articles/article613-background-processes-in-asp-net-web-applications.aspx
And this:
http://code.msdn.microsoft.com/CSASPNETBackgroundWorker-dda8d7b6
But I think for my needs this one will be perfect:
http://forums.asp.net/t/1433665.aspx/1
ASP.NET is not designed to handle this kind of requirement. If you need something to run constantly, you would be better off creating a windows service.
Update
ASP.NET is not designed for long running tasks. It's designed to respond quickly to HTTP requests. See Cyborgx37's answer or Can I use threads to carry out long-running jobs on IIS? for a few reasons why.
Update
Now that you finally mentioned you are working with SignalR, I see that you are trying to host SignalR within ASP.NET, correct? I think you're going about this the wrong way, see the example NuGet package referenced on the project wiki. This example uses an IAsyncHttpHandler to manage tasks.
You can start a thread in your global.asax, however it will only run till your asp.net process get recycled. This will happen at least once a day, or when no one uses of your site. If the process get recycled, the only way the thread is restarted agian, is when you have a hit on your site. So the thread is not running continueuosly.
To get a continues process it is better to start a windows service.
If you do the 'In process' solution, it realy depends on what your are doing. The Thread itself will not cause you any problems in memory or deadlocks. You should add a meganism to stop your thread when the application stops. Otherwise restarting will take a long time, because it will wait for your thread to stop.
This is an old post, but as I was seraching for this, I would like to report that in .NET 4.5.2 there is a native way to do it with QueueBackgroundWorkItem.
Take a look at this post: https://blogs.msdn.microsoft.com/webdev/2014/06/04/queuebackgroundworkitem-to-reliably-schedule-and-run-background-processes-in-asp-net/
MarianoC
It depends what you are trying to accomplish in your while loop, but in general this is the kind of situation where a Windows Service is the best answer. Installing a Windows Service is going to require that you have admin privileges on the web server.
With an infinite loop you end up with a lot of issues regard the Windows message pump. This is the thing that keeps a Windows application alive even when the application isn't "doing" anything. Without it, a program simply ends.
The problem with an infinite loop is that the application is stuck "doing" something, which prevents other applications (or threads) from "doing" their thing. There have been a few workarounds, such as the DoEvents in Windows Forms, but they all have some serious drawbacks when it comes to responsiveness and resource management. (Acceptable on a small LOB application, maybe not on a web server.) Even if the while-loop is on a separate thread, it will use up all available processing power.
Asynchronus programming is really designed more for long-running processes, such as waiting for a database to return a result or waiting for a printer to come online. In these cases, it's the external process that is taking a long time, not a while-loop.
If a Window Service is not possible, then I think your best bet is going to be setting up a separate thread with its own message pump, but it's a bit complicated. I've never done it on a web server, but you might be able to start an Application. This will provide a message pump for you and allow you to respond to Windows events, etc. The only problem is that this is going to start a Windows application (either WPF or WinForms), which may not be desirable on a web server.
What are you trying to accomplish? Is there another way you might go about it?
I found this nice article about how to use async worker, will give it a try. http://www.dotnetfunda.com/articles/article613-background-processes-in-asp-net-web-applications.aspx
And this:
http://code.msdn.microsoft.com/CSASPNETBackgroundWorker-dda8d7b6
But I think for my needs this one will be perfect:
http://forums.asp.net/t/1433665.aspx/1

Special considerations for using threads in IIS

I'd like to start using asynchronous processing in IIS. Edit: I'm talking about using the task parallel library.
For example, on certain page loads I want to log a bunch of crap, send an email, update some tables, etc. But I don't want to make the user wait for me to log all that crap.
So normally what I do is I have a static Queue that I push the log info onto, and then I have a cron job that calls a special page every 10 minutes whose OnLoad flushes out the queue. This works, but it's kind of clunky to setup, especially when you want to log 50 things. I'd rather do this:
Task.CreateNew(() => Log(theStuff));
However I'm terrified of running tasks in IIS because one slip up and your entire website goes down.
So now I have
SafeTask.FireAndForget(() => Log(theStuff));
This wraps the delegate in some try/catch and passes it into Task.CreateNew. So if someone changes something that affects something else that generates an exception somewhere else that accidentally gets thrown on the task thread, we get a notification instead of a crashed website. Also, the error notification inside the catch is also inside its own try/catch, and the catch for that also has a try/catch that tries to log in a different way.
Now that I can safely run stuff asynchronously in IIS, what other things do I need to worry about before I can start using my SafeTask class?
Every request in IIS and .net is processed in one thread by default. This thread comes from a thread pool called the "Application Pool". Existing threads are reused so you can't really use them for thread state unless you clear or set it every time. You define the size of this thread pool using a formula from MSDN in the machine.config or even your web.config.
Now, every async function call is put on a different thread. This includes async web service calls, async page functions, async delegates, etc. This thread comes from the "application pool" thus reducing the number of thread available for IIS to service new requests.
Most likely, your application will work just fine while using async function calls. In case you are worried or you have a lot of async tasks then you may want to create your own thread pool or look at SmartThreadPool on codeplex.
Hope this helps.
Consider using the page's OnUnload event. Read about it here: http://msdn.microsoft.com/en-us/library/ms178472.aspx
This event fires after the content is sent to the user (so the user isn't blocked while you do work), and should completely satisfy your requirement without introducing additional threads.
Specific to your question, you should be concerned about thread pool exhaustion only if your load and performance testing suggests you're running up against thread limits. If you're not then what you propose is certainly reasonable.

ASP.NET concurrency

I have an ASP.NET application that starts a long running operation during the Event Handler phase in the ASP.NET Page life cycle. This occurs when the end user pushes a button a bunch of queries are made to a database, a bunch of maps are generated, and then a movie is made from jpeg images of the maps. This process can take over a minute to complete.
Here's a link to the application
http://maxim.ucsd.edu/mapmaker/cbeo.aspx
I've tried using a thread from the threadpool, creating and launching my own thread and using AsyncCallback framework. The problem is that the new thread is run under a different userid. I assume the main thread is run under ASPNET, the new thread is run under AD\MAXIM$ where MAXIM is the hostname. I know this because there is an error when it tries to connect to the database.
Why is the new thread under a different userid?
If I can figure out the userid issue, what I'd like to do is check if the movie making process has finished by examining a Session variable in a Page_Load method, then add a link to the page to access the movie.
Does anyone have any good examples of using concurrency in a ASP.NET application that uses or creates threads in an EventHandler callback?
Thanks,
Matt
Did you read this?: http://msdn.microsoft.com/en-us/magazine/cc163725.aspx
Quoting one relevant portion from that link (you should read the whole thing):
A final point to keep in mind as you build asynchronous pages is that you should not launch asynchronous operations that borrow from the same thread pool that ASP.NET uses.
Not addressing the specific problem you asked about, but this is likely to come up soon:
At what point is this video used?
If it's displayed in the page or downloaded by the user, what does the generated html that the browser uses to get the video look like? The browser has to call that video somewhere using a separate http request, and you might do better by creating a separate http handler (*.ashx file) to handle that request, and just writing the url for that handler in your page.
If it's for storage or view elsewhere you should consider just saving the information needed to create the video at this point and deferring the actual work until the video is finally requested.
The problem is that the new thread is run under a different userid. I assume the main thread is run under ASPNET, the new thread is run under AD\MAXIM$ where MAXIM is the hostname.
ASPNET is a local account, when the request travels over a network it will use the computer's credentials (AD\MAXIM$).
What may be happening, is that you're running under impersonation in the request - and without in the ThreadPool. If that's the case, you might be able to store the current WindowsIdentity for the request, and then impersonate that identity in the ThreadPool.
Or, just let the ThreadPool hit the DB with Sql Authentication (username and password).

Resources