Asp.Net Threads in Visual Studio - asp.net

I have a basic Asp.Net application, to test a problem I want two instances of IE calling an asp.net page, I want the first to block (Thread.Sleep) and the second to proceed, hence I have code like this:
private static bool firstOne = true;
...
if (firstOne)
{
firstOne = false;
System.Threading.Thread.Sleep(10000);
}
What I'm actually seeing is both pages wait for the sleep to finish, it's as if there is a single thread servicing both.
NB I don't have the debugger attached nor any breakpoints
This is on my work pc, default machine.config and web.config. Nothing unusual has been configured.
This isn't something I normally have to do so i could easily be missing something obvious here
Any ideas?

The reason is the asp.net session module that locks the request until is finished.
Also the thread that you place on sleep is the requested thread and not a new one. If you make a new one and left the page return, then the new thread will live and die alone without affect your session. The session is lock on request the page, and unlock when the page is finished.
The session is lock entire each request of the user to synchronize them. To avoid it you simply turn it off for that page, or use handlers that not use sessions by default.
You can also make your totally custom session that handle that case.
If you wondering if this is good, that session is lock your user, yes is good for starting sites because is really helps you to synchronize the user actions. If session not lock your user, you must do that by your self with mutexes, and on each action of your user, eg when you insert something on a database.
One similar question:
Web app blocked while processing another web app on sharing same session

Related

asp.net page wait others server side / asynchrone page

I created an asp.net page for waiting ajax. I have one page creating something that takes 30 seconds. On every step I change a session value.
I have another page for ajax, returning the session value for showing the percentage of creation. But, I dont know why, my ajax page awaits the end of the creation of my first page. So I only get the 100% at the end.
Maybe it's because I use VS development server and not IIS server. If this is the problem, can I change settings of the development server for asynchrone execution?
Or is it something else?
WebForms are not ideal for asynchronous operations.
Add SignalR to your project and use a Hub to push status data back to your page to update the current state of the process you are running Asynchronously.
An example of a technique to perform this type of asynchronous notification is covered in my blog post titled "A Guide to using ASP.Net SignalR with RadNotifications"
Don't use ASP.Net session state to do that. It has an implicit reader/writer lock around it, meaning your other call is probably blocking until your process finishes. You can try storing your status in a database or the cache, but it would probably be better to redesign the interaction.

Does ASP.NET Web Forms prevent a double click submission?

I'm working on an ASP.NET 4.0 Web Forms project, after a break of about 5 years, and I'm confused about the submit behaviour.
In the click event of a button, I sleep the thread so that I can click the submit button multiple times.
From what I can tell, ASP.NET is preventing the click event being called more than once. Is this true?
BTW: I've turned off javascript for this test.
"ASP.NET is preventing the click event being called more than once. Is this true?"
No, actually what is happening to you is that your events are blocked by the lock of the session. (What session is on asp.net, is a module on asp.net that can keeps data for each user that visit the site)
So when you make a submit, and the page is running this submit, and then you make a second one before the first one replay, actually the second one is waiting the first to finish and unlock the session.
To make a test and prove that the session is locking your request, turn off the session and try again your test.
relative:
Web app blocked while processing another web app on sharing same session
What perfmon counters are useful for identifying ASP.NET bottlenecks?
Replacing ASP.Net's session entirely
Trying to make Web Method Asynchronous
No, As you are making your main thread sleep it causes the session to get blocked which prevents it from processing any further requests. Hence it gives that effect. If you have an independent task which can be done without web request and response, then you can perform it in separate thread. This will not block your process.

Special considerations for using threads in IIS

I'd like to start using asynchronous processing in IIS. Edit: I'm talking about using the task parallel library.
For example, on certain page loads I want to log a bunch of crap, send an email, update some tables, etc. But I don't want to make the user wait for me to log all that crap.
So normally what I do is I have a static Queue that I push the log info onto, and then I have a cron job that calls a special page every 10 minutes whose OnLoad flushes out the queue. This works, but it's kind of clunky to setup, especially when you want to log 50 things. I'd rather do this:
Task.CreateNew(() => Log(theStuff));
However I'm terrified of running tasks in IIS because one slip up and your entire website goes down.
So now I have
SafeTask.FireAndForget(() => Log(theStuff));
This wraps the delegate in some try/catch and passes it into Task.CreateNew. So if someone changes something that affects something else that generates an exception somewhere else that accidentally gets thrown on the task thread, we get a notification instead of a crashed website. Also, the error notification inside the catch is also inside its own try/catch, and the catch for that also has a try/catch that tries to log in a different way.
Now that I can safely run stuff asynchronously in IIS, what other things do I need to worry about before I can start using my SafeTask class?
Every request in IIS and .net is processed in one thread by default. This thread comes from a thread pool called the "Application Pool". Existing threads are reused so you can't really use them for thread state unless you clear or set it every time. You define the size of this thread pool using a formula from MSDN in the machine.config or even your web.config.
Now, every async function call is put on a different thread. This includes async web service calls, async page functions, async delegates, etc. This thread comes from the "application pool" thus reducing the number of thread available for IIS to service new requests.
Most likely, your application will work just fine while using async function calls. In case you are worried or you have a lot of async tasks then you may want to create your own thread pool or look at SmartThreadPool on codeplex.
Hope this helps.
Consider using the page's OnUnload event. Read about it here: http://msdn.microsoft.com/en-us/library/ms178472.aspx
This event fires after the content is sent to the user (so the user isn't blocked while you do work), and should completely satisfy your requirement without introducing additional threads.
Specific to your question, you should be concerned about thread pool exhaustion only if your load and performance testing suggests you're running up against thread limits. If you're not then what you propose is certainly reasonable.

Passing HTTP authenticated principal onto another worker thread

We have a web front end on our business layer server.
Certain pages in our web application instantiate very long running tasks (could be up to 10+ minutes). The way that these requests are handled is like so: -
(on the HTTP request thread)
we make a connection to the business server.
we create a new thread to make the long running call passing in the connection object.
The HTTP request then completes, passing a handle back to the browser,
the browser periodically polls the web server to get updates on the long running task progress.
All requests to the business server are authenticated - the connection's user principal page must have permission to call the method on the business server.
This mechanism works fine as long as our web application is running in Classic mode.
When we run in pipeline mode, we get ObjectDisposedExceptions when the browser polls.
System.ObjectDisposedException: Safe handle has been closed
at System.StubHelpers.StubHelpers.SafeHandleC2NHelper(Object pThis, IntPtr CleanupWorkList)
at Microsoft.Win32.Win32Native.GetTokenInformation(SafeTokenHandle TokenHandle, UInt32 TokenInformationClass, SafeLocalAllocHandle TokenInformation, UInt32 TokenInformationLength, ref UInt32 ReturnLength)
at System.Security.Principal.WindowsIdentity.GetTokenInformation(SafeTokenHandle tokenHandle, TokenInformationClass tokenInformationClass, ref UInt32 dwLength)
at System.Security.Principal.WindowsIdentity.get_User()
at System.Security.Principal.WindowsIdentity.GetName()
at System.Security.Principal.WindowsIdentity.get_Name()
the problem appears to be that the windows principal used to make the connection is disposed when the original request ends (which is understandable - in fact I am surprised that the code worked at all!).
As a way around this problem I was wondering if it was possible to either create a duplicate of the HTTP request principal and use that to create the connection (and dispose of it when the long running task completes) or would it be possible to impersonate the HTTP request principle on the worker thread even after the principal is disposed?
Update
(My comment under Aliostad's question was incorrect: the test page did fail. I managed to confuse myself sufficiently that I wrote my test page so that it did not exercise the same code path as the real (faulting) code. Nevermind!)
I have written a "workaround" for this problem: -
I am in the fortunate position of knowing what roles/groups the business server logic will be querying for before the call to the business server is made. So my workaround is to create a new generic principal based upon the request's principal's membership of these roles. The long running task is run using the generic principal.
I am not 100% happy with this workaround because it is very much a "hack" - i.e. I can see that it would easily fall down if some logic did the (eminently sensible) check of verifying that the principal's identity is authenticated.
So I would still very much appreciate any help / insight into this issue.
Thanks
OK, here is my catch on this.
First of all, if you create a thread, all the current thread's security context will be copied to the new thread - by default. This operation is heavy but much needed (as you can imagine most things will not work without it). In case you need to prevent it and you do not need the copying of context, there is a way to do it and it has been explained in Richter's C# via CLR. Lucky enough, he has shared this very bit of the book here and basically calling a static method to prevent context to be flowed:
ExecutionContext.SuppressFlow();
I cannot think this is being called in WCF although using Reflector, I found a single use of it in here:
[SecuritySafeCritical]
private IAsyncResult BeginGetContext(bool startListening)
{
Exception exception;
do
{
exception = null;
try
{
try
{
if (ExecutionContext.IsFlowSuppressed())
{
return this.listener.BeginGetContext(this.onGetContext, null);
}
using (ExecutionContext.SuppressFlow())
{
return this.listener.BeginGetContext(this.onGetContext, null);
}
}
// .... the rest
Interestingly enough, this is used in 3 places one of them in SharedHttpTransportManager.
Now all this might look like we have found the issue and it is a bug but I very much doubt it.
My hunch is that there is a process recycling happening in between and the context is lost. The way to prove or disprove this would be to use perfmon to register all process recycles and find out if any was in between.
My solution is basically - which you might not like! - to simply insert an item into a queue (MSMQ or a simple database queue) and have a windows service reading it. With this operation being so important, I would never trust IIS to carry out to the finish.
Hope this is useful to you.

ASP.NET concurrency

I have an ASP.NET application that starts a long running operation during the Event Handler phase in the ASP.NET Page life cycle. This occurs when the end user pushes a button a bunch of queries are made to a database, a bunch of maps are generated, and then a movie is made from jpeg images of the maps. This process can take over a minute to complete.
Here's a link to the application
http://maxim.ucsd.edu/mapmaker/cbeo.aspx
I've tried using a thread from the threadpool, creating and launching my own thread and using AsyncCallback framework. The problem is that the new thread is run under a different userid. I assume the main thread is run under ASPNET, the new thread is run under AD\MAXIM$ where MAXIM is the hostname. I know this because there is an error when it tries to connect to the database.
Why is the new thread under a different userid?
If I can figure out the userid issue, what I'd like to do is check if the movie making process has finished by examining a Session variable in a Page_Load method, then add a link to the page to access the movie.
Does anyone have any good examples of using concurrency in a ASP.NET application that uses or creates threads in an EventHandler callback?
Thanks,
Matt
Did you read this?: http://msdn.microsoft.com/en-us/magazine/cc163725.aspx
Quoting one relevant portion from that link (you should read the whole thing):
A final point to keep in mind as you build asynchronous pages is that you should not launch asynchronous operations that borrow from the same thread pool that ASP.NET uses.
Not addressing the specific problem you asked about, but this is likely to come up soon:
At what point is this video used?
If it's displayed in the page or downloaded by the user, what does the generated html that the browser uses to get the video look like? The browser has to call that video somewhere using a separate http request, and you might do better by creating a separate http handler (*.ashx file) to handle that request, and just writing the url for that handler in your page.
If it's for storage or view elsewhere you should consider just saving the information needed to create the video at this point and deferring the actual work until the video is finally requested.
The problem is that the new thread is run under a different userid. I assume the main thread is run under ASPNET, the new thread is run under AD\MAXIM$ where MAXIM is the hostname.
ASPNET is a local account, when the request travels over a network it will use the computer's credentials (AD\MAXIM$).
What may be happening, is that you're running under impersonation in the request - and without in the ThreadPool. If that's the case, you might be able to store the current WindowsIdentity for the request, and then impersonate that identity in the ThreadPool.
Or, just let the ThreadPool hit the DB with Sql Authentication (username and password).

Resources