Code executed infinite times in threadpool because of exception - asp.net

I've heard its bad to use ThreadPool in asp.net, however I've used it for the purpose of educating myself. My goal was to determine if the Application_Error event got fired (which is handled in the Global.asax) - my answer to that is: no, it does not get triggered.
But I had observed something strange. The thread I wrote simply queued up tasks to the threadpool. The task was meant to throw errors randomly. But I observed however that I keep frequently getting the error - the number exceeds the no. of times I've queued the task. A separate concern I have is even the System.Diagnostics.Trace.WriteLine() isn't logging messages to my output window (visual studio). Why this strange behaviour?
using System;
using System.Threading;
namespace ThreadPoolDemo.Web
{
public partial class _default : System.Web.UI.Page
{
protected void Page_Load(object sender, EventArgs e)
{
}
protected void Button1_Click(object sender, EventArgs e)
{
Thread t = new Thread(createthreads);
t.IsBackground = true;
t.Start();
}
void createthreads()
{
Thread.Sleep(10 * 1000);
int i;
System.Diagnostics.Trace.WriteLine("Queueing items");
for (i = 0; i < 1; i++)
ThreadPool.QueueUserWorkItem(new WaitCallback(ErrorTask), null);
System.Diagnostics.Trace.WriteLine("End Queueing items");
}
void ErrorTask(object obj)
{
Random generator = new Random();
int value = generator.Next(1);
if (value == 0)
throw new Exception("Sample exception thrown");
else
System.Diagnostics.Trace.WriteLine("Processed thread");
}
}
}

There are two problems with your ErrorTask. The first is that you're initializing a new Random instance every time the method is called. The default Random constructor seeds the random number generator with the value from Environment.TickCount, which will likely be the same for consecutive threads. So you'll get the same random sequence for multiple threads.
The bigger problem, though, is that generator.Next(1) will always return 0. Random.Next(int max) generates a random number, N, such that 0 <= N < max. So your ErrorTask will throw the exception for every thread.
I have no idea how or why it would be throwing that exception more than once per call to ErrorTask. That doesn't seem possible.
I would suggest the following modification:
private Random generator = new Random();
void ErrorTask(object obj)
{
int value;
lock (generator)
{
value = generator.Next(2);
}
if (value == 0)
throw new Exception("Sample exception thrown");
else
System.Diagnostics.Trace.WriteLine("Processed thread");
}
generator now has class scope and is initialized only once. The lock is there to prevent multiple threads from trying to generate a number at the same time. Without the lock, the random number generator can get corrupted and it will start returning 0 on every call. And I changed the parameter to generator.Next to 2, so you can get numbers 0 and 1.

Related

ASP.net cache access causing Collection Modified exception in foreach loop

Ok first things first. This is some exception information given by the support team. I know the line and code where it happens. It happens in a FirstOrDefault call over a dictionary obtained from cache.
1) Exception Information
*********************************************
Exception Type: System.InvalidOperationException
Message: Collection was modified; enumeration operation may not execute.
Data: System.Collections.ListDictionaryInternal
Now I wanted to simulate the problem and I could do it in a simple ASP.net application.
My page has 2 Buttons - Button_Process and Button_Add
The code behind is as follows:
public partial class _Default : System.Web.UI.Page
{
protected void Page_Load(object sender, EventArgs e)
{
if (!IsPostBack)
{
var data = Cache["key"];
if (data == null)
{
var dict = new Dictionary<int, string>();
for (int i = 0; i < 10; i++)
{
dict.Add(i, "i");
}
Cache["key"] = dict;
}
}
}
protected void ButtonProcess_Click(object sender, EventArgs e)
{
var data = Cache["key"] as Dictionary<int, string>;
if (data != null)
{
foreach (var d in data.Values) //In actual code there is FirstOrDefault here
{
Thread.Sleep(1000);
if (d.Contains("5"))
{
//some operation
}
}
}
}
protected void Button2_Click(object sender, EventArgs e)
{
var data = Cache["key"] as Dictionary<int, string>;
if (data != null)
{
data.Add(new Random().Next(), "101");
Cache["key"] = data;
}
}
}
Now assume there are 2 requests:
Request 1 - Someone clicks on button_Process and some operation on cache object is taking place
Request 2 - Someone clicks on button_Add and the first person gets an exception - collection modified blah blah
I understand the problem - it is happening because we are accessing same bit of memory. I have 2 solutions in my mind:
1. I use a for loop instead of for each (to replace FirstOrDefault in actual code) - I dunno how efficient this operation will be after I make the changes. - I don't ever delete any item from cache so I was thinking of this solution
2. I put some lock over cache object or something on those lines - but I dunno exactly where and how should I lock this object.
Please help me with this. I am not able to figure out an efficient solution. What is the best way to handle such situations?
This happens because you're working directly with object, locating in cache. Good practise, to avoid those exceptions and other wierd behavior (when you accidentally modify cache object) is working with copy of cache data. And there are several ways of achieving it, like doing clone or some kind of deep copy. What i prefer is keeping objects in cache serialized (any kind you like - json/xml/binary or w/e else), since (de)serialization makes a deep copy of your object. Following small code snippet will clarify things:
public static class CacheManager
{
private static readonly Cache MyCache = HttpRuntime.Cache;
public static void Put<T>(T data, string key)
{
MyCache.Insert(key, Serialize(data));
}
public static T Get<T>(string key)
{
var data = MyCache.Get(key) as string;
if (data != null)
return Deserialize<T>(data);
return default(T);
}
private static string Serialize(object data)
{
//this is Newtonsoft.Json serializer, but you can use the one you like
return JsonConvert.SerializeObject(data);
}
private static T Deserialize<T>(string data)
{
return JsonConvert.DeserializeObject<T>(data);
}
}
And usage:
var myObj = new Dictionary<int, int>();
CacheManager.Put(myObj, "myObj");
//...
var anotherObj = CacheManager.Get<Dictionary<int, int>>("myObj");
Check Task Parallel Library for .NET 3.5. It has Concurrent Collections such as ConcurrentStack, ConcurentQueue and ConcurrentDictionary.
http://www.nuget.org/packages/TaskParallelLibrary
The problem is that the cache object is global for appdomain and the data stored in are shared between all request.
The only solution to this problem is to activate a lock when you want to access to the collection and then release the lock (https://msdn.microsoft.com/en-us/library/vstudio/c5kehkcz%28v=vs.100%29.aspx).
(sorry form my bad english)

ASP.NET Response.Filter

I need to create filter that replace tags <h2> in the HTML to <h3>:
My filter
public class TagsFilter:Stream
{
HttpContext qwe;
public TagsFilter(HttpContext myContext)
{
qwe = myContext;
}
public override void Write(byte[] buffer, int offset, int count)
{
string html = System.Text.Encoding.UTF8.GetString(buffer);
html = html.Replace("<h2>", "<h3>");
qwe.Response.Write(html.ToCharArray(), 0, html.ToCharArray().Length);
}
My module
public class TagsChanger : IHttpModule
{
public void Init(HttpApplication context)
{
context.Response.Filter = new TagsFilter(context.Context);
}
I get error System.Web.HttpException:In this context, the answer is not available.
Look at Rick Strahl's post about "Capturing and Transforming ASP.NET Output with Response.Filter".
Response.Filter content is chunked. So to implement a Response.Filter effectively requires only that you implement a custom stream and handle the Write() method to capture Response output as it’s written. At first blush this seems very simple – you capture the output in Write, transform it and write out the transformed content in one pass. And that indeed works for small amounts of content. But you see, the problem is that output is written in small buffer chunks (a little less than 16k it appears) rather than just a single Write() statement into the stream, which makes perfect sense for ASP.NET to stream data back to IIS in smaller chunks to minimize memory usage en route.
Unfortunately this also makes it a more difficult to implement any filtering routines since you don’t directly get access to all of the response content which is problematic especially if those filtering routines require you to look at the ENTIRE response in order to transform or capture the output as is needed for the solution the gentleman in my session asked for.
So in order to address this a slightly different approach is required that basically captures all the Write() buffers passed into a cached stream and then making the stream available only when it’s complete and ready to be flushed.
As I was thinking about the implementation I also started thinking about the few instances when I’ve used Response.Filter implementations. Each time I had to create a new Stream subclass and create my custom functionality but in the end each implementation did the same thing – capturing output and transforming it. I thought there should be an easier way to do this by creating a re-usable Stream class that can handle stream transformations that are common to Response.Filter implementations.
Rick Strahl wrote own implementation of stream filter that permits text replacing in right way.
I did a small example. I think you have to access the original stream, rather than accessing the httpContext.
public class ReplacementStream : Stream
{
private Stream stream;
private StreamWriter streamWriter;
public ReplacementStream(Stream stm)
{
stream = stm;
streamWriter = new StreamWriter(stream, System.Text.Encoding.UTF8);
}
public override void Write(byte[] buffer, int offset, int count)
{
string html = System.Text.Encoding.UTF8.GetString(buffer);
html = html.Replace("<h2>", "<h3>");
streamWriter.Write(html.ToCharArray(), 0, html.ToCharArray().Length);
streamWriter.Flush();
}
// all other necessary overrides go here ...
}
public class FilterModule : IHttpModule
{
public String ModuleName
{
// Verweis auf Name in Web.config bei Modul-Registrierung
get { return "FilterModule"; }
}
void context_BeginRequest(object sender, EventArgs e)
{
HttpContext context = HttpContext.Current;
context.Response.Filter = new ReplacementStream(context.Response.Filter);
}
public void Init(HttpApplication context)
{
context.BeginRequest += new EventHandler(context_BeginRequest);
}
}
Found the solution at this post on SO. Worked for me.
The problem is that you are applying the filter in the Init event, which only occurs once per application instance (it is essentially close to App_Start).
What you need to do is hook in the BeginRequest event from the Init event, and then apply the filter on BeginRequest.
public void Init(HttpApplication application)
{
application.BeginRequest += BeginRequest;
}
private void BeginRequest(object sender, EventArgs e)
{
var app = (HttpApplication)sender;
var context = app.Context;
context.Response.Filter = new TagsFilter(context);
}

Parallel activities appear to execute sequentially

I am learning WF4 and got stuck at the following place. Please help.Thanks.
1) I have created a static method, MyMethod in a static class called Worker. Within this method I call Thread.Sleep(3000) and then print "MyMethod" called.
2) I then created an activity, DoWork (DoWork.xaml) which consists of a InvokeMethod (The target type is the Worker class in step 1 and MethodName = MyMethod).
3) In the main method, I call 2 methods called OutputSequence() and OutputParallel() which are as follows
private static void OutputSequence()
{
Sequence s = new Sequence() { Activities = new DoWork(), new DoWork() } };
WorkflowInvoker.Invoke(s);
}
private static void OutputParallel()
{
Parallel p = new Parallel() { Branches = new DoWork(), new DoWork() } };
WorkflowInvoker.Invoke(p);
}
The OutputSequence() is OK as it calls the target method twice (in sequence) but the parallel one seems to execute sequentially as well. I expected it to execute in parallel.
What am I missing.
The Parallel activity is not what you think it is - it allows you to wait for things in parallel not to execute CPU based code in parallel. The WF4 threading mode is that there is exactly one thread at a time active in the workflow.
If you put two delays in the parallel then both of those waits would occur in parallel as opposed to sequentially as they would in a sequence
The idea is you want to wait for a number of actions when you don;t know the order in which they will occur. Then the parallel activity is complete when all of its child branches have completed
Actually Parallel activity really executes all branches one-by-one and has nothing related to concurrent code execution, like two thread do.
But there is MS sample, that shows "true" concurrent execution for blocks inside of parallel activity. There is the AsyncCodeActivity in the .net 4 that allows to get concurrent execution of activities. Please check http://msdn.microsoft.com/en-us/library/ee358731(VS.100).aspx
Below you can find copy-pasted sample from link above:
public sealed class GenerateRandom : AsyncCodeActivity<int>
{
static Random r = new Random();
protected override IAsyncResult BeginExecute(AsyncCodeActivityContext context, AsyncCallback callback, object state)
{
// Create a delegate that references the method that implements
// the asynchronous work. Assign the delegate to the UserState,
// invoke the delegate, and return the resulting IAsyncResult.
Func<int> GetRandomDelegate = new Func<int>(GetRandom);
context.UserState = GetRandomDelegate;
return GetRandomDelegate.BeginInvoke(callback, state);
}
protected override int EndExecute(AsyncCodeActivityContext context, IAsyncResult result)
{
// Get the delegate from the UserState and call EndInvoke
Func<int> GetRandomDelegate = (Func<int>)context.UserState;
return (int)GetRandomDelegate.EndInvoke(result);
}
int GetRandom()
{
// This activity simulates taking a few moments
// to generate the random number. This code runs
// asynchronously with respect to the workflow thread.
Thread.Sleep(5000);
return r.Next(1, 101);
}
}
hope this will help for someone else

Regarding implementation/usage of Background Worker

Ok, I've poked around and can't find a suitable answer to my question. I have a rather complicated code base that uses a Timer to initiate the creation of background worker to transmit something on regular intervals.
I am running into an issue when it calls ReportProgress:
"This operation has already had OperationCompleted called on it and further calls are illegal."
Stack Trace:
at System.ComponentModel.AsyncOperation.VerifyNotCompleted()
at System.ComponentModel.AsyncOperation.Post(SendOrPostCallback d, Object arg)
at System.ComponentModel.BackgroundWorker.ReportProgress(Int32 percentProgress, Object userState)
at NAME.TX.Work(Object sender, DoWorkEventArgs e) in file.cs:line 68
at System.ComponentModel.BackgroundWorker.OnDoWork(DoWorkEventArgs e)
at System.ComponentModel.BackgroundWorker.WorkerThreadStart(Object argument)
My question is relatively simple, the function called when the timer goes off creates a new background worker, but is it actually a new background worker?
My code (simplified because its very bulky):
//THIS IS THE FUNCTION CALLED BY TIMER
public void TransmitMSG(MSG msg)
{
//Initialze _tx
this.m_thread = new BackgroundWorker();
m_thread.WorkerReportsProgress = true;
//Initialize all events used by _tx
m_thread.DoWork += new DoWorkEventHandler(this.Work);
m_thread.ProgressChanged += new ProgressChangedEventHandler(this.Report);
m_thread.RunWorkerCompleted += new RunWorkerCompletedEventHandler(this.Complete);
//Run the backgroundworker
m_thread.RunWorkerAsync(msg);
}
//THIS IS THE WORK FUNCTION
public override void Work(object sender, DoWorkEventArgs e)
{
//This code is in the TX Thread
//The sender should be the a BackgroundWorker
BackgroundWorker _tx = sender as BackgroundWorker;
//Check the Argument
if (e.Argument != null && e.Argument.GetType() == typeof(MSG))
{
//Transmit the Argument Message
m_THING.Transmit((MSG)e.MSG)
//Report progress according to the result
_tx.ReportProgress(CONSTANTS.PROGRESS.SUCCESS_TX, (MSG)e.Argument);
}
}
The Work function is an override because there is a higher handling class that allows for transmit and recieve, these two inherit from the same class so I can have common ReportProgress and Close methods.

Example of Asynchronous page processing in ASP.net webforms (.NET 2.0)

Can someone provide me with a simple example of Asynchronous page processing in ASP.NET Webforms 2.0 (I'm using VS 2010, so new syntax like lambdas are ok)?
I have some long running requests that I don't want tying up IIS threads.
For simplicity's sake, let's say my current code looks like this:
protected void Page_Load(object sender, EventArgs e)
{
string param1 = _txtParam1.Text;
string param2 = _txtParam2.Text;
//This takes a long time (relative to a web request)
List<MyEntity> entities = _myRepository.GetEntities(param1, param2);
//Conceptually, I would like IIS to bring up a new thread here so that I can
//display the data after it has come back.
DoStuffWithEntities(entities);
}
How can I modify this code so that it is asynchronous? Let's assume that I already set async="true" in the aspx page.
EDIT
I think I figured out how to get what I'm looking for. I've put the example code in an answer here. Feel free to point out any flaws or changes that can be made.
I asked some folks on the ASP.NET team. Here's their emailed response to me, and now, to you.
All that code ends up doing is spinning up a new thread and performing delegate invocation on that thread. So now there are two threads running: the request thread and the new thread. Hence this sample actually has worse performance than the original synchronous code would have had.
See http://www.asp.net/web-forms/tutorials/aspnet-45/using-asynchronous-methods-in-aspnet-45 for a sample on how to write and consume async methods in ASP.NET.
Here is a simple example of asynchronous processing.
protected void Page_Load(object sender, EventArgs e)
{
ThreadPool.QueueUserWorkItem(new WaitCallback(ThreadProc));
ThreadPool.QueueUserWorkItem(state => Dokimes_Programming_multithread_QueryWorkThead.ThreadProc2());
Debug.Write("Main thread does some work, then sleeps.");
// If you comment out the Sleep, the main thread exits before
// the thread pool task runs. The thread pool uses background
// threads, which do not keep the application running. (This
// is a simple example of a race condition.)
// Thread.Sleep(4000);
txtDebug.Text += "ended";
Debug.Write("end.");
}
// This thread procedure performs the task.
static void ThreadProc(Object stateInfo)
{
// No state object was passed to QueueUserWorkItem, so stateInfo is null.
Debug.Write(" Hello from the thread pool 1.");
}
static void ThreadProc2()
{
// No state object was passed to QueueUserWorkItem, so stateInfo is null.
Debug.Write("Hello from the thread pool 2.");
}
Other way
You can use the PageAsyncTask, see here a full example:
http://msdn.microsoft.com/en-us/library/system.web.ui.pageasynctask.aspx
Something like
clAsynCustomObject oAsynRun = new clAsynCustomObject();
PageAsyncTask asyncTask = new PageAsyncTask(oAsynRun.OnBegin, oAsynRun.OnEnd, oAsynRun.OnTimeout, null, true);
Page.RegisterAsyncTask(asyncTask);
Page.ExecuteRegisteredAsyncTasks();
I think I discovered how to do what I wanted to accomplish... though it may not be the best way, feel free to chime in.
At the time of writing there was only one answer in this thread, by Aristos. While he gave an example of executing an asynchronous request, what I wanted was a way to tell ASP.NET to execute some long running method, release the IIS thread so it can be available to service other requests, and then come back when the method finished.
Here's what I came up with, using the same (or similar) example in the question:
using System;
using System.Collections.Generic;
using System.Threading;
using System.Web.UI;
namespace WebApplication2
{
public class MyEntity
{
public string Name { get; set; }
}
public class MyRepository
{
public List<MyEntity> GetEntities(string param1, string param2)
{
Thread.Sleep(10000);
return new List<MyEntity> {new MyEntity {Name = "John Smith"}};
}
}
public partial class Default : Page
{
private readonly MyRepository _myRepository = new MyRepository();
private List<MyEntity> _myEntities;
protected void Page_Load(object sender, EventArgs e)
{
}
private void DoStuffWithEntities()
{
Response.Write("<br/><br/><b>" + _myEntities[0].Name + "</b><br/><br/>");
}
protected void _btnProcess_Click(object sender, EventArgs e)
{
AddOnPreRenderCompleteAsync(BeginExecution, EndExecution, null);
}
private void GetEntities()
{
string param1 = _txtParam1.Text;
string param2 = _txtParam2.Text;
//This takes a long time (relative to a web request)
_myEntities = _myRepository.GetEntities(param1, param2);
}
private IAsyncResult BeginExecution(object sender, EventArgs e, AsyncCallback callback, object state)
{
var t = new ThreadStart(GetEntities);
return t.BeginInvoke(callback, null);
}
private void EndExecution(IAsyncResult result)
{
//Conceptually, I would like IIS to bring up a new thread here so that I can
//display the data after it has come back.
DoStuffWithEntities();
}
}
}

Resources