use of timer causes HttpContext.Current to be null - asp.net

I have a function that parses an input file.
Private Function getSvSpelOdds(ByVal BombNo As Integer) As Boolean
Dim InputFileBase As String = HttpContext.Current.Application("InputFileBase")
strInputFile = InputFileBase & "PC_P7_D.TXT"
OddsReader = New StreamReader(strInputFile)
'some other code
End Function
If the file is not there (getSvSpelOdds returns False), I would like to retry after 30 seconds.
To achieve this I use a timer.
If Not getSvSpelOdds(y) Then
Timer1.Interval = 30000
End If
Private Sub Timer1_Elapsed(sender As Object, e As System.Timers.ElapsedEventArgs) Handles Timer1.Elapsed
getSvSpelOdds(y)
End Sub
Problem is that when timer fires the HttpContext.Current (used to get the value of gloal variable) is null.
Should I use some other approach to get this to work?

As already described HttpContext should be null as Timer_Elapsed is called in different thread. But you may use System.Web.HttpRuntime.Cache to pass filename, cache should be accessible from all threads.

HttpContext.Current only gives you the context you want when you call it on the thread that handles the incoming thread.
When calling it outside of such threads, you get null. That matches your case, as Timer1_Elapsed is executed on a new thread.

Should I use some other approach to get this to work?
Almost certainly, yes. 30 seconds is a long time to wait without giving any feedback to users.
It would probably be better to return a "no results are available yet, but we're still looking" page to the user. That page can be set to refresh automatically after 30 seconds, by adding a suitable meta-tag:
<META HTTP-EQUIV="refresh" CONTENT="30">
And you then get a fresh request/response cycle on the server. And haven't tied up server resources in the meantime.
Other answers seems to address the other part of your question (about why it doesn't work in the timer callback)

The Elapsed event on the Timer will run on a separate thread therefore its expected behaviour for the current context to be null.
You can only access it from the same thread.
Should I use some other approach to get this to work?
Yes, it's not generally a good idea to mix ASP.NET and threads given the complexity of how ASP.NET works. Like already mentioned its not a great UX to have no feedback for 30 seconds, its better to let the user know what's actually going on.
Also, you need to determine whether the timeout length is appropriate or whether a timeout is needed at all. I don't know the nature of your application but I assume there is some external means for the file to be generated and picked up by your site.

Related

ASP.net Thread Safety Confusion

I have a very long running process in an ASP.net application that we desperately need to dramatically shorten. The process in question is charging a large number of credit cards. Currently it performs at about 1 charge per second. We need this to be more like 10 per second.
So we decided that utilizing multiple simultaneous threads would be one way to go. So we basically take this large list of orders to process, divide the list into ten lists and then spawn a new thread to process each of the ten lists simultaneously.
An additional complication of this process is that we need to report progress on this process, and not only to the user session that initiated the process, but to any user, in any session in the application. So for example, if I log in and start this process, I will see a progress bar. If after I initiate the process, and it is still running, another user logs in elsewhere and goes to this same page, they will also see the progress bar.
I did some research and thought that I could use Application variables to store the relevant bits of information required to report progress. The client polls the server on a regular basis whenever on this page to see if there are any threads running, and if so, it returns various statistics on the progress of the process back to the client.
It would seem that this approach does not work. A simple counter of the number of currently running threads does not work as expected. It seems that the so-called thread safety of the Application object is safe in that no two threads will be able to access the same variable simultaneously, but not safe in that if two threads both attempt to increment a variable, one of them will be able to increment it, and the other will not, and rather than queue up and increment it in turn, the second thread just moves on. I'm sure this is my thread safety ignorance shining through.
Another issue is that using Debug.Print or Debug.WriteLine seem to be the same kind of "thread-safe" as the Application object. As each thread starts, we use Debug.WriteLine to output the name and start time of the thread, and as it completes, we do the same thing to write that it completed. We consistently see ten threads start and four threads end in the debug window.
I don't think we need to use Application.Lock() and Application.Unlock(), but I have tried it both with and without those calls before and after every write operation, but to no avail- the results are the same either way.
I have a ton of code, so I'm not sure exactly which parts to share, but here are some of the relevant parts:
This is how we create and start the threads:
For Each oBatch As List(Of Guid) In oOrderBatches
Dim t As New Threading.Thread(Sub() ProcessPaymentBatch(oBatch, clubrunid, oToken.UserID))
t.IsBackground = True
t.Start()
Next
Here is the sub that is started by each thread:
Private Sub ProcessPaymentBatch(oBatch As List(Of Guid), clubrunid As String, UserID As Guid)
ThreadsRunning(clubrunid) += 1
Try
Debug.Print("Thread Start")
For Each oID As Guid In oBatch
‘Do a bunch of processing stuff…
Next
Finally
ThreadsRunning(clubrunid) -= 1
Debug.Print("Thread End")
End Try
End Sub
Finally, this is an example of one of the application variables that the threads attempt to access, but seems to be failing.
Private Const _THREADSRUNNING As String = "ThreadsRunningThisRun_"
Public Property ThreadsRunning(clubid As String) As Integer
Get
Dim sToken As String = _THREADSRUNNING & clubid
If Application(sToken) Is Nothing Then
ThreadsRunning(clubid) = 0
End If
Return Application(sToken)
End Get
Set(ByVal value As Integer)
Debug.Print(value)
Dim sToken As String = _THREADSRUNNING & clubid
Application.Lock()
Application(sToken) = value
Application.UnLock()
End Set
End Property
The Debug output from this property looks something like this:
Thread Start
1
Thread Start
Thread Start
1
1
4
Thread End
5
3
Thread Start
6
3
1
-1
Thread End
-2
-3
I can't understand why there would be a different number of "Thread Start" and "Thread End" debug statements, and I don't understand how the thread count could get to negative numbers. This is why I am confused by the thread safety of the Application and Debug objects.
Your help in this matter would be greatly appreciated!
Nevermind, I was just being an idiot. The problem had nothing to do with the Application or Debug objects not being thread safe, the problem was in my methodology (as was expected really).
To clarify, the issue was that we were locking the global variables in the application object when writing, but not when reading. We then tried also locking when reading, but still had the same problem. What we failed to realize was that when incrementing a value, you are getting the current value, adding onto that, then setting the new value. The lock needed to bridge all three of those operations, so it goes like this:
Lock
Get
Add
Set
Unlock
What we were doing previously was:
Lock
Get
Unlock
Add
Lock
Set
Unlock
Which allowed for multiple threads to Get and then Set the same values as one another, which explains all of the oddities we were seeing in the debug window.

ApplicationInstance.CompleteRequest doesn't stop the execution of the code below it?

I was told that Respond.Redirect is an expensive process because it raises a ThreadAbortException. So instead, we should be using the CompleteRequest function instead. So I gave it a try but I noticed the codes below it still runs, which I do not want. I want to instantly force the browser to jump to another website.
Public Shared Sub TestCompleteRequest()
If 1 = 1 Then
System.Web.HttpContext.Current.Response.Redirect("Http://Google.com", False)
System.Web.HttpContext.Current.ApplicationInstance.CompleteRequest()
End If
Throw New ApplicationException("Hello, why are you here?")
End Sub
As for the code above, the ApplicationException is still thrown. But why? :(
One method doesn't replace the other directly. The CompleteRequest() method does not end execution when it's called. So if that's really what you want to do then Response.Redirect(string) would be the way to go.
CompleteRequest() simply bypasses the Response.End() method, which is what generates the ThreadAbortException you mentioned, but crucially CompleteRequest() flushes the response buffer. This means the HTTP 302 redirect response is sent to the browser at the line where you call CompleteRequest(), which gives you a chance to do operations that don't affect the response after it's been sent to the user.
The solution for you really depends on what you need to achieve, can you provide an example of what you're using Response.Redirect for and what other code is in the same method?
Calling a method in the ASP.NET framework deals with the request, but the fact is you're still writing and running VB.NET - there's nothing in the language (nor should there be, I'd say) that indicates 'when this method returns, perform an Exit Sub'.
Who's to say you wouldn't want to execute some more of the method after telling ASP.NET to complete the request, anyway?

Dictionary Behaves Strangely During Databinding

I was trying to do a little data access optimization, and I ran into a situation where a dictionary appeared to get out of sync in a way that should be impossible, unless I'm somehow getting into a multithreaded situation without knowing it.
One column of GridLabels binds to a property that does data access -- which is a tad expensive. However, multiple rows end up making the same call, so I should be able to head any problems off at the pass by doing a little caching.
However, elsewhere in the app, this same code is called in ways where caching would not be appropriate, I needed a way to enable caching on demand. So my databinding code looks like this:
OrderLabelAPI.MultiSyringeCacheEnabled = True
Me.GridLabels.DataBind()
OrderLabelAPI.MultiSyringeCacheEnabled = False
And the expensive call where the caching happens looks like this:
Private Shared MultiSyringeCache As New Dictionary(Of Integer, Boolean)
Private Shared m_MultiSyringeCacheEnabled As Boolean = False
Public Shared Function IsMultiSyringe(orderLabelID As Integer) As Boolean
If m_MultiSyringeCacheEnabled Then
'Since this can get hit a lot, we cache the values into a dictionary. Obviously,
'it goes away after each request. And the cache is disabled by default.
If Not MultiSyringeCache.ContainsKey(orderLabelID) Then
MultiSyringeCache.Add(orderLabelID, DoIsMultiSyringe(orderLabelID))
End If
Return MultiSyringeCache(orderLabelID)
Else
Return DoIsMultiSyringe(orderLabelID)
End If
End Function
And here is the MultiSyringeCacheEnabled property:
Public Shared Property MultiSyringeCacheEnabled As Boolean
Get
Return m_MultiSyringeCacheEnabled
End Get
Set(value As Boolean)
ClearMultiSyringeCache()
m_MultiSyringeCacheEnabled = value
End Set
End Property
Very, very rarely (unreproducably rare...) I will get the following exception: The given key was not present in the dictionary.
If you look closely at the caching code, that's impossible since the first thing it does is ensure that the key exists. If DoIsMultiSyringe tampered with the dictionary (either explicitly or by setting MultiSyringeCacheEnabled), that could also cause problems, and for awhile I assumed this had to be the culprit. But it isn't. I've been over the code very carefully several times. I would post it here but it gets into a deeper object graph than would be appropriate.
So. My question is, does datagridview databinding actually get into some kind of zany multithreaded situation that is causing the dictionary to seize? Am I missing some aspect of shared members?
I've actually gone ahead and yanked this code from the project, but I want to understand what I'm missing. Thanks!
Since this is ASP.NET, you have an implicit multithreaded scenario. You are using a shared variable (see What is the use of a shared variable in VB.NET?), which is (as the keyword implies) "shared" across multiple threads (from different people visiting the site).
You can very easily have a scenario where one visitor's thread gets to here:
'Since this can get hit a lot, we cache the values into a dictionary. Obviously,
'it goes away after each request. And the cache is disabled by default.
If Not MultiSyringeCache.ContainsKey(orderLabelID) Then
MultiSyringeCache.Add(orderLabelID, DoIsMultiSyringe(orderLabelID))
End If
' My thread is right here, when you visit the site
Return MultiSyringeCache(orderLabelID)
and then your thread comes in here and supercedes my thread:
Set(value As Boolean)
ClearMultiSyringeCache()
m_MultiSyringeCacheEnabled = value
End Set
Then my thread is going to try to read a value from the dictionary after you've cleared it.
That said, I am not sure what performance benefit you expect from a "cache" that you clear with every request. It looks like you should simply not make this variable shared- make it an instance variable- and any user request accessing it will have their own copy.

ASP.NET session object lifetime pessimistic assumption !

I check a session object and if it does exist then call another method which would use that object indirectly. Although the second method would access this object in a few nanoseconds I was thinking of a situation when the object exactly expires between two calls. Does Session object extends its lifetime on every read access from code for preventing such a problem ? If not how to solve the problem ?
If you are going to say why I don't pass the retrieved object from first method to second one, this is because I pass the ASP.NET Page object which carries many other parameters inside it to second method and if I try to pass each of them separately, there would be many parameters while I just pass one Page object now.
Don't worry, this won't happen
If I understand your situation it works sort of this way:
Access a certain page
If session is active it immediately redirects to the second page or executes a certain method on the first page.
Second page/method uses session
You're afraid that session will expire between execution of the first and second method/page.
Basically this is impossible since your session timer was reset when just before the first page starts processing. So if the first page had active session then your second page/method will have it as well (as long as processing finishes before 20 minutes - default session timeout duration).
How is Session processed
Session is processed by means of an HTTP Module that runs on every request and before page starts processing. This explains the behaviour. If you're not familiar with HTTP Modules, then I suggest you read a bit about IHttpModule interface.
It's quite difficult to understand your question, IMHO, but I will try.
From what I understand, you're doing something like:
string helloWorld = string.Empty;
if (this.Session["myObject"] == null)
{
// The object was removed from the session or the session expired.
helloWorld = this.CreateNewMyObject();
}
else
{
// Session still exists.
helloWorld = this.Session["myObject"].ToString(); // <- What if the session expired just now?
}
or
// What if the session existed here...
if (this.Session["myObject"] == null)
{
this.Session["myObject"] = this.CreateNewMyObject();
}
// ... but expired just there?
string helloWorld = this.Session["myObject"].ToString();
I thought that Session object is managed by the same thread as the page request, which would mean that it is safe to check if object exists, than use it without a try/catch.
I were wrong:
For Cache objects you have to be aware of the fact that you’re dealing essentially with an object accessed across multiple threads
Source: ASP.NET Cache and Session State Storage
I were also wrong about not reading to carefully the answer by Robert Koritnik, which, in fact, clearly answers the question.
In fact, you are warned about the fact that an object might be removed during page request. But since Session lifespan relies on page requests, it would mean that you must take in account the removal of session variables only if your request takes longer than the session timeout (see How is Session processed in the answer by Robert Koritnik).
Of course, such situation is very rare. But if in your case, you are pretty sure that the page request can take longer than 20 minutes (default session timeout), than yes, you must take in account that an object may be removed after you've checked if it exists, but before you really use it.
In this situation, you can obviously increment the session timeout, or use try/catch when accessing the session objects. But IMHO, if the page request takes dozens of minutes, you must consider other alternatives, as Windows services, to do the work.
I'm having difficulties understanding what the problem here is but let me try it again referring to thread safety.
Thread safety issue
If this is a thread safety issue, you can always issue a lock when creating a certain session object so other parallel requests won't run into a problem by double creating your object.
if (obj == null)
{
lock (objLock)
{
if (obj == null)
{
obj = GenerateYourObject();
}
}
}
Check lock documentation on MSDN if you've never used it before. And don't forget to check other web resources as well.

Need suggestion for ASP.Net in-memory queue

I've a requirement of creating a HttpHandler that will serve an image file (simple static file) and also it'll insert a record in the SQL Server table. (e.g http://site/some.img, where some.img being a HttpHandler) I need an in-memory object (like Generic List object) that I can add items to on each request (I also have to consider a few hundreds or thousands requests per second) and I should be able unload this in-memory object to sql table using SqlBulkCopy.
List --> DataTable --> SqlBulkCopy
I thought of using the Cache object. Create a Generic List object and save it in the HttpContext.Cache and insert every time a new Item to it. This will NOT work as the CacheItemRemovedCallback would fire right away when the HttpHandler tries to add a new item. I can't use Cache object as in-memory queue.
Anybody can suggest anything? Would I be able to scale in the future if the load is more?
Why would CacheItemRemovedCalledback fire when you ADD something to the queue? That doesn't make sense to me... Even if that does fire, there's no requirement to do anything here. Perhaps I am misunderstanding your requirements?
I have quite successfully used the Cache object in precisely this manner. That is what it's designed for and it scales pretty well. I stored a Hashtable which was accessed on every app page request and updated/cleared as needed.
Option two... do you really need the queue? SQL Server will scale pretty well also if you just want to write directly into the DB. Use a shared connection object and/or connection pooling.
How about just using the Generic List to store requests and using different thread to do the SqlBulkCopy?
This way storing requests in the list won't block the response for too long, and background thread will be able to update the Sql on it's own time, each 5 min so.
you can even base the background thread on the Cache mechanism by performing the work on CacheItemRemovedCallback.
Just insert some object with remove time of 5 min and reinsert it at the end of the processing work.
Thanks Alex & Bryan for your suggestions.
Bryan: When I try to replace the List object in the Cache for the second request (now, count should be 2), the CacheItemRemovedCalledback gets fire as I'm replacing the current Cache object with the new one. Initially, I also thought this is weird behavior so I gotta look deeper into it.
Also, for the second suggestion, I will try to insert record (with the Cached SqlConnection object) and see what performance I get when I do the stress test. I doubt I'll be getting fantastic numbers as it's I/O operation.
I'll keep digging on my side for an optimal solution meanwhile with your suggestions.
You can create a conditional requirement within the callback to ensure you are working on a cache entry that has been hit from an expiration instead of a remove/replace (in VB since I had it handy):
Private Shared Sub CacheRemovalCallbackFunction(ByVal cacheKey As String, ByVal cacheObject As Object, ByVal removalReason As Web.Caching.CacheItemRemovedReason)
Select Case removalReason
Case Web.Caching.CacheItemRemovedReason.Expired, Web.Caching.CacheItemRemovedReason.DependencyChanged, Web.Caching.CacheItemRemovedReason.Underused
' By leaving off Web.Caching.CacheItemRemovedReason.Removed, this will exclude items that are replaced or removed explicitly (Cache.Remove) '
End Select
End Sub
Edit Here it is in C# if you need it:
private static void CacheRemovalCallbackFunction(string cacheKey, object cacheObject, System.Web.Caching.CacheItemRemovedReason removalReason)
{
switch(removalReason)
{
case System.Web.Caching.CacheItemRemovedReason.DependencyChanged:
case System.Web.Caching.CacheItemRemovedReason.Expired:
case System.Web.Caching.CacheItemRemovedReason.Underused:
// This excludes the option System.Web.Caching.CacheItemRemovedReason.Removed, which is triggered when you overwrite a cache item or remove it explicitly (e.g., HttpRuntime.Cache.Remove(key))
break;
}
}
To expand on my previous comment... I get the picture you are thinking about the cache incorrectly. If you have an object stored in the Cache, say a Hashtable, any update/storage into that Hashtable will be persisted without you explicitly modifying the contents of the Cache. You only need to add the Hashtable to the Cache once, either at application startup or on the first request.
If you are worried about the bulkcopy and page request updates happening simultaneously, then I suggest you simple have TWO cached lists. Have one be the list which is updated as page requests come in, and one list for the bulk copy operation. When one bulk copy is finished, swap the lists and repeat. This is similar to double-buffering video RAM for video games or video apps.

Resources