Detect session hang and kill it - asp.net

I have an asp.net page that runs certain algorithm and returns it's output. I was wondering what will happen and how to handle a case where the algorithm due to a bug goes into infinite loop. It will hog the cpu and other sessions will be served very slowly.
I would love to have a way to tell IIS, if processing Algo.aspx takes more than 5 seconds, kill it or something like that.
Thanks in advance

There is no such thing in IIS. What you can do instead is to perform the work in a background thread and measure the time it takes to complete this background task and simply kill the thread if the wait time is longer than expected.
You may take a look at the WaitHandle.WaitOne method which allows you to specify a timeout for waiting for a particular event to be signaled from a background thread for example.

Set the ScriptTimeout property. It will abort the page if the time is exceeded. It only works when the debugging property is set to true, though.

Related

Is QTimer smart enough to resynchronize itself

Let's say we start a QTimer with a 100ms interval at t0.
Let's say first timeout occurs at t0+100ms. Fine.
Let's say that, due to huge CPU load and/or lots of events having to be handled by the event loop, second timeout occurs at t0+230ms.
Let's say CPU is back to normal load. Is their any chance that third timeout could occur at t0+300ms (QTimer object realising it was late and trying to correct that by resynchronizing itself), or will it most likely timeout at t0+330ms?
Per QTimer documentation :
All timer types may time out later than expected if the system is busy or unable to provide the requested accuracy. In such a case of timeout overrun, Qt will emit activated() only once, even if multiple timeouts have expired, and then will resume the original interval.
I'm not sure I understand this correctly but, apparently, it won't resynchronize itself and third timeout will occur at t0+330ms.

Asp.net PreRenderComplete taking a long time to finish

I'm having an issue with occassional slow performance on button click events on a particular page. There are times that it performs well within normal parameters, but it seems like whenever the server is under even moderate load(meaning I can produce this issue in our production environment not in our dev or test environments) it seems to just hang. After enabling tracing I see that it seems to just hang between the Begin PreRenderComplete and End PreRenderComplete. It just sits there for close to 30 sec. I don't have any specific code that executes in that event space. My understanding was that this event is supposed to be a non event in the life cycle since it is just to make sure that the PreRender phase finished. This page has a large number of controls and as such has a sizable viewstate, but my understanding is that the view state is handled in the LoadState and SaveState events which don't seem to be the phases eating all of my time.
I've run perfmon against the server, and at times when I am able to produce this behavior system resources are normal, there aren't requests queuing. I'm trying to understand what actions might be taking place behind the scenes causing this slowness.
Are there any asynchronous actions on the page? I think that async calls will complete prior to that event, so if some of them are taking a while, such as a slow database or slow network call, that might cause the delay you're seeing.
I think I've found the problem. Through more indepth profiling it seems that my ScriptManager control was causing the delay in trying to combine scripts. Apparently this only presented a problem under load and most take place in the prerendercomplete event. By setting the CombineScripts="false" attribute it seems to have cleared this issue.

endInterruption not being called evrytime

I am using openCL to play sounds and I have noted that the sounds stop functioning after a call enters and I press Decline.
I was able to trace it to the endInterruption not being called.
The problem is that this happens only about once out of 5 times I repeat the replication.
This means that my code is ok, because in the majority of times it does call endInterruption, but still every other time iOS decides not to call endInterruption and I have no idea why.
Check if you are calling the "play" functions from a background thread instead of the main thread. If sometimes you initiate the play/playAtTime call from a background thread, you will not receive the endInterruption callback.
Interestingly though, the system does call beginInterruption even when the play call was initiated from a background thread.
Hope this helps.

QTimer firing issue in QGIS(Quantum GIS)

I have been involved in building a custum QGIS application in which live data is to be shown on the viewer of the application.
The IPC being used is unix message queues.
The data is to be refreshed at a specified interval say, 3 seconds.
Now the problem that i am facing is that the processing of the data which is to be shown is taking more than 3 seconds,so what i have done is that before the app starts to process data for the next update,the refresh QTimer is stopped and after the data is processed i again restart the QTimer.The app should work in such a way that after an update/refresh(during this refresh the app goes unresponsive) the user should get ample time to continue to work on the app apart from seeing the data being updated.I am able to get acceptable pauses for the user to work-- in one scenario.
But on different OS(RHEL 5.0 to RHEL 5.2) the situation is something different.The timer goes wild and continues to fire without giving any pauses b/w the successive updates thus going into an infinite loop.Handling this update data definitely takes longer than 3 sec,but for that very reason i have stopped-restarted the timer while processing..and the same logic works in one scenario while in other it doesnt.. The other fact that i have observed is that when this quick firing of the timer happens the time taken by the refreshing function to exit is very small abt 300ms so the start-stop of the timer that i have placed at the start-and-end of this function happens in that small time..so before the actual processing of the data finishes,there are 3-4 starts of the timer in queue waiting to be executed and thus the infinite looping problem gets worse from that point for every successive update.
The important thing to note here is that for the same code in one OS the refresh time is shown to be as around 4000ms(the actual processing time taken for the same amount of data) while for the other OS its 300ms.
Maybe this has something to do with newer libs on the updated OS..but I dont know how to debug it because i am not able to get any clues why its happening as such... maybe something related to pthreads has changed b/w the OSs??
So, my query is that is there any way that will assure that some processing in my app is timerised(and which is independent of the OS) without using QTimer as i think that QTimer is not a good option to achieve what i want??
What option can be there?? pthreads or Boost threads? which one would be better if i am to use threads as an alternate??But how can i make sure atleast a 3 second gap b/w successive updates no matter how long the update processing takes?
Kindly help.
Thanks.
If I was trying to get an acceptable, longer-term solution, I would investigate updating your display in a separate thread. In that thread, you could paint the display to an image, updating as often as you desire... although you might want to throttle the thread so it doesn't take all of the processing time available. Then in the UI thread, you could read that image and draw it to screen. That could improve your responsiveness to panning, since you could be displaying different parts of the image. You could update the image every 3 seconds based on a timer (just redraw from the source), or you could have the other thread emit a signal whenever the new data is completely refreshed.

Can I use QwaitCondition.wait() in a slot called by the main thread?

if the maximum wait time is 10 ms can i use qwaitcondition in Qt's main thread?
Nothing stops you from using QWaitCondition in the main thread. If you are setting the wait time to 10ms, and it passes without unlocking you will probably not get the desired effects you want. The default is to wait indefinitely.
However, using a wait condition in the main thread will cause the GUI to become unresponsive while it waits. This is almost always undesired.

Resources