IIS Long operation request and performance issue - asp.net

just wondering
if I have a webpage that generates a pdf, but could take a while to generate due to long sql request and number of data to insert and tread before generating pdf.
If the request hasn't finished yet and the user seeing that nothing happened clicks again on button and again and again.
What will happen in my web application and in the database?
Is it going to wait for previous request to be finished before throwing another one?
Does it accept multiple requests per session?
Is my web application going to freeze?
Is my database going to performe multiple sql request at the same time from same user?
Is my sql server going to freeze?
I know I should not leave it like that and make the button unclickable and put a little message "Please wait" but I'm just interested in what would happen in that situation.
Sorry for bad english!
Thanks!

Check out this answer: Problem with IHttpAsyncHandler and ASP.NET "Requests Executing" counter for an technical viewpoint.
You had a number of questions:
If the request hasn't finished yet and the user seeing that nothing happened clicks again on button and again and again?
In this case, multiple requests will be queued on your web server, and they will all be processed. This will affect performance. As you mentioned, your UI should prevent this by disabling the button and giving the user some feedback that the request is being processed.
What will happen in my web application and in the database? Is it going to wait for previous request to be finished before throwing another one?
No. Unless you are locking on a single resource in your database, each request will be handled by a separate worker in IIS. There are limits to the number of concurrent requests, but generally, things will happen in parallel. If the work you are doing is CPU intensive, there could be some contention for CPU resources and overall performance will suffer. You should definitely look in to the AsyncHandler model.
Does it accept multiple requests per session?
Yes
Is my web application going to freeze?
Freeze might be the wrong word. It is possible that the queue for requests will grow large if requests are coming in at a rate faster than the web server can fulfill them. If this happens, it will appear to users that your web server is unresponsive or slow.
Is my database going to performe multiple sql request at the same time from same user?
It might.
Is my sql server going to freeze?
Same as above. If your SQL Server cannot process requests faster than they are coming in, it may appear to your users that your application is unresponsive.

Related

ASP.Net connection keep alive for long running server-side task

I have an ASP.Net C# web application, running on IIS, that I'm supporting which involves generation of word documents. Some of these word documents take a very long time (i.e. upwards of 20-30 minutes) to generate. What I notice while testing on my dev server is that the server closes the connection long before the process completes, the server-side ASP.Net code itself enters a loop and updates the status of a boolean value when the word doc generation completes.
My workaround for this is to keep the connection alive by implementing a dynamically animated wait screen ( using jquery and ajax) on the client-side that's updated by a repeated asynchronous AJAX call to the server that checks on the status of the operation from a server-side web method. I'm asking about that piece in another question.
Is the solution I'm looking at implementing the best approach to this problem? Are there more efficient or common methods for keeping the connection alive during a long running server-side operation? Any help or insight is appreciated, thanks.
UPDATE:
I tried Brian's suggestion, unfortunately I still get the same error from Chrome that no data is being sent from the server and the entirety of the error is as follows:
No data received Unable to load the webpage because the server sent no
data.
Here are some suggestions: Reload this webpage later.
Error 324 (net::ERR_EMPTY_RESPONSE): The server closed the connection
without sending any data.
I'll try again by setting the connection timeout in the advanced website settings as described and increasing the connection idle setting.
Ideally, you'd use a socket to notify the client when the process completes. Look at socket.io or native web-socket implementations on how to do this.
You can control Idle time within IIS 7. This is done by going to IIS management; select application pools; then right click on the pool your using for your website. Click the "Advance settings" here you will be able to control idle time out and some other settings for your website. Hope this is what your looking for.

Why is ViewState not stored on the server by default?

Can someone give me a good reason why ViewState isn't stored on the server by default?
Why not send a small session token in place of ViewState, which can then be mapped to whatever ViewState info is needed on the server, to prevent the whole ViewState being posted back and forwards multiple times.
Am I missing something?
Scalability - imagine how much server resources would be needed if a complex WebForms page was viewed by 1M users. Server would need to hold ViewState for at least the duration of the session timeout. Automatic server side cleanup of viewstate would also be problematic - user may be viewing several pages at once so ViewState for all pages would need to be retained.
Edit
There are several techniques discussed in these posts on how to move viewstate to the server. However, before you do that, it would be a good idea to remove unnecessary viewstate from controls / pages which don't need it (e.g. View only / no postback rendering).
I'm guessing now, but when viewstate was designed 10 years or so ago, 1GB RAM on a 32 bit server was about as good as it got, and MS presumably had to think of hosting providers wanting load 100's of apps per server. So bandwidth was probably viewed as cheaper than server Ram and disk storage.
There are a number of issues with storing the ViewState in memory.
If the application recycles, the VS for all anyone using the application is lost.
It increases the memory consumption of the application. This isn't an issue if there are only a few apps hosted on a server; but there are cases when there could be many websites hosted on one box.
Scalability; the more active the application, the more VS needs to be stored. And you can't assume 1-1 (1 user - 1 VS). A user can have multiple tabs open, can go back, leave tabs inactive, etc... which leads to:
How long do you store VS? Keeping the data encoded on the page ensures that it'll still be there if the user leaves the site open for a while.
What happens if you're hosted on a web farm. We can't guarantee that the user will hit the same machine on each request.
That being said, there are a few solutions:
Memcached-Viewstate - stores the VS in distributed memory using Memcache. This isn't ideal - if a server goes down the VS for anyone who had the VS stored to that server is lost, but will allow for application pools to reset without issue.
SQL-Viewstate - stores the VS in a SQL database. This adds a least 1 DB read and 1 DB write per request. Again, not ideal, but if the VS is getting unmanagable getting and setting the VS from the database is faster than sending and recieving it over HTTP.
Filesystem-Viewstate - stores the VS in the filesystem. It's less expensive than the SQL connection but would require a file server to work in a distributed environment.
It improves scalability because the server doesn't need to maintain all of that in memory. It is possible to store the viewstate in session but it's generally not recommended.
The root cause is using client side view state is that server doesn't know the current state of the page.
If a user is anxious, does multiple (partial) postback on the page, without waiting the response, browser will send out multiple partial postback requests, that each request create a new view state on server side, which will eventually flush out the initial view state in the browser. Finally the user does his last postback, at that time, the inital copy is gone, thus exception is thrown.
Also server side view state impacts server performance and user experience. If a user doesn't interact with the page for a day or a long time, the view state on server will expire. When the user posts back the page later, an exception is thrown.
For instance I watch youtube video of length 40 minutes. Yesterday I watched the first half, didn't close the tab but hiberated my computer. Today I continue watchig the last half, and post back something, the page will get errored out if the view state is in server and expired.

asp.net infinite loop - can this be done?

This question is about limits imposed to me by ASP.NET (like script timeout etc').
I have a service running under ASP.NET and I want to create a counterpart service for monitoring.
The main service's data is located at a database.
I was thinking about having the monitor service query the database in intervals of 1 second, within a loop, issued by an http request done by the remote client.
Now the actual serving of this monitoring will be done by a client http request, which will make the script loop (written in C#) and when new data is detected it'll aggregate that data into that one looping request output buffer, send it, and exit the loop, thus finishing the request.
The client will have to issue a new request in order to keep getting updates.
This is actually exactly like TCP (precisely like Windows IOCP); You request the service for data and wait for it. When it arrives you fire another request.
My actual question is: Have you done it before? How did it go? Am I limited by some (configurable) limits imposed by the IIS/ASP.NET framework? What are my limits in such situation, or, what are better options without complicating things too much?
Note that I do not expect many such monitoring requests at a time, maybe a few dozens.
This means however that 10 such concurrent monitoring requests will keep 10 threads busy, and the question is; Can it hurt IIS/performance? How will IIS handle 10 busy threads? Will it issue more? What are the limits? This is just one example of a limit I can think of.
I think you main concern in this situation would be timeouts, which are pretty much configurable. But I think that it is a wrong solution - you'd be better of with some background service, running constantly/periodically, and writing the monitoring data to some data store and then your monitoring page would just return it upon request.
if you want your page to display something only if the monitorign data is available- implement it with ajax - on page load query monitoring service, then if some monitoring events are available- render them, if not- sleep and query again.
IMO this would be a much better solution than a reallu long running requests.
I think it won't be a very good idea to monitor a service using ASP.NET due to the following reasons...
What happens when your application pool crashes?
What if you decide to do IISReset? Which application will come up first... the main app, or the monitoring app?
What if the monitoring application hangs due to load?
What if the load is already high on the Main Service. Wouldn't monitoring it every 1 sec, increase the load on the Primary Service, as well as IIS?
You get the idea...

Keeping my web app running after Browser close

I have a aspx web application that updates or adds files in a database. The clients access through the browser and one of the requirements is that they can start the update and be able to close the browser while the update continues. It appears to run for a little bit after I close the browser but then it stops. How can you keep the application running for asp.net?
That's something you could very well solve with WF (Workflow Foundation). Create a workflow for the task that should survive closing the browser. Workflows have their own threads and livecycles separate from ASP.NET.
The web application will keep running in the application pool, but this will be recycled eventually. As long as the users session runs the application should be kept alive, so by upping the session timeout you may fix the problem.
A better approach though would be to move the long-running task into a service instead, but that may require a rewrite of your application.
Usually for long-running or asynchronous processing, you want to dispatch the request to a back-end service to handle. Trying to keep the web-app alive to finish processing can lead to problems, especially with HTTP and session timeouts.
A common pattern for this is to put the request on a message queue and let a back-end service process it when it can.
I would create a separate windows service that you can push jobs onto from your web application, then check the status of the job(s) when the user logs in again.
The windows service won't be tied to the asp.net app domain so it will continue to run regardless of whats happening in your web application.
I've run into this pattern and you have to decouple the work from the HTTP request. The way we've solved it is to abstract the computing to be done as an event to be scheduled. So, say a user at a browser takes an action that requires a long lived (relatively) computation on the back end, this computation is given a name like 'doXYZForUser' and given a prameter vector like (userId, params...) and sent off to the work queue. Some time in the future the user logs in again and can see what the status of their job is.
I'm running a Java stack and a Java Message Service (JMS) but the principle is the same. The request from the browser queues up an event and the browser get an ACK back saying the event is on the work queue. The queue is managed by an entirely separately running process which in .NET I believe is just called the Message Queue. The job comes up on the queue gets processed and the results can be placed in a separate table containing a reference to the user that kicked off the job, so the next time they log in job status/results can be returned.

browser timeouts while asp.net application keeps running

I'm encountering a situation where it takes a long time for ASP.NET to generate reply with the web page (more than 2 hours). It due to the codebehind running for a while (very long, slow loop).
Browser (both IE & Firefox) stops waiting for the reply (after about an hour) and gives generic cannot display webpage error (similar to what you would see if you'd try to navige to non-existing server).
At the same time asp.net app keeps going (I can see it in debugger) and eventually completes.
Why does this happen? Are there any settings in web.config to influence this? I'm hoping there's a timeout setting that I'm missing that's causing this.
Maybe a settings in IE or Firefox? But I think they wait while the server is keeping connection alive.
I'm experiencing this even when I launch app in debug mode (with compilation debug="true") on my local machine from VS (so it's not running on IIS, but on ASP.NET Dev Server).
I know it's bad that it takes so long to generate the page, but it doesn't matter at this stage. Speeding it up would take a lot of extra work and the delay doesn't really matter. This is used internally.
I realize I can redesign around this issue running logic to a background process and getting notified when it's done through AJAX, or pull it to a desktop app or service or whatever. Something along those lines will be done eventually, but that's not what I'm asking about right now.
Sounds like you're using IE and it is timing out while waiting for a response from the server.
You can find a technet article to adjust this limit:
http://support.microsoft.com/kb/181050
CAUSE
By design, Internet Explorer imposes a
time-out limit for the server to
return data. The time-out limit is
five minutes for versions 4.0 and 4.01
and is 60 minutes for versions 5.x, 6,
and 7. As a result, Internet Explorer
does not wait endlessly for the server
to come back with data when the server
has a problem. Back to the top
RESOLUTION
In general, if a page does not return within a few
minutes, many users perceive that a
problem has occurred and stop the
process. Therefore, design your server
processes to return data within 5
minutes so that users do not have to
wait for an extensive period of time.
The entire paradigm of the Web is of request/response. Not request, wait two hours, response!
If the work takes so long to do, then have the page request trigger the work, and then not wait for it. Put the long-running code into a Windows service, and have the service listen to an MSMQ queue (or use WCF with an MSMQ endpoint). Have the page send requests for work to this queue. The service will read a request, maybe start up a new thread to process it, then write a response to another queue, file, or whatever.
The same page, or a different, "progress" page can poll the response queue or file for responses, and update the user, assuming the user still cares after two hours.
For something that takes this long, I would figure out a way to kick it off via AJAX and then periodically check on it's status. The background process should update some status variable on a regular basis and store it's data in the cache or session when complete. When it completes and the browser detects this (via AJAX), have the browser do a real postback (or get by changing location.href), pick up the saved data, and generate the page.
I have a process that can take a few minutes so I spin off a separate thread and send the result via ftp. If an error occures in the process I send myself an error message including the stack trace. You may want to consider sending the results via email or some other place then the browser and use a thread as well.

Resources