IIS (or maybe ASP.NET) takes longer time to respond requests when they are sent simultaneously with other requests. For example if a web page sends request A simultaneously along with 20 other requests, it takes 500 ms but when this request is sent lonely, it takes 400 ms.
Is there a name for this feature? It is in IIS or ASP.NET? Can I disable or change it? Is there any benefits using it?
Notes:
I am seeing this issue on a ASP.NET Web API application.
I have checked IIS settings (IIS 8.5 on Windows Server 2012 R2) and found nothing that limit its throughput. All constraints like band-with and CPU throttlings are at high number. Also the server have good hardware.
Update 1:
All requests are going to read something from database. I have checked them in Chrome developers' console. Also created a simple C# application that makes multiple parallel requests to the server. When they are really parallel, they take a large time, but when makes wait between each call, the response time decreases dramatically.
Update 2:
I have a simple method in my application that just sends an Ok:
[AllowAnonymous]
public IHttpActionResult CheckOnline()
{
return Ok();
}
Same behavior exists here. In my custom C# tester, if I call this route multiple times simultaneously it tokes more than 1000 ms to complete but when wait 5 seconds between each call, response time drops below 20 ms.
This method is not IO or CPU bound. Seems that IIS detects that these requests are from a single specific user/client so do not make too much attention to it.
If you use ASP.NET Session in your application, requests are queued and processed one by one. So, the last request can stay holt in the queue while the previous requests are being processed.
Another possible reason is that all threads in the ASP.NET Thread Pool are busy. In this case, a new thread will be created to process a new request that takes additional time.
This is just a theory (or my thoughts). Any other cause is possible.
Related
Task:
I'm using static classes, so everyone shares data that's been already loaded. If someone makes a change, that request will put an item in a list with an incremental ID, and my idea would be that every client has it's version on client side and requests if there's any change.
My solution: For this I use a $.post with a timeout of 5000ms and sending the client version, on server side I have a 500 cycle for loop which checks if there's something newer and breaks the loop, returns the changes and have a 10ms Thread.Sleep in every cycle so it wouldn't hog the cpu. Either if on the client it times out, has an error, I call the post again, if it succeeds I process the return data, than call the post again. This way I always should get the changes almost instantly without an overwhelming number of requests, and if something fails, I only need to wait 5secs for it to resume.
My problem is that when this loop runs, other requests aren't handled. With asp.net development server, that's okay, because it's single threaded. But that's also the case with win7hp iis7.5.
What I tried: Set it in the registry (HKLM\SOFTWARE\Microsoft\ASP.NET\4.0.30319.0\MaxConcurrentRequestsPerCPU), increasing the worker threads for the application pool, updating the aspnet.config file with maxConcurrentRequestsPerCPU="12" maxConcurrentThreadsPerCPU="0" requestQueueLimit="5000" settings, and I also read that my win7hp should be able to use 3 threads. I also thought it's an optimization, that I use same variables in one request so it queues the others, so I commented those lines, left the for loop with the sleep only, but same result.
Don't use Thread.Sleep on threads handling requests in ASP.Net. This will essentially consume thread and prevent more requests to be started. There is restriction on number of threads ASP.Net will create to handle requests that you've tried to change, but high number of threads will make process less responsive and can easily cause OutOfMemeoryException for 32bit proceses - so it is not a good route.
There are several other threads discussing implemeting long poll requests with ASP.Net like this one - Can ASP.NET MVC's AsyncController be used to service large number of concurrent hanging requests (long poll)? and obviously Comet questions like this - Comet implementation for ASP.NET?.
This question is about limits imposed to me by ASP.NET (like script timeout etc').
I have a service running under ASP.NET and I want to create a counterpart service for monitoring.
The main service's data is located at a database.
I was thinking about having the monitor service query the database in intervals of 1 second, within a loop, issued by an http request done by the remote client.
Now the actual serving of this monitoring will be done by a client http request, which will make the script loop (written in C#) and when new data is detected it'll aggregate that data into that one looping request output buffer, send it, and exit the loop, thus finishing the request.
The client will have to issue a new request in order to keep getting updates.
This is actually exactly like TCP (precisely like Windows IOCP); You request the service for data and wait for it. When it arrives you fire another request.
My actual question is: Have you done it before? How did it go? Am I limited by some (configurable) limits imposed by the IIS/ASP.NET framework? What are my limits in such situation, or, what are better options without complicating things too much?
Note that I do not expect many such monitoring requests at a time, maybe a few dozens.
This means however that 10 such concurrent monitoring requests will keep 10 threads busy, and the question is; Can it hurt IIS/performance? How will IIS handle 10 busy threads? Will it issue more? What are the limits? This is just one example of a limit I can think of.
I think you main concern in this situation would be timeouts, which are pretty much configurable. But I think that it is a wrong solution - you'd be better of with some background service, running constantly/periodically, and writing the monitoring data to some data store and then your monitoring page would just return it upon request.
if you want your page to display something only if the monitorign data is available- implement it with ajax - on page load query monitoring service, then if some monitoring events are available- render them, if not- sleep and query again.
IMO this would be a much better solution than a reallu long running requests.
I think it won't be a very good idea to monitor a service using ASP.NET due to the following reasons...
What happens when your application pool crashes?
What if you decide to do IISReset? Which application will come up first... the main app, or the monitoring app?
What if the monitoring application hangs due to load?
What if the load is already high on the Main Service. Wouldn't monitoring it every 1 sec, increase the load on the Primary Service, as well as IIS?
You get the idea...
Under windows server 2008 64bit, IIS 7.0 and .NET 4.0 if an ASP.NET application (using ASP.NET thread pool, synchronous request processing) is long running (> 30 minutes). Web application has no page and main purpose is reading huge files ( > 1 GB) in chunks (~5 MB) and transfer them to the clients. Code:
while (reading)
{
Response.OutputStream.Write(buffer, 0, buffer.Length);
Response.Flush();
}
Single producer - single consumer pattern implemented so for each request there are two threads. I don't use task library here but please let me know if it has advantage over traditional thread creation in this scenario. HTTP Handler (.ashx) is used instead of a (.aspx) page. Under stress test CPU utilization is not a problem but with a single worker process, after 210 concurrent clients, new connections encounter time-out. This is solved by web gardening since I don't use session state. I'm not sure if there's any big issue I've missed but please let me know what other considerations should be taken in your opinion ?
for example maybe IIS closes long running TCP connections due to a "connection timeout" since normal ASP.NET pages are processed in less than 5 minutes, so I should increase the value.
I appreciate your Ideas.
Personally, I would be looking at a different mechanism for this type of processing. HTTP Requests/Web Applications are NOT designed for this type of thing, and stability is going to be VERY hard, you have a number of risks that could cause you major issues as you are working with this type of model.
I would move that processing off to a backend process, so that you are OUTSIDE of the asp.net runtime, that way you have more control over start/shutdown, etc.
First, Never. NEVER. NEVER! do any processing that takes more than a few seconds in a thread pool thread. There are a limited number of them, and they're used by the system for many things. This is asking for trouble.
Second, while the handler is a good idea, you're a little vague on what you mean by "generate on the fly" Do you mean you are encrypting a file on the fly and this encryption can take 30 minutes? Or do you mean you're pulling data from a database and assembling a file? Or that the download takes 30 minutes to download?
Edit:
As I said, don't use a thread pool for anything long running. Create your own thread, or if you're using .NET 4 use a Task and specify it as long running.
Long running processes should not be implemented this way. Pass this off to a service that you set up.
IF you do want to have a page hang for a client, consider interfacing from AJAX to something that does not block on IO threads - like node.js.
Push notifications to many clients is not something ASP.NET can handle due to thread usage, hence my node.js. If your load is low, you have other options.
Use Web-Gardening for more stability of your application.
Turn-off caching since you don't have aspx pages
It's hard to advise more without performance analysis. You the VS built-in and find the bottlenecks.
The Web 1.0 way of dealing with long running processes is to spawn them off on the server and return immediately. Have the spawned off service update a database with progress and pages on the site can query for progress.
The most common usage of this technique is getting a package delivery. You can't hold the HTTP connection open until my package shows up, so it just gives you a way to query for progress. The background process deals with orchestrating all of the steps it takes for getting the item, wrapping it up, getting it onto a UPS truck, etc. All along the way, each step is recorded in the database. Conceptually, it's the same.
Edit based on Question Edit: Just return a result page immediately, and generate the binary on the server in a spawned thread or process. Use Ajax to check to see if the file is ready and when it is, provide a link to it.
I'm trying to get a better handle on how threads work in ASP.NET, so I have a test site with a few pages, and I have a test WinForms client that creates 40 roughly concurrent requests to the test site. The requests take about 5-10 seconds to complete--they call a web service on another server. When I run the test client, I can use Fiddler to see that the requests are being made concurrently. However, when I look at Performance Monitor on the web server, with counters "ASP.NET Apps v2.0.xxx/Requests Executing", "ASP.NET/Requests Current", "ASP.NET Requests Queued", these counters never display more than 2. This is the case regardless of whether the test page I'm requesting is set up with Async=True and using the Begin/End pattern of calling the web service, or if it's set up to make the call synchronously. Judging by what I see in Fiddler, I would think I should be seeing a total of 40 requests in one of those states, but I don't. Why is that? Do these counters not mean what I think they mean?
My application overview is
alt text http://img823.imageshack.us/img823/8975/modelq.jpg
ASP.Net webservice entertains requests from various applications for digital signing and verification via a client. The webservice will then route these requests to a smart card
When the system date changes, I want the following to happen.
New request from the clients are made to wait
Current work between webservice and smart card should get completed
If there is any prior pending requests then they should get completed.
The reason why I need the above things to happen is, I need to close the existing sessions between the smartcard and webservice. This should happen only when there is no signing/verification of files. I cannot just close all the sessions as it might affect a file being processed by any one of the threads. So I need to make sure that there are no current active threads between webservice and smart card.
I wrote a piece of code which gives the total number of active threads between webservice and smartcard.
int vWorkerThreads,vWorkerThreadsMax;
int vPortThreads,vPortThreadsMax;
System::Threading::ThreadPool ^ vThreadPool;
vThreadPool->GetAvailableThreads(vWorkerThreads, vPortThreads);
vThreadPool->GetMaxThreads(vWorkerThreadsMax, vPortThreadsMax);
ActiveThreadCount = vWorkerThreadsMax - vWorkerThreads;
This means, I also need to make the client requests wait?
CLEANUP MECHANISM: Close the PKCS#11 API using C_CloseAllSessions and C_Finalize call which will free up the library so that it cleans all the session objects. This should be done once everyday.
Any ideas on how I can perform such a task?
UPDATE:
I could have been much more clearer in my query. I want to make it clear that my aim is not to shutdown the ASP.NET webservice. My aim is to reset the smartcard. As I am accessing the smartcard via ASP.NET webservice, I need a mechanism to perform this task of resetting the smart card.
I am giving the current process below
Client detects Date change, At midnight
Client calls the function WebService_Close_SmartCard
Web Service receives the request WebService_Close_SmartCard and in turn
calls PKCS11_Close_SmartCard. This
Call will be served via one of the
available threads from the Thread
Pool. PKCS11_Close_SmartCard will
close all the existing current
sessions with the smartcard.
At this point, I want to make sure that there are no threads with
function calls such as
PKCS11_DigitalSign_SmartCard/
PKCS11_DigitalVerify_SmartCard talking
to smartcard, as
PKCS11_Close_SmartCard will abruptly
end the other ongoing sessions.
PS: I am new to ASP.NET and Multithreading.
The question was updated in a big way, so bear with me...
Given that no threads are being created directly\indirectly by your web method code:
Quesiton So you are not explicitly creating any new threads or using ThreadPool threads directly\indirectly, you are simply receiving calls to your web method and executing your code synchronously?
Answer Yes, you are correct. There is a client API which calls the webservice. Then the webservice manages the threads automatically(creats/allocates etc) inresponse to the client's demands.The webservice talks to a smart card by opening multiple sessions for encryption/decryption.
It is more helpful to rephrase the original question along the lines of "requests" rather than threads, e.g.
When the system date changes I want to re-start my ASP.NET application and ensure that all requests that are currently executing are completed, and that any outstanding\queued requests are completed as well.
This is handled automatically as there is a concept of a request queue and active requests. When your ASP.NET application is restarted, all current and queued requests are completed (unless they do not complete in a timely fashion), and new requests are queued and then serviced when a new worker process comes back up. This process is followed when you recycle the Application Pool that your ASP.NET application belongs to.
You can configure your application pool to recycle at a set time in IIS Manager via the "Recycle" settings for the associated Application Pool. Presumably you want to do this at "00:00".
Update
I think I can glean from your comments that you need to run some cleanup code when all requests have been serviced and then the application is about to shut down. You should place this code in the global "Application_End" event handler.
Update 2
In answer to your updated question. Your requirements are:
When the application is restarted:
New request from the clients are made to wait
Current work between webservice and smart card should get completed
If there is any prior pending requests then they should get completed.
This is supported by the standard recycling pattern that I have described. You do not need to deal with request threads yourself - this is one of the pillars of the ASP.NET framework, it deals with this for you. It is request orientated and abstracts how requests are handled i.e. serviced on multiple threads. It manages putting requests onto threads and manages the lifeclyle of those requests when the application is recycled.
Update 3
OK, I think we have the final piece of the scenario here. You are trying to shut down ASP.NET from your client by issuing a "CLOSED" web service call. Basically
you want to implement your own ASP.NET shut down behaviour by making sure that all current and queued request are dealt with before you then execute your clean-up code.
You are trying to re-invent the wheel.
ASP.NET already has this behaviour and it is supported by:
a. Application Recycling It will service outstanding requests cleanly and start-up a new process to serve new requests. It will even queue any new requests that are received whilst this process is going on.
b. Application_End A global application event handler where you can put your clean-up code. It will execute after recycling has cleanly dealt with your outstanding requests.
You do not need your "CLOSED" command.
You should consider letting IIS recycle your application as it has support for recycling at a specified daily time(s). If you cannot configure IIS due to deployment reasons then you can you use web.config "touching" to force a recycle out-of-bounds of IIS:
a. Have a timer running in the server which can check for the date change condtion and then touch the web.config file.
b. Still have the client call a "CLOSED" web method, but have the "CLOSED" method just touch the web.config file.
IIS, then "a" are the most desirable.
Honestly Microsoft have already thought about it. :)
Update 4
#Raj OK, let me try and rephrase this again.
Your conditions are:
You have a requirement to reset your smartcard once a day.
Before resetting your smartcard, all current and queued web service requests must be completed i.e. the outstanding requests.
After outstanding requests are completed, you reset your smartcard.
Any new requests that come in whilst this process is happening should be queued and then serviced after the smartcard has been reset.
These conditions allow you to complete existing requests, queue any new requests, reset your smartcard, and then start processing new requests after the card has been reset.
What I am suggesting is:
Place your smartcard reset code in "Application_End".
Configure IIS to recycle your application at "00:00". Ensure that in advanced settings for the associated Application Pool that you configure "Disable Overlapped Recycle = True".
At "00:00" application recycling ensures that all current and queued requests will be completed.
After "00:00" application recycling ensures that all new requests will be queued whilst requests in "3" are completed and the application performs shutdown steps.
After requests in "3" are completed, "Applicaton_End" will be called automatically. This ensures that your smartcard is reset after all current requests are completed.
Application recycling ensures that your application is re-started in a new process, and that new requests queued in step "4" start to be processed. The important thing here is that your reset code has been called in "5".
Unless there is some detail missing from your question, the above appears to meet your conditions. You wish to do "x,y,z" and ASP.NET has built-in support which can be used to achieve "x,y,z" and gives you mature, guaranteed and well-documented implementations.
I am still struggling to understand why you are talking about threads. I do multi-threaded development, but talking about threads instead of requests when thinking about ASP.NET adds unnecessary complexity to this discussion. Unless your question is still unclear.
Perhaps you are missing the point I'm making here. I am drawing a parallel between the behaviour you require when you call "CLOSED" from your client application, and what happens when you recycle an application. You can use recycling and "Application_End" to achieve the required results.
I am trying to help you out here, as trying to implement this behaviour yourself is unnecessary and non-trivial.