w3wp.exe has high cpu usage on every request - asp.net

I'm running a Windows 2008 server (a VPS with 1GB of RAM), with SQL Server Express and IIS 7 installed. On it I'm hosting a NopCommerce 1.7 website, with a database of around 26 000 products.
Right now I'm the only user of the website (it's in development) and I'm getting rather bad performance from it. To be more specific every time I make a request, the worker process goes to 90-100% CPU usage for a few seconds. Is it me or this is a lot for a 1 user NopCommerce website? Any ideas why this happens and what I can do to rectify it or further investigate?
PS: the worker process uses between 100MB-400MB of memory (private working set), and SQL Server with this database, around 160MB. Do you have any suggestions other then the obvious one to get more RAM? I intend to get one more GB but I fear this will not solve the cpu usage problem.

You've already stated you're going to get more RAM, but don't be surprised how much a lack of RAM can impact the CPU. If your RAM is not able to hold large objects efficiently because of lack of space (and I'd say using 40% of available RAM qualifies), then the CPU has to work harder to page things in and out of virtual memory. 90% is a little overkill for this, but with the server specs you give it's not impossible.
The most likely problem is that there is a hole in your code somewhere. My guess is that you have either an infinite loop or a direct memory leak (resources open during requests that aren't closed perhaps?). Your best bet would be to get the IIS Debug Diagnostics tool, install it and set up reports to find out what is going on directly on the server.

Related

Constant SQL Server 80% CPU Utilization

We have a small (for now) Asp.Net MVC 5 website on a dedicated VPS. When I go to the server and fire-up task manager, I see that "SQL Server Windows NT - 64 bit" is using around 80% of CPU and 170MB of RAM and IIS is using 6% CPU and 400MB of RAM. Server Specs are:
CPU 1.90Ghz dual core
Memory 2GB
Windows Server 2012
SQL Server Express 2012
Disk Space: 25GB, 2.35 Free.
The database is not very big. Its backup is less than 10MB.
I have tried to optimize the website as much as I could. I added caching to a lot of controllers and implemented donut caching for quite a lot of controllers. But today, even though there were only 5 users online, our search wouldn't work. I restarted the Windows on the server and it started working but I got the high CPU usage the minute server started. Interestingly when I open the SQL Server Management Studio and try to get the report for top CPU-consuming queries it says that there are no queries currently consuming any CPU!!! But at the same time I can see that SQL server is consuming a lot of CPU. How can I examine what is taking all the CPU? Below is a picture from the server:
I was/am very careful with designing and implementing the website. All the database access is through latest version of Entity Framework. I just wonder if the server's specs are low. Any help would be very much appreciated.
Update:
Here's the result of the sp_who2 stored procedure.
This could happen if the memory set to use is more than the available memory on the box. The default memory setting of 2147483647MB. In our case the AWS box had only 30.5 GB so we changed the setting to 26GB and the CPU usage fell to 40%. You generally want to leave 20% of memory for OS and its operations.
I would agree running SQL Profiler to spot large query durations and large write operations. Try running perfmon and spotting any potential connection leaks (reclaimed connections).

asp.net runs fast then slow on restart

Why is it that every time the server goes down, and asp.net restarts, the response time is SUPER FAST when it comes back up for a few minutes. I assume because everyone is off the server and I am one of the few (or only) people back on the server so quick?
I have discussed this with our developers and they say the response time is due to everyone on the server normally (200+ desktops) and when you are the only person on there, it flys. Really? Then does that mean we need newer, faster web servers?
I am not a programmer, but I think there may be two answers, one is what the devs say above is true, and two is the system is accumulating temp files of some sort and they get cleared out when the server crashes and then restarts.
How do we prove who might be right? Where does one start to look for asp.net bottlenecks?
windows server 2003
asp.net 3.0
iis6
12GB ram
sql server 2005 (db admin says there is no load issue on sql..)
Some very basic steps that you can follow and check if your server work on limits are:
First you download the Process Explorer from sysinternals and you run it to see two things.
Is your server work on their memory limit ?
If yes then what program eats the memory, usually SQL Server 2005 use a lot of memory for database cache, and this is done after many time of work.
Did the server use all of his computing power, if yes, check what program is the one that need all that computing power.
Now next step, download the TCPView from sysinternals, run it and see how many connections are done, how fast, etc... There you can see anomalies, or if the computer is also on their limit.
Final step is to defrag your disks.
Also have in mine that the asp.net session is lock the entire view on all users. So if you have one application on web, with too many users, and each user, or some users, make long time processing on their calls, then this can cause delay to all the users.

IIS 6 RAM Allocation on Windows Server 2003

I have my IIS 6 running my website. It is on a Windows Server 2003 which has 4GB of RAM. I run SQL intensive code after the user submits a form (math statistics stuff). This process is not threaded (should it be, especially if 2 or more users run the same thing?). But my process seems to consume only a couple of GBs of memory and the server crawls. How do I get my IIS process to use nearly all the memory?
I see on other sites that its 2GB or 3GB allocated using boot.ini. But is there another way for the process to use memory? If I make it multithreaded, will there be a process for each thread?
If there is still memory free for IIS, it does not need more. Even if you give it more memory it will perform better. It is good to see some memory is not used and can be used for other processes as IIS. If you want to make is multi threading, it depends on what you do parallel if more memory is used, and if you gain any performance.
The basic here is to start with your requirements and see what peak use you can have. Then make a performance test to see if your machine can handle that load. To be sure you can handle some more do an other test to see the peek load your machine can handle. Then you will know if you have to invest any more time.
Check you database server to see if you bottleneck is not on that machine, most developers forget optimizing and maintaining their databases.

ASP.NET - Single large web request triggers System.OutOfMemoryException - Still have plenty of available memory

Environment:
Windows 2003 Server (32 bit); IIS6, ASP.NET 2.0 (3.5); 4Gb Ram; 1 Worker Process
We have a situation where we have a very large System.XmlDocument is being loaded into memory, and then it heads into a complied XSL transform.
What is happening is when a web request comes in the server is sitting in an idle state with 2500Mb of available system memory.
As the XML DOM is populated, the available memory drops approx 500Mb at which point we get a System.OutOfMemoryException event. At this point the system should theoretically still have 2000Mb of available memory available to service the request (according to Perfmon).
The related questions I have are:
1) At what level in the stack is this out of memory limitation being met? OS? IIS? ASP.NET? worker process? Is this a per individual web request limit?
2) Is this limit configurable somewhere?
3) Why can’t this web request access the full available system memory?
1) I would guess at the worker process but this should be configurable within IIS to the limit of memory that a worker process can use. Another factor is what level of bits does your software use, e.g. 32 bit has a physical limit of 4 GB since this is the total address space.
2) Probably but don't forget that memory fragmentation may play a role in getting to out of memory faster than you think, e.g. if there is a memory request for a contiguous 1000 Mb piece of memory then this may not necessarily be found in the current memory.
3) Have you examined dump data to see what is in the memory when the exception gets thrown? If not, there are ways to get a snapshot of the memory to see what it looks like as this may give you more clues about what is going on.
You are running in a process. A process can only access 2 gigs of memory. This task is sharing memory with everything else running in this process, so this bit of code does not get the full 2 gig -- even if it is available.
There is a 3 gig switch on the os as well. I believe it is a registry setting. But you will have to search MSDN to find that info.
But realistically, you need to do this another way. Possibly by switching to a SAX style xml parser.
I'm sure there are some bright heads here that can answer your specific questions, but have you asked yourself if there is another way to do what you want? I specifically mean that you probably do not want to process a very large XML document, but you probably more specifically want to return something back to the client. Could you rewrite the code to avoid this XML document altogether, or perhaps not load it all into memory at the same time, and still produce the same end-result?
1) Dunno. Check your logs.
2) IIS limits memory divvied out to websites/application pools. Check your settings.
3) Servers are all about uptime; if an single app hogs all the resources everybody else suffers. Thats why enterprise apps like IIS limit memory to prevent runaways from taking down the entire server.

Best way to determine the number of servers needed

How much traffic can one web server handle? What's the best way to see if we're beyond that?
I have an ASP.Net application that has a couple hundred users. Aspects of it are fairly processor intensive, but thus far we have done fine with only one server to run both SqlServer and the site. It's running Windows Server 2003, 3.4 GHz with 3.5 GB of RAM.
But lately I've started to notice slows at various times, and I was wondering what's the best way to determine if the server is overloaded by the usage of the application or if I need to do something to fix the application (I don't really want to spend a lot of time hunting down little optimizations if I'm just expecting too much from the box).
What you need is some info on Capacity Planning..
Capacity planning is the process of planning for growth and forecasting peak usage periods in order to meet system and application capacity requirements. It involves extensive performance testing to establish the application's resource utilization and transaction throughput under load. First, you measure the number of visitors the site currently receives and how much demand each user places on the server, and then you calculate the computing resources (CPU, RAM, disk space, and network bandwidth) that are necessary to support current and future usage levels.
If you have access to some profiling tools (such as those in the Team Suite edition of Visual Studio) you can try to set up a testing server and running some synthetic requests against it and see if there's any specific part of the code taking unreasonably long to run.
You should probably check some graphs of CPU and memory usage over time before doing this, to see if it can even be that. (A number alike to the UNIX "load average" could be a useful metric, I don't know if Windows has anything like it. Basically the average number of threads that want CPU time for every time-slice.)
Also check the obvious, that you aren't running out of bandwidth.
Measure, measure, measure. Rico Mariani always says this, and he's right.
Measure req/sec, RAM, CPU, Sessions, etc.
You may come up with a caching strategy (Output caching, data caching, caching dependencies, and so on.)
See also how your SQL Server is doing... indexes are a good place to start but not the only thing to look at..
On that hardware, a .NET application should be able to serve about 200-400 requests per second. If you have only a few hundred users, I doubt you are seeing even 2 requests per second, so I think you have a lot of capacity on that box, even with SQL server running.
Without know all of the details, I would say no, you will not see any performance improvement by adding servers.
By the way, if you're not using the Output Cache, I would start there.

Resources