CPU Usage on File Upload with Telerik RadAsyncUpload - asp.net

I'm having a huge CPU usage when two or more users are uploading files using RadAsyncUpload. We tried to limit the CPU usage at 20% at IIS, but it did'nt solved. One of the cons is that the upload speed sometimes is like 1Kbps :(

Try playing with the ChunkSize property to see if it will help http://www.telerik.com/help/aspnet-ajax/asyncupload-disable-chunk-upload.html
If large files are used, perhaps they go to the memory and this loads the server.
I have never heard of such a problem before, though.

Related

ASP.DLL memory leak (or something) forcing constant restart of w3wp

We've been struggling with this for the past 12 or so months. We think it's due to either one or two apps that are leaking memory or a large amount of leaks that have finally accumulated over years of programming in classic ASP. We've begun the conversion to ASP.NET but we still have a large number of apps in classic.
We've tried changing how IIS restarts, depending on CPU and memory usage and we've tried to clean up some processes. We've installed multiple analytical tools to be able to track exactly where it's coming from to no avail.
Just today we were able to finally track down a more detail error message, "Detected possible blocking or leaked critical section at asp!g Template cache+88 owned by thread 72 in W3WP". It also states that "ASP.DLL is currently holding a Critical Section Lock on ASP template cache manager...".
So, is there any tool that will help track where our leak is coming from? Or maybe a better way to restart this before it freezes our whole web process?
I appreciate your time!
You have to use cache class (to html) for the most viewed pages look at http://www.webdevbros.net/2006/11/18/cache-object-for-classic-asp/.
You have to close all connections at the end of pages.
These will solve the memory leak.

App Pool memory

My App pool is taking like 180mb to 220mb at any given time.
It sometimes goes down to 80mb but comes back to 180mb in few mins.
Is this behaviour normal? If the memory usage seems high, how can i reduce it?
We have like 500 employees of which at any given time atleast 200 employees will be working on that particular website.
I am using IIS 7.0, windows server 2008, Asp.net 3.5
Any help is greatly appreciated.
Abhi
It is totally dependent on your site. 180-220 mb is nothing. On 32bit windows you have to worry around 600mb. 64bit windows, it can be much higher.
Right click on your App Pool in IIS and choose Advance Settings... then scroll down and look for Private Memory Limit (KB) and Virtual Memory Limit (KB) near the very bottom. However like #BNL suggests your usage is nothing to really be concerned about.
Yes, the behavior you describe sounds normal. Garbage collection, among other things, can cause periodic fluctuations in memory use.
Unless your server is starting to page excessively, I wouldn't restrict the memory available to the AppPools. Internal ASP.NET features such as caching work better when they have plenty of memory available.
If you're concerned that the memory use is higher than it should be, then consider running your apps through a memory profiler, to find out how the memory is being used.

Ways to make ASP.NET build faster

When I'm building my web project it takes about 20 seconds to compile. Then when I try to browse to a web page in project, asp.net does its runtime compilation(another 20 seconds). I know I can't escape these steps because thats how asp.net works, just want to see if anyone has some kind of optimization to make these builds faster.
Trying to improve my Edit-Compile-Test loop
My machine details:
-Intel Core i7 processor #2.80GHz
-8GB of RAM
-HD # 7200 RPM
Buy a faster machine? Sounds like a smart answer. I know that the compiler can take advantage of multi core machines. Also, during compilation there's a lot of Hard drive access, so it may make sense to get a solid state drive. Maybe not the answer you are looking for, but it's a definite solution.
The other thing you can do is configure your project to allow for "Edit And Continue". This will allow for small things to be change, and continue debugging, without doing a full recompile.
Here are a couple of thoughts:
Disable any "realtime" virus / malware protection, at least during this process.
Disable indexing (Windows, Google desktop, etc.) for the folders that VS uses during this process.
Disable / stop other processes that may be accessing the hard disk. The biggest issue here is latency - even if other applications are accessing / writing tiny files, it is the access time that kills speed.
As the original poster suggested, your biggest bang will come from hardware: get an SSD and a processor with at least 4 cores. If you were to buy 4 cheap 64GB SSD's and put them in RAID 0, you would be shocked at the difference and even discover that your CPU and RAM will suddenly become bottlenecks.
Move your code onto a RAMDisk, or buy an SSD drive.
Suspend Resharper - R# helps tremendously when you're just coding but really slows down the Edit-Compile-Test loop.

Debugging ASP.Net shared pool techniques

I work for a hosting company, providing ASP.Net 3.5 hosting. Honestly, we usually provide quite good uptime and velocity. However, we are having problems with one of our shared pools. As usual, we try to maximize the number of webs that can run into one pool.
Lately we are suffering continuous hangs. The process doesn't crash, but starts to show OutOfMemoryExceptions or stops processing requests. We think this is responsability of one of the applications (it would be great to know which one).
I have some memory dumps that I have processed with WinDbg. I've run f.e:
!dumpheap -stat
This method provide global memory usage of objects. Nothing remarkable... Also I've checked:
~*e!clrstack
I see various non managed threads. In those who are managed appears stacks like:
[HelperMethodFrame_1OBJ: 0f30e320]
System.Threading.WaitHandle.WaitMultiple(System.Threading.WaitHandle...
0f30e3ec 7928b3ff System.Threading.WaitHandle.WaitAny(System.Threading...
0f30e40c 7a55fc89 System.Net.TimerThread.ThreadProc()...
0f30e45c 792d6e46 System.Threading.ThreadHelper.ThreadStart_Context(System...
0f30e468 792f5781 System.Threading.ExecutionContext.runTryCode(System...
At least, I haven't seen exception throwing or similar (in that moment). I've also had access to two scripts written by Tess Ferrandez for calculating the number of sessions and size. Also here not promising results. Anything peculiar or remarkable (24000 bytes as average).
I would like to know what kind of strategies are you usually using facing this kind of problems. Have you ever used Microsoft Support?
Thanks a lot!
Very nice question, well a bad asp.net can hang all shared web apps on the same pool...
Ok let see... if the problem is on memory, get the VMMap from Sysinternals, and also the Process Explorer
Run them both, and from Process explorer find the PID number of pool that you wish to investigate, its under the inetinfo.exe, and have probably the name aspnet_wp.exe.
Now on the VMMap add for monitoring this Pool using for help the PID, and voila, you see the memory and the open images (aspx files) that probably are a lot and make the problems... The files that you going to see are located on temporary of asp.net Framework, but you can connect them and see from witch site they come from.
Well if the problem is not on memory, but the programmer have create bad loops, or even create thread sleeps, then I think process explorer is a way to investigate the pools and search for whats eating the power.
Additional
Maybe a pool recycle every 15minute can solve this issue ?
More about
In those videos there are a lot of informations about VMMap and memory manager.
Mysteries of Windows Memory Management, Part 1, and , Part 2
There are many tools, but it sounds like your main goal is to determine what's causing the problem. This can be done very simply with a binary search.
Break the pool in half, and see which one crashes. Repeat until you have a crashed pool with only one application in it.
This is already O(log2n), but you can speed the process up arbitrarily by dividing into more than two sub-pools.

File upload limit in HTTP

Is there any limit theoretically on the size of file that can be uploaded by a client using the browser's file upload using HTML form?
I am posing this because Flash has a drawback where the largest file size you can upload is lesser than the size of available RAM. I am wondering if there is any such restrictions with the browsers...
If your file upload is larger than 2Gb you will run into problems with HTTP uploads.
The "available RAM" limit suggests that the file data is being sucked all into memory, which is very inefficient especially when the file size grows. Streams are much more efficient for this.
Here is the result of a study regarding the upload depending on web browsers.
For the moment, only Google Chrome & Opera are able to perform an upload of more than 4GB.
BranTheMan is correct, I hit this issues a few years ago and we decompiled bits of ASP.NET and found that it will take the file and put it in a byte array. So you cannot get around this.
Maybe with 64 bit hardware you can push past the 2GB limit, but.. 2GB is quite a lot anyways so maybe its enough.
The problem you may run into is if lots of people upload large files. eg 100 people upload 20 meg files as the process cannot allocate more than 2GB on a standard 32 bit server. (with no config changes)
I suppose it would depend on the server receiving the request, both settings and the way the receive is implemented. Apache probably has a different implementation of the receive end from IIS. On the client side the file is read from disk (again this would depend on the browser the client is using and how that is implemented) so there shouldn't be. I don't know if this is ever mentioned say in IE's documentation.

Resources