Does it help to use NGEN? - asp.net

Is it better to use NGEN an ASP.NET application when we know it is not going to change much? Or is the JIT good enough?
The only reason I asked was because this article by Jeffrey Richter in 2002 says :
And, of course, Microsoft is working quite hard at improving the CLR
and its JIT compiler so that it runs faster, produces more optimized
code, and uses memory more efficiently. These improvements will take
time. For developers that can't wait, the .NET Framework
redistributable includes a utility called NGen.exe.

NGen will only help startup time - it doesn't make the code execute any faster than it would after JITting. Indeed, I believe there are some optimizations which NGen doesn't do but the JIT does.
So, the main question is: do you have an issue with startup time? I don't know how much of an ASP.NET application's start-up time will be JITting vs other costs, btw... you should probably look at the Performance Manager graphs for the JIT to tell you how much time it's really costing you.
(In terms of availability, having multiple servers so you can do rolling restarts is going to give you much more benefit than a single server with an NGENed web app.)

NGen isn't the way to go for ASP.NET -- the creation of the .dlls in the bin folder isn't the final step -- they are compiled again with the web/maching.config settings applied into your C:\Windows\Microsoft.NET\Framework\v2.0.50727\Temporary ASP.NET Files folder. Instead of NGen, to decrease initial load-time, use the Publish Website Tool or aspnet_compiler.exe

I'm not sure that NGEN's primary benefit is to start-up time alone - if the application's suffering from a high '% Time in JIT', this is listed as a potential remedy:
http://msdn.microsoft.com/en-us/library/dd264972(VS.100).aspx.
The discussion is closely related to a different question on how JIT'd machine code is cached and re-used?

I was looking into this tonight and came across the following:
The common language runtime cannot load images that you create with NGEN into the shared application domain. Because ASP.NET standard assemblies are shared and are then loaded into a shared application domain, you cannot use Ngen.exe to install them into the native image cache on the local computer.
http://support.microsoft.com/kb/331979
Not sure if this just refers to assemblies referenced from ASP.net app or the app itself?

NGen helps startup time. For example, the Entity Framework documentation says this about the benefit of running NGen:
Empirical observations show that native images of the EF runtime assemblies can cut between 1 and 3 seconds of application startup time.
The context is just the Entity Framework assemblies. NGenning other assemblies would provide additional speedup.

Related

Profiling running ASP.NET application in production

I am trying to profile a web application running on IIS in a Production environment (.Net framework 4.0, WebForms, SQLServer, Windows 2008 server) .
Several pages are repeatedly slow in Production, but we are unable to reproduce in Development.
We cannot install any IDE or similar tool in the Production environment.
Does anyone know of a DLL or a stand-alone exe that we could easily drop into the server, run for about an hour, and then quickly and easily remove?.... we are seeking one important aspect to profiling:
--> what is the amount of time spent for each CSharp method to run. <--
Thank you in advance.
Try this https://github.com/jitbit/cpu-analyzer command-line tool for profiling cpu load in production.
We forked this from the original Sam Saffron's cpu profiler, which is kinda abandoned now.
Disclaimer: I'm the maintainer of this project.
Perfview is the one of the best tool for this purpose, have used to find numerous production issues.
You can narrow down the issue using multiple approach, it can tell from network level to IIS to asp.net to your C# methods.It uses ETW events emitted by IiS,Asp.net and CLR to do this.
https://channel9.msdn.com/Series/PerfView-Tutorial/PerfView-Tutorial-7-Using-the-Event-Viewer-in-ASPNET-Scenarios
https://channel9.msdn.com/Series/PerfView-Tutorial
using the ThreadTime view you can narrow down to a particular method
Perfview always collect system wide data and you can also set a circular buffer .But best way to collect data is when the issue starts happening, You go to the server in question and start perfview trace and reproduce the issue.Then you can analyze this data later to find the performance bottleneck.

Do you have any suggestions on how to speed up precompiling a Kentico site? AKA: the aspnet_compiler is VERY slow

I have a Kentico 6 site that we have squeezed better performance out of by precompiling. We have a Team City Continuous Integration (CI) server that runs the build automatically on check in to the subversion repository.
However, the build takes 40 min! About 33 min are during the aspnet_compiler step! This is a pretty long time for a build that SHOULD take less than 10 min. I've gone through some of the various perf improvements such as moving the asp.temp files to a fast SSD. During the aspnet_compiler step, the server is using very little CPU, Disk and RAM. The CPU use seems to avg about 1%! In researching about aspnet_compiler, I have found many pages complaining about the speed, but a dramatic silence from Microsoft about it.
https://aspnet.uservoice.com/forums/41199-general-asp-net/suggestions/4417181-speed-up-the-aspnet-compiler http://programminglife.wordpress.com/2009/04/16/aspnet_compiler-compilation-speed-part-1/
I've come to (perhaps erroneous) conclusion that if I can't speed up the aspnet_compiler, then perhaps I can reduce it's workload. Since Kentico has lots of controls in the project, are there any savings that I can glean by remove extraneous controls? ie: if I run the aspnet_compiler for a project with a single page, it's quick.
I've also thought about maybe making the cmsdesk a separate application that only needs to be compiled after a new hotfix has been applied.
To recap: My two concerns are: #1 - can I speed up the aspnet_compiler somehow? #2 - If I can't, then I'm guessing I can reduce it's workload.
In relation to #1, maybe I can do incremental compilation, so that I only precompile files that have changed since the last build? I haven't found much info about doing this; there are a few unanswered questions on StackOverflow about this very topic - eg: aspnet_compiler incremental precompile Incremental Build aspnet_compiler
FYI - for those of you unfamiliar with Kentico CMS, It's a Web Site Project, with LOTS of controls - maybe hundreds of them.
Any ideas?
PS - I have a reply on the Kentico forums: http://devnet.kentico.com/questions/do-you-have-any-suggestions-on-how-to-speed-up-precompiling-a-kentico-site-aka-the-aspnet_compiler-is-really-slow
One thing our team did try to improve the compilation time was to remove the modules and set of controls that were not necessary to the project(forum, ecommerce etc..) to speed up the compilation time.
Also, which approach do you use for developement, portals or aspx? as we've noticed that compilation prove to be faster in the portal approach than in the ASPX .
Everything I've read says that precompiling with aspnet_compiler is super difficult to achieve with Kentico.
However, I managed to find this post on Kentico's forums where someone appears to have figured out a way to make your system work for them (search for EHUGGINS-PINNACLEOFINDIANA on the page and you'll find it).
I have heard that using MSDeploy or MSBuild are a much more efficient way to precompile than directly calling aspnet_compiler (although I'm pretty sure both of those either still call aspnet_compiler or do the same thing it does, only faster).
I've personally never tried using any of these methods (and I'm pretty new to ASP.NET myself), but I figured I could at least give you some leads:
http://odetocode.com/blogs/scott/archive/2006/10/18/what-can-aspnet_compiler-exe-do-for-me.aspx
http://therightstuff.de/2010/02/06/How-We-Practice-Continuous-Integration-And-Deployment-With-MSDeploy.aspx
http://www.troyhunt.com/2010/11/you-deploying-it-wrong-teamcity_26.html

Does DLL deployment into GAC reduce memory usage when shared by 100+ web apps?

I have an ASP.NET app hosted for many domains and all are mapped to the same root folder, i.e., code-base. Will it take less memeory if I load the DLLs currently located in bin folder into GAC? Any special code considerations to take to load DLLs into GAC for sharing (other than security aspect)?
Is there any way of making DLLs sharable in .NET environment?
Thank you
Kris
Putting the DLLs in the GAC will not help with memory usage. The DLLs still need to be loaded into each app domain, and they will not be in shared memory.
The point of using the GAC is to centralize distribution of assemblies - so changes can be managed in one place. Sounds like this is something that you could still benefit from.
As Oded explained just putting them in the GAC won't help. What might help is putting them into the GAC and NGEN the assemblies. See here and here. Be sure to measure the different memory usages, load times and overall performance, because those can be negatively influenced by NGEN. See here and here.
Actually, the accepted answer from #Oded and the other from #LarsTruijens are both
incorrect. Placing assemblies in the GAC DOES help reduce memory usage.
This is confirmed by Jeffrey Richter (from Wintellect who helped design the CLR with the .NET team) in his book CLR via C#:
Installing assemblies into the GAC offers several benefits. The GAC enables many applications to share assemblies, reducing physical memory usage on the whole....
It is also confirmed by Tess Ferrandez (Memory and Performance Guru from Microsoft's - https://blogs.msdn.microsoft.com/tess/2006/04/12/asp-net-memory-you-use-the-same-dll-in-multiple-applications-is-it-really-necessary-to-load-it-multiple-times).
Wherever possible strong name and install to the global assembly cache (GAC) any assemblies that are used by more than one ASP.NET application. This will reduce memory consumption.
I've also confirmed this myself by testing (WinDbg, Task Manager, and ProcExplorer) on a x64 WebAPI service as an example. You'd see that the Private Working Set is lower with the GAC'd app. In the case of NGen, you would again see the Private Working Set decreased. However, the Page Faults are also greatly reduced in the NGen'd app compared to the baseline (almost by half in my test). I saw no difference in Page Faults between the GAC'd app and non-GAC'd app.
Note that the best solution in some cases is to combine NGen and the GAC by installing NGen'd assemblies into the GAC. This optimizes memory usage between apps that share assemblies as well as providing a performance boost at application startup!

Opinions on MSDeploy

You know, the next "big" and "enterprisey" thing from Microsoft.
Is it just me, or is it really hardly for humans? Main highlights are (IMO):
Absolutely cryptic syntax (-skip:objectName=filePath,absolutePath=App_Offline.* just for skipping App_Offline.html)
Manifest as an after thought
Lack of thorough documentation
Not a word about extensibility (except for several blog posts out there). Moreover, all these extensions developed in great pains have to be registered in GAC and registry
Waaay too low-level (metadata/metakey; all this IIS jazz)
No integration with MSBuild
Granted, MSDeploy and MSDeployAgent are quite powerful, but do they really need to be that complex for relatively simple tasks?
I too share your frustrations over the lack of documentation and the apparent low-level nature of this tool.
However what MS has done is finally create a free tool with which you can actually script whole server deployments, including parameterising addresses, configurations etc. This is unfortunately a very complicated thing to do - given how many bits of configuration actually go into a web server - and this is probably the best way to do it all.
What we need now is a really good GUI that can help build up these packages, and scripts etc. The GUI that is embedded within IIS is good - but again, short on explanation - so hopefully soon that'll be addressed.
On the functional side, I'm using at the moment to deploy a site from dev -> staging -> live with parameters to change bound IP addresses etc. I was deeply frustrated that it took me a few days to get it all working - however now I have it, I can remove a lot of the possibly of human error at the IT Support side - who are responsible for our deployments. I now only have the configuration of my master staging server to worry about - and can be sure that all the servers in the web farm will be kept in sync whenever I deploy.
As Sayed mentions, as well, there are MSBuild tasks in 2010 (the Website Deployment feature is now implemented using msdeploy) to work with this - which also brings the possibility of a true Continuous Integration environment to VSTeamSystem - having a team build that can actually perform a full web deployment as its last step is very exciting (and scary, granted!).
Actually there are MSBuild tasks for MSDeploy. They will be shipped with .NET 4/Visual Studio 2010.
Although a bit rough around the edges, I've come to like MSDeploy quite a bit. Using it to sync web servers in a farm is very useful as it is efficient (only copies changes) and takes care of actual IIS settings in addition to content files. It seems like MSDeploy is a building block for various scenarios and uses. Also, as previously mentioned, there is a MSBuild task for MSDeploy in .NET 4. I've taken advantage of this MSBuild task to make deployment of my web applications from TeamCity trivially easy. I've blogged here it here:
Web Deploy (MS Deploy) from TeamCity - http://www.geekytidbits.com/web-deploy-ms-deploy-from-teamcity/
I have recently started implementing a deployment pipeline and I found below links quite useful:
MSBuild commands I used for Continuous Integration:
http://www.troyhunt.com/2010/11/you-deploying-it-wrong-teamcity_24.html
WebDeploy sync commands, I used for deployment packages to production server:
http://sedodream.com/2012/08/20/WebDeployMSDeployHowToSyncAFolder.aspx
Also I used these references:
Video about MSBuild on dnrtv.com
Microsoft Press book called "Inside the Microsoft® Build Engine: Using MSBuild and Team Foundation Build" which you can buy PDF version from Oreilly
Finally, "Continuous Delivery" book, gave me good ideas about deployment pipe line, although the book is not focusing on MSDeploy, but it is really worth reading.
The statement of documentation is typical of a MSFT 1.0 product, unfortunately MSDN no longer have dedicated Developer Technology Engineers to fill the gaps --- instead, there is a blind faith that the web will provide it.
I am actually considering dusting off my writing skills and write a short ebook on it since there is likely a market for it....
Msdeploy definitely has a touch of the PowerShell to it: power over simplicity rather than worse is better.
There is no Windows alternative to it, however you can hybridize some of its powers to make automated deployments. For example:
Compile your solution with Team City and msbuild
Use msdeploy to transform your site and web.configs on the build server
Manually FTP a ZIP file of your site (it doesn't support FTP)
Alternatively, use its remote deploy capabilities. This requires port 8172 open, lots of security changes and as far as I'm aware no concessions for load balancing
Use msdeploy on the live site to sync changes
As a tool it's clearly aimed at service providers as it's an enormous Swiss army knife. You can do all kinds of things to IIS with it, which for the most part are over kill for small businesses. I've no experience of large scale IIS setups so maybe that's where it shines.

ASP.NET Website DLL: Debug vs. Release version

When uploading my ASP.NET web application .dll file to my website's /bin/ directory, are there any disadvantages to using the debug version as opposed to recompiling a release build.
For example, while working locally on the website, the build configuration is set to Debug. When things look good, I go ahead and upload the latest .dll for the website/webapp. Should I instead at that point switch the build configuration to Release, then compile, and then upload that version of the .dll to the server?
I hoping this question isn't a duplicate. I read a number of other questions with similar words in the subject, but didn't find something directly related to my question.
Thanks,
Adam
Running with debug assemblies is a little heavier on performance, as they take up more memory, but usually not extremely so. You should only deploy a release build when it's really a "release". If you still anticipate some level of unexpected behavior in the wild, I'd consider using debug assemblies, so you will get more useful information from unhandled exceptions. The real performance "gotcha" is having debug="true" in your web.config.
A lot of it depends on what your individual needs are. In general people frown on putting the debug build into production for performance reasons. The emitted debug code is apparently not optimized and contains debug symbols that can slow down the execution of the code.
On the other hand, I've worked places where the policy was to put debug builds in production because it makes it easy to see line number, etc, when the code throws exceptions. I'm not saying I agree with this position, but I've seen people do it.
Scott Hanselman has a good post on doing a hybrid version of Debug and Release that could get you the best of both worlds here.
If you have a low volume website, you will never see the performance penalty of a Debug assembly in any measurable way. If you have high volume, look into other means of logging/instrumenting code instead.
On a high-volume website, you DO need to perform extensive stress and load testing to try very hard to break the application before it goes into production. I would do the first pass of that testing with Debug assemblies (since you probably WILL break stuff, and it will make it easier to see where). Then, repeat with the Release assemblies to make sure they behave the same way as the Debug ones.
http://weblogs.asp.net/scottgu/archive/2006/04/11/Don_1920_t-run-production-ASP.NET-Applications-with-debug_3D001D20_true_1D20_-enabled.aspx
Very few applications are going to see a significant difference in performance between release and debug builds. If you're running a small to medium sized application and you think there might be any bugs you haven't caught, use the debug build.

Resources