We're seeing lots of virtual memory fragmentation and out of memory errors and then it hits the 3GB limit.
The compilation debug is set to true in the web.config but I get different answers from everyone i ask, does debug set to true cause each aspx to compile into random areas of ram thus fragmenting that ram and eventually causing out of memory problems?
Scott Guthrie (manager of the ASP.NET development team) has an interesting post about it.
The most important points why you should not leave debug="true" are:
The compilation of ASP.NET pages takes longer (since some batch optimizations are disabled)
Code can execute slower (since some additional debug paths are enabled)
Much more memory is used within the application at runtime
Scripts and images downloaded from the WebResources.axd handler are not cached by the browser, resulting in more requests between
client and server
He also mentions the flag <deployment retail=”true”/> in machine.config, which allows to globally override the debug="true" flag of all applications running on a machine (e.g. on a production server).
Update: deploying web apps with debug="true" is still bad, as you can read in Scott Hanselman's recent blog post:
Here's why debug="true" is bad. Seriously, we're not kidding.
Overrides request execution timeout making it effectively infinite
Disables both page and JIT compiler optimizations
In 1.1, leads to excessive memory usage by the CLR for debug information tracking
In 1.1, turns off batch compilation of dynamic pages, leading to 1 assembly per page.
For VB.NET code, leads to excessive usage of WeakReferences (used for edit and continue support).
An important note: Contrary to what is sometimes believed, setting
retail="true" in a element is not a direct antidote to
having debug="true"!
The debug flag should be set to false in web.config, unless you actually need to debug the application.
Running in debug mode can increase the memory usage somewhat, but it's not likely case as severe problems as you are talking about. However, you should set it to false to elliminate the effect that it has, and see if you can notice any improvement.
When run in debug mode, the garbage collection works differently. The life time of variables is expanded from its actual usage to the scope of the variable (to be able to show the value in the debugger). This makes some objects live longer before they are garbage collected.
The compiler doesn't optimize the code when compiling in debug mode, and also some extra nop instructions are added so that each code line has at least one instruction where a break point can be placed.
Throwing an exception takes considerably longer in debug mode. (However, normally the code should not throw exceptions that often.)
AFAIK "debug = true" not cause the situation you mentioned.
I had faced same problem with an ASP.NET application that created images on the fly.
so I think you have a problem with not disposed resources.
If you deploy your aspx files with code-behind files to server. It will be compiled once when the request comes to an aspx. then it will be stored to the cache until the file changes.
It absolutely could affect memory, just take a look at some of the perfmon counters and run a comparison with both configurations.
If your site has a lot of files I would be more concerned with disk io in the asp.net temp folder.
Couple Questions...
Do you have a lot of files in your App_Code?
Are you allowing the site to be updatable or are you publishing it?
If so is the site being updated frequently or is there a deployment process?
What is the hardware configuration?
Why not utilize multiple configurations?
Web.Debug.Config - Have debugging turned on
Web.UAT.Config - Whatever your preference
Web.Release.Config - Have Debugging turned off
This way you can minimize regression configuration errors like a developer checking a web.config in with debug="true"
On production systems always set Debug=false. As the flag suggests it should only be set to true when debugging a development system.
This flag has nothing to do with your memory fragmentation problem.
Related
I am working on a .NET WebForms application and I have observed that whenever I build, after the build, the very first page load takes longer to load than usually. This happens even if I wait after building before I load a page. Is there a way to increase human workforce performance by changing IIS/.NET to initialize things on postbuild instead of first page load?
Yes you can, like this.
Quoting:
You can use Application Initialization Module which comes in-box with IIS 8.0, like this:
<applicationInitialization
doAppInitAfterRestart="true" >
<add initializationPage="/" />
</applicationInitialization>
This will send a request to the root of your app (initializationPage="/") every time your app starts automatically.
You can also configure the Start Mode for your application pool to Always Running which means every time IIS restarts, it will make sure to start your application pool immediately (this if from right click on your application pool then Advanced Settings).
Professional servers have hardly any latency, though it requires quite a bit of tweaking. Also, by default, applications recycle regularly on IIS (as well as when some kinds of exceptions occur, when some files are changed, or when some thresholds are reached). Professional web application hosting is anything but simple :) You might get help with that on Server Fault, perhaps.
Another option is to avoid mixing pre-compilation and JIT-compilation - if you only pre-compile, you don't need to do any compilation when the application is deployed, resulting in faster startup times. If you only deploy sources, the application domain doesn't need to be torn down when you make a change, which means that only the change needs to be recompiled, which is much faster.
And of course, ASP.NET Core is much, much faster in both scenarios - it can do the whole compilation in-memory, unlike the legacy system which uses csc to build multiple assemblies, save them to disk, load them from disk, merge them together, save that again, just to load it again and initialize.
I am getting pretty frustrated by debugging, but maybe I am just doing it wrong.
When I am actively developing it is extremely cumbersome to write some code, fire up the debugger to test said code, wait a minute for the debugger to start, look at the page in the browser, stop the debugger, edit the code, rinse, lather, repeat.
I can get around that by using CTRL-F5 and CTRL-SHIFT-B during development but I lose all the benefits of the debugger.
Is there a better way to use the debugger, or something else I can do to get quick rebuilds and use of the debugger?
Thanks,
Kyle
P.S. I/we do write unit tests, but you also need to test your app in the browser so please no "you shouldn't have this problem if your unit tests were written properly." comments ;)
Update
Thanks for the "get a better machine" suggestions. Not much I can do there. Loads of RAM and an Intel SSD. I shouldn't need more than a $2500 machine to code a web app.
Debug fewer times: If you are stopping the debugger to change values or test different scenarios then don't. While debugging you can change the values of the variables using QuickWatch or the Immediate Window.
Make debugging less costly: Turning off batch will make your page load faster on the first time since it will no longer precompile all of your pages and user controls. This is good for development if you are making changes quite often.
<compilation ... batch="false"> ...</compilation>
You should take a look at this post (tweeted by Scott Guthrie):
Slash your ASP.NET compile/load time without any hard work
http://blog.lavablast.com/post/2010/12/01/Slash-your-ASPNET-compileload-time.aspx
Summary
Get better hardware (Big impact)
Store your temporary IIS files on your fastest disk or a RAM disk e.g. <compilation ... tempDirectory="q:\temp\iistemp\"> ... </compilation>
Restructure your projects
Selectively build the necessary projects
Review a few magical settings (Most impact)
<compilation ... batch="false"> ...</compilation>
<compilation ... optimizeCompilations="true"> ... </compilation>
Get an SSD and boat-loads of RAM.
Maybe what you need is not to debug faster, but to reduce the amount of times you need to debug. Perhaps a more liberal Debug.* or trace logging approach would help.
#Kyle West - Well... there are a bunch of different ways you can go about it. The approach that works best for me is to use the MS Enterprise Library Logging app block (main site) to log events to a rolling daily file. The log level can be ratcheted up (verbose detail) or down (exceptions only), just by editing the .config file.
There is a lot in the app block, so we created a wrapper around the logging calls so that we can more easily make the calls that matter. For example,
DebugEvent.Log(String.Format("the value of _myVariable is {0}", _myVariable))
InfoEvent.Log("Reached the entry to the gatesOfHell method")
ExceptionEvent.Log(ex)
The nice thing about EL is you can change the config without having to change code. So if you want to log to the event log or even email, its just a few lines of configuration.
You can also substitute any other logger (log4Net, etc), or use the built in Debug or Trace in a way that is useful for you.
The statement "Really I just want to see any exceptions that don't buddle up the UI." is a bit worrisome, and implies that exception swallowing or some similar poor practice is happening. (Thats a whole other bucket of roosters, and may be the reason why you have to debug so much).
Well, there are tools like Watin, which allow you to script browser interaction, but I don't think that's really what you want.
I guess the answer here is "Get a faster machine"...
I have an ASP.NET 3.5 website running under IIS7 on Windows 2008.
When I restart IIS (iisreset), then hit a page, the initial startup is really slow.
I see the following activity in Process Explorer:
w3wp.exe spawns, but shows 0% CPU
activity for about 60 seconds
Finally, w3wp.exe goes to 50% CPU for
about 5 seconds and then the page
loads.
I don't see any other processes using CPU during this time either. It basically just hangs.
What's going on during all that time? How can I track down what is taking all this time?
We had a similar problem and it turned out to be Windows timing out checking for the revocation of signing certificates. Check to see if your server is trying to call out somewhere (e.g. crl.microsoft.com). Perhaps you have a proxy setting incorrect? Or a firewall in the way? We ultimately determined we had enough control over the server and did not want to 'call home', so we simply disabled the check. You can do this with .NET 2.0 SP1 and later by adding the following to the machine.config.
<runtime> <generatePublisherEvidence enabled="false"/> </runtime>
I am not sure if you can just put this in your app.config/web.config.
IL is being converted into machine native code (Assembly) by the Just-In-Time compiler and you get to wait while all the magic happens.
When compiling the source code to
managed code, the compiler translates
the source into Microsoft intermediate
language (MSIL). This is a
CPU-independent set of instructions
that can efficiently be converted to
native code. Microsoft intermediate
language (MSIL) is a translation used
as the output of a number of
compilers. It is the input to a
just-in-time (JIT) compiler. The
Common Language Runtime includes a JIT
compiler for the conversion of MSIL to
native code.
Before Microsoft Intermediate Language
(MSIL) can be executed it, must be
converted by the .NET Framework
just-in-time (JIT) compiler to native
code. This is CPU-specific code that
runs on the same computer architecture
as the JIT compiler. Rather than using
time and memory to convert all of the
MSIL in a portable executable (PE)
file to native code. It converts the
MSIL as needed whilst executing, then
caches the resulting native code so
its accessible for any subsequent
calls.
source
Thats the compilation of asp.Net pages into intermediate language + JIT compilation - it only happens the first time the page is loaded. (See http://msdn.microsoft.com/en-us/library/ms366723.aspx)
If it really bothers you then you can stop it from happening by pre-compiling your site.
EDIT: Just re-read the question - 60 seconds is very long, and you would expect to see some processor activity during that time. Check the EventLog for errors / messages in the System and Application destinations. Also try creating a crash dump of the w3wp process during this 60 seconds - there is an chance you might recognise what its doing by looking at some of the call stacks.
If it takes exactly 60 seconds each time then its likely that its waiting for something to time out - 60 seconds is a nice round number. Make sure that it has proper connections to the domain controllers etc...
(If there are some IIS diagnostic tools that would do a better job then I'm afraid I'm not aware of them, this question might be more suited to ServerFault, the above is a much more developer-ish approach to troubleshooting :-p)
I found that there was a network delay making an initial connection from the front end web server to the database server.
The issue was peculiar to Windows 2008 and our specific network hardware.
The resolution was to disable the following on the web servers:
Chimney offload state
Receive window auto-tuning level
Greater than 60 seconds sounds fishy. Try running a test.html page to see how long that takes. That will isolate IIS7's role.
Then temporarily rename your web.config, global.asax and application folders and try a test.aspx page (very simple page). That will isolate ASP.NET.
If both of those are fast (i.e. about 10 seconds), then it's your application. But, if either are slow then not the application and something with the server itself.
This hat nothing to do with JIT compiling. The normal C# compiler compiles your code behind files (.aspx.cs) into intermediate language into an assembly at startup if this assembly dont exist or code files have changed. Your web site assembly is located in the "bin" folder of your web site.
In fact the JIT compiling occures after that, but this is very fast and won't take several minutes. JIT Compiling happens on every startup of an .net application and that won't take more than a view seconds.
You can avoid the copmpiling of your web site if you deploy the already compiled website assembly (YourWebsite.dll) into the bin folder. It is also possible to deploy only the aspx files and leave the code behind files (aspx.cs) files away.
I've just been battling a similar issue. For me it turned out to be that I had enabled internal logging for NLog. It added about 3 minutes to the startup time!
Original config
<nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
autoReload="true"
throwExceptions="false" throwConfigExceptions="false"
internalLogLevel="Debug"
internalLogFile="C:\Temp\NLog.Internal.txt">
Fixed Config
<nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
autoReload="true"
throwExceptions="false" throwConfigExceptions="false">
For Info I discovered this by using SysInternals' ProcMon.exe, filtering on the Process Name "w3wp.exe"
During our build process we run aspnet_compiler.exe against our websites to make sure that all the late-bound stuff in ASP.NET/MVC actually builds (I know nothing about ASP.NET but am assured this is necessary to prevent finding the failures at runtime).
Our sites are fairly large in size, with a few hundred pages/views/controls/etc. however the time taken seems excessive in the 10-15 minute range (for reference, this is longer than it takes the entire solution with approx 40 projects to compile, and we're only pre-compiling two website projects).
I doubt that hardware is the issue as I'm running on the latest Quad core Intel chip, with 4GB RAM and a WD Velociraptor 10,000rpm hard disk. And part of what's odd is that the EXE doesn't seem to be using much CPU (1-5%) and doesn't seem to be doing an awful lot of I/O either.
So... is this a known issue? Why is it so slow? And is there any way to speed it up?
Note: To clarify a couple of things people have answered about, I am not talking about the compilation of code within Visual Studio. We're using web application projects already, and the speed of compilation of those is not the issue. The problem is the pre-compilation of the site after these projects have already been compiled (see this MSDN page for more details) as part of the dev build script. We are performing in-place pre-compilation, not copying the files to a target directory.
Switching to Roslyn compiler most likely will significantly improve precompile time. Here is a good article about it: https://devblogs.microsoft.com/aspnet/enabling-the-net-compiler-platform-roslyn-in-asp-net-applications/.
In addition to this, make sure that batch compilation is enabled by setting batch attribute to true on the compilation element.
Simply, the aspnet_compiler uses what is effectively a "global compiler lock" whenever it starts pre-compiling any individual aspx page; it is basically only allowed to compile each page sequentially.
There are reasons for this (although I personally disagree with them) - primarily, in order to detect and prevent circular references causing an infinite loop of sorts, as well as ensuring that all dependencies are properly built before the requiring page is compiled, they avoid a lot of "nasty CS issues".
I once started writing a massively-forked version of aspnet_compiler.exe last time I worked at a web company, but got tied up with "real work" and never finished it. Biggest problem is the ASPX pages: the MVC/Razor stuff you can parallelize the HELL out of, but the ASPX parse/compile engine is about 20 levels deep of internal and private classes/methods.
Compiler should generate second code-behind file for every .aspx page, check
During compilation, aspnet_compiler.exe will copy ALL of the web site files to the output directory, including css, js and images.
You'll get better compilation times using Web application project instead of Web site model.
I don't have any specific hot tips for this compiler, but when I have this sort of problem, I run ProcMon to see what the process is doing on the machine, and I run Wireshark to check that it isn't spending ages timing-out some network access to a long-forgotten machine which is referenced in some registry key or environment variable.
Just my 2 cents.
One of the things slowing down ASP.NET views precompilation significantly is the -fixednames command line option for aspnet_compiler.exe. Do not use it especially if you're on Razor/MVC.
When publishing the wep app from Visual Studio make sure you select "Do not merge", and do not select "create separate assembly" cause this is what causes the global lock and slows things down.
More info here https://msdn.microsoft.com/en-us/library/hh475319(v=vs.110).aspx
Basically, what I'm wondering is if I need to set debug="false" before hitting the "Publish Web Site" button or if I can switch it after all the files have been published.
You do not have to turn that setting off, however, you will want to set debug="false" before running the website as a production application. It will have a profound impact on your site's performance.
As to what Ryan wrote - see debug code in production.
Another option you may want to use is retail="true".
You can keep it set to true when you publish/precompile, but once its at a production status, its strongly recommended that you set the value to false, the reasons are here outlined by Scott Guthrie (he manages the ASP.NET team) himself.
Highlights from Scott's post:
Doing so causes a number of non-optimal things to happen including:
1) The compilation of ASP.NET pages takes longer (since some batch optimizations are disabled)
2) Code can execute slower (since some additional debug paths are enabled)
3) Much more memory is used within the application at runtime
4) Scripts and images downloaded from the WebResources.axd handler are not cached
On the production server you could put deployment retail=”true” in machine.config. This will ensure that debug is always false for that server. Details here.
Unfortunately, no. You can publish it with the option set to true, although you certainly should not (if the page is going into production). I publish apps to a test environment initially with it set to 'true', the set it to false in the test environment, and absolutely to false when into production.
But your site will build and publish fine to whatever environment you send it to with the debug set to true. As I understand it, there are a quite a few sites out there that in production with this set that way. Dror posted a great link to the issues that come with leaving this option on.
I hope this adds some value. I have been learning a lot about msbuild, and what a incredible tool it is!
If you are publishing sites to production, there is probably some of the steps that you would like to automate. If you want to use a batch file to do your publish instead of the visual studio interface, or included this in your build scripts, after you have compiled sucessfully the following will publish your website.
msbuild <yourProjectFile>.csproj
/target:"ResolveReferences;_CopyWebApplication"
/properties:"debug=false;
retail=true;
WebProjectOutputDir = <YourCorrectOutputDir>;
OutDir = <YourCorrectOutputDir>\bin\"
The time we have spent in our time, investing in fairly extensive build scripts, that even run a few basic http get reguests against the site after we are done building and publishing it, has reduced a huge amount of frustrations.
I am a firm beleiver of automating as much as possible, machines dont seem to forget to do things as often as I do :D
Hope it helps
Cheers
Rihan