I know there are several questions asking the same question "Why should I use release mode". The problem I have with the answers is that they simply state, quite strongly, that you should always use release mode when a website is in production.
Why?
I understand that the code is optimised in the assemblies, but to what level? Is it going to optimise well written code? What kind of optimisations does it perform?
Are there any analyses regarding this? Is there anyway I can test the differences between debug and release?
I would really like someone who understands the why of this to at least provide a reference to some definitive reading material, as I have yet to find anything hard enough to satisfy my curiosity on this issue.
Read this first: http://blogs.msdn.com/tess/archive/2006/04/13/575364.aspx, I just found it as part of answering this question, its a great article.
See this question: At what level C# compiler or JIT optimize the application code? for some info on general compiler optimizations.
Also, keep in mind that for a Asp.Net web application changing to release mode will compile the assemblies in release mode but for the page compilations you may also need to edit the debug attribute of the compilation element in your web.config.
<compilation defaultLanguage="c#" debug="true">
Web applications do strange things when debug=true is set, for example they do not honor request timeouts because it would interfere with debugging.
Here is a great article from the Gu on the subject: Don’t run production ASP.NET Applications with debug=”true” enabled
Related
You know, the next "big" and "enterprisey" thing from Microsoft.
Is it just me, or is it really hardly for humans? Main highlights are (IMO):
Absolutely cryptic syntax (-skip:objectName=filePath,absolutePath=App_Offline.* just for skipping App_Offline.html)
Manifest as an after thought
Lack of thorough documentation
Not a word about extensibility (except for several blog posts out there). Moreover, all these extensions developed in great pains have to be registered in GAC and registry
Waaay too low-level (metadata/metakey; all this IIS jazz)
No integration with MSBuild
Granted, MSDeploy and MSDeployAgent are quite powerful, but do they really need to be that complex for relatively simple tasks?
I too share your frustrations over the lack of documentation and the apparent low-level nature of this tool.
However what MS has done is finally create a free tool with which you can actually script whole server deployments, including parameterising addresses, configurations etc. This is unfortunately a very complicated thing to do - given how many bits of configuration actually go into a web server - and this is probably the best way to do it all.
What we need now is a really good GUI that can help build up these packages, and scripts etc. The GUI that is embedded within IIS is good - but again, short on explanation - so hopefully soon that'll be addressed.
On the functional side, I'm using at the moment to deploy a site from dev -> staging -> live with parameters to change bound IP addresses etc. I was deeply frustrated that it took me a few days to get it all working - however now I have it, I can remove a lot of the possibly of human error at the IT Support side - who are responsible for our deployments. I now only have the configuration of my master staging server to worry about - and can be sure that all the servers in the web farm will be kept in sync whenever I deploy.
As Sayed mentions, as well, there are MSBuild tasks in 2010 (the Website Deployment feature is now implemented using msdeploy) to work with this - which also brings the possibility of a true Continuous Integration environment to VSTeamSystem - having a team build that can actually perform a full web deployment as its last step is very exciting (and scary, granted!).
Actually there are MSBuild tasks for MSDeploy. They will be shipped with .NET 4/Visual Studio 2010.
Although a bit rough around the edges, I've come to like MSDeploy quite a bit. Using it to sync web servers in a farm is very useful as it is efficient (only copies changes) and takes care of actual IIS settings in addition to content files. It seems like MSDeploy is a building block for various scenarios and uses. Also, as previously mentioned, there is a MSBuild task for MSDeploy in .NET 4. I've taken advantage of this MSBuild task to make deployment of my web applications from TeamCity trivially easy. I've blogged here it here:
Web Deploy (MS Deploy) from TeamCity - http://www.geekytidbits.com/web-deploy-ms-deploy-from-teamcity/
I have recently started implementing a deployment pipeline and I found below links quite useful:
MSBuild commands I used for Continuous Integration:
http://www.troyhunt.com/2010/11/you-deploying-it-wrong-teamcity_24.html
WebDeploy sync commands, I used for deployment packages to production server:
http://sedodream.com/2012/08/20/WebDeployMSDeployHowToSyncAFolder.aspx
Also I used these references:
Video about MSBuild on dnrtv.com
Microsoft Press book called "Inside the Microsoft® Build Engine: Using MSBuild and Team Foundation Build" which you can buy PDF version from Oreilly
Finally, "Continuous Delivery" book, gave me good ideas about deployment pipe line, although the book is not focusing on MSDeploy, but it is really worth reading.
The statement of documentation is typical of a MSFT 1.0 product, unfortunately MSDN no longer have dedicated Developer Technology Engineers to fill the gaps --- instead, there is a blind faith that the web will provide it.
I am actually considering dusting off my writing skills and write a short ebook on it since there is likely a market for it....
Msdeploy definitely has a touch of the PowerShell to it: power over simplicity rather than worse is better.
There is no Windows alternative to it, however you can hybridize some of its powers to make automated deployments. For example:
Compile your solution with Team City and msbuild
Use msdeploy to transform your site and web.configs on the build server
Manually FTP a ZIP file of your site (it doesn't support FTP)
Alternatively, use its remote deploy capabilities. This requires port 8172 open, lots of security changes and as far as I'm aware no concessions for load balancing
Use msdeploy on the live site to sync changes
As a tool it's clearly aimed at service providers as it's an enormous Swiss army knife. You can do all kinds of things to IIS with it, which for the most part are over kill for small businesses. I've no experience of large scale IIS setups so maybe that's where it shines.
95% of my time I program ASP.NET (MVC) web sites.
Should I care about MSBuild?
We use MSBuild with CruiseControl.Net to manage the builds of most of our big ASP.NET projects. For every commit of one member of the team, a build is launched. It helps us detect
quickely incompatibilities before moving a feature to "staging" or "production".
I think it is really usefull when working with a team on the same ASP.NET project or if you are working alone on a big project.
That depends on your development environment.
If you have other folks that do deployment of your systems, and they take care of the build and deployment environment, then MSBuild probably won't be necessary for your work.
On the other hand, if you need to configure the build script to understand special situations that your code comes up with, then you will definitely need to understand MSBuild scripts.
Even for a one-man shop, it's a useful tool to know, especially if you are configuring a continuous integration server like Hudson.
No. Until you have to.
Its not absolutely necessary to know MS Build, but it is useful to know.
It might not be needed for all kind of projects, but it is extremely useful when you are working on a huge code base with automated custom build solution/ nightly build/developer builds so on and so forth.
It's unlikely, unless you choose to use it, or you start to make use of Team Foundation Server's Team Build.
Your development processes need to get to a certain complexity before automated builds really deliver their true value and/or if you find need for automatic deployment (including database changes if applicable).
The coming Visual Studio 2010 is going to make it far easier to use, but for now it retains a fairly steep learning curve which you can avoid by using alternatives, or commercial products (e.g. Visual Build Pro, Final Builder etc).
The nice thing is that it is part of the .Net framework, so it's already available as long as you have the framework installed (which it probably is).
So, in short, not really. It's something very useful and powerful though, setting up deployments using MSBuild can be very, very useful.
What should a developer know about MsBuild?
Every developer should know it exists and it's basic capabilities. If know it exists you won't duplicate its features and will know what it can do for you, when you need it.
Minimum:
As an exercise, build your project through the command line: msbuild myproj.sln
Know the role of continuous integration
A little more than minimum:
Hack your csproj (or vbproj) with a message task, so it outputs something during clean.
All done. When you need to know more, you'll figure it out.
When uploading my ASP.NET web application .dll file to my website's /bin/ directory, are there any disadvantages to using the debug version as opposed to recompiling a release build.
For example, while working locally on the website, the build configuration is set to Debug. When things look good, I go ahead and upload the latest .dll for the website/webapp. Should I instead at that point switch the build configuration to Release, then compile, and then upload that version of the .dll to the server?
I hoping this question isn't a duplicate. I read a number of other questions with similar words in the subject, but didn't find something directly related to my question.
Thanks,
Adam
Running with debug assemblies is a little heavier on performance, as they take up more memory, but usually not extremely so. You should only deploy a release build when it's really a "release". If you still anticipate some level of unexpected behavior in the wild, I'd consider using debug assemblies, so you will get more useful information from unhandled exceptions. The real performance "gotcha" is having debug="true" in your web.config.
A lot of it depends on what your individual needs are. In general people frown on putting the debug build into production for performance reasons. The emitted debug code is apparently not optimized and contains debug symbols that can slow down the execution of the code.
On the other hand, I've worked places where the policy was to put debug builds in production because it makes it easy to see line number, etc, when the code throws exceptions. I'm not saying I agree with this position, but I've seen people do it.
Scott Hanselman has a good post on doing a hybrid version of Debug and Release that could get you the best of both worlds here.
If you have a low volume website, you will never see the performance penalty of a Debug assembly in any measurable way. If you have high volume, look into other means of logging/instrumenting code instead.
On a high-volume website, you DO need to perform extensive stress and load testing to try very hard to break the application before it goes into production. I would do the first pass of that testing with Debug assemblies (since you probably WILL break stuff, and it will make it easier to see where). Then, repeat with the Release assemblies to make sure they behave the same way as the Debug ones.
http://weblogs.asp.net/scottgu/archive/2006/04/11/Don_1920_t-run-production-ASP.NET-Applications-with-debug_3D001D20_true_1D20_-enabled.aspx
Very few applications are going to see a significant difference in performance between release and debug builds. If you're running a small to medium sized application and you think there might be any bugs you haven't caught, use the debug build.
I've been bitten in the past by Page.RegisterClientScriptBlock-registered JS not being emitted in a stable order from machine to machine in the bad old .Net 1.1 days. Now, I'm writing a set of user controls that use <asp:ScriptManager/> to reference JS, and although I haven't had any problems so far - order always seems to be conserved between <asp:ScriptReference> tags - I'm feeling a bit shy about it. MSDN seems silent on the topic, and various bloggers seem to indicate ordering is stable in .Net 2.0+, but I haven't found any definitive reference.
Does one exist? Is the order of inclusion of scripts I observe on my development machine guaranteed to be the one I'll see in all other contexts the webapp runs?
Further analysis concludes that the answer is that they are, at least in my particular production and development environments.
Is it better to use NGEN an ASP.NET application when we know it is not going to change much? Or is the JIT good enough?
The only reason I asked was because this article by Jeffrey Richter in 2002 says :
And, of course, Microsoft is working quite hard at improving the CLR
and its JIT compiler so that it runs faster, produces more optimized
code, and uses memory more efficiently. These improvements will take
time. For developers that can't wait, the .NET Framework
redistributable includes a utility called NGen.exe.
NGen will only help startup time - it doesn't make the code execute any faster than it would after JITting. Indeed, I believe there are some optimizations which NGen doesn't do but the JIT does.
So, the main question is: do you have an issue with startup time? I don't know how much of an ASP.NET application's start-up time will be JITting vs other costs, btw... you should probably look at the Performance Manager graphs for the JIT to tell you how much time it's really costing you.
(In terms of availability, having multiple servers so you can do rolling restarts is going to give you much more benefit than a single server with an NGENed web app.)
NGen isn't the way to go for ASP.NET -- the creation of the .dlls in the bin folder isn't the final step -- they are compiled again with the web/maching.config settings applied into your C:\Windows\Microsoft.NET\Framework\v2.0.50727\Temporary ASP.NET Files folder. Instead of NGen, to decrease initial load-time, use the Publish Website Tool or aspnet_compiler.exe
I'm not sure that NGEN's primary benefit is to start-up time alone - if the application's suffering from a high '% Time in JIT', this is listed as a potential remedy:
http://msdn.microsoft.com/en-us/library/dd264972(VS.100).aspx.
The discussion is closely related to a different question on how JIT'd machine code is cached and re-used?
I was looking into this tonight and came across the following:
The common language runtime cannot load images that you create with NGEN into the shared application domain. Because ASP.NET standard assemblies are shared and are then loaded into a shared application domain, you cannot use Ngen.exe to install them into the native image cache on the local computer.
http://support.microsoft.com/kb/331979
Not sure if this just refers to assemblies referenced from ASP.net app or the app itself?
NGen helps startup time. For example, the Entity Framework documentation says this about the benefit of running NGen:
Empirical observations show that native images of the EF runtime assemblies can cut between 1 and 3 seconds of application startup time.
The context is just the Entity Framework assemblies. NGenning other assemblies would provide additional speedup.