Asp.net website first start is very slow - asp.net

The first time I load the website in the production web server, it start very slow, subsequent pages load very quickly (included the home page).
I precompiled the site, but nothing changes.
I don't have any code at Application start.
I don't have cached items.
Any ideas? How can I find out what is happening?

It's just your app domain loading up and loading any binaries into memory. Also, it's initializing static variables, so if you have a static variable that loads up a lot of data from the db, it might take a bit.

When you published the site, did you choose to make the website "updatable" in the publish website's settings or not? If I remember well, the aspx / ascx file need to be compiled as well, and if they are "updatable" then the first start will cause a recompile of those resources.

Have you turned on tracing in your web.config?

Try clearing your event log?

use http://www.iis.net/expand/ApplicationWarmUp for warming up your app
this is for IIS 7.5 - so if you are running on Server R2 then it will work.

Make sure you publish your application in 'release' and not 'debug'. I've noticed this decreases loading time considerably. The web.config file will be updated.

This sounds very much like background compiling; though if you're precompiling, that shouldn't be an issue.
First thing I would look at is your ORM (if any). NHibernate, in particular, has a serious startup penalty, as it runs multiple compilers in the background at startup to turn each class in your data layer into its own in-memory assembly.

Just a quick nod at Darren. That's typical behavior of a .NET app after a DLL update is made. After the initial load everything should zip along just fine.

When you say "precompile" the site, are you using the aspnet_compiler utility to precompile, or simply using the "Build site" option in Visual Studio?
If you are not carrying out the former, I recommend giving it a spin. Coupled with Web Deployment Projects, you should have an easier time deploying your site for each release.

The initial slowness is a couple things:
The appDomain is being setup
ASP.NET is parsing and compiling the ASPX pages.
Global Contexts are being initialized.
This is normal behavior for ASP.NET.

#Mickey: No, it is turned off. Do I need to turn it on to find out?
The trace log will show you how long each action takes. It could help you find what is taking so long.
Here is a link that might help you get it setup.

Related

Asp.Net Core 1.0 (aka Asp.Net 5) website refresh with any file change?

Looking for a similar functionality like browsersync give for Node applications to auto reload browser with any file change.
Running Asp.Net Core (aka Asp.net 5) with dnx-watch and it restarts the Kestrel web server with any C# code change, but still have to refresh browser manually to see the changes whether its client or server file changes. Using Gulp for build pipeline and thinking of using it to do both dnx-watch and reload browser, but cannot find any example online.
Love to have some help on this.
Thanks
There's no official support for your scenario, sorry!
However, it's interesting and I would like to have it at least on the backlog. Can you please open a request at https://github.com/aspnet/dotnet-watch ?
1) It is possible to just use gulp and browsersync. It works good and fast, but is a bit tricky because you have to start IIS-Express first and use browsersync in proxymode.
2) A much better solution is the Visual Studio Extension Browser Reload on Save made by Mads Kristensen, a member of the Asp.Net team.

Best approach for deploying asp.net website

Just wondering what is the best option for deploying an ASP.Net Website.At the moment I just place the code in a folder on server and create a virtual directory on IIS referring to this folder. Then I open the website in VS2008 on the server and build it.Though it works fine for me,I am not sure if I am following the best approach for deployment or not.
Thanks.
There's a wealth of opinion on this across the internet and it is all opinion. To an extent it's down to you and your team (if you have one), if your approach is working for you then I don't see any huge reason to change but I would suggest that you at least have a staging site where you can deploy the code for user testing before it's deployed to production.
That said, running VS on the server isn't great (and means you need another VS license so could be a waste) and as VS includes a Publish option anyway, it's rather redundant. I use publish for the smaller sites and it works a fine.
Publish from inside VS is a pretty powerful tool as it lets you do web.config substitution. Check out the Hanselman talk Web Deployment Made Awesome: If You're Using XCopy, You're Doing It Wrong
You have several options which are preferable to running Studio on the server.
Depending on your team size, you could:
publish right from VS
continuous integration, check out Cruise Control for info on that
combination of CI and file synch (i.e. CI to test server then xcopy to production)
I'd advocate for CI since you tend to find issues faster that way, but it assumes you are using good version tracking and testing practices. Copying files can have unintended consequences like missed files, outdated files begin retained, etc.
When you deploy that way, anyone who gains access to the web server (which may be beyond your control if it is hosted) can view and possibly even alter your .aspx pages.
One alternative, which you can use from within Visual Studio, is to compile everything into a binary. You do that by choosing menu Build > Publish > uncheck the checkbox "Allow this precompiled site to be updatable." The downside of this, of course, is that even the tiniest change in a page's HTML will require recompiling the code and redeploying it.
It's a clear tradeoff between security and manageability, but precompilation can also aid in performance. Here is one explanation of precompilation alternatives.
You might also consider the suggestions made in Key Configuration Settings When Deploying a Web Application. In a nutshell,
If you are deploying your web application to a machine that you have control over, such as a web server within your company's intranet or a dedicated web server at a web host provider, you can use the element in machine.config to force all applications on the web server to adhere to the recommendations provided above (namely, using a custom error page, disabling output tracing, and not having the auto-compiled code compiled in debug mode). Simply add the following markup to the machine.config file within the <system.web> element:
<deployment retail="true" />
Again, this is a pretty simple change to make.
On a project I work on, we originally built on a dev machine, zipped and copied the contents of the 'bin' directory across. (unzipping, creating a site in IIS etc...)
Later, when we had the time, we went for this approach:
Creating windows installers in VS2008.
This has worked really well, as (literally) anyone is capable of doing the deployment. The real beauty of this, is that you can account for This is just a fancy way of wrapping the process of copying the 'bin' directory across...
Food for thought I hope.
Dave

What is the advantage of the ASP.NET precompilation?

How useful is it to use Aspnet_compiler.exe instead of conventional Publish via Visual Studio? And what about resource (resx) files?
Precompilation, as opposed to simple xcopy gives you two main advantages:
The filesystem will not have all the code in .aspx files and all the code behind is compiled into an assembly.
There is no ASP.NET compilation delay the first time you visit a page after the server starts up.
Having said that, my precompilation knowledge is a bit rusty these days, last time I touched it was a while back.
By pre compiling the site your server won't have to compile the site on the first visit. You have probably noticed that the first time you view an asp.net page there is a noticeable delay.
In addition you don't have to ship all your files since the code is already compiled. This can be useful if you don't trust whoever is hosting your pages.
Visual Studio's "Publish" feature is actually just a nice frontend to aspnet_compiler.exe. Publish has the advantage of being very easy to execute, where aspnet_compiler.exe requires some tweaking to get the results you're after.

Why does JavaScript not work on my site under an existing virtual directory?

I deployed my ASP.NET application under an existing virtual directory. The new deployment will have some features using JavaScript. The new features are not working.
If I deploy this build under a new virtual directory, the features using JavaScript are working.
I restarted the IIS Admin service. The problem continues.
What could be going wrong here?
Since javascript runs on the client, and not on the server, I doubt that IIS, per se, has anything to do with your problem.
What have you done to attempt to diagnose the problem? Have you looked at the network interaction between the browser and the server? Perhaps some script files are not being found.
Have you turned on any debugging tools (for instance, Firebug or the F12 command in IE8)? You may be getting errors you don't know about.
Sounds like it could be a caching issue on the browser.
Is the code that calls the Javascript routines being generated dynamically? If so, it might be a path assumption. Your description was a big vague. For instance, in ASP.NET, you should use "~" to represent the applications current path. This might change. If you have code that just referrs to "/" or another (perhaps the second attempted path), then perhaps it's just a bad assumption?
Please provide more specifics. There are a hundred possible scenarios that fit your description.
Check the IIS APPLICATION POOL on IIS Manager and the project Target Framework on Visual Studio
try to match it
After the deployment if javascript features are not working then it may be beacuse executes the script which already cached. In this case to handle the situation please do the following
Try changing the JavaScript file's src?
From this:
To this:
This method should force your browser to load a new copy of the JS file.

Why does ASP.NET re-compile (re-JIT) everything when only one thing changes?

I have an ASP.NET 2.0 application (installed on IIS 6.0 from an MSI) which was compiled as a "web site", and precompiled/packaged using a web deployment project, in Visual Studio 2005. (I have put in a request to the developers to consider changing to a web application for the next version, but it won't change for this version).
Whenever the application is recycled (e.g. a change is made to the web.config), on first hit, ASP.NET JITs the application. As part of this, it takes all the assemblies required for the login page and compiles them into native code in the Temporary ASP.NET Files 'assembly\dl3' directory, which takes between 20 and 60 seconds. This only happens on a recycle, which happens infrequently — but when it does, it causes the page to take much longer to load, and I believe it may be possible to optimize this.
There appear to be 122 DLLs that it needs to consider, some of which are the precompiled code-behind, others are third party components for the web site (for example, NHibernate.dll, reporting components, etc.)
Why does it recompile/re-JIT everything? Why does it not detect that most of the assemblies have not changed, and not attempt to change them? Can I prove it's not batch compilation that is causing the problem? (I have <compilation debug="false"> set in the web.config.)
Other questions suggest NGEN might be useful but I read it's not possible to use it on ASP.NET 1.x; we are using 2.0 and I can't find a clean answer either way.
From my personal experience slow recycle is often caused by NHibernate/ActiveRecord if you have lots of entities. See http://nhibernate.info/blog/2009/03/13/an-improvement-on-sessionfactory-initialization.html for explanation + possible solution.
Are you running IIS? I'm fairly certain that if you restart your site in IIS it will pick up any changes to configs without copying the dlls.
You may be able to improve your recycle time by installing common DLLs that change infrequently -- such as NHibernate or reporting tools -- into the GAC. That should prevent them from being re-jitted.
How to: Install an Assembly into the Global Assembly Cache
It's strange that only copying the dll takes 20 seconds. I would suggest to do another check and make sure where the bottleneck is.
How can you be certain that everything is in the proper state without recycling/resetting (or whatever happens) the AppDomain? Imaging that you have something in application start (global.asax) which sets the value of a static field based on a config value. Unless you reset the entire AppDomain you cannot be sure.
Another reason: There is no way to unload a .NET dll once its loaded, so you have to recreate the app domain when something is updated.

Resources