How to find out why my website is slow loading - asp.net

I have an ASP.NET website that is hosted with https://www.blacknight.com/.
Whenever I browse to my website in my browser it can take up to 10 seconds to appear.
How can I determine where the bottleneck is?
How can I determine if its the web hostings fault or if its due to my own website?
Are there tools that I can use etc

Frequently when deploying ASP.NET websites to hosting providers the application may become idle and unloaded from memory in a Shared environment. The resulting behavior is that any "new" visits to the website after it's become idle for some time results in a JIT recompilation of the web application before being served again.
Typically hosting providers offer some form of "always on" option for your website which prevents this behavior. Otherwise, in the case of infrequently visited websites, this behavior is par for the course.
Depending on your hosting provider they may respect your web.config entries to allow "Always On" in IIS. You can find more information on this approach here
Hope this makes sense. It is incredibly common for lower traffic websites in Shared environments.

Well just from looking at your code, you need to change some stuff
First, load the scripts at the end of the body to allow the website to load faster.
One of your CSS if called upon after the body tag, call it inside the head tag.
Also do a resource speed test. Do this easily by going on Google Chrome. Right click, select inspect element, then select Network. Press the red button and reload the page. This will show you how fast the resources are loading. There are also test on pingdom which will show you how fast your website loads and the score of it, it also compares your website to other websites.
But this all may not be the case, it just may be your server. Get a dedicated server with at least 10mpbs. You can also get Cloudflare to speed up the process. Also consider using a CDN to load your website faster.
Good Luck

Related

Unable to debug some aspx pages in ASP application

I have a classic ASP website running on IIS. I opened it with VS 2015 (Open website in File menu) and saved the solution (when opening it it said this is a precompiled website - whatever that means). Then I attached to process to debug it.
Now, the breakpoints I placed are hit on some of the .aspx pages, and not on others. Any idea why this might be the case? I checked the webconfig and it has debug option set to true. Probably some PDB files are missing. People suggest to rebuild the website, but when I click build or rebuild solution, the process completes immediately with success, so I doubt anything was recompiled at all.
I can modify the code of those pages and the IIS recompiles them on the next request, but not sure why the breakpoints don't get hit there. They obviously are once I put something like Debugger.Launch() in my code, but it's not what I want.
I'm no expert so I'd be grateful if you could help me out with this.
Precompiled website means it improves performance on some ASP.NET websites. It can be used to optimize static sites. We explore strategies for other types of sites. This speeds up the first access to pages in your site. And if you want the faster option for the site once deployed, please consider "site precompilation." Let's assume your site is high-volume, popular and important. It is important that the site respond instantly when a customer visits.
Hope this is helpful.

ASP.Net / Umbraco Website has (initially) very high server response time

I've got this problem.
I launched an ASP.NET website with the Umbraco CMS on an ISP.
(Its just a very basic informative site. nothing special.)
When I go want to visit the website however, the first pageload takes a lot of time, sometimes even up to 20 seconds. Of course this is ridiculous.
Afterwards, I am able to navigate the site relatively quick..
So every first pageload is slow, then everything is OK, more or less.
Does anybody have any idea what the problem could be? Would it be IIS? ASP.NET?
IIS is probably configured to shutdown the application pool after N minutes of inactivity.
AFAIK, this is the default behaviour on IIS.
If it is the first request to be served, IIS at least starts the APP Pool. This might take a bit of a time. Maybe Umbraco loads initially some data, but I did not have any experience with Umbraco, so that's beyond my knowledge.
-sa
What do you meen by first page load?
Have you just done a build? If this is a website then .Net will compile and load the dll. Then IIS will cache page outputs.
Do you have any large images on the page?
Essentially there are an infinate number of reasons. Have you used firebug? Determine where the loadtime is?
Do you have a link?
You may want to look into a keep alive service. There are many available that regularly poll you site to keep the application pool running and prevent the startup delay you are seeing. More information here and here.

How to put an asp.net application into offlince/maintenance mode?

I've developed my first web application which, surprisingly, is getting very popular.
Because the website is now live, I have a hard time doing some changes, in fear some people are still logged in and are using the application.
I wish to avoid having a duplicated instance of the web application for testing.
Is there any way to put the website in 'maintenance mode' with only me having access to it? Like redirecting to a page with some info, telling its in maintenance mode.
I wish to avoid having a duplicated
instance of the web application for
testing.
That's your problem right there. For anything but the most trivial sites, you should have a staging or development instance. You should be using source control and have a script to update the main instance.
You can simply drop a file called app_offline.htm in the root of your website and ASP.NET will automatically route all traffic to this page. This file can contain any HTML you wish indicating that your site is down for a short period due to maintenance.
For more information please read App_Offline.htm and working around the "IE Friendly Errors" feature:
The way app_offline.htm works is that
you place this file in the root of the
application. When ASP.NET sees it, it
will shut-down the app-domain for the
application (and not restart it for
requests) and instead send back the
contents of the app_offline.htm file
in response to all new dynamic
requests for the application. When
you are done updating the site, just
delete the file and it will come back
online.
This is the answer to your question:
http://www.codeproject.com/Tips/219637/Put-the-website-in-Maintanance-Mode-Under-Construc
There's no such built-in functionality in ASP.NET except app_offline.htm which doesn't quite fit your needs because even you will be denied access to the site. You have to build it on your own but this is best done on the routers and load balancers level than at the application level. Of course this will depend on your network architecture.
Besides building a dev replica of your website to build patches and fixes on, couldn't you just announce a site closing for maintenance several days in advance? I'm not a web programmer, but you might want look into what Hattrick, a popular online soccer management, does for maintaining their site. They use a notification system on the homepage, after users sign-in, that announces when maintenance will be taking place (usually late at night in Europe where a large portion of the players and all the devs are located) and they close down the website for a couple of hours. When they take the site down they post a page, using the same style as the rest of the site, and provide an estimate of when it will be up and running again. Simple, elegant, and when coupled with the long forewarning it seems to do a good job placating the user base.
Give users a long heads up that planned maintenance is scheduled to take place and give them some idea what it is for and most people will be able to accommodate the down time. Nothing is more frustrating than purposefully going to a web app that was up and running 10-20 minutes ago to find it suddenly unavailable and down for maintenance.
Try app_offline.htm ??
What version of ASP.NET? I'm sure there are a million more elegant ways of doing this, but you can change the Default Document in IIS to redirect to Maint.html (or similar).

Re-publishing an ASP.NET Web Application While Site is Live

I am trying to get a grasp on how to handle updates to a live, functioning ASP.NET (2.0 or greater) Application while there are users on the site.
For example, suppose SO is an ASP.NET Web Application project. The project code compiles down to the single .DLL in the BIN folder. Now, there are constantly users on SO, so what would happen to users' actions/sessions if you would use the Visual Studio .NET "Publish" feature (or just FTP everything again manually) while they are using the site?
Would creating an ASP.NET Web Site, instead, alleviate any problems that may or may not exist with the scenario above? I am beginning to develop a web site as a user-driven Web Application, and I want to make sure that my inexperience with this would not potentially annoy the [potentially] many users that I [want to] have 24/7.
EDIT: Sorry, I should have put this in a more exact context. Assume that this site is being hosted by a web hosting service with monthly fees. I won't be managing the server itself, just what the web host allows as a user of their services.
I create two Web sites in IIS. One is the production Web site, and the other is a static Web site with an HttpHandler that sends all requests to a single static "We're updating" HTML page served with an HTTP 503 Service Unavailable. Typically the update Web site is turned off. When it's time to update, we stop the production Web site, start the update Web site, and now we can fiddle with the production Web site all we want without worrying about DLLs being locked or worker processes needing to be spun down.
I started doing this because
App_Offline.htm really does not work well in Web Gardens, which we use.
App_Offline.htm serves its page as 404, which is bad if you're down for a meaningful period of time.
We can start the upgraded production Web site with modified settings (only listening on localhost), where we can do a last-minute acceptance/verification that everything is working before we flip the switch, turning off the update Web site and re-enabling the production Web site.
Things this does not solve include
Any maintenance that requires a restart of the server--you still have downtime where no page is served.
Any maintenance that diddles with the .NET runtime, like upgrading to the latest service pack.
Other approaches I've seen include
Having two servers. Send all load balancing requests to one server, upgrade the other one; then rinse and repeat. Most of us don't have this luxury.
Creating multiple bin directories, like bin-1.0.0.0 and bin-1.1.0.0 and telling ASP.NET which bin directory to use in the web.config file. (One advantage of this is that reverting to a previous binary is just editing a config file. A disadvantage is that it's harder to revert resources that don't end up in your binaries, like templates and images and such.) I don't remember how this actually worked--I think the application did some late assembly loading in its Global.asax based on its own web.config section (since you touched the web.config, the app had restarted, so it was okay).
If you find a better way, let me know!
Changing to the asp.net web site model won't have any effect, as the recycle will also happen, some of changes that trigger it for sure: web.config, global.asax, app_code.
After the recycle, user will still be logged in because asp.net will just validate the syntax. That is given you use a fixed machine key, otherwise it will change on each recycle. This is something you want to do anyway as other stuff can break if the key change across requests i.e. viewstate validation, embedded resources (decryption of the url fails).
If you can put the session out of process, like in sql server, you will avoid loosing the session. If you can't, your code will have to consider that. There are plenty of scenarios where you can avoid using session, and others were you can wrap it and re-retrieve the info if the session was cleaned. This should leave you with a handful specific cases that you know can give trouble to the users, so for those you do some of the suggestions others have already made.
One solution could be to deploy your application into a load balanced environment (web farm).
When deploying a new version you would use the load balancer to redirect requests to the server you are not deploying to.
App_offline.htm is great solution for this I think.
in SO we see application currently unavailable page when a deployment begins.
I am not sure how SO handles it.. But we usually put a holding page. So what ever the user has done (adding question or answering questions) does not get updated. As soon as he updates something he will see a holding page asking him to try after sometime.
And if I am the user I usually press the back button to make sure what I entered is saved in the browser history so that I can post later.
Some site use use are in clustered environment so I take one server offline and inform the load balancer that she will not be available and once I make sure that the new version is working fine I make it live.. I do the same thing for the next server.
Do we have any other option?
It is not a technical solution, but set up a scheduled maintenance window. You can annoucement in advance giving your user base fair warning that there is a possiblity that the application will not be available during that time frame.

Can you enable HTTP compression in IIS 6 without restarting IIS?

I'm currently optimizing the performance on my company's site; when it was taking 6-10 seconds to download 2MB+ of our homepage and assets (the site is mostly Flash with a lot of media, so it's not 2MB of HTML and viewstate). There are a lot of things that will need to be done to get this download size down; but one thing I definitely want to do is enable HTTP compression to compress our static content, specifically XML, CSS, and JS; I don't imagine compression will do much for the SWFs and JPGs.
I want to enable this on just our staging site so I can do some server testing and benchmarking. This means I'm going to have to do some Metabase editing, since IIS 6 doesn't allow you to set compression on an individual site via IIS manager. The problem with that is the Metabase is locked by IIS so I can't save; and even if I save the edits, I'm required to restart IIS for the changes to take affect; which will take down other live sites hosted on the same server. Is there anyway to enable compression for one site without restarting IIS? I don't mind restarting our staging site; I just don't want this work to take down other sites on the server.
Any assistance is greatly appreciated.
Do you have "Enable Direct Metabase Edit" checked? If so you should be able to edit the metabase and when the file is save IIS will automatically pickup most of changes.More details here
You can also enable compression using adsutil.vbs. There are examples here and in the comments of this blog post.
cscript C:\Inetpub\AdminScripts\adsutil.vbs set w3svc/site#/root/DoStaticCompression False
cscript C:\Inetpub\AdminScripts\adsutil.vbs set w3svc/site#/root/DoDynamicCompression False
You do realize an IISRESET can happen in literally a couple seconds, and it can be so quick that user requests will merely "hang" until the server responds.
The only bad part is that if they are using server sessions, those might get lost.
You should enable HTTP compression, it's generally a good thing, in today's servers you are rarely using any significant amount of CPU usage, so the minor task of compressing the HTTP output will save you more on bandwidth than you loose in CPU time.
I should also mention, whoever is creating your Flash files are doing it incorrectly, the flash developer needs to stream the flash components, not deliver every single graphic, sound & animation on the first page view. There's no reason any Flash front page should be more than 100k.

Resources