Can you enable HTTP compression in IIS 6 without restarting IIS? - asp.net

I'm currently optimizing the performance on my company's site; when it was taking 6-10 seconds to download 2MB+ of our homepage and assets (the site is mostly Flash with a lot of media, so it's not 2MB of HTML and viewstate). There are a lot of things that will need to be done to get this download size down; but one thing I definitely want to do is enable HTTP compression to compress our static content, specifically XML, CSS, and JS; I don't imagine compression will do much for the SWFs and JPGs.
I want to enable this on just our staging site so I can do some server testing and benchmarking. This means I'm going to have to do some Metabase editing, since IIS 6 doesn't allow you to set compression on an individual site via IIS manager. The problem with that is the Metabase is locked by IIS so I can't save; and even if I save the edits, I'm required to restart IIS for the changes to take affect; which will take down other live sites hosted on the same server. Is there anyway to enable compression for one site without restarting IIS? I don't mind restarting our staging site; I just don't want this work to take down other sites on the server.
Any assistance is greatly appreciated.

Do you have "Enable Direct Metabase Edit" checked? If so you should be able to edit the metabase and when the file is save IIS will automatically pickup most of changes.More details here
You can also enable compression using adsutil.vbs. There are examples here and in the comments of this blog post.
cscript C:\Inetpub\AdminScripts\adsutil.vbs set w3svc/site#/root/DoStaticCompression False
cscript C:\Inetpub\AdminScripts\adsutil.vbs set w3svc/site#/root/DoDynamicCompression False

You do realize an IISRESET can happen in literally a couple seconds, and it can be so quick that user requests will merely "hang" until the server responds.
The only bad part is that if they are using server sessions, those might get lost.
You should enable HTTP compression, it's generally a good thing, in today's servers you are rarely using any significant amount of CPU usage, so the minor task of compressing the HTTP output will save you more on bandwidth than you loose in CPU time.
I should also mention, whoever is creating your Flash files are doing it incorrectly, the flash developer needs to stream the flash components, not deliver every single graphic, sound & animation on the first page view. There's no reason any Flash front page should be more than 100k.

Related

How to find out why my website is slow loading

I have an ASP.NET website that is hosted with https://www.blacknight.com/.
Whenever I browse to my website in my browser it can take up to 10 seconds to appear.
How can I determine where the bottleneck is?
How can I determine if its the web hostings fault or if its due to my own website?
Are there tools that I can use etc
Frequently when deploying ASP.NET websites to hosting providers the application may become idle and unloaded from memory in a Shared environment. The resulting behavior is that any "new" visits to the website after it's become idle for some time results in a JIT recompilation of the web application before being served again.
Typically hosting providers offer some form of "always on" option for your website which prevents this behavior. Otherwise, in the case of infrequently visited websites, this behavior is par for the course.
Depending on your hosting provider they may respect your web.config entries to allow "Always On" in IIS. You can find more information on this approach here
Hope this makes sense. It is incredibly common for lower traffic websites in Shared environments.
Well just from looking at your code, you need to change some stuff
First, load the scripts at the end of the body to allow the website to load faster.
One of your CSS if called upon after the body tag, call it inside the head tag.
Also do a resource speed test. Do this easily by going on Google Chrome. Right click, select inspect element, then select Network. Press the red button and reload the page. This will show you how fast the resources are loading. There are also test on pingdom which will show you how fast your website loads and the score of it, it also compares your website to other websites.
But this all may not be the case, it just may be your server. Get a dedicated server with at least 10mpbs. You can also get Cloudflare to speed up the process. Also consider using a CDN to load your website faster.
Good Luck

not loading script files (javascript, scriptresource, css, and any other included script files) intermittenly

I got this very odd situation on my application which not all the time most of scripts I include in the <head> is not loaded, but some times they are. If I will refresh the page (either f5 or refresh button), sometimes it loads sometimes it's not. But if I enter the URL directly to the address bar, the page loads as expected. This happens not only in one page but in most pages but intermittently. My application is deployed on my local IIS version 5.1 (i'm using XP SP3). Mostly this situation happens if I rebuild my solution or I restart my IIS to refresh my IIS cache or to restart my Application. My guess is that my IIS don't have enough memory to process all the request at a time that it forgot to load all of the included scripts. Can anyone shed some light on where I should look to solve this kind of problem? At first it only happen on my machine, but lately I've noticed that it happens to our dev server also, I don't want it to go our test server or worse at the production server.
EDIT:
Here's how I determine the scripts that are not loaded.
I don't have any changes on any scripts, css what so ever that is directly seen on the client side. I only change a bit of code behind logic that wont change a bit on what the client user can see on UI. When I build it and test it on browser directly using the local IIS, my display has been corrupted, some scripts will not function and some page methods errors shown.
If I refresh the page, once or many times, the page will displace normally. As soon as I notice this, I already know that this is a cache problem.
I've used firefox and seen the same thing, so I've viewed the source. At first I didn't notice anything suspicious and everything seems normal until I clicked each include on the pages, css and scripts. Some of it are not read instead it returned an html containing The page cannot be displayed, There are too many people accessing the Web site at this time.
Once I noticed the return of some include scripts and css on the source which they return The page cannot be displayed, There are too many people accessing the Web site at this time. which is 403 error or Access forbidden, I wonder why access is forbidden, so I've done some research on my own and found out that IIS 5.1 on XP system only allows a maximum of 10 concurrent keep-alive connection, so if the page is not cached yet or part of the page, it requests to the IIS and there are times that the first request is not done and another request is being made which stuck up to 10 concurrent request, the IIS will return an 403 error to those last requests until, the first request is done and opens another connection.
For IIS5.1 in XP system, I can't do anything but to accept the fact that it only accepts 10 concurrent request, but for IIS6+, I found this article that it can handle up to 3,000 concurrent connections easily and could go more without limits depending on the Hardware resource of the machine, and some tweaks on the settings.

How to put an asp.net application into offlince/maintenance mode?

I've developed my first web application which, surprisingly, is getting very popular.
Because the website is now live, I have a hard time doing some changes, in fear some people are still logged in and are using the application.
I wish to avoid having a duplicated instance of the web application for testing.
Is there any way to put the website in 'maintenance mode' with only me having access to it? Like redirecting to a page with some info, telling its in maintenance mode.
I wish to avoid having a duplicated
instance of the web application for
testing.
That's your problem right there. For anything but the most trivial sites, you should have a staging or development instance. You should be using source control and have a script to update the main instance.
You can simply drop a file called app_offline.htm in the root of your website and ASP.NET will automatically route all traffic to this page. This file can contain any HTML you wish indicating that your site is down for a short period due to maintenance.
For more information please read App_Offline.htm and working around the "IE Friendly Errors" feature:
The way app_offline.htm works is that
you place this file in the root of the
application. When ASP.NET sees it, it
will shut-down the app-domain for the
application (and not restart it for
requests) and instead send back the
contents of the app_offline.htm file
in response to all new dynamic
requests for the application. When
you are done updating the site, just
delete the file and it will come back
online.
This is the answer to your question:
http://www.codeproject.com/Tips/219637/Put-the-website-in-Maintanance-Mode-Under-Construc
There's no such built-in functionality in ASP.NET except app_offline.htm which doesn't quite fit your needs because even you will be denied access to the site. You have to build it on your own but this is best done on the routers and load balancers level than at the application level. Of course this will depend on your network architecture.
Besides building a dev replica of your website to build patches and fixes on, couldn't you just announce a site closing for maintenance several days in advance? I'm not a web programmer, but you might want look into what Hattrick, a popular online soccer management, does for maintaining their site. They use a notification system on the homepage, after users sign-in, that announces when maintenance will be taking place (usually late at night in Europe where a large portion of the players and all the devs are located) and they close down the website for a couple of hours. When they take the site down they post a page, using the same style as the rest of the site, and provide an estimate of when it will be up and running again. Simple, elegant, and when coupled with the long forewarning it seems to do a good job placating the user base.
Give users a long heads up that planned maintenance is scheduled to take place and give them some idea what it is for and most people will be able to accommodate the down time. Nothing is more frustrating than purposefully going to a web app that was up and running 10-20 minutes ago to find it suddenly unavailable and down for maintenance.
Try app_offline.htm ??
What version of ASP.NET? I'm sure there are a million more elegant ways of doing this, but you can change the Default Document in IIS to redirect to Maint.html (or similar).

Re-publishing an ASP.NET Web Application While Site is Live

I am trying to get a grasp on how to handle updates to a live, functioning ASP.NET (2.0 or greater) Application while there are users on the site.
For example, suppose SO is an ASP.NET Web Application project. The project code compiles down to the single .DLL in the BIN folder. Now, there are constantly users on SO, so what would happen to users' actions/sessions if you would use the Visual Studio .NET "Publish" feature (or just FTP everything again manually) while they are using the site?
Would creating an ASP.NET Web Site, instead, alleviate any problems that may or may not exist with the scenario above? I am beginning to develop a web site as a user-driven Web Application, and I want to make sure that my inexperience with this would not potentially annoy the [potentially] many users that I [want to] have 24/7.
EDIT: Sorry, I should have put this in a more exact context. Assume that this site is being hosted by a web hosting service with monthly fees. I won't be managing the server itself, just what the web host allows as a user of their services.
I create two Web sites in IIS. One is the production Web site, and the other is a static Web site with an HttpHandler that sends all requests to a single static "We're updating" HTML page served with an HTTP 503 Service Unavailable. Typically the update Web site is turned off. When it's time to update, we stop the production Web site, start the update Web site, and now we can fiddle with the production Web site all we want without worrying about DLLs being locked or worker processes needing to be spun down.
I started doing this because
App_Offline.htm really does not work well in Web Gardens, which we use.
App_Offline.htm serves its page as 404, which is bad if you're down for a meaningful period of time.
We can start the upgraded production Web site with modified settings (only listening on localhost), where we can do a last-minute acceptance/verification that everything is working before we flip the switch, turning off the update Web site and re-enabling the production Web site.
Things this does not solve include
Any maintenance that requires a restart of the server--you still have downtime where no page is served.
Any maintenance that diddles with the .NET runtime, like upgrading to the latest service pack.
Other approaches I've seen include
Having two servers. Send all load balancing requests to one server, upgrade the other one; then rinse and repeat. Most of us don't have this luxury.
Creating multiple bin directories, like bin-1.0.0.0 and bin-1.1.0.0 and telling ASP.NET which bin directory to use in the web.config file. (One advantage of this is that reverting to a previous binary is just editing a config file. A disadvantage is that it's harder to revert resources that don't end up in your binaries, like templates and images and such.) I don't remember how this actually worked--I think the application did some late assembly loading in its Global.asax based on its own web.config section (since you touched the web.config, the app had restarted, so it was okay).
If you find a better way, let me know!
Changing to the asp.net web site model won't have any effect, as the recycle will also happen, some of changes that trigger it for sure: web.config, global.asax, app_code.
After the recycle, user will still be logged in because asp.net will just validate the syntax. That is given you use a fixed machine key, otherwise it will change on each recycle. This is something you want to do anyway as other stuff can break if the key change across requests i.e. viewstate validation, embedded resources (decryption of the url fails).
If you can put the session out of process, like in sql server, you will avoid loosing the session. If you can't, your code will have to consider that. There are plenty of scenarios where you can avoid using session, and others were you can wrap it and re-retrieve the info if the session was cleaned. This should leave you with a handful specific cases that you know can give trouble to the users, so for those you do some of the suggestions others have already made.
One solution could be to deploy your application into a load balanced environment (web farm).
When deploying a new version you would use the load balancer to redirect requests to the server you are not deploying to.
App_offline.htm is great solution for this I think.
in SO we see application currently unavailable page when a deployment begins.
I am not sure how SO handles it.. But we usually put a holding page. So what ever the user has done (adding question or answering questions) does not get updated. As soon as he updates something he will see a holding page asking him to try after sometime.
And if I am the user I usually press the back button to make sure what I entered is saved in the browser history so that I can post later.
Some site use use are in clustered environment so I take one server offline and inform the load balancer that she will not be available and once I make sure that the new version is working fine I make it live.. I do the same thing for the next server.
Do we have any other option?
It is not a technical solution, but set up a scheduled maintenance window. You can annoucement in advance giving your user base fair warning that there is a possiblity that the application will not be available during that time frame.

How to cache images in memory on the web server for an ASP.NET MVC web app?

I am working on a web application with many images, using ASP.NET MVC. I want to be able to cache the images in memory to improve the performance, but I would like to hear what is the best way to do this.
1) The images are accessible from URL, like http://www.site.com/album/1.jpg. How are the images stored in memory? Are they going to be in a form of memory stream?
2) How to access the image from memory and send to the web pagee? Now the web pages will use the image URL to directly embed the image in a tag.
Thanks!
Wont the webserver and downstream caches be handling this for static resources anyway? Not sure there's many performance gains to be had, but knowing nothing of the app or setup I could be wrong.
To implement I'd setup a page that took an image filename and served it either from disk or from the asp.net in memory cache.
If the images are just static files on disk, then Beepcake is right that IIS will already be caching frequently used images and serving them from memory. Using separate caching servers shouldn't be any quicker than IIS serving an image from memory - it's got more to do with scalability. Once you have a large server farm, it means you have a group of servers just dealing with your code and a group of servers just dealing with static images. Also, if you have too much content for one server to cache it all then you you can route requests so that each of your ten servers caches a different 10% of your content. This should be much better than just having each server cache the same most-used 10% of the content.
Thanks for the response. I think I was thinking the wrong direction. I just found out Flickr is using Squid to cache images.
If you want really good performance, I'd suggest Amazon CloudFront. Edge caching will give you better performance than memory caching, and CloudFront runs nginx, which is significantly better than IIS at static files (among other things).
Setting up edge caching is very easy - you log in, and get an domain to use instead of your own for image URLs.

Resources