ASP.NET - Basic checklist for putting a site into production - asp.net

I'm building a static ASP.NET site (using Masterpages and a few forms) and I'm about to release it onto my production server.
I know about changing <compilation debug="true"> to false, but I'm wondering what other things I can do to obtain the highest speed possible. There is no data access in the site, it's all static content.
Does anyone have a checklist they run through or know of a good resource for setting up sites in a production environment, with a focus on performance?
Checklist so far (Feel free to edit this yourself with any worth additions)
Make sure <compilation debug="false" /> is actually set to false in Web.Config
Make sure <trace enabled="false" /> is actually set to false in Web.Config
Set necessary read/write/modify folder permissions for site
Enable GZIP in IIS (reduces size of pages/css/javascript dramatically)
Have you considered OutputCaching for any pages / controls?
Consider setting up Web Tests (Eg WatiN for .NET) to make sure functionality on your site is still working ok
Make sure it isn't Friday afternoon!

If you're writing any log or output files, make sure the proper folder permissions are setup in the production environment. Typically debug/test environments are much more lax on file read/write permissions than production.

Don't deploy on Friday afternoons! This is guaranteed to mess up your head for the weekend.

Also, don't forget to check the gzip settings in IIS. Compressing output will make things travel across the wire much faster.

There is actually a very good checklist on how to perform a security deployment review provided on MSDN.

if its all static content, you'll want to use aggressive Output Caching

If your site use a database and only presenting information, make the database read-only. That takes away all locking handling and speeds upp the access a great deal.
If you have a back-end that updates the data, make it a separate database and have sheduled periods that update the readonly database once a day or what is needed for that application.
If you just present news and other small things on a company web-site that not change so often then this solution is probably for you. Even if its a site with gigabytes of data.. The key word is, how often does we update the data?
From what I see in daily business,noone really thinks about this solution because everything has to be "real time", but there are plenty of cases where this would be a perfect solution.

Review your web.config
Check debug (web.config / *.svc), tracing, ...
Update debug to production values:
email addresses
(web)service addresses
location log files
quick search: link

You should have some sort of test to verify various functions of your site, and the permissions. For instance, once you publish. Walk through a checklist, can I access x if I don't have permission? Does x,y,z work on the application? I do this after every publish because small changes can have a big impact.

You should read this:
https://stackoverflow.com/questions/72394/what-should-a-developer-know-before-building-a-public-web-site
It's currently the 9th highest voted question on SO and in the top 3 most favorited. The caveat is that it's platform agnostic, so it's missing some ASP.Net-specific items.

Thoroughly test the site outside of your corporate firewall / proxy after clearing your browser cache. This will help to ensure that all resources are publicly accessible (and are not on a local server or cached). For instance, you might find that you have used absolute URLs to include, say, JavaScript or CSS files. These work fine in your development environment, but as soon as the site goes live they are inaccessible. Or you have a CSS file in your cache that has subsequently been deleted, but you don't notice.
Ensure that any products / applications you use that have keys that are tied to a domain will work on your live site. This includes things like Google Map keys or commercial 3rd party applications. It also includes automatically generated hyper-links sent out in, say, emails. You wouldn't want a user registration to have a link back to http://localhost/comfirm.aspx or the like, would you?

Related

Publising Umbraco pages in development environment differs across clients

We're working on an Umbraco site - multiple development machines using a shared development database.
When one developers makes changes in the CMS to content and does a Save and Publish the change is reflected on his machine but not other development machines.
This doesn't seem to make sense as we're all looking at the same database.? We've tried doing an IIS reset to see if it's caching at work but this doesn't seem to make a difference either.
Any ideas what on earth could be going on?
Umbraco does a lot of caching, so it doesn't have to hit the database all the time. Normally, all of the published content is cached in an xml file at App_Data\umbraco.config. You just need to have your developers right click on the root of the content tree in the umbraco backoffice and click "Republish the entire site" to regenerate that xml cache on disk from the xml cache in the database.
You also might need to reindex your examine indexes. You can normally find the "Examine Management" dashboard on the developer section in the backoffice of umbraco. By default, there are three indexes: InternalMember, Internal, and External. Unless you have membership going on in your umbraco site, you can ignore that index. The External index is used mostly for site searches. The Internal index is much more critical. It is used to cache media. I believe it is also used in the backoffice, but I'm not 100% certain. Make sure that the Internal index is regenerated.
Remember that media files are stored in the /media directory by default. That means if developer 'A' uploads a file, the physical file won't show up on developer 'B's machine automatically.
I'll bet you there's some cool ways to set up load balancing to handle a caching for your dev setup. I'm pretty sure there are also ways to store the media in the database, so you don't have to worry about transferring them back and forth.

Is it best practice to keep settings in root web.config

I have hundreds of sub-web.configs who have same app settings so I was wondering if it's alright to merge them all and put them into root web.config.
If you're the developer, or if you can chat to the developers so you can be sure that they are all likely to keep on sharing the same values, then I say go ahead and do it. Bear in mind that future apps deployed onto the server would also have access to those settings, so if you if you can't trust them, and if the setting values are confidential, you may want to keep those separate still.

Storing site specific configuration data

I have been given a job to re-develop a news portal. The website already has couple of thousands of unique visits a day. I am going to develop it using ASP.NET webforms. I am currently in the planning phase and I am thinking to offer the main admin a page where he can change site specific configuration information. Some of these are;
Web site title "<title>"
site URL
footer text
default image directory
whether to accept comments without authorisation or not
I listed above some settings so that you can understand my scenario better.
What I can't decide is, where to store all this information. Do I store them in a DB (costly?), a custom XML file? or a .config file. e.g. ConfigurationManager.AppSettings
Any pros or cons would make my day!
Thank you!
My opinion is to store them on web.config on WebConfigurationManager.OpenWebConfiguration().GetSection() because this variables are critical and change only ones - in the initialize of the site.
For example the default image directory is stay the same for the rest of the site life, the same and the site URL the same and the other.
Also when you change this settings probably you need also a restart of the web application because for sure you need to re-read them on some static variables.
And because this variables are stay as is, and need them for start the web (then you read the database and the rest) you need to have it in first hand, from the web.config.

app_offline alternative

I usually place an app_offline.htm in my root directory when I am releasing a website to a production environment. However sometimes if there has been a few big changes to the site, I would like to click around first to make sure it's stable without allowing access to anyone other than me.
As far as I am aware this isn't possible, but I'm hoping someone has a neat solution...
The solution has to include if someone has a deeplink into the site, so using a default.htm/asp page in the root won't do the trick unfortunately.
I agree with the staging environment answer above, but otherwise here's one possible approach: Temporarily block all IP addresses besides your own. This can be achieved through IIS Directory Security configuration, or programmatically in any number of ways
You can redirect all the non-authorized users to an Under Construction page of some sort. Meanwhile, you can happily browse the site from your IP. When the site is vetted, you remove that IP restriction and the site becomes available to the world at large.
It's a difficult thing to achieve. That's why you should have a staging environment where everything should be validated before shipping into production. Then during the deployment process (if it takes long, but it shouldn't) you could use an App_Offline file. This staging environment should be as close as possible to your production environment (in terms of software, patches and configurations installed, not in terms of hardware power of course).
Another quick suggestion that would allow you to control things from the web.config might include a custom module that redirected all requests to a static page except those defined by a filter (i.e. hostname, url sniffing) that could be configured via the web.config.

How do I take a .NET site down for maintenance?

I have an ASP.NET site that I'm going to have to take down to make some major structural updates to, and I was wondering how I should go about it from the client-side perspective. I have heard of an App_Offline.htm file or something like that, but I've never really gotten that to successfully work. Does anyone know how to do this?
EDIT
My app is running ASP.NET 4.0, for what it is worth.
Rather than messing with the app_offline silliness (among other reasons, you can't continue to see the site internally while performing maintenance), I created an additional "down for maintenance" site in IIS, which is normally stopped. It has the same IP, host headers, etc as the main site, but only has a default.aspx, an images folder and a stylesheet. That file contains the "This site will be down for maintenance until xx:xx PM CST" message.
When I'm ready to perform the update, I stop the main site and start the maintenance site, which then processes any requests it receives and, of course, returns the maintenance message.
When the maintenance is complete, stop the maintenance placeholder site, and restart your main site.
If you're using host headers, you can modify this approach so that the site remains internally accessible over your LAN/WAN while the maintenance site is handling external requests. A simple approach is to remove the host headers for <*>.yourdomain.com from the main site before starting the maintenance site, and ensure that the main site has an additional host header that is internally accessible (added to your local hosts file, for instance). When you start the maintenance site, it'll handle external requests while the primary site will handle requests to the internal-only header.
Alternatively (this seems complex, but saves you the trouble of adding and removing headers), create three sites:
Main site: Configured as in normal operation.
Maintenance site: Has same IP, host headers, etc as main site, but only contains default "down for maintenance" page and any images, css, etc that are required.
Internal test site: Duplicates the configuration of the main site and points to the same folders, but only has host headers,etc for an internal name that is not in the public DNS.
This way, you have only to stop the main site and start the other two in order to funnel external traffic to the "down for maintenance" site, while you can still see and tweak the primary site. This is helpful for that last few minutes of testing/bug fixing that tends to come up during a deployment.
Update
If you don't have access to the server or IIS Manager, you most likely won't be able to use any of that. Assuming that your access is limited to your own folder, your options seem to be either to deploy app_offline.htm to the root of the site (ASP.NET checks for that filename), or to just replace the whole site with a "down for maintenance" app. Perhaps someone else will chime in with alternatives.
The trick for IE is to push over the wire particular count of bytes otherwise IE shows not so friendly 404 error anyway. here is more details: http://weblogs.asp.net/scottgu/archive/2006/04/09/442332.aspx
If you have a good pre-release testing process and careful release procedures you probably won't be down for long.
In that case dropping a file called App_Offline.htm into your site root works fine. IIS will replace your site with it until you remove it. It's a painless way of making sure nothing's updating while you transition.
Mine just has a header with the site logo and a message that we'll be down for maintenance for up to twenty minutes. That's it. It took me about five minutes to write IIRC.
I would definitely recommend this for short sharp down periods of less than half an hour. Anything longer and you're probably looking at a major system change that warrants an approach like David Lively's.

Resources