Background Info: File Replication is Lame
Currently, we have a massive, high-traffic ASP.NET web application load-balanced across 8 different IIS servers. Due to the nature of the site, minor changes to .aspx files and .ascx controls happen frequently throughout the day, and after being tested and published to live, are replicated out to each of the public webservers through xcopy deployment on a scheduled basis every 10 minutes.
Of course this is incredibly inefficient, as each server must have a redundant copy of the entire site, and we would like to eliminate the 10-minute publishing lag.
Possible Improvement: Hosting from Shared Storage
We now have the option to use centralized storage with an iSCSI interface to host the entire site centrally, with each server believing that the remote storage is a local drive. Publishes would be instantaneous and system-wide.
Note: Hosting the drive off a UNC share is not possible, as there are so many different directories in the site structure, each requiring a FileSystemWatcher for ASP.NET to monitor for changes, that the SMB maximum command count is quickly reached. Yes, we know about the MaxCmds and MaxMpxCt registry settings.
The Problem: Web.config changes trigger massive recompiles
The problem we forsee is that certain changes to the file system structure can cause nearly every compiled .aspx or .ascx to have to recompile, causing queued requests and a perception that the server is down. Most resources are not used system-wide and so recompiling them on a change causes hardly a blip in resources. A global master page used by all pages on the site can cause this, but this can easily be managed by code.
The primary culprit is the web.config file. Changes to the web.config file cause the entire web application to recycle, and recompilations to occur. So, we currently don't replicate web.config changes. Any web.config changes requires bringing the web server off the load balancer, applying (and testing) the changes, and then warming the server up with junk requests before it is placed back on the load balancer.
However, if the web.config file, like the rest of the web application's directory structure, is located on centralized storage, there is only one copy of the file, and individual servers could not be patched and warmed up anymore.
The Question
Is there a way to get an ASP.NET web application to take its marching orders from a source other than a file named Web.config?
Ideally there would be one file per server, for example:
default.aspx
global.asax
Web-ServerA.config
Web-ServerB.config
...
Web-ServerN.config
Where is the name "web.config" defined anyway? Is there a registry setting that could be set on a per-server basis? Is there an entry that could be made in the machine.config or the global web.config to specify what file to use?
Things Out Of Scope
Just so I am clear, I am not asking how to have different AppSettings for debug, test, and live. There are other topics that cover this, and all my web.configs will be identical most of the time, the only time I need them to be different is when an update is being performed.
We aren't using the web.config for any appSettings information; this is for the really important stuff, like assembly references, httpHandler definitions, and other system.web settings that can not be databased.
Update
I tried searching the registry for Web.config, and found nothing except for applications that noted that I had recently edited web.config files, which I apparently do a lot. No help there.
My first question would be what are you keeping in the web.config & can you move it to a database? We keep every config setting in a table in our database and use the machine.config to store the db connection information.
Not sure how much of a rewrite that would be for you, but it would avoid your issue.
Another option would be to house your configuration items in an external file and reference it from the web.config. Changes to that file would not be re-read until the aspnet wp was recycled but would allow you to change setting and then cycle each server via an IISRESET.
<configuration>
<appSettings file="OtherFile.config">
...
Just curious, what filesystem are you using? NTFS is not a shared storage file system. In other words you can't have more than one node writing to the filesystem at a time.
I would suggest virtual directories under the site in IIS. This will probably require a little bit of restructuring the layout of your code, but shouldn't be too major. So, you would have the root home dir of the site with the web.config that is specific to that machine and then a vdir mapped to whatever shared filesystem resource you setup.
Related
My ASP.Net application used to restart unpredictably during requests, ruining my state, session data etc. I determined that the problem was caused by a control that writes and deletes some files in Temp folder that it creates near bin folder, so the web directory looks like this:
bin
Temp
....
Default.aspx
web.config
ASP.Net apparently reacts to changes inside Temp folder the same way it reacts to changes inside bin or in web.config - it restarts the application.
I could partially solve the problem by moving the Temp outside the site directory. That way the application doesn't get restarted every time something temporary is written\deleted, so this part works well. The problem is that some of the files inside Temp directory should be web-accessible - like images generated on the fly and such.
So my question is actually threefold:
Should ASP.Net application get restarted even if the changes are made not in the bin directory, but in another directory at the same level? Or is there something wrong with my configuration?
Where and how do I create a temporary folder so it's web-accessible but it doesn't cause application restart?
How do I turn off restart on directory change from code or web.config (both in IIS and ASP.Net development server)?
Based on my understanding here is the answer to your questions:
ASP.NET restarts the application when too many files are changed in one of the content directories. For more info abt when app restart happens read: http://programming360.blogspot.in/2009/04/what-causes-application-restart.html
You can create the temporary folder anywhere in the same machine or on a different machine(file-share) so long as the IIS-User has access to the folder it should work normally. The only thing that you need to consider is the latency when the folder is on a different computer.
I dont believe you can control application restart and this is completely controlled by IIS. Might want to read up on this SO questions: ASP.NET restarts when a folder is created, renamed or deleted
I want to know all possibilities that IIS7 application pool automatically restart.
Because I'm facing such situation and i have no idea about what should i looking for.
There are a lot of reasons that an IIS app pool may restart. The best resource I've found was a blog post ASP.NET Case Study: Lost session variables and appdomain recycles, by Tess Ferrandez that goes into detail on how to identify the issue and fix it. She lists the following reasons that an app domain will recycle:
An application domain will unload when any one of the following occurs:
Machine.Config, Web.Config or Global.asax are modified
The bin directory or its contents is modified
The number of re-compilations (aspx, ascx or asax) exceeds the limit specified by the setting in machine.config or web.config (by default this is set to 15)
The physical path of the virtual directory is modified
The CAS policy is modified
The web service is restarted (2.0 only)
Application Sub-Directories are deleted (see Todd’s blog http://blogs.msdn.com/toddca/archive/2006/07/17/668412.aspx for more
info)
I have a custom ASP.NET application that I utilize for several clients that I host. Each client has a separate domain and the application is normally a child application under the root domain (http://domain.com/customapp). The application files are the same (aspx, ascx, style sheets, images, etc.). The only thing different is the web.config file for each client. As development of the application continues to evolve, I have to update the application for each directory and this obviously becoming tedious. I am trying to come up with a method keep the application up to date. My first though is placing the application into a single physical path and creating multiple applications pointing to that path (the problem with this method is I can't have different web.config files). I am curious as to what solution others are using in this scenario...
If you want to handle this entirely in Visual Studio, VS2010 offers web.config transforms which could solve your problem.
In a nutshell, create a build configuration (In VS, select Build|Configuration Manager...) for each site. Add a web.config transform for each client, which only specifies the differences required for each application.
I use this for differentiating between development, staging and release configurations - each transform adjusts the connection string, app settings, etc - and it works quite well both within Visual Studio and when deploying via MSBuild.
Also, note that web.config settings are inherited by IIS applications. So, if you have a root site
/root
and client apps
/root/client1
/root/client2
...
you could place the client-specific config settings in a web.config in each client-specific folder, and global settings a web.config in the root folder.
Can you just move your web.config content to a database and load it conditionally based on the domain that was referenced?
Select Case Request.Url.Host.ToLowerInvariant()
Case "xyz.com", "www.xyz.com"
'Load XYZ stuff'
Case "abc.com", "www.abc.com"
'Load ABC stuff'
Case Else
'Throw an error probably'
End Select
Even better, store your domains in the database as keys so that you don't ever have to touch the code.
What it best location to store various configuration settings of a web site modules.
Creating class (that inherit ConfigurationSection) that map the settings in web.config file?
Or creating some DAL and BLL clases that work with database?
I've used a simple heuristic to categorize each configuration variable into one of four categories:
Compile time configuration (changes done together with code changes) - if possible then inside the code or assembly (as an embedded resource), otherwise in web.config
Server instance specific configuration (SQL connection strings, local file paths) - in web.config
Application (database) configuration (feature selection and other global application settings that change rarely, if ever) - in database but without an UI
Application configuration - in database, accessible through an admin UI
Storing the configuration settings in the Web.config will have the effect that if you modify the web.config file, your application will be restarted and the new settings will have immediate effect.
If you are running the application on multiple machines, you however will need to update each machine.
If you store the configuration settings in the database, you will need to either restart your web application manually or have a function (such as an admin page/site) to allow the application to re-read the settings.
To actually answer the question:
Basic information is going to have to be stored locally in web.config (connection strings etc.)
Beyond that other information could be stored in either location.
Having it in the database means that it's easier to write admin pages to control the information rather than editing the web.config file directly.
How often are things going to change? If set up is a one-off thing then having admin pages would be overkill, but if there's ongoing changes (adding new users, categories etc.) then they might be a good idea.
Also with data in the database you can perform remote administration on the system
So, without more information on your application I can't make a recommendation.
In most times, you have separate settings for each module on each page. Thus, you have to save them in database.
Build a configuration section. It is pretty straight forward and suits your needs.
I am trying to get a grasp on how to handle updates to a live, functioning ASP.NET (2.0 or greater) Application while there are users on the site.
For example, suppose SO is an ASP.NET Web Application project. The project code compiles down to the single .DLL in the BIN folder. Now, there are constantly users on SO, so what would happen to users' actions/sessions if you would use the Visual Studio .NET "Publish" feature (or just FTP everything again manually) while they are using the site?
Would creating an ASP.NET Web Site, instead, alleviate any problems that may or may not exist with the scenario above? I am beginning to develop a web site as a user-driven Web Application, and I want to make sure that my inexperience with this would not potentially annoy the [potentially] many users that I [want to] have 24/7.
EDIT: Sorry, I should have put this in a more exact context. Assume that this site is being hosted by a web hosting service with monthly fees. I won't be managing the server itself, just what the web host allows as a user of their services.
I create two Web sites in IIS. One is the production Web site, and the other is a static Web site with an HttpHandler that sends all requests to a single static "We're updating" HTML page served with an HTTP 503 Service Unavailable. Typically the update Web site is turned off. When it's time to update, we stop the production Web site, start the update Web site, and now we can fiddle with the production Web site all we want without worrying about DLLs being locked or worker processes needing to be spun down.
I started doing this because
App_Offline.htm really does not work well in Web Gardens, which we use.
App_Offline.htm serves its page as 404, which is bad if you're down for a meaningful period of time.
We can start the upgraded production Web site with modified settings (only listening on localhost), where we can do a last-minute acceptance/verification that everything is working before we flip the switch, turning off the update Web site and re-enabling the production Web site.
Things this does not solve include
Any maintenance that requires a restart of the server--you still have downtime where no page is served.
Any maintenance that diddles with the .NET runtime, like upgrading to the latest service pack.
Other approaches I've seen include
Having two servers. Send all load balancing requests to one server, upgrade the other one; then rinse and repeat. Most of us don't have this luxury.
Creating multiple bin directories, like bin-1.0.0.0 and bin-1.1.0.0 and telling ASP.NET which bin directory to use in the web.config file. (One advantage of this is that reverting to a previous binary is just editing a config file. A disadvantage is that it's harder to revert resources that don't end up in your binaries, like templates and images and such.) I don't remember how this actually worked--I think the application did some late assembly loading in its Global.asax based on its own web.config section (since you touched the web.config, the app had restarted, so it was okay).
If you find a better way, let me know!
Changing to the asp.net web site model won't have any effect, as the recycle will also happen, some of changes that trigger it for sure: web.config, global.asax, app_code.
After the recycle, user will still be logged in because asp.net will just validate the syntax. That is given you use a fixed machine key, otherwise it will change on each recycle. This is something you want to do anyway as other stuff can break if the key change across requests i.e. viewstate validation, embedded resources (decryption of the url fails).
If you can put the session out of process, like in sql server, you will avoid loosing the session. If you can't, your code will have to consider that. There are plenty of scenarios where you can avoid using session, and others were you can wrap it and re-retrieve the info if the session was cleaned. This should leave you with a handful specific cases that you know can give trouble to the users, so for those you do some of the suggestions others have already made.
One solution could be to deploy your application into a load balanced environment (web farm).
When deploying a new version you would use the load balancer to redirect requests to the server you are not deploying to.
App_offline.htm is great solution for this I think.
in SO we see application currently unavailable page when a deployment begins.
I am not sure how SO handles it.. But we usually put a holding page. So what ever the user has done (adding question or answering questions) does not get updated. As soon as he updates something he will see a holding page asking him to try after sometime.
And if I am the user I usually press the back button to make sure what I entered is saved in the browser history so that I can post later.
Some site use use are in clustered environment so I take one server offline and inform the load balancer that she will not be available and once I make sure that the new version is working fine I make it live.. I do the same thing for the next server.
Do we have any other option?
It is not a technical solution, but set up a scheduled maintenance window. You can annoucement in advance giving your user base fair warning that there is a possiblity that the application will not be available during that time frame.