Junctions or Virtual Directories for Web Applications? - asp.net

I see that junctions are a common way of referencing shared code in many projects. However, I have not seen them used in web applications before.
Our team is exploring the possibility of abandoning virtual directories in favor of junctions to simplify our build process. My goal is to compile a list of pros and cons in order to make an informed decision regarding this change.
Is it more appropriate to use junctions or virtual directories on web application projects?
Environment is ASP.NET, IIS6/IIS7, VS.NET.

Virtual directories vs. junctions is like comparing apples to pears: they both create sort of a virtual copy of a directory, as well as apples and pears are both fruits, but the comparison ends there.
First off, since Windows Vista, the new thing is symbolic links (which are essentially the same as junctions but can also point to a file or remote SMB path).
Symbolic links enable you to, for example, share every part of a web application except its Web.config and a stylesheet. This is something virtual directories can never do.
Also, virtual directories participate in ASP.NET's change monitoring. If you try and delete (a file or) directory from within your application, for instance, ASP.NET kills your app after the request completes, resulting in loss of session, etc. If instead of using a virtual directory, you use a symbolic link, the change will not be noticed and your app will keep on churning.
It's important to keep in mind that symbolic links are not an everyday feature in Windows. Yes, you can see that a file or directory is linked in Explorer, but it's not instantly visible to what it is linked. Also, from code it is much harder to see if a file is linked, so if you accidently delete the file that is being linked to from a million symbolic links, all those symbolic links suddenly 'stop existing'.
Symbolic links also speed up deployment of multiple instances of the same application, since the only thing you have to do is copy a few actual files, and then create symbolic links to the source files for all the rest.

In case with virtual folders you need IIS installed on each environment. However with both approaches you need to manually maintain all references after each change (For example situation when someone added one more reference), which is not convinient.
Consider using VCS with referncing system. For example SVN with externals. In that case you will have:
Automatic update of references on each environment.
Ability to have references in different versions of external code. This will avoid situations when it is needed to change all dependent applications after each external code change.

Related

Best way to deploy multiple instances of the same ASP.NET web application

Any creative minds out there want to help me think outside the box?
We have an ASP.NET web application which we need to deploy to say 100 machines. The web application is the same except for configuration files (web.config) and a few other files (scripts, mostly).
Our goals are:
To be able to upgrade the web application efficiently by sending out incremental changes which could also easily be reverted (on a per project or per file basis) in case of a problem.
Efficient support for binary files (as this web application contains compiled *.dlls)
Not require any compilation on the target machines.
Nice to haves would be:
Easy to remotely initiate an update
Opportunity to execute certain scripts (ex: database migrations) automatically post-update
We currently have scripts which commit a compiled version of each separate instance of the application to source control and copies over the appropriate configuration files to each folder. Each instance checks out (and later updates) a specific folder from the source control.
This process is good enough for 100 instances, but would not scale to 1000 instances as it is storing clones of all the binary files with every release (one per instance). I'm looking for better ways of doing this.
The most obvious answer appears to keep the same process but have only one folder to contain the files (one folder for all instances) and have scripts that override the appropriate files after an update on the target machines.
What's a better way of accomplishing this?

Where to put a new ASP.NET website?

Where's the best place for a production asp.net application? I mean a place that we need less permission manipulation on folders and probably the experts choice.
under C:\inetpub\wwwroot or C:\inetpub or elswhere ?
In development/test phases I usually put it under C:\inetpub\wwwroot and create a new web application without setting bindings. But on production version with binding I'm not sure where's the right place.
You can put it anywhere you like, they key thing is to ensure that the app pool it is running under is set to run as a low privileged user (like NT AUTHORITY\NETWORK SERVICE), then ensure that user has Read (and possibly Browse if you want it) permissions on the folder you put your web app in. Very seldom (if ever) will the user need Write or Modify permissions on the folder.
and on a new system I had a lot of problem to modify batch files, setting permissions
Setting permissions should not be a problem, you should set the same basic permissions i mentioned above for the user you want to run the app pool as. You can use PowerShell or WMI for this, and you should use the same permissions no matter what folder you install in to.
You could always wrap all this up into an installer, then it can be as simple as hitting Next.. Next... Finish... in an installer wizard to set up your website on any machine. Doing this in an installer also gives you some certainty that nothing has been missed.
Personally I have a 'Development' folder on my D: drive which is then subdivided into different categories depending on the work. I generally don't use inetpub directory and any permission issues I come across I just set directly onto the relevant folder within my own development structure.
On production environments I've used in the past, we've generally done the same thing. Mainly to help backup scenarios really, but also because there's no strict need to use the default IIS directories - you're free to structure things how you like.
Personally, I always create a new folder (in the root of a drive) called WebSites. I then make sure it has the appropriate permissions for the website process(es) (aka App Pools).
eg.
C:\
|_WebSites
|_www.Foo.com
|_www.Bah.com
It also makes it easier to manage because you don't have to hunt through the folder structures to find any/all websites.
But technically, it can be (more or less) anywhere - just needs to have the correct permissions set.
Bonus Answer
I also remove the Default Website from IIS .. which in effect means I can also delete c:\inetpub\wwwroot.
You can put the website any where on the server hard disk, Just make sure it is a secure folder and also I recommend to don't put it in the same OS drive, in case it failed and you needed to formate it.
C:\inetpub\wwwroot and C:\inetpub are just the default places nothing more.
Really depends on how the production server is configured and how operations likes to operate over there. Typically we setup a second "data" drive on servers for a few reasons:
a) Back in the old days, there were a lot of cannonical attacks where the attacker would try to navigate from c:\inetpub to c:\winnt\cmd.exe. Putting things on a different drive prevented this sort of thing.
b) Recovery -- if the OS gets hosed, you can pretty easily reinstall/reimage or move the data disks to another box and get things stood up fast.
c) Typically is lots easier to do things like swap the non-os disk in case you need more disk space or faster disks or whatever.
Basically, off the OS drive is a good idea. Though virtualization and modern deployment tools make lots of this matter less.

Working with multiple branches in ASP .NET

I've seen several other posts similar to this (namely https://stackoverflow.com/questions/5237/solutions-for-working-with-multiple-branches-in-asp-net) but there are several issues that I have that seem to be different than other similar posts.
I have an ASP .NET application that uses a virtual directory off of localhost. There are several spots in the code where I need to reference the name of the virtual directory so the virtual directory needs to be in place and named correctly in order for it to work. I'm also using my httpd.conf file to format my URLs to avoid cluttered querystrings.
That being said, I just published my application and now need to create a branched environment for bug-fixes whenever there is a bug in the live code and I don't want to upload the dev code.
The trouble is that I need to be able to easily run my branched code parallel to my dev code without needing to do a bunch of work with IIS and config files every time I want to load in my branched code. The drawbacks are that the parallel environment needs to have the virtual directory in place and work with the same httpd.conf (for URL formatting).
I don't think Cassini would work because I need SSL and of course...the httpd.conf and the virtual directories would need to still be in place.
The perfect solution in my mind would be to run a parallel website to localhost with the same httpd.conf and the same virtual directory...but I'm running XP Pro and they don't "do" multiple websites.
Have your build process create the virtual directory each time the build is run.
I've used NantContrib's mkiisdir task for this.
With this approach you can't run multiple branches simultaneously, but you can quickly switch between branches by building the branch you want to run.
I would do as above, but you could hook it into your solutions post build event, but this wouldn't be parallel more a quick switch. I think there's a registry hack out there to get multiple sites in iis or if memory serves if you create an additional site through a script it works, it's just the GUI that's locked down. Or the better solution would be upgrade to windows server, and have different branches build to different ports.

SCM for ASP.net

As part of my overall development practices review I'm looking at how best to streamline and automate our ASP.net web development practices.
At the moment, our process goes something like this:
Designer builds frontend as static HTML/CSS on a network share. This gets tweaked until signed off. (e.g. http://myserver/acmesite_design)
Once signed off, developer takes over and copies over frontend HTML/CSS to a new directory on the same server (e.g. http://myserver/acmesite_development)
Multiple developers work on local copy until project is complete.
Developer publishes code to an external publicly accessible server for a client to review/signoff.
Edits made locally based on feedback.
Republish to external server.
Signoff
Developer publishes to live public server
What goes wrong? Lots of things!
Version Control — this is obviously a must and is being introduced
Configuration errors — many many times, there are environment specific paths and variables (such as DB names, image upload directories, web server paths etc. etc.) which incorrectly get copied from local to staging to live etc. etc. with very embarrassing results.
I'm pretty confident I've got no.1 under control. What about configuration management? Does anyone have any advice as to how best to manage an applications structure within asp.net apps to minimize these kinds of problems?
I found that using SVN, NAnt and NUnit with Cruise Control.net solves a lot of the issues you describe. I think it works well for small groups and it's all free. Just need to learn how to use them.
CruiseControl.net helps you put together builds and continuous integration.
Use NAnt or MSBuild to do different environment builds (DEV, TEST, PROD, etc).
http://confluence.public.thoughtworks.org/display/CCNET/Welcome+to+CruiseControl.NET
You got the most important part right. Use version control. Subversion is a good choice.
I usually store configuration along with the site; i.e. when coding a PHP-based site I have a file named config.php-dist. If you want the site to work at all you'll have to copy + edit in all the required parameters (this avoids storing passwords in version control). The -dist file should have reasonable defaults.
Upload directories should be relative if possible; actually all directories should be relative. I'm not experienced in ASP.net, but if it's anything like PHP the current directory is always the directory of the file being requested. If you channel all requests through a single file (i.e. index.asp), then this can even be found programmatically. Or you could find it programmatically by using the equivalent of dirname(____FILE____) in your configuration file.
I also recommend installing IIS (or whatever webserver you are using) on all development workstations (including the designers). Makes life easier as noone can step on each others toes. What one has to do is simply add test hosts to the hosts file (\windows\system32\drivers\etc\hosts iirc) in addition to adding a site to the local IIS. This plays well with version control (checkout, add site to IIS and hosts-file, edit edit edit commit).
One thing that really helps is making sure you keep your paths relative where you can and centralise them where you can't, so when I've been working with ASP.Net I have tended to use web.config to store any configuration and path related data that can't be found programmatically. It is quite possible to find information like your current application path programmatically through the Request object - it's worth looking in some detail over what the environment makes available to you.
One way to make sure you don't end up on something that is dependent on the path name is having a continuous integration server executing your test suite against your application. Each time this happens you create a random filepath. As soon as someone introduces a dependency on the filepath it will fail.

Keeping dot files synched across machines?

Like most *nix people, I tend to play with my tools and get them configured just the way that I like them. This was all well and good until recently. As I do more and more work, I tend to log onto more and more machines, and have more and more stuff that's configured great on my home machine, but not necessarily on my work machine, or my web server, or any of my work servers...
How do you keep these config files updated? Do you just manually copy them over? Do you have them stored somewhere public?
I've had pretty good luck keeping my files under a revision control system. It's not for everyone, but most programmers should be able to appreciate the benefits.
Read
Keeping Your Life in Subversion
for an excellent description, including how to handle non-dotfile configuration (like cron jobs via the svnfix script) on multiple machines.
I also use subversion to manage my dotfiles. When I login to a box my confs are automagically updated for me. I also use github to store my confs publicly. I use git-svn to keep the two in sync.
Getting up and running on a new server is just a matter of running a few commands. The create_links script just creates the symlinks from the .dotfiles folder items into my $HOME, and also touches some files that don't need to be checked in.
$ cd
# checkout the files
$ svn co https://path/to/my/dotfiles/trunk .dotfiles
# remove any files that might be in the way
$ .dotfiles/create_links.sh unlink
# create the symlinks and other random tasks needed for setup
$ .dotfiles/create_links.sh
It seems like everywhere I look these days I find a new thing that makes me say "Hey, that'd be a good thing to use DropBox for"
Rsync is about your best solution. Examples can be found here:
http://troy.jdmz.net/rsync/index.html
I use git for this.
There is a wiki/mailing list dedicated to the topic.
vcs-home
I would definetly recommend homesick. It uses git and automatically symlinks your files. homesick track tracks a new dotfile, while homesick symlink symlinks new dotfiles from the repository into your homefolder. This way you can even have more than one repository.
You could use rsync. It works through ssh which I've found useful since I only setup new servers with ssh access.
Or, create a tar file that you move around everywhere and unpack.
I store them in my version control system.
i use svn ... having a public and a private repository ... so as soon as i get on a server i just
svn co http://my.rep/home/public
and have all my dot files ...
I store mine in a git repository, which allows me to easily merge beyond system dependent changes, yet share changes that I want as well.
I keep master versions of the files under CM control on my main machine, and where I need to, arrange to copy the updates around. Fortunately, we have NFS mounts for home directories on most of our machines, so I actually don't have to copy all that often. My profile, on the other hand, is rather complex - and has provision for different PATH settings, etc, on different machines. Roughly, the machines I have administrative control over tend to have more open source software installed than machines I use occasionally without administrative control.
So, I have a random mix of manual and semi-automatic process.
There is netskel where you put your common files on a web server, and then the client program maintains the dot-files on any number of client machines. It's designed to run on any level of client machine, so the shell scripts are proper sh scripts and have a minimal amount of dependencies.
Svn here, too. Rsync or unison would be a good idea, except that sometimes stuff stops working and i wonder what was in my .bashrc file last week. Svn is a life saver in that case.
Now I use Live Mesh which keeps all my files synchronized across multiple machines.
I put all my dotfiles in to a folder on Dropbox and then symlink them to each machine. Changes made on one machine are available to all the others almost immediately. It just works.
Depending on your environment you can also use (fully backupped) NFS shares ...
Speaking about storing dot files in public there are
http://www.dotfiles.com/
and
http://dotfiles.org/
But it would be really painful to manually update your files as (AFAIK) none of these services provide any API.
The latter is really minimalistic (no contact form, no information about who made/owns it etc.)
briefcase is a tool to facilitate keeping dotfiles in git, including those with private information (such as .gitconfig).
By keeping your configuration files in a git public git repository, you can share your settings with others. Any secret information is kept in a single file outside the repository (it’s up to you to backup and transport this file).
-- http://jim.github.com/briefcase
mackup
https://github.com/lra/mackup
Ira/mackup is a utility for Linux & Mac systems that will sync application preferences using almost any popular shared storage provider (dropbox, icloud, google drive). It works by replacing the dot files with symlinks.
It also has a large library of hundreds of applications that are supported https://github.com/lra/mackup/tree/master/mackup/applications

Resources