How do you deploy ASP.NET applications? Do you push it to production servers using UNC paths/mapped drives? FTP? SFTP? SSH/SCP (via 3rd party app installed)? Something else? Or do you pull it from the production servers with a Source control update or other mechanism? Consider the production servers being on the internet or in a DMZ, push requires opening insecure firewall ports (for UNC or FTP) does it not?
I'm trying to solidify my deployment philosophy for ASP.NET. The pieces that my ideal one-click build/deploy process will include are: MSBuild, Web Deployment Projects, CruiseControl.NET. But, I still struggle with how to actually deliver the bits to the production server.
After spending time on both Windows and *nix platforms, I get frustrated with the Windows deployment story and so am curious how others are doing this.
I use cruisecontrol.net and am very happy with it. I have a nant script that cruise control calls which:
Gets the source from our staging SVN branch
Builds it
Copies the files to each production server
Creates a tag in our production branch in svn so we have a snapshot of what was pushed.
Works great! We used to do this all manually a year ago and I can't tell you how much better this is. There is a small learning curve and some upfront script writing. That might take a couple days but you will save much more time in the end.
This is how we publish:
Publish from Visual Studio to a local folder
Remove some unwanted files and folders (like the empty web reference folder that it creates for no appearent reason...)
Zip
Use remote desktop to copy to stage server
Unpack in a temp folder in the site (this gives the files the right permissions)
Move current files to a backup folder
Move new files to site
Test that it works on the staging server
Run a script (set up by the hosting company) that copies files to the two front end servers
(However, we had some hardware problems with one of the front end servers, so as a short term (?) solution we got to borrow another server as font end server, and the script doesn't copy the files to that server. So the last year or so I have had to use remote desktop to log on to the live server, manually copy the files, move current files to a backup folder and move new files in place.)
Related
Environment: ASP.NET Core 2.1, Ubuntu.
In old style ASP.NET, when I did a bin deploy (uploaded some dll files for example), the webapp would detect that and reload itself - very useful.
With Core it doesn't do that. I need to stop and restart the dotnet MyApp.dll process.
How do I make it detect changes to binaries and reload?
There are file watchers in Ubuntu that can issue restart commands whenever files are changed like systemd or inotify, but I would strongly advise against that. Uploads can pause or be slow and when uploading 50 files imagine restart after every single one every couple seconds. Server has no way to know when you have finished uploading last DLL. IIS has same problem, it's reliable in development because you refresh the page after the full DLL rebuild. But when in production you don't want random visitors to boot your site midway while it's still uploading. Errors, file locks, all kinds of weirds things can happen.
As pointed by Chris Pratt you want to script your deployment workflow. Idk what environment you are developing on, but if you have Visual Studio and WinSCP it's as easy as writing couple lines of code with Scripting and Task Automation.
Then your publish workflow can be for example as following:
Hit publish in Visual Studio
VS will execute winscp script after publish is finished
Authenticate on remote server
Upload publish folder to a remote folder
Remove old files
Prune logs
After all done issue systemctl restart kestrel-myapp command
Then your site is deployed, cleaned and restarted in the most reliable fashion with a single click.
There's nothing I'm aware of that will do this for you. IIS watches things like the bin directory, web.config, etc. and recycles the App Pool when it detects changes, but that's because it knows to. It's also a full-featured web server, and App Pool recycling on file changes is one of those features. Kestrel, which I assume your using is not. It's a very simple web server that does just what it needs to do as strictly a web server. That's why a more traditional web server like IIS, Apache, Nginx, etc. is normally used as a reverse proxy in front of Kestrel - to provide more advanced functionalities.
All that said, though, this is really just a matter of your release strategy. Personally, I'd encourage you to go with something far more robust that copy-pasting DLLs, but if you want to go that route, you can also script it. Create a shell script to copy the bin directory and restart your app. Your release should be one rails as much as possible. Every time human intervention is called for, you have a potential point of failure, because humans are inherently fallible. A script, however, once tested and ensured to work, will pretty much work every time, because it always does the same things in the same order.
The problem is, that when I commit project directory, there is uploaded everything including source code.
Not really sure why you want to upload via FTP? You shouldn't commit your own compiled binaries to source control for deployment though.
You could take a look at AppHarbor, just push your code with git and it will be build and deployed automatically.
more about AppHarbor
Real alternatives to Windows Azure PaaS (web role)?
Does it matter? Since asp.net pages can be compiled on the server, having source files on the web server is sometimes normal so IIS knows not to allow access to them.
That said, uploading output binaries into source control is generally a bad idea - it is better to do the deployment from your build server.
Actually, this is kind of hard.
For months, I've tried to automatize our deployment, without absolute success. For my experience, I can see only way to do that:
Have a build server on your deployment machine (or same network)
A build server will pull out your code from repository, say, once per minute and will check for modifications. If there's modifications, it will execute the build scripts related to this project. I suggest you to use TeamCity, because it is very easy to use compared to CruiseControl (I'm not sure if you can use Git with TFS). You can program your build server for build your solution or project and after, you can execute an msbuild script to copy the files to the production folder (e.g: c:\inetpub\yourapp or \\my_server\inetpub\yourapp). You can use MSBuild's Copy Task to do that.
UPDATE 1: I didn't tried, but if helps, you can push to an FTP server using git-ftp
UPDATE 2: Seems that some guy did some workarounds and successfully deployed his app using git and FTP.
I'm basically wondering what the best way to deploy an Asp.Net Web Site is, mostly from the point of view of security. Right now, I'm trying to publish the website using Visual Studio 2010. Could someone direct me to a good tutorial on how to do this securely? For example, can it be done over an encrypted connection via Visual Studio? Is it necessary to install any software on the server to do this? Should I use a different program to open up an SSL (TLS) connection first, and if so, which program (does it come standard with windows)?
The server is running Windows Server 2008. Development is on Vista.
Many thanks in advance for any direction in this matter!
Andrew
I would publish the site to your local machine and file copy the files across to your test/production environment. As a rule we don't publish sites straight from VS to test or production.
For example you don't want to accidentally push things straight from dev into a live environment do you?
As far as the file transfer security goes you could use SFTP.
Note: First thing is to check with the owner of the server, as they often will provide you an FTP connection and will take care of configuring IIS.
If you want to add security, make a keyfile and sign your assemblies and consider running Dotfuscator on your dlls, the community edition is included in Visual Studio. Here is an earlier question where I've put more info on Dotfuscator.
If you have to do the deployment yourself, here's a few things to consider.
XCopy (easy)
MSI (have to create a setup program, you can do this easily in Visual Studio)
There is no security advantage in deploying using Visual Studio, but you can use Visual Studio to create a small setup program. One thing you want to make sure for security is DO NOT deploy any .cs files. Prepare your files, you should compile in Release mode, make sure debug is not enabled in your config file, keep your bin and it's dll, also the aspx, asmx, ascx, svc, css, js, and config files.
XCopy: Install a small FTP server, or use one your company alreayd has, this will allow you to get your files once you are logged into the target machine. You should be able to get an administrator account for the target machine, just ask the sysadmin of the domain, then log on using remote desktop, got to your ftp site, and download your files. Open IIS on the target machine, create a virtual directory and a pool. Copy your files to the location, configure your connection string to your DB if you use one, then test your website.
MSI: same process as above, except the setup will create the virtual directory and pool for you.
Here is extra info on best practices from the official ASP.Net website.
If you have some control on the server (e.g. to configure IIS7), you might want to look into Microsoft Web Deploy (new product just been released):
http://weblogs.asp.net/scottgu/archive/2010/09/13/automating-deployment-with-microsoft-web-deploy.aspx
Haven't tried it myself, but looks quite slick and it apparently encrypts the data being copied up, so might suit you.
Currently we deploy compiled ASP.Net applications by publishing the web site locally and emailing a zip file to the system administrator with a (usually) lengthy set of instructions for deployment. This is because the first time we deployed an ASP.Net application to a customer the dev and test IIS instance were the same, and we were unable to deploy the site twice to the same machine. This set the tone for deployment on all subsequent projects.
I am now evaluating our deployment methods and am looking specifically at the built in deployment tools; specifically I'm looking at custom installation tasks and using as much of the standard installer functionality as I can (mostly the user interface).
Secondly, I'm looking at merging deployments and automatic updates.
How do you go about deploying sofware in your organisation? What tools do you use, and what problems do you come across most frequently?
We have dedicated DEV, TEST, STAGE, and PRODUCTION servers.
We also have a dedicated build machine which runs Cruise Control.
Cruise Control is configured for a Continuous Integration build, which runs after code is checked in. It is also configured for separate Development, QA, Stage, and Production tasks.
To deploy to development, the code is first retrieved from SVN and built, then the "Precompiled Web" folder is copied to the development web site, and the web service project is copied to the development application server. Cruise Control is also configured to "tag" the source code before the build starts so we can reproduce the build at a later time, or branch from the tag if we need to do a hot fix.
To deploy to QA, the files are copied from the development machines to the QA machines.
Likewise, to deploy to Stage the files are copied from the QA machines to the Stage machines.
Finally, to deploy to production, the files are again copied from the Stage machines to the Production machines.
To configure each environment, we have a custom tool which is part of each environment's Cruise Control task that modifies connection strings, "debug=true|false", "customErrors=Off|RemoteOnly", and other environment-specific settings.
So each environment can be deployed with a button push from the Cruise Control dashboard.
One caveat is that we currently have the production database password configured in the Cruise Control config file...it would be nice move it elsewhere!
Lastly, let me add that even though our production machines are in a dedicated hosting facility, the servers are accessible from our Cruise Control machine, which makes it very easy to do a production deployment. The only manual step is to encrypt the web.config files and remove the "AppOffline.html" file that Cruise Control puts up.
Let me know if this helps, or if you have any questions.
Thanks!
A couple things that I have done is the following:
1) Use a Web Deployment Project in order to compile and clean the build as well as handing web.config section replacement if the config changes between environments.
2) Use NAnt to do all of the building, archiving, and copying in a repetitive manner.
The Web Deployment Project ends up creating a MSBuild file which can be used in place of NAnt; however, I came from a Java background and used Ant all of the time so NAnt is my preference in .Net. If you add in the NAnt Contrib tasks, you will be able to deploy not only the files but also handle items such as your source control (incase it is not part of the default tasks) and Sql Script Execution for changes.
Currently I use both of the options together. I have my NAnt build file call the Web Deployment Project through MSBuild. With the configuration manager setup for each environment, it allows me to manage the web.config section replacements automatically and still have fairly decent control over my copying and archiving of a release.
Hope this helps.
We use web deployment projects, and the VS 2008 projects to create an .msi from the output of the webdeployment & other projects. A normal windows app called 'setup' is used to do a lot of the db creation and preliminary stuff, rather than trying to customise the setup projects with custom steps. It is a lot easier to do this yourself than trying to customise the MS code. This windows app then calls the correct .msi files that the user needs.
Team foundation build runs every evening to rebuild the solution and copy everything to a 'Release CD' directory which anyone can access and do testing on the latest 'release'.
To be honest TFS build is a bit overboard for a small team like ours, and I only use it because its what I am used to.
In a previous company we used this http://www.finalbuilder.com/ and I can recommend it for ease of use and for the amount of software supported.
1) Build project with MSBUILD
2) FTP files to Production Environment
3) Copy / Paste manually to each web server
For intranet sites, we use CruiseControl in conjunction with SVN to have the site rebuilt automagically.
Theoretically you could extend this model over a VPN if you could map a drive remotely to a client's intranet. Or a more quick and dirty solution might be to use a tool like SyncBack to sync the remote folder containing the compiled DLLs for the site.
Deploy Web Applications Using the Copy Web Tool
Text from Microsoft Training Kit Book Web Based Development
Web Setup Projects are useful if you are providing a Web application to many users (for example, allowing people to download the application from the Web and install it). If you are responsible for updating a specific Web site for your organization, it’s impractical to log on to the Web server and install a Windows Installer package each time you make an update. For internal applications, you can edit the Web application directly on the Web server. However, changes you make are immediately implemented in your production Web application, and this includes any bugs that might be there. To enable yourself to test a Web application, you can edit a local copy of the Web application on your computer and publish changes to the production Web server using the Copy Web tool. You can also use the Copy Web tool to publish changes from a staging server to a production Web server, or between any two Web servers. The Copy Web tool can copy individual files or an entire Web site to or from a source Web site and a remote Web site. You can also choose to synchronize files, which involves copying only changed files and detecting possible versioning conflicts in which the same file on both the source and remote site have been separately edited. The Copy Web tool cannot merge changes within a single file; only complete files can be copied.
We currently deploy web applications by creating a database and running SQL scripts through query analyzer. Then we copy the output from "publish website" and set up that website in IIS.
We have seen websetup in visual studio, but that part seems to be thinly documented. For example, we are not clear how to ask the user for IP and password of SQL server. We also tend to get websites deployed this way coming up under folders like http://example.com/project, instead of just http://example.com.
Then there are issues with AJAX.Net not being installed or some or the other patch not applied.
So far, we have physical access to the servers. Pretty soon though we are going to be shipping CDROMs. What is the practical tradeoff between manual intervention and automation?
Avoid Visual Studio deployment, and automate as much as possible. Web Deployment Projects and NAnt can be your friends!
Briefly, our deployment setup:
We use RedGate SQL to script differences between dev and live database.
An NAnt build file which calls MSBUILD to build the web deployment project (.wdproj), zips up the resulting compiled web app (along with the SQL change script) and then uploads the zip file to the server.
On the server side, there is another NAnt build file which takes the application offline, backs up the database, backs up the website. runs the SQL change script, unzips the new version and brings the app online.
Step 3 is usually run "manually" (one double-click), but sometimes scheduled for late at night. You could do exactly the same from a CDROM, or even write a pretty little Windows Forms app as a wrapper.
Quite happy to give details of the NAnt script if you're interested.
Have you tried using Web Deployment project? There is support for VS 2008 also now..
I deploy mostly ASP.NET apps to Linux servers. Here is my standard workflow:
I use a source code repository (like Subversion)
On the server, I have a bash script that does the following:
Checks out the latest code
Does a build (creates the DLLs)
Filters the files down to the essentials (removes code files for example)
Backs up the database
Deploys the files to the web server in a directory named with the current date
Updates the database if a new schema is included in the deployment
Makes the new installation the default one so it will be served with the next hit
Checkout is done with the command-line version of Subversion and building is done with xbuild (msbuild work-alike from the Mono project). Most of the magic is done in ReleaseIt.
On my dev server I essentially have continuous integration but on the production side I actually SSH into the server and initiate the deployment manually by running the script. My script is cleverly called 'deploy' so that is what I type at the bash prompt. I am very creative. Not.
In production, I have to type 'deploy' twice: once to check-out, build, and deploy to a dated directory and once to make that directory the default instance. Since the directories are dated, I can revert to any previous deployment simply by typing 'deploy' from within the relevant directory.
Initial deployment takes a couple of minutes and reversion to a prior version takes a few seconds.
It has been a nice solution for me and relies only on the three command-line utilities (svn, xbuild, and releaseit), the DB client, SSH, and Bash.
I really need to update the copy of ReleaseIt on CodePlex sometime:
http://releaseit.codeplex.com/