Good way to make changes to production database / source code - asp.net

I'm interested to find out what would be the good way to make changes to production database and source code in web application (ASP.NET, SQL Server 2008).
A little bit more details, we develop on local machines, and then we need to transfer the code and database changes to production (pretty much standard story).
At the moment we do it in the evening, change the database directly from management studio on production server, and then just overwrite the existing asp.net code (copy/past).

You're talking about Release management. What you're asking about is a big subject with a LOT of different answers. The best solution for you is not something we can tell you. There are trade offs to consider.
For example, what you're describing is a very basic release management process that would be considered an "immature" process.... It does not take into account rollback plans, versioning, separation of concerns, proper testing, or any of a hundred other factors that a "mature" release management process involves.
A mature process is very good, but if you don't have the resources, it's not feasible.
To get to the point, I don't think you question can be answered fully here. I'd suggest starting to research "change management", "release management", "Application Lifecycle management", and "Applicaiton Development Lifecycle". I'll have a few good starter links for you in a minute.
Just a forewarning, though, you are asking a question that's going to open your eyes and your world in ways you probably haven't considered. There are things like automated builds to consider, tools to do it for you (high priced, free, and everything in between)
http://en.wikipedia.org/wiki/Release_management
http://en.wikipedia.org/wiki/Application_lifecycle_management
A few simple options for JUST what you're asking about can be found here:
http://msdn.microsoft.com/en-us/library/7hd4c0x3(VS.80).aspx
Also, since you talked about source code without mentioning which source control you're using, I need to say... if you're not already using source control, you need to. You'll wonder how you ever lived without it once you start using it.

Depends on whether it's the first deployment of a new app, or an update to the app.
For small updates, record all your database changes as sql scripts. You must strictly enforce that all changes to development are applied as sql scripts. Put the scripts in source control. Deploy the update by running the scripts on production.
For new apps you may have thousands of scripts. You can't run them individually. Consolidating them into a master script takes too much time. (although you still want to check EVERY script into source control). In this case you reach a milestone in development then FREEZE the development database, and declare it a baseline. Use the database tools to generate a master script(s). Deploy production by running this script(s). Manually create data scripts for your lookup tables to keep it separate from junk dev data.
Avoid a database copy. Avoid changing by hand through the GUI. Scripts are the way. How you go about collecting the scripts, consolidating to master scripts, generating the scripts, etc is another story.

Related

What is the standard procedure for deploying an MVC website with a team of programmers?

I am used to working in a team that uses Web Forms and VS Source Safe, so procedure would be something like:
get latest version at beginning of day and before checking out.
check in all files at the end of the day, and notify team not to upload.
when finished the page and ready to upload, take a backup, just upload your files and check in.
the team was small enough that it was manageable.
Since you precompile in MVC and Web Applications, it is not possible to upload the site whilst pages in development are checked in.
What is the normal procedure for deployment in small/medium/large companies?
Thanks.
There is no normal procedure, although by rule-of-thumb it generally gets more complex and convoluted the bigger the company.
Consider your own process, if there is nothing wrong with it, then don't change it.
If you need to expand your team, consider a more collaborative way to manage code and deployment. Deployment sucks and nobody wants to do it manually over and over, verbally telling people you're uploading and not to is even worse > consider a build server such as TeamCity or TFS and setup a deploy process that manages this for you.
Consider moving from SourceSafe to Subversion, GIT, TFS etc.
Research ALM across the web (there's lots of good shared knowledge on blogs), but again, consider your need first, and think about if any changes will be actually cost effective and gain you productivity.

How do I keep compiled code libraries up-to-date across multiple web sites using version control?

Currently, we have a long list of various websites throughout our company's intranet. Most are inside a firewall and require an Active Directory account to access. One of our problems, as of late, has been the increase in the number of websites and the addition of a common code library that stores our database access classes, common helper functions, serialization methods, etc. The goal is to use that framework across all websites throughout the company.
Currently, we have upgraded the in-house data entry application with these changes consistently. It is up-to-date. The problem, however, is maintaining all of the other websites. Is there a best practice or way in which I find out versions on each website and upgrade accordingly? Can I have a centralized place where I keep these DLLs and sites reference them? What's the best way to go about finding out what versions are on these websites without having to go through each and every single website, find out the version, and upgrade after every change?
Keep in mind, we run the newest TFS and are a .NET development team.
At my job we have a similar setup to you, lots of internal applications that use common libraries, and I have spent the best part of a year sorting this all out.
The first thing to note is that nothing you mentioned really has anything to do with TFS, but is really a symptom of the way your applications, and their components, are packaged and deployed.
Here are some ideas to get you started:
Setup automated/continuous builds
This is the first thing you need to do. Use the build facility in TFS if you must, or make the investment into something like TeamCity (which is great). Evaluate everything. Find something which you love and that everyone else can live with. The reason why you need to find something you love is because you will ultimately be responsible for it.
The reason why setting up automated builds is so important is because that's your jumping off point to solve the rest of your issues.
Setup automated deployment
Every deployable artifact should now be being built by your build server. No more manual deployment. No more deployment from workstations. No more visual studio Publish feature. It's hard to step away from this, but it's worth it.
If you have lots of web projects then look into either using web deploy which can be easily automated using either msbuild/powershell or go fancy and try something like octopus deploy.
Package common components using nuget
By now your common code should have its own automated builds, but how do you automatically deploy a common component? Package it up into nuget and either put it on a share for consumption or host it in a nuget server (TeamCity has one built in). A good build server can automatically update your nuget packages for you (if you always need to be on the latest version), and you can inspect which version you are referencing by checking your packages.config.
I know this is a lot to take in, but it is in its essence the fundamentals of moving towards continuous delivery (http://continuousdelivery.com/).
Please beware that getting this right will take you a long time, but that the process is incremental and you can evolve it over time. However, the longer you wait the harder it will be. Don't feel like you need to upgrade all your projects at the same time, you don't. Just the ones that are causing the most pain.
I hope this helps.
I'd just like to step outside the space of a specific solution for your problem and address the underlying desire you have to consolidate your workload.
Be aware that any patching/upgrading scenario will have costs that you must address - there is no magic pill.
Particularly, what you want to achieve will typically incur either a build/deploy overhead (as jonnii has outlined), or a runtime overhead (in validating the new versions to ensure everything works as expected).
In your case, because you have already built your products, I expect you will go the build/deploy route.
Just remember that even with binary equivalence (everything compiles, and unit tests pass), there is still the risk that the application will behave somehow differently after an upgrade, so you will not be able to avoid at least some rudimentary testing across all of your applications (the GAC approach is particularly vulnerable to this risk).
You might find it easier to accept that just because you have built a new version of a binary, doesn't mean that it should be rolled out to all web applications, even ones that are already functioning correctly (if something ain't broke...).
If that is acceptable, then you will reduce your workload by only incurring resource expense on testing applications that actually need to be touched.

Best Practice for maintaining a TSQL database creation script for a web application

We have a ASP.NET web application and need to maintain the database creation and initialization script.
Are there any industry best practices that people know of for maintaining database creation and initialization scripts. I can think of two main approaches.
Maintain a tsql creation script directly by hand.
Maintain a master database and create the script that is then checked into source safe.
Also the script should be able to be tracked through source control, i.e. table order should be controllable.
If possible should also include the ability to track initialisation data either in the same or a seperate script.
Currently we generate the script from management studio but the order of the tables seems to be random.
And the more automated the solution the better.
The problem is not maintaining the script, nor maintaining a 'master' copy of the database. The real problem is upgrading existing database(s). You do your modification in the developer environment, which are then propagated to the test environment, and finally pushed into production environment. While at developer and test environment stage is possible to start from scratch, in production you always have to upgrade the existing deployment.
In my experience the best practice is to use upgrade scripts. This practice is useful even with a single deployed site, but it becomes invaluable with multiple locations that may be at different versions. But even with one single operational site is still useful to be able to test the upgrade repeatedly (starting from backups of current version), keep the changes in source control, have a well formalized and peer reviewed change procedure (the upgrade script). And upgrade scripts can be tailored to specific needs of the operational site, like handling a large table with special care, or deal with encrypted data, or whatever one of the myriad of the details diff based tools neglect or ignore. The main disadvantage is the the scripts have to be written, which require real T-SQL knowledge (forget all the 'designers' in you favorite management tool).
You might want to check out RedGate SQL Source Control.
Are you looking for Visual Studio Database Projects?
I use database projects to store all database objects (tables, views, functions, keys, triggers, indexes across schemas) and keep versioning in TFS. You can build the database to ensure that everything is valid. You can deploy to a fresh database, or do a schema comparison with an existing database.
I also keep all reference and setup data in post deployment scripts which are automatically run after deployment.

Should we have separate database instance for each developer?

What is the best way for developing a database based application? We can have two approaches.
One common database for all the developers.
Separate database for all the developers.
What are the pros and cons of each? And which one is better way?
Edit: More then one developer is supposed to update the database and we already have SqlExpress 2005 on each developer machine.
Edit: Most of us are suggesting a common database. However if one of the dev has modified the code and database schema . He has not committed the code changes but the schema changes has gone to the common database. Will it not possibly break the other developers code.
Both -
I like a single database that changes are tested on before going live, or going to a 'formal' test environment. This is your developer's sanity check; it stays up to date with the live system and it makes sure they always consider each others changes. The rule should be that changes don't go on here if they might break something else.
A database per developer is great (even essential) when more than one developer is making updates. It allows them all the development flexibility they want without breaking things for other developers.
The key is to have a process for moving database changes from development through to your live system, and stick to your process.
Shared database
Simpler
Less cases of "It works on my machine".
Forces integration
Issues are found quickly (fail fast)
Individual databases
Never affect other developers, but this is also a bad thing, in continuous integration
We use a shared development database and it works out nicely. Our schema rarely changes in a way that makes it backwards incompatible, but occasionally a design change will occur before we go live, and we simply ask the other developers to update.
We do have separate development application (web) servers, but they share the same database. Our developers do have the option to use their own database, as they know how to set this up, and will do that on occasion, but only temporarily. The norm, for us, is to share the database.
Thought I'd throw this out there, but why not let every developer host their own instance of SQL Server Developer on their desktops and then have a shared server for each of the other environments (development, QA, and prod)? I think even the basic MSDN that comes with Visual Studio Pro (if you opt for it) includes a license for SQL Server Developer.
The developer can work on their desktop without impacting the others and then you can have them move the code to the next shared environment as you see fit (at will, with daily/weekly builds, etc.).
EDIT:
I should add that the desktop instance allows developers to do things that he DBAs often restrict on shared environments. This includes database creation, backup/restore, profiler, etc.. These things are not essential but they allow the developer to become so much more productive while reducing the demands they make against your DBAs.
The shared environment is completely necessary for testing - I would not recommend going from desktop to production. But you can add so much by allowing the developers to have 100% control over a given database environment (including isolation from others) with a relatively minor cost.
Depends on your development, testing and maintenance cycles. Also on the size and location of the development team (and of course organization). If you support several versions of the database you might need even more environments.
In real world I found the following approach rather satisfying:
single central database/application for testing purposes, gets all the changes by various developers periodically merged into it
local copies for development (so you are free to drop and reload the whole database)
upgrade scripts are maintained for any changes to schema, auxiliary and sample data sets
Here are some further points:
If two developers (two teams) are working on changes that can affect each other then they should complete their tasks independently and then integrate/merge and test. For this it is much better to have separate development environments (unless they have to work together in which case I consider them to be a part of the same team; still they can work on their own copies of the database and share it if necessary)
If they work on the changes that do not influence each other they could work on the main server. Or on their own local copies of the database.
So, developing on the local copy has all the benefits with no risk in a general case (when you support multiple versions of the system and maintain upgrade scripts anyway).
Still it is great if you can share test cases so ability to dump/restore the database easily and quickly is a big plus.
EDIT:
All of the above assume that having a copy on the local machine of the whole system for testing purposes is feasible (size, performance, licenses, etc).
I would opt for solution #1 : One common database for all the developers.
Pros
Less expensive for the infrastructure;
Only one dump is required when it's time to refresh the development database;
Everyone develops with the same data, so it closely represents the production environment;
Cons
If one developer performs a bad operation, this could impact a larger amount of developers.
As for solution #2 : One independant database for each of the developers;
Pros
This could be useful for new features developments, when development requires isolation;
Cons
More expensive for the company (infrastructure, licences...);
Multiplication of problems caused by eager isolation development environment (works in devloper's environement, not integrated);
Multiplication of dumps by the DBAs of the same copy from the production environment.
Considering the above, I would recommend, depending on your company size:
One database for development;
One database for testing the integration;
One database for acceptance tests;
One for new feature development that will perhaps require integration tests.
If your company doesn't require integration tests, then go with acceptance tests, this step is crucial before going to production.
One per developer plus a continuous integration and build server to run unit and integration tests. That gives you the best of both worlds.
Having all developers modify a single dev database quickly becomes less productive once the amount of database change reaches a certain level because it forces a developer to deploy changes to the shared database before he is ready to check-in, which means other parts of the code line may break unnecessarily.
Simple answer:
Have one development database, and if the developers want their own, they can just run their own instance on their own machines. Just be sure to test/publish on the shared.
We do both:
We use code generation where I'm at and our database is generated as well. So we have an instance on each developer's box where the database is generated. Then we use the scripts that are generated to apply the changes to a central test database. If that goes well we apply the changes to the production database during a release.
What's nice with this approach is that when our "source of truth" is checked in to source control, all the database changes are automatically distributed to the other developers when they rebase and regenerate. It works well for us.
The best way is single database on Test/QA server and one database (probably on developer's local computer) for each developer (so, 10 developers work with 10 + 1 databases).
The same approach as for general development: each developer has own copy of source code on local machine.
Also, multiple-database approach simplifies the keeping database schema in version control systems. We are keeping database creation scripts in SVN.
We are using the approach, described here:
http://www.sqlaccessories.com/Howto/Version_Control.aspx
You might also want to look at Refactoring Databases. Aside from discussing database changes, he includes discussions on going from development to production in a way that reduces risk.
Why on earth would you want a separate database for all developers?
Have one common database for all, that way the table structure is consistent and the sql statements are as well.
The biggest problems with developers having their own databases are:
First it is unlikely to be the size
of the real production database (if
you take all the databases we need to
work with here, they would take up
several hundred gigabytes of space, I
don't have that available on my
machine), this causes bad code to be
written that will never work on a
large database for performance
reasons. SQL code should never be written against a data set significantly smaller than the one on prod.
Second, developers who use their own
database create problems when they
spend a long time developing
something and then find out only
after they merge with a real datbase
that it affects something else. You
find this stuff much faster when you
share the environment. So there is
inthe end less wasted development
time.
Third developers working on related
things need to know about the changes
you are making, it will affect their
change.
When you know you are going to affect others, I think you tend to be more careful what you do which isa plus in my book.
Now the shared database server should have what we call a scratch database, a place where people can create and test table changes, so if they are doing something that might need to drop and recreate a table (which should be a rare case!), they can test the process first by copying the table to the scratch database and running their process there and then changin to the real database when they are sure it works. Or we often copy a backup table to the scratch database before testing a particular change, so we can easily recreate the old data if it goes bad.
I see no advantages at all to using individual databases.

ASP.NET website deployment best practices resource suggestions

I have looked through the related questions, and none of them have provided me the information I am looking for.
Currently the team I work on does deployments of individual .aspx (and .aspx.vb) files for bug fixes/enhancments. I am trying to affect change, as I really believe that deploying the "whole compiled site" is less error prone. As this is a significant change from the way things have been done, my suggestions have ben met with significant resistance.
As my google-fu has not been up to par lately, I was hoping the SO community could either tell me that I am off my rocker, and that there is nothing wrong with moving individual files, or point me to some really good resources which would allow me to make a stronger case.
Edit:
This has all been great info, and reinforces the arguments that I have already been making, can anyone argue the other side?
Deploying individual files for bug fixes and deployment is not a wise strategy. It sounds like you need a comprehensive build and deployment process. That doesn't mean it has to be complicated as there are some good tools available nowadays.
Build and deployment can get detailed, so as a minimum start try taking a look at the Microsoft Web Deployment Tool (http://www.iis.net/extensions/WebDeploymentTool). Install the tool on your build server and install it on your deployment server. Stage your ASP.NET content locally using the Visual Studio Publish command, then use the above tool to synchronize the entire package on the deployment server. I like this approach because it can be completely automated. When doing builds and deployments, aim for complete automation to reduce potential errors.
This is the bare minimum, but you will at least be certain that when specific files are changed, they are ALL synchronized on the deployment server.
Personally to me rolling back immediately is most important. Again website projects are very hard when it comes to track the changes.
you can find a good detailed comparison here. I am reproducing the article here.
1) Deployment. If you need in-place deployment, this model is perfect. However, it's not recommended since you are exposing your logic in clear text. So, anybody who have access to physical server can mess with your code and you never going to notice this. You can try to make precompiled web site, but you going to end up with a lot of dll and almost untouchable aspx files. Microsoft recognized this limitation and released Web Deployment Project tool.
2) You need to keep track of what did you change locally and what did you upload to production server. There are no versioning control. Visual Studio has Web Copy tool, but this tool fails to help. I had to build my own tool, which kept track of changes based on Visual Source Safe.
3) When you hit F5 for debug execution it takes merely 2 minutes to compile and execute whole project. Of course you can attach debugger to existing thread, but this is not an obvious solution.
4) If you ever try to generate controls on a fly you will hit first unsolvable limitation. How to reference other pages and controls. Page and control compilation happens on a per directory basis. On best case you going to get assembly for each directory, in worst each page or control is going to get its own assembly. If you need to reference another page from a control or another page you need to explicitly import it with the #Reference directive.
So for,
customControl = this.LoadControl("~/Controls/CustomUserControl.ascx") as CustomUserControl;
You need,
But what if you want to add something really dynamically and can't put all appropriate #Reference directives? Or What if you are creating server control and it doesn't have ascx file, so you don't have a place for #Reference ? Since each control has it's own assembly, it's almost impossible to do reflection.
Web Application Projects which re-appeared in Visual Studio 2005 SP1. They solves all issues mentioned above.
1) Deployment. You get just one dll per project. You can created redistributable packages and repeatable builds.You can have versioning and build scripts.
2) If you did code behind change you can upload just one dll. If you did aspx change you can upload just aspx change.
3) Execution takes 2-3 sec maximum.
4) Whole project is in one assembly, which helps reference any page or control. Conclusion. For any kind of serious work you should use Web Application Projects. Special thanks to Rick Strahl for his amazing article Compilation and Deployment in ASP.NET 2.0.
I agree with Rich.
Further information:
Deploying your SOURCE code ala the .vb files to the server is a BAD idea. Compile it. Obfuscate if you can, just don't deploy straight source. Imagine an attacker which gains access to the system. They could easily change your code and you might not ever notice. Yes, you can use a tool like reflector to decompile. But it's really hard to decompile a full site, make the changes you want, and put them back into production.
Deploying a single file might very well cause some type of problem in a related module. I'm guessing you guys don't really do QA. Tell them it's time to grow up.
Compiling your site will reduce JIT (just in time) compilation. Think performance.
I'm also going to guess that pretty much everyone has production server access. This is bad from the company's perspective as you have no controls in place. What happens when an employee decides to cause some havoc before leaving?
What you are describing is inline with Cowboy coding. Sure, it's fun to ride to the rescue but this style frequently blows everything up.
It's bad for rolling back. If you deploy as a web site vs web app, yeah you can do quick patches of one or two files, but what if you ever need to roll back to a previous version? Good luck tracking down all the files that were updated to make the new version. I much prefer the concept of a "version" for organizational reasons, and the compiled web app is much more inline with this than a "website" project.
We had this dilemma and ended up going with the compiled version mainly for the security reasons. If your site is external facing you could be compromising your security by allowing the vb files to be out there in plain text. I realize one could still get your code if they really wanted to but it would be an additional hurdle they would need to go through. If you use Visual Studio as your development environment you can publish the site pre-compiled and check the named assemblies option when publishing and this will essentially create a dll for each aspx page so you can do the one off page changes if necessary. This was a great feature we found as we were constantly updating the whole site and there were times when things would get updated that shouldn't. After using that feature we no longer had updates getting pushed that shouldn't. As far as rolling back I hope your using some type of Source control / versioning system. Team Foundation Server is great for versioning/source control but it is quite pricey.
What is the best deployment strategy depends a lot on what kind of environment you are working in, and what kind of developers you are working with.
Visual artists that started with graphic layout and worked towards programming are much more in tune to individual page generation and release. Also the .aspx.vb files are simply server side scripting, not really programming.
Programmers usually start at the command line and branch out to environments such as the web and understandably feel that good programming practices should be applied too the web, including standard test and release cycles (and compiled code).
If the site is in constant flux the individual pages would make more sense, but if you are required to deliver an installation package to your production group msi files are the way to go, since they can be easily backed out if necessary.
If you evaluate what your groups needs are, which includes the varied experience of everyone in your group, you should be able to convince either yourself or the group. This is not a matter of which is better, but which provides the best business model.

Resources