Flyway - Multiple DBs with Central Migration - flyway

Our company has about 30+ applications written in different languages (java, c#, visual basic, nodejs etc)
Our aim is to have development teams keep the database change sqls in their repositories, and do the migration from Jenkins with them starting pipelines with version number. Development teams don't have access to Jenkins configuration, they can only run jobs that we created and configured.
How should we go about this? Do we have too keep different flyway instances for each application? And how about pre-production and production stages?
Basically, how should we, as devops team, maintain flyway to do migration of different applications with different stages, without the development teams doing the migration part.

This should be possible with the Flyway CLI. You can tell Flywyay where to look for migrations and how to connect to the database. See the docs about configuring the CLI.
You can configure Flyway in various ways - environment variables, command line arguments, and config files.
What you could do is allow each development team to specify a migrations directory and connection details for the Jenkins task. The task can then call the Flyway CLI, overriding the relevant config items via command line arguments. For example, the command line call to migrate a database:
flyway url=jdbc:oracle:thin:#//<host>:<port>/<service> -locations=some-location migrate
Or you could allow your devs to specify environment variables, or provide a custom config file.
You can reuse a single Flyway instance since the commands are essentially stateless. The only bit of environmental state they need comes from the config file, which you have complete control over.
Hope that helps

Related

How do I use Flyway in a continuous release process of a Java application?

I am new to Flyway. I like it a lot and I'm about to integrate it into our Java workspace. I need a little "push" however when planning the actual release procedure of these migrations. According to Flyway's documentation I have the choice of
distributing a list of Sql files or
distributing a list of both Sql and Java files packed into a jar archive.
I would like to try the second option because it gives us more flexibility and it sounds like I could simply include the migration scripts as resources in the jar file of the executable. However as we deliver database changes quite often in a continuous release process I see the jar file eventually being polluted with tons of script files. Moreover, when using Ant to create the jar file Ant will put the name of every file into the manifest's classpath which will make the manifest just messed up.
With these concerns in mind, for those of you that use Flyway in production:
What is the recommended way of distributing migration scripts? Do you put all of them into a jar and pass it to flyway on the server? Or do you copy all scripts to the server using a batch file everytime you make a release? Thanks for your advice.
war project is built with maven
war project src/main/resources/db/migration has the migration scripts.
DB migration scripts are packages when the war is built.
When the war is deployed as part of container startup, db migration using flyway is executed.
The above approach is followed for local development environment, Test/Stage environment are continuously deployed by Jenkins pipeline and DB Migrations are automatically run.
In case of failures in Continuous deployment pipeline, we manually run db migration using flyway command line.
We also have the option of running flyway using command line for production deployment and for trouble shooting.
You can setup a jenkins job to automate flyway migration.You can do it in 2 ways.
By configuring the goals and options to run the
compile flyway:migrate -Dflyway.user="xxxxx" ...(your other options)...
By installing a Flyway Runner Plugin on Jenkins
I would prefer doing it using the first option as you won't have to install anything on your Jenkins box.

Running a post deploy ps script or executable

I am in the process of converting our legacy custom database deployment process with custom built tools into a full fledged SSDT project. So far everything has gone very well. I have composite projects that can deploy a base database as well as projects that deploy sample and test data.
The problem I am having now is finding a solution for running some sort of code that can call a web service to get an activation code and add it to the database as the final step of the process. Can anyone point me to a hook that I might be able to use?
UPDATE: To be clearer I am doing this to make it easier to maintain and deploy our sample and test data to a local machine. We can easily use Jenkins to activate the sites when they are deployed nightly to our official testing environments. I'm just hoping to be able to do this in a single step to replace the homegrown database deploy tool that we use now.
In my deployment scenario I wrapped database deployment process in some powershell scripts which do necessary prerequisites. For example:
powershell script is started and then it stops some services
next it run sqlpackage.exe or preproduced sql deployment scripts
finally powershell script starts services.
You can pass some parameters from powershell to sql scripts or sqlpackage.exe as sqlcmd variables. So you can call webservice first, then pass activation code as sqlcmd variable and use the variable in postdeployment script.
Particularly if it's the final step, I'd be tempted to do this separately, using whatever tool you're using to do the deployment: Powershell scripts, msbuild, TFS, Jenkins, whatever. Presumably there's also a front-end of some description that gets provisioned this way?
SSDT isn't an eierlegende Wollmilchsau, it's a set of tools for managing database changes.
I suspect if the final step were "provision a Google App Engine Instance and deploy a Python script", for example, it wouldn't appear to be a natural candidate for inclusion in an SSDT post-deploy script, and I reckon this falls into the same category.

Automated Deployment and Upgrade Strategy for ASP.Net MVC Application

I am working on a ASP.net MVC4 project where a same project needs to be deployed to many clients on daily basis, each client will have its own domain / sub domain and a separate app pool and db (MSSSQL).
Doing each deployment manually could take at least 1-2 hours if everything goes well. Is there anyway using which I can do this in some automated way?
Moreover, we also need to update all of the apps when a new version is released.. may be one by one or all of them at same time. However, doing this manually could take weeks and once we have more clients then it will not possible doing this update manually.
The update involves, suspending app for some time, taking a full backup of files and db, update application code/ files in app folder, upgrade db with a script and then start app, doing some diagnosis script to check if update was successful or not, if not we need to check what went wrong?
How can we automate this updates? Any idea would be great on how to approach this issue.
As a developer for BuildMaster, I can say that this scenario, known as the "Core Version" pattern, is a common one. If you're OK with a paid solution, you can setup your deployment plans within the tool that do exactly what you described.
As a more concrete example, we experience this exact situation in a slightly different way. BuildMaster has a set of 60+ extensions that rely on a specific SDK version. In our recent 4.0 release, we had to re-deploy every extension because of breaking API changes within the SDK. This is essentially equivalent to having a bunch of customers and deploying to them all at once. We have set up our deployment plans such that any time we create a new release of the SDK application, we have the option to set a variable that says to build every extension that relies on the SDK:
In BuildMaster, the idea is to promote a build (i.e. an immutable object that travels through various environments like Dev, Test, Staging, Prod) to its final environment (where it becomes the deployed build for the release). In your case, this would be pushing your MVC application to its final environment, and that would then trigger the deployments of all dependent applications (i.e. your customers' instances of your application). For our SDK, the plan looks like this:
For your scenario, you would only need the single action, "Promote Build". As I mentioned before, any dependents would then be promoted to their final environments, so all your customer deployments would kick off once that action is run during deployment. As an example, our Azure extension's deployment plan for its final environment looks like this (internal URLs redacted):
You may have noticed that these plans are marked "Shared", which means every extension we have has the exact same deployment plan, but utilizes different variables to handle the minor differences like names, paths, etc.
Since this is such an enormous topic I could go on for ages, but I think that should be sufficient for your use-case if you wanted to try it out.
There are others but you could setup Team Server Foundation to deploy automated builds.
http://msdn.microsoft.com/en-us/library/ff650529.aspx
I find the easiest way to do this from an MVC project is to create a publish profile.
This is done by right-clicking your project selecting publish and then configuring it to your needs.
Then from TFS you create a new build definition, this kicks of a wizard which takes you through it.
There are quite a few options which would be too long to go into for every scenario.
The main change I usually find the most important is to set an MSBuild Argument to deploy with the publish profile.
This can be found at Process > Advanced > MSBuild Arguments.
Once this is configured correctly it's a simple case of right-clicking and queue new build to build and deploy.
You wil need different PublishProfile/Build configuration per deployment environment.
For backups I use a powershell script which can be called manually or from TFS.
You also have a drop folder in TFS which keeps a backup of x many releases.
The datbases are automatically configured via Sql server to backup, TBH I didn't set that up it was a DB admin guy who is also involved with releases.
From a dev testing side I use jMeter (http://jmeter.apache.org/) to run some automated scripts that check that users can login and view certain screens, just to confirm nothing major has gone wrong. However there is usually a testing team to run more detailed tests, again not setup by me.
All of the above will probably take you sometime to setup but in the long run it will literally save you weeks of time over a year.
A free alternative to TFS is http://www.cruisecontrolnet.org/, I have used this in the past too and is pretty good.
You can automate your .Net deployments with Beanstalk, which will give you a way to trigger deployments with a single click, watch progress, manage permissions and see history of deployments. Check out this guide on the topic:
http://guides.beanstalkapp.com/deployments/deploy-dotnet.html
I hope you will find it useful.
P.S. - I work at Beanstalk.

Rake Strategy, DotNet Implementation

When reading about and playing with Rails last year, one of the tools that made the biggest impression on me was Rake. A database versioning system that keeps all dev db's identical integrated right into the build...something like that would make life so much easier (and safer)!
However, one of the things that I haven't been able to figure out:
How do you move these changes to your production servers when you don't actually have access to the production servers? We have multiple servers across the country that where the application is installed/upgraded by a setup package.
Note: This question is more about strategy than Rails/Rake specific technologies. We don't use rails, we use .Net. But if I can figure out this publish scenario, there seem to be several tools Migratordotnet being one that might enable us to do something similar.
As you probably know, the standard Rails way of running migrations in production is Capistrano. It has a deploy:migrations task that runs the migrations on remote servers using ssh.
You might be able to adapt Capistrano to do what you want. It's essentially a flexible way to run commands on groups of remote servers. You need to have Ruby installed on the machine you are deploying from in order to use it, but not on the machines you are deploying to.
Your best option may be to write a custom Capistrano task to upload the setup.exe, run it, then run the migrations (perhaps using Migrator.NET).
You might be able to use something like Red Gate's SQL Compare to produce schema diff scripts that would allow you to automate the process of updating the database. I've used the tool manually to do such changes and could easily see creating a program that would run these updates as part of upgrade process. If I were going to automate it, though, I'd design in something that would enable me to check what version of the schema was in place and run the necessary scripts in the proper order to bring it up the the desired version.

Step-By-Step ASP.NET Automated Build/Deploy

Seems like there are so many different ways of automating one's build/deployment that it becomes difficult to parse through all the different scenarios that people support in tutorials on the web. So I wanted to present the question to the stackoverflow crowd ... what would be the best way to set up an automated build and deployment system using the following configuration:
Visual Studio 2008
Web Application Project
CruiseControl.NET
One of the first things I tried was to have CCnet automatically zip the output and copy it to the server, but then that requires manual work to unzip at the destination. However, if we try to copy all the files individually, then it could potentially take a long time if it's a large application (build server lives outside of the datacenter in our office ... I know).
Also of particular interest is how we would support multiple environments as we have dev, qa, uat, and then of course prod.
MSDeploy seems really interesting, but unless I'm interpreting the literature incorrectly, doesn't help in the scenario of deploying from the output of a build server. If anything, it seems like it'll be useful in deploying one build across a build farm ... but even for deploying from one environment to another, one would have to manually change config settings and web service URLs, etc.
I recently spent a few days working on automating deployments at my company.
We use a combination of CruiseControl, NAnt, MSBuild to generate a release version of the app. Then a separate script uses MSDeploy and XCopy to backup the live site and transfer the new files over.
Our solution is briefly described in an answer to this question Automate Deployment for Web Applications?
You might be interested in MSDeploy. Here's a Scott Hanselman post on this. It's only available as a technical preview at the moment (September 2008) but is worth evaluating against your requirements.
There is another new build tool (a very intelligent wrapper) called NUBuild. Its lightweight, open source and extremely easy to setup and provides almost no-touch maintenance. I really like this new tool and we have made it standard tool for our continuous build and integration process of our projects (we have about 400 projects across 75 developers). Try it out.
http://nubuild.codeplex.com/
Easy to use command line interface
Ability to target all .Net framework
version i.e. 1.1, 2.0, 3.0 and 3.5
Supports XML based configuration
Supports both project and file
references
Automatically generates the “complete
ordered build list” for a given
project – No touch maintenance.
Ability to detect and display
circular dependencies
Perform parallel build -
automatically decides which of the
projects in the generated build list
can be built independently.
Ability to handle proxy assemblies
Provides visual clue to the build
process e.g. showing “% completed”,
“current status” etc.
Generates detailed execution log both
in XML and text format
Easily integrated with
Cruise-Control.Net continuous
integration system
Can use custom logger like XMLLogger
when targeting 2.0 + version
Ability to parse error logs
Ability to deploy built assemblies to
user specified location
Ability to synchronize source code
with source-control system
Version management capability
Do you have the ability to run commands remotely? The PsExec utility from Systinternals would let run a command line unzip program on the remote machine. If you have a script that copies the build as a .zip file to the remote site, you would just need one more line for the PsExec call to unzip the files.
I had a related question about getting a deployable set of files from an automated build. I found Web Deployment Projects (links and all in the old question) did what I needed - they're a VS and MSBuild add-on.
This is a common problem (and I wish I had read it sooner) for all development, not just ASP.NET. Being one of its developers, my team naturally uses BuildMaster internally for the entire release process, and for most scenarios it's free. Within the tool, we are able to perform all the standard CI builds to create artifacts and then set up an automation process to deploy these artifacts to any one of the 40+ servers we have internally or externally hosted, depending on the specific application or environment.
Since you specifically mentioned deployment to different testing environments, this is a fundamental aspect of the tool. The idea is to model the environment workflow (e.g. Integration -> QA -> Production) you already have in place and essentially promote a build all the way from source control to production. Most times, it's as simple as adding a deployment action that deploys an artifact to the environment, other times it can be much more complex.
You also casually mentioned configuration file changes are part of deployment, which is another built-in component to BuildMaster. The idea we had was to use the tool itself as the central hub for all configuration files and deployments, thus ensuring the latest changes are applied automatically with a simple "deploy configuration files" action in your deployment plan.
One thing you didn't mention with regard to this process is the database deployment aspect. Most ASP.NET applications require an associated database, otherwise they could just be static HTML files. It is crucial that the database schema gets updated to the appropriate database version with every deployment. There is, not surprisingly, a module within BuildMaster that handles this for you as well. The idea is to store DDL-DML scripts within the tool itself, and by executing scripts only once per environment, it ensures that all of your databases across each environment are up-to-date as your builds are deployed through them. Other scripts (e.g. stored procedures, views, triggers, etc.) are essentially code files and therefore belong in source control. These DROP-CREATE-CONFIGURE type scripts can be run each and every time in most cases with a simple deployment action.
Another piece of the deployment puzzle that most developers do not think about is process automation. Many developers need to perform sign-offs or fill out change request forms in order to manually perform these processes. Again, this is all available as part of the automated workflow setup within BuildMaster. You can setup blockers that do not allow promotion to say the QA environment unless all unit tests have passed, or block promotion to the Staging environment unless someone from the QA team approves the build and all issues in your issue tracking tool are resolved/closed for that particular release.
While I realize I left out CC.NET from the answer, our applications are all built and deployed through BuildMaster so we no longer need it, though we could however just as easily pickup the artifacts from a drop location and deploy them in later environments.
I see that many people use CC for their .NET projects, but why not use Jenkins, Sonarqube? They got all you need. I setup all this in 3 days. I have a Win 2008 server R2, MSSQL, Jenkins, VIsual SVN and Sonarqube.
It all works great and u get all metrics on your project. Sonarqube uses Gallio, Gendarme, FXcop, Stylecop, NDepths and PartCover to get your metrics and all this is pretty straight forward since SonarQube do this automatically without much configuration.
i post som pictures for u too get a feeling for it. Here is Jenkins witch builds and get Sonar metrics and a another job for deploying automatically to IIS
And Sonarqube, all metrics for my project. This is a simple MVC4 app, but it works great!:
If you want more information i can be more specific but i think you should at least consider jenkins. If CC suites you better, at least you looked at good alternative before you chose.
This whole setup uses MSBuild, too build and deploy the apps.

Resources