I have server which contains multiple databases, now i need to deploy the code on all the database in parallel, Is it possible in flyway to achieve this because for each database i need to have separate config file. So i wanted to know how to achieve in such way all the config will trigger in parallel.
Regards,
Adarsh
Why do you need a separate config file for each database? You can override most/all the data in the config file using the various Flyway command switches documented here:
http://flywaydb.org/documentation/commandline/migrate.html
For example, you can use the -URL switch to override the JDBC connection string in the config file. This allows you to have a single config file but to run the upgrade against different target databases.
It is generally preferable to use these switches as it avoids duplication of code. (You won't need to maintain so many config files.) It also means you can avoid putting stuff like passwords into your source code.
Next step is to create a script that runs flyway migrate against each of your target databases. For example, you could write a script that does something like:
flyway migrate -url:jdbc:mysql://<host>:<port>/<database1>
flyway migrate -url:jdbc:mysql://<host>:<port>/<database2>
flyway migrate -url:jdbc:mysql://<host>:<port>/<database3>
flyway migrate -url:jdbc:mysql://<host>:<port>/<database4>
Now when you run this script each of your databases will be updated in sequence.
Alternatively, if you need the updates to run in parallel rather than in sequence, you need to find a way to schedule for each line to run at the same time. One way you could achieve this is to use an automation tool like Octopus Deploy to orchestrate your deployments.
If you want to use Octopus Deploy you may find this step template useful. This step template also includes a "Drift-Check" feature to ensure that your databases are in sync:
Flyway migrate step template for Octopus Deploy
If you plan to use any other tool you may find this PowerShell script useful (copied from the link above) where $locationsPath, $targetUrl, $targetUser and $targetPassword are all variables.
# Executing deployment
Write-Host "*******************************************"
Write-Host "Executing deployment:"
Write-Host " - - - - - - - - - - - - - - - - - - - - -"
$arguments = #(
"migrate"
"-locations=filesystem:$locationsPath",
"-url=$targetUrl",
"-user=$targetUser",
"-password=$targetPassword"
)
Write-Host "Executing the following command: & $flywayCmd $arguments"
& $flywayCmd $arguments
Regards, Alex
(Open disclosure: I am a pre-sales engineer at Redgate Software. I wrote the step template mentioned above and worked with a team to build FlySQL, a tool that helps MySQL Flyway users to build their projects more efficiently.)
In the Flyway documentation it describes a switch do specify a different config file. This way you
"To use an alternative configuration file use -configFiles=path/to/myAlternativeConfig.conf"
This way you should be able to use the same sql folder and yet apply different config settings.
Related
Our company has about 30+ applications written in different languages (java, c#, visual basic, nodejs etc)
Our aim is to have development teams keep the database change sqls in their repositories, and do the migration from Jenkins with them starting pipelines with version number. Development teams don't have access to Jenkins configuration, they can only run jobs that we created and configured.
How should we go about this? Do we have too keep different flyway instances for each application? And how about pre-production and production stages?
Basically, how should we, as devops team, maintain flyway to do migration of different applications with different stages, without the development teams doing the migration part.
This should be possible with the Flyway CLI. You can tell Flywyay where to look for migrations and how to connect to the database. See the docs about configuring the CLI.
You can configure Flyway in various ways - environment variables, command line arguments, and config files.
What you could do is allow each development team to specify a migrations directory and connection details for the Jenkins task. The task can then call the Flyway CLI, overriding the relevant config items via command line arguments. For example, the command line call to migrate a database:
flyway url=jdbc:oracle:thin:#//<host>:<port>/<service> -locations=some-location migrate
Or you could allow your devs to specify environment variables, or provide a custom config file.
You can reuse a single Flyway instance since the commands are essentially stateless. The only bit of environmental state they need comes from the config file, which you have complete control over.
Hope that helps
I am new to Flyway. I like it a lot and I'm about to integrate it into our Java workspace. I need a little "push" however when planning the actual release procedure of these migrations. According to Flyway's documentation I have the choice of
distributing a list of Sql files or
distributing a list of both Sql and Java files packed into a jar archive.
I would like to try the second option because it gives us more flexibility and it sounds like I could simply include the migration scripts as resources in the jar file of the executable. However as we deliver database changes quite often in a continuous release process I see the jar file eventually being polluted with tons of script files. Moreover, when using Ant to create the jar file Ant will put the name of every file into the manifest's classpath which will make the manifest just messed up.
With these concerns in mind, for those of you that use Flyway in production:
What is the recommended way of distributing migration scripts? Do you put all of them into a jar and pass it to flyway on the server? Or do you copy all scripts to the server using a batch file everytime you make a release? Thanks for your advice.
war project is built with maven
war project src/main/resources/db/migration has the migration scripts.
DB migration scripts are packages when the war is built.
When the war is deployed as part of container startup, db migration using flyway is executed.
The above approach is followed for local development environment, Test/Stage environment are continuously deployed by Jenkins pipeline and DB Migrations are automatically run.
In case of failures in Continuous deployment pipeline, we manually run db migration using flyway command line.
We also have the option of running flyway using command line for production deployment and for trouble shooting.
You can setup a jenkins job to automate flyway migration.You can do it in 2 ways.
By configuring the goals and options to run the
compile flyway:migrate -Dflyway.user="xxxxx" ...(your other options)...
By installing a Flyway Runner Plugin on Jenkins
I would prefer doing it using the first option as you won't have to install anything on your Jenkins box.
I am in the process of converting our legacy custom database deployment process with custom built tools into a full fledged SSDT project. So far everything has gone very well. I have composite projects that can deploy a base database as well as projects that deploy sample and test data.
The problem I am having now is finding a solution for running some sort of code that can call a web service to get an activation code and add it to the database as the final step of the process. Can anyone point me to a hook that I might be able to use?
UPDATE: To be clearer I am doing this to make it easier to maintain and deploy our sample and test data to a local machine. We can easily use Jenkins to activate the sites when they are deployed nightly to our official testing environments. I'm just hoping to be able to do this in a single step to replace the homegrown database deploy tool that we use now.
In my deployment scenario I wrapped database deployment process in some powershell scripts which do necessary prerequisites. For example:
powershell script is started and then it stops some services
next it run sqlpackage.exe or preproduced sql deployment scripts
finally powershell script starts services.
You can pass some parameters from powershell to sql scripts or sqlpackage.exe as sqlcmd variables. So you can call webservice first, then pass activation code as sqlcmd variable and use the variable in postdeployment script.
Particularly if it's the final step, I'd be tempted to do this separately, using whatever tool you're using to do the deployment: Powershell scripts, msbuild, TFS, Jenkins, whatever. Presumably there's also a front-end of some description that gets provisioned this way?
SSDT isn't an eierlegende Wollmilchsau, it's a set of tools for managing database changes.
I suspect if the final step were "provision a Google App Engine Instance and deploy a Python script", for example, it wouldn't appear to be a natural candidate for inclusion in an SSDT post-deploy script, and I reckon this falls into the same category.
I have 3 stages (dev / staging / production). I've successfully set up publishing for each, so that the code will be deployed, using msbuild, to the correct location, with the correct web configs transformed - all within Jenkins.
The problem I'm having is that I don't know to deploy the code to staging from what was built on dev (and staging to production). I'm currently using SVN as the source control, so I think I would need to somehow save the latest revision number dev has built and somehow tell Jenkins to build/deploy staging based on that number?
Is there a way to do this, or a better alternative?
Any help would be appreciated.
Edit: Decided to use the save the revision number method, which parses a file containing the revision number to the next job -- to do this, I followed this answer:
How to promote a specific build number from another job in Jenkins?
It explains how to copy an artifact from one job to another using the promotion plugin. For the artifact itself, I added a "Execute Windows batch command" build step after the main build with:
echo DEV_ENVIRONMENT_CORE_REVISION:%SVN_REVISION%>env.properties
Then in the staging job, using that above guide, copied that file, and then using a plugin EnvInject, to read from that file and set an environment variable, which can then be used as a parameter to the SVN Repository URL.
You should be able to identify the changeset number that was built in DEV and manually pass that changeset to the Jenkins build to pull that same changeset from SVN. Obviously that makes your deployment more manual. Maybe you can setup Jenkins to publish the changeset number to a file and then have the later env build to read that file for the changeset number.
We used to use this model as well and it was always complex. Eventually we moved to a build once and deploy many times model using WebDeploy. This has made the process much more simple. Check it out - http://www.dotnetcatch.com/2016/04/16/msbuild-once-msdeploy-many-times/
How can I automate the web-application build process, that includes following steps:
Change connection string.
Recreate database by scripts.
Deploy web-site by ftp.
Copy some files to server in addition to application.
And may be perform some initialize operations.
Should I write any script/programm, use Visual Studio or any another program?
Personally I use a Continuous Integration tool to do this kind of work.
The one I mainly use is Team City by JetBrains.
This kind of software can look at your Source Control repo for new checking, perform builds, publish builds to servers as well as running pre/post build events.
You've to start learning MSBuild. It is VERY simple and straightforward, so just start and you'll see ;)
In adddition to built in features it has Community Pack with many tasty things so you will be able to:
Replace connection string in config file using regex or replace whole config with predefined connection string (FileUpdate or Copy task)
Execute database scripts (MSBuild.Community.Tasks.SqlServer.ExecuteDDL)
Deploy site using Copy task
And many other...
You can run pre and post events in Visual Studio. To do this, simply right click on the project and in the project properties navigate to the 'Build Events' options. Here you can specify the pre and post build events (you can also specify when the event runs - on successful build or otherwise).
Once the project has been successfully built, the post build event can be set up to perform the tasks specified. You can detail the steps either in a separate file or in Visual Studio project's build events itself.
More information
Pre/Post Build event command line arguments
How to: Specify Build Events (C#)
Much along the continuous integration concept Jamie mentions, we use BuildMaster internally for all of our applications since we develop it :)
Now that we have a version offered for free, I'll share some thoughts on each of your bullet points:
Change connection string
This is something that is handled uniquely by the tool. Each environment would get its own "instance" of a configuration file and in a deployment plan you can use the "deploy configuration files" action to put them in any environment. This means there are no transforms to worry about since the config file is stored and versioned within the tool.
Recreate database by scripts
This is another major feature we have. Object code (stored procs, views, etc.) can be run every time with a DROP/CREATE combo, but adding indexes, dropping columns, can only be done once (you can't bring a column's data back without a restore!)
BuildMaster handles these types of change scripts differently - they can only be run at most once against an environment's instance of your database. This makes it super easy to bring any new or existing initialized database schema up-to-date.
Deploy web-site by FTP
Just add an action to your deployment plan, and you click Create Build or Promote Build, it will do that.
Copy some files to server in addition to application
If the process is repeatable you can do this easily, if need be by using a manual action that will remind you to do it.
And may be perform some initialize operations
This sounds like a "change control" to me, a one-time change when you release. We support these as well but not in the free version unfortunately.