In my symfony project, I regularly need to download data from a git repository, and then importing it in to the database. I think the best way to do this will be a console command which is called by a cronjob.
My question, what would be best practice: Do it all in a console command, or having a controller which does the work and the command is just the interface for the cronjob?
Related
I am new to Flyway. I like it a lot and I'm about to integrate it into our Java workspace. I need a little "push" however when planning the actual release procedure of these migrations. According to Flyway's documentation I have the choice of
distributing a list of Sql files or
distributing a list of both Sql and Java files packed into a jar archive.
I would like to try the second option because it gives us more flexibility and it sounds like I could simply include the migration scripts as resources in the jar file of the executable. However as we deliver database changes quite often in a continuous release process I see the jar file eventually being polluted with tons of script files. Moreover, when using Ant to create the jar file Ant will put the name of every file into the manifest's classpath which will make the manifest just messed up.
With these concerns in mind, for those of you that use Flyway in production:
What is the recommended way of distributing migration scripts? Do you put all of them into a jar and pass it to flyway on the server? Or do you copy all scripts to the server using a batch file everytime you make a release? Thanks for your advice.
war project is built with maven
war project src/main/resources/db/migration has the migration scripts.
DB migration scripts are packages when the war is built.
When the war is deployed as part of container startup, db migration using flyway is executed.
The above approach is followed for local development environment, Test/Stage environment are continuously deployed by Jenkins pipeline and DB Migrations are automatically run.
In case of failures in Continuous deployment pipeline, we manually run db migration using flyway command line.
We also have the option of running flyway using command line for production deployment and for trouble shooting.
You can setup a jenkins job to automate flyway migration.You can do it in 2 ways.
By configuring the goals and options to run the
compile flyway:migrate -Dflyway.user="xxxxx" ...(your other options)...
By installing a Flyway Runner Plugin on Jenkins
I would prefer doing it using the first option as you won't have to install anything on your Jenkins box.
I am in the process of converting our legacy custom database deployment process with custom built tools into a full fledged SSDT project. So far everything has gone very well. I have composite projects that can deploy a base database as well as projects that deploy sample and test data.
The problem I am having now is finding a solution for running some sort of code that can call a web service to get an activation code and add it to the database as the final step of the process. Can anyone point me to a hook that I might be able to use?
UPDATE: To be clearer I am doing this to make it easier to maintain and deploy our sample and test data to a local machine. We can easily use Jenkins to activate the sites when they are deployed nightly to our official testing environments. I'm just hoping to be able to do this in a single step to replace the homegrown database deploy tool that we use now.
In my deployment scenario I wrapped database deployment process in some powershell scripts which do necessary prerequisites. For example:
powershell script is started and then it stops some services
next it run sqlpackage.exe or preproduced sql deployment scripts
finally powershell script starts services.
You can pass some parameters from powershell to sql scripts or sqlpackage.exe as sqlcmd variables. So you can call webservice first, then pass activation code as sqlcmd variable and use the variable in postdeployment script.
Particularly if it's the final step, I'd be tempted to do this separately, using whatever tool you're using to do the deployment: Powershell scripts, msbuild, TFS, Jenkins, whatever. Presumably there's also a front-end of some description that gets provisioned this way?
SSDT isn't an eierlegende Wollmilchsau, it's a set of tools for managing database changes.
I suspect if the final step were "provision a Google App Engine Instance and deploy a Python script", for example, it wouldn't appear to be a natural candidate for inclusion in an SSDT post-deploy script, and I reckon this falls into the same category.
I use Scrum methodology and deploy functionality in builds every sprint.
There is necessity to perform different changes in the stored data (I mean data in database and on filesystem). I'd like to implement it as a PHP scripts invoked from console. But they should be executed only once, during the deployment.
Is there any way to implement it through app/console without listing it in the list of registered Console commands? Or is there any other way to implement runonce scripts?
DoctrineMigrations covers some part of my requirements, but it's hard to implement complex changes in Model. And it does not cover changes in files on the filesystem.
I don't think symfony has a facility for that, and besides, hiding the command is not the same as securing the command.
Instead, I would make the script determine if it has been run already (could be as simple as adding a version number to a file and checking that number before running) and stop if it detects it has already run before.
I am trying to create a setup procedure which installs my entire web application. I am using Visual Studio's Setup and Deployment Project. So far I've gotten it to deploy my website to the Inetpub folder, and I've also added some custom actions which allow it to run some SQL and setup my database.
The last thing I have to integrate into the setup process is my two SSIS packages. Not only do these need to be installed, they also need to be scheduled to run nightly.
The packages are simple, and don't reference anything unusual. They are just 2 .dtsx files.
So far, I've seen that I can use something called dtutil in order to create a dtsinstall.exe file which can be run to install the SSIS packages to either the file system or the database.
First of all, is this the easiest way to do it? And secondly, how would I go about scheduling the packages to be run nightly?
One caveat is that I need this to install silently, without prompting the user for any input.
First, create a new job in SQL Agent. In the steps panel, click New, give it a name, and for the job type, select SQL Server Integration Services Package. For the Package source, select File system, and point it to where you want the file to live.
Then you can select Schedules from the left panel, and configure how often you want it to run.
After that, you should be able to handle deployment of the SSIS package by copying the .dtsx file to the location you specified when you created the job.
I have an ASP.NET web application that includes code for enforcing its own database schema ; this code runs on application start.
I've recently started using LINQ to SQL, and I've added a pre-build event to run SqlMetal on my database so that I get objects representing my db tables.
What would be really cool is if I could enforce the database schema in the pre-build event, and then run SqlMetal. As it is, if the schema changes (e.g. I add a field to a table), I have to (a) build and run the website once so that application start fires and the schema is enforced, and then (b) build it again so that SqlMetal runs.
So: What are my options for running code that lives in my web application, from the command line?
Here's what we do.
We have a local 1-click build that is required to be run before check in (an integration build also runs in a separate environment every check in...).
The NANT script will:
Rebuild the database from scratch using Tarantino (Database change management)
Clean & Compile
Copy DLLs to a separate directory
Run Unit Tests against the DLLs
We have a separate script for SQL Metal, but your question is going to have me look at inserting the call between steps 1 and 2. This way your database changes and linq generated files are always in sync.
You could either write a small program that use CodeDOM to interpret a file in your repository or directly call the compiler executable inside your pre-build event.
Using CodeDOM avoid any problems with having to know where the compiler executable is but if your code can't be contained in one file without any dependency it's unusable and calling the compiler, then executing the result is a better option.