I use Scrum methodology and deploy functionality in builds every sprint.
There is necessity to perform different changes in the stored data (I mean data in database and on filesystem). I'd like to implement it as a PHP scripts invoked from console. But they should be executed only once, during the deployment.
Is there any way to implement it through app/console without listing it in the list of registered Console commands? Or is there any other way to implement runonce scripts?
DoctrineMigrations covers some part of my requirements, but it's hard to implement complex changes in Model. And it does not cover changes in files on the filesystem.
I don't think symfony has a facility for that, and besides, hiding the command is not the same as securing the command.
Instead, I would make the script determine if it has been run already (could be as simple as adding a version number to a file and checking that number before running) and stop if it detects it has already run before.
Related
I'm creating an app where a user can create a piece of data that could be presented in the ui at a later date. I'm trying to find a way to create cron entries dynamically using either java code (for android devices) or node.js code (firebase cloud function generates a cron job). I haven't been able to find a way to do it dynamically and based on what I read it may not be possible. Does anyone out there know a way?
Presently the only way to create GAE cron jobs is via deployment of a cron configuration file (by itself or, in some cases, together with the app code). Which today can only be done via CLI tools (from the GAE or Cloud SDKs).
I'm unsure if you'd consider programmatically invoking such CLI tools qualifying to 'create cron entries dynamically'. If you do - generation the cron config file and scripting the desired CLI-based deployment would be an approach.
But creating jobs via an API is still a feature request, see How to schedule repeated jobs or tasks from user parameters in Google App Engine?. It also contains another potentially acceptable approach (along the same lines as the one mentioned in #ceejayoz's comment))
The normal way of dealing with Doctrine Migrations is via the standard Commands - during development one runs the commands manually to e.g. run diffs and apply the migrations, and deployment typically involves applying them the by the same approach but automatically. Occasionally when working in a team on a local instance there are new migrations, but I've updated my source from version control rather than done a deployment, so I need to apply the new migrations manually, and I need to know that I need to do that! An improvement could be to display a warning on a rendered webpage that migrations are out of sync and action needs to be taken.
Is there a way to access the Migrations API directly in PHP/Symfony code, so that I could detect a mismatch between committed and applied migrations? I haven't found any documentation about that. I've had an initial poke around the code and it seems heavily skewed towards Commands (reasonably enough).
Firstly, updating your source code from version control, is a deployment too, and applying Doctrine Migrations should be part of that. You should create a check list of all the steps you need to do during a deployment, including rollbacks. Depending on the complexity of the application, many things could go wrong.
To answer you question, you can execute, in your code, a diff migration with the Process component and parse the output to determine if there're migrations to be applied.
I am trying to find a reasonable approach to getting a code coverage report for code that is called from within a test via HTTP. Basically I am testing my own API the way it is supposed to be called but because of that PHPUnit/Xdebug are unaware of the execution of the code within the same codebase.
Basically what I want to achieve is already done using the PHPUnit Selenium extension but I don't run Selenium, I call the code through an OAuth2 Client which in turn uses curl.
Is it be possible to call my API with a GET-parameter that triggers a code coverage report and to have PHPUnit read that report and merge it with the other code coverage? Is there a project that already does that or do I have to resort to writing my own PHPUnit extension?
OP says the problem is that Xdebug-based code coverage collection, won't/can't collect coverage data because Xdebug isn't enabled in all (PHP) processes that execute the code.
There would seem to only be two ways out of this.
1) Find a way to enable Xdebug in all processes invoked. I don't know how to do this, but I would expect there to be some configuration parameter for the PHP interpreter to cause this. I also can't speak to whether separate Xdebug-based coverage reports can be merged into one. One the face of it, the raw coverage data is abstractly just a set of "this location got executed" signals, so merging should just be a set union. How these sets are collected and encoded may make this more problematic.
2) Find a coverage solution that doesn't involve Xdebug, so whether Xdebug is enabled or not is irrelevant. My company's (see bio) PHP Test Coverage Tool does not use Xdebug, so it can collect the test coverage data you want without an issue. You can download it and try it; there's a built in-example of test coverage collection triggered exactly by HTTP requests. The tool has a built-in ability to merge separate test-coverage runs into an integrated result. (I'd provide a direct link, but some SO people are virulently against this).
How can I automate the web-application build process, that includes following steps:
Change connection string.
Recreate database by scripts.
Deploy web-site by ftp.
Copy some files to server in addition to application.
And may be perform some initialize operations.
Should I write any script/programm, use Visual Studio or any another program?
Personally I use a Continuous Integration tool to do this kind of work.
The one I mainly use is Team City by JetBrains.
This kind of software can look at your Source Control repo for new checking, perform builds, publish builds to servers as well as running pre/post build events.
You've to start learning MSBuild. It is VERY simple and straightforward, so just start and you'll see ;)
In adddition to built in features it has Community Pack with many tasty things so you will be able to:
Replace connection string in config file using regex or replace whole config with predefined connection string (FileUpdate or Copy task)
Execute database scripts (MSBuild.Community.Tasks.SqlServer.ExecuteDDL)
Deploy site using Copy task
And many other...
You can run pre and post events in Visual Studio. To do this, simply right click on the project and in the project properties navigate to the 'Build Events' options. Here you can specify the pre and post build events (you can also specify when the event runs - on successful build or otherwise).
Once the project has been successfully built, the post build event can be set up to perform the tasks specified. You can detail the steps either in a separate file or in Visual Studio project's build events itself.
More information
Pre/Post Build event command line arguments
How to: Specify Build Events (C#)
Much along the continuous integration concept Jamie mentions, we use BuildMaster internally for all of our applications since we develop it :)
Now that we have a version offered for free, I'll share some thoughts on each of your bullet points:
Change connection string
This is something that is handled uniquely by the tool. Each environment would get its own "instance" of a configuration file and in a deployment plan you can use the "deploy configuration files" action to put them in any environment. This means there are no transforms to worry about since the config file is stored and versioned within the tool.
Recreate database by scripts
This is another major feature we have. Object code (stored procs, views, etc.) can be run every time with a DROP/CREATE combo, but adding indexes, dropping columns, can only be done once (you can't bring a column's data back without a restore!)
BuildMaster handles these types of change scripts differently - they can only be run at most once against an environment's instance of your database. This makes it super easy to bring any new or existing initialized database schema up-to-date.
Deploy web-site by FTP
Just add an action to your deployment plan, and you click Create Build or Promote Build, it will do that.
Copy some files to server in addition to application
If the process is repeatable you can do this easily, if need be by using a manual action that will remind you to do it.
And may be perform some initialize operations
This sounds like a "change control" to me, a one-time change when you release. We support these as well but not in the free version unfortunately.
When reading about and playing with Rails last year, one of the tools that made the biggest impression on me was Rake. A database versioning system that keeps all dev db's identical integrated right into the build...something like that would make life so much easier (and safer)!
However, one of the things that I haven't been able to figure out:
How do you move these changes to your production servers when you don't actually have access to the production servers? We have multiple servers across the country that where the application is installed/upgraded by a setup package.
Note: This question is more about strategy than Rails/Rake specific technologies. We don't use rails, we use .Net. But if I can figure out this publish scenario, there seem to be several tools Migratordotnet being one that might enable us to do something similar.
As you probably know, the standard Rails way of running migrations in production is Capistrano. It has a deploy:migrations task that runs the migrations on remote servers using ssh.
You might be able to adapt Capistrano to do what you want. It's essentially a flexible way to run commands on groups of remote servers. You need to have Ruby installed on the machine you are deploying from in order to use it, but not on the machines you are deploying to.
Your best option may be to write a custom Capistrano task to upload the setup.exe, run it, then run the migrations (perhaps using Migrator.NET).
You might be able to use something like Red Gate's SQL Compare to produce schema diff scripts that would allow you to automate the process of updating the database. I've used the tool manually to do such changes and could easily see creating a program that would run these updates as part of upgrade process. If I were going to automate it, though, I'd design in something that would enable me to check what version of the schema was in place and run the necessary scripts in the proper order to bring it up the the desired version.