Synchronise back-office with Github in a stateful CMS - wordpress

For the continuous integration and deployment of websites, I am using this pipeline:
But for many CMS like wordpress, prestashop, magento and others, the configuration of the website and the installation of plugins is done in the back-office of the deployed website.
For now, I am building the docker image on top of the CMS base image, then replacing all the /var/html directory with the files in github. Then Kubernetes is deploying the containers and plug a database and a persistent storage
Hence, this is breaking my pipeline: imagine that someone is installing and configuring a plugin in the back-office, then someone else is doing a modification on a file and pushes it to github. The github repo doesn't have the info that a plugin was installed and will build and deploy a new image without it.
How to integrate all the modifications done in the back-office in my github repository?

The solution we use is an override of the DB class.
So we monitor a number of tables (Configuration, module, hook, etc ...) and we store all queries about it in a sql file.
So during commit, we also have a .sql actions to perform on the database side.
Once deployed, either you manually execute the sql, or a script detect that new SQL are present and executes.
In this way we are always up to date.
This solution we developed in the form of Prestashop modules to track all actions.
Regards

My (by any means not ideal) working solution:
Create plugins folder outside docker and symlink this folder in dockered /wp-content/plugins
recreate above in production
Then installing new plugin doesn't break CI flow but requires two independent installations and configurations, if you (or dev team) need to install something new.
So you basically treat plugin files same way as you already do it with DB.

Related

Bitbucket Pipeline with AWS EB and Wordpress

I'm working on figuring out deploying a Wordpress site from BB Pipelines to AWS EB. It all makes sense except for one thing, I want the repository to only contain the theme and plugin files.
I do not want the full WP directory being deployed each time. Media would be handled via an S3 bucket and the DB would use RDS.
What is the best way to get WP installed but only have the theme and plugins deploy through Pipelines? And when I want to update to the latest version of WP, how would that work?
Or am I going about this wrong?
The simple solution, and best practice in my humble opinion, is to repo the entire WordPress installation, including the WordPress core, and all your custom themes and plugins.
Having the entire installation in one repo solves many problems: you can can tag and release versions, and you can install all software locally with a simple git clone.
Regarding the file system, definitely consider EFS instead of S3. It is much more reliable and easier to mount in a linux-based system. Make sure and define set the file path environment variable, so you can point WordPress to the files. You will want mount this outside of the software file tree.
I have been running this kind of setup for 3 years with no problems. We do releases through the code deploy service daily. Very straightforward and easy to maintain.
To upgrade WordPress just check out the current version from the repo, then apply the upgrades release, do comparisons, test commit and release.

WordPress Plugin Development - How to re-upload/update plugin while developing, without deleting current version?

I just started developing plugins for WordPress (fun!) and am curious if there is a faster/better way to upload a new version of the plugin, without having to delete the current one first?
I am not using the WordPress.org Developer Center, as the plugin I am making will be strictly for internal purposes. I am just uploading the plugin folder (zipped) through the WP Plugin Dashboard.
Since I am new to developing plugins, I am doing a lot of testing and frequent updating of the code. Would be nice if I could save time with the whole delete-then-reinstall-and-activate process.
Several options:
Sync files with git, possibly on pushing to a central repo - that's what I use daily.
rsync (run manually when needed)
Some people are using capistrano or something similar, but I personally think that this is not necessary if you're making frequent changes in a staging environment.

How do I download my Appfog app's live file system?

Hello I am a Appfog beginner and I want to ask if I upload picture/plugins/themes via the wordpress admin. Because appfog does not currently support a persistent file system, all the plugins/pictes/themes not in the source code will be lost. Is there anyway to backup the current live system and include these files in the source code that I upload? The download source code button or the "af pull" command will only download the last source code I uploaded not changes that where made for example when I install a plugin.
You can add a helper php script to your app like this:
https://gist.github.com/4134750
You can manually download single files using af files <appname> /app/<filename> but this would be painfull for your purposes.
You would be much better served by setting up your Wordpress installation to run locally using Mamp or Xampp. Pull your app as it is from AppFog, host it locally using Mamp, making your file system changes, then pushing those changes to AppFog.
Here are a few reasons why making changes locally then updating AppFog apps is better:
If your running multiple instances of your wordpress app, only one of them will get the installed plugin. Installing the plugin locally and pushing insures all instances get the plugin.
Its much faster to develop and test locally and you can see the results of your changes before impacting your live site.
Your live production site will not go down if your plugin install fails or somehow makes an unintended change. This is also true for Wordpress updates, do them local then push to production.
If your have the changes on your local box you can use version control to track and tag releases before updating production.
blue-green deployments become trivial. Have two production apps, a primary and a slave app. Update your code locally then update the slave and test it then promote it to primary by mapping the domain to it. Then you demote the previous primary to slave by unmapping the domain. The slave is always one update older and you can switch back two it if you discover issue with your primary.
Curating your Wordpress apps this way will allow you to take advantage of power the AppFog platform provides.
I found this script "zipit" even better than the "ls" script Sea Comet provided. This will zip up the entire live app directory and then you download it. This way, you can make changes via the wordpress admin, get it all working the way you want it, then use zipit, unzip the file and push it to your app on appfog and the state is totally saved across restarts.
https://github.com/zeroecco/zipit/blob/master/zipit.php
You can find more info in this blog post over on the old PhpFog blog:
http://blog.phpfog.com/2012/11/16/how-to-download-your-entire-application-not-just-code-from-php-fog/

How to develop using GIT on database driven website project?

I like to use a simple Git workflow for static web sites but I build Joomla and Wordpress sites a semi-regular basis too. However I am at a loss as to how to use Git with with database driven site development.
For a static site I would 'Push' to dev.websitename.com, then push to www.websitename.com once the dev site checks out. How would I mimic that process with database driven site like wordpress or joomla.
Thanks in advance for you insight!
You can definitely use Git with your website code, such as changes to your WordPress theme/plugin, exactly as you would if you are developing a static website.
However, you wouldn't use it for your database. Git provides version control for code, while WordPress and Joomla already manage content stored in the database. Plus, Git wouldn't understand a database, so it wouldn't have any advantage over a periodic backup, which you should already be doing. Take a look at running a dev copy of your site for how to download your database directly from your server.
By the way, if you use Git with WordPress/Joomla, you should add e.g. cache, logs, tmp to .gitignore. There are also lots of tutorials out there--try searching e.g. http://google.com/search?q=wordpress+git.
In addition Chris, you may want to embark on your Git workflow without the handy script approach (at least initially). The script approach and using Git hooks can sure seem sexy (well, because they are) and handy too, but initially why not go with a more manual cmd line approach, which will also help you familiarize yourself with Git.
Once you've got your repo setup (GitHub, Bitbucket, somewhere else) and you've pushed your latest to it and are ready to deploy to production or staging, just login to your host and from wherever you've initialized the git repo (site root, example: /site) just do a:
git pull origin master
This will fetch and merge your code. Good idea to test this on a dev/staging environment and if the merge goes well then do it in production.

Duplicate a Drupal installation from one server to another

I have been developing a Drupal 6 site on my PC using XAMPP. I'm done now, and everything looks peachy.
Problem is, I need to put all my content (including custom modules and themes) up onto a staging server which only has a fresh Drupal 6 install on it. I can't imagine having to set up all my custom content types and whatnot all over again on the staging server.
So I ask, how does one go about doing what I need to do? Which is essentially duplicating my Drupal install from my PC, to the staging server.
The staging server is running Linux, and I develop on a Windows PC, if that helps.
Thanks in advance.
Copy up all the files from development to live, and mysqldump your database and run that on the live server. Then all you have to do is change the settings.php file to point at the right database, if for some reason 'localhost' is not also your mysql database.
The quickest solution is probably the backup_migrate module. It is only a tool to copy your database. You could also use phpmyadmin or similar instead if you wanted. The backup_migrate module do have some good defaults settings as to which tables to skip (like cache tables). All the settings etc. that is not defined in code is stored in your db. So you only need to copy the db to be set. You can choose to exclude some tables, like the node or user table if you don't want to bring over your test data.
If you don't use subversion, then you gotta manually copy the files (rsync, scp, whatever) and the db (mysqldump).
what we usually do is have a hierarchy of independent subversion repos as follows:
core
sites/all/modules/contributed
sites/all/modules/custom
sites/all/themes/ (we develop our own and don't use contributed themes)
sites/all/libraries
then we use the svn:externals properties so that if you check out "core" you get every associated repo.
we got about 2 main developers with 4 other guys that may also contribute code to the site. each have their own local dev environment and we all got a common sandbox - where we make sure the stuff we wrote doesn't break someone else's module (it has happened before!).
we use svn commit hooks to update the beta/staging/sandbox site upon commit.
with all that setup, [re]deploying a site is a simple matter of going to the proper folder and issuing a "svn co http://repolocation/reponame ." and then updating the DB.
two last things to consider:
we are moving from svn to git
the features module will allow you to save changes you make to your own modules (views, content types, etc) and package all that into a deployable module so you don't have to duplicate your efforts. we are also looking into using this for ourselves.
I hope this helps you.
I second using backup_migrate. It's great.
When I'm installing a fresh site from development to production, I:
backup the site using backup_migrate module
copy all the files up to the server
edit the sites/default/settings.php to have the right database path and account info
do an import of the last backup_migrate dump (usually using mysql < backupfilename.sql, unless I already have drupal setup and have backup_migrate installed, then I use the GUI
But take a look here for the official version:
http://drupal.org/node/776864
Now, you didn't ask, but when the site is live and users are contributing content, moving future development versions of your site from development/staging to production without blowing away live content is a whole different problem, and one that Drupal doesn't have a good answer for...
Andy-

Resources