Context:
We have a WordPress/Woocommerce site that has a wide range of custom-tailored features that solve for a specific marketing need.
We want to have variants of this same site (around 80% of the code would be the same) for different domains hosted on completely different servers.
Question:
What would be the best approach to instantiate and maintain the clone sites?
Additional Details
We don't track (git) WordPress core files.
We track (git) only vital plugins for the site, the remaining are ignored.
The difference between sites would be mainly branding, but still content AND code related.
The idea is to set up a new "clone" site in a short period and still be able to migrate new features in future.
Deployment Specs
We use Laravel Forge to provision AWS servers.
We use Bash script for installing all dependencies, downloading WP core, restoring a sample DB.
We use composer for dependency management.
Related
I'm very new to Drupal, so please don't be too mad in case I have any major misunderstandings :) I've tried searching for a similar problem, but is just couldn't find a suitable solution for my case.
We're currently setting up a Drupal 9 project, which will perspectively have a shared development environment and a production environment as well as a local instance to develop on. I'd wish to have a way to synchronize those instances to have the same configuration, content types and optionally even content.
At the moment, I'm developing a theme locally, which means I have installed a Drupal instance inside a XAMPP server. That theme is versioned by git, so it is migratable to another developer without a problem.
For migrating the structure and content (which is obviously saved in the database), I tried using Backup & Migrate, but there were two issues I was facing: The D9 version is not fully supported yet, so an installation via composer fails with default security settings, and there seems to be an already multiple times reported bug when trying to backup the entire site. You can workaround it by backing up the database and the files separately, but this is pretty inconvenient due to other issues (but let's keep it a little short...).
I also tried to export the whole database, which is actually working (after this little fix), but the overhead seems a little high for me. Especially when I just want to copy new content types from dev to prod environment without users, content and so on, for instance.
So, to finally come to an end, is there any best practice for this case? Or should I even consider to go a whole other way?
Thanks in advance!
I definitely wouldn't recommend using Backup & Migrate for this - that's so Drupal 7! Drupal 9 has better tools that are baked into core!
There are many possible ways to import/export Config and Content entities across environments, but I'll share what I believe to be the current best practices.
For Configuration, Drupal 9 has a built-in Configuration Management system that makes it quite easy to migrate Config across environments. It can be used within the Drupal UI, and also integrates with Drush (a command-line tool for Drupal).
Essentially, the Config system exports all Config settings as standardized YAML files, which can easily be included in your Git repository. This will make it incredibly easy to set up another Drupal environment that is identical in its Config settings.
More info on Configuration Management can be found here.
For Content, Drupal 9 has a built-in Migrate API, which facilitates migrations from any data source into a Drupal 9 environment. That means you could set up a migration that would allow you to migrate your Content entities across environments.
I've only ever used this for migrating Content (migrated from Drupal 7), but I believe it's also possible to use this to migrate Config as well.
If you decide to use the Migrate API, you may (depending on the setup of your content) need to install Migrate Tools and Migrate Plus.
More info on the Migrate API can be found here.
I've never worked with any CMS and I simply wanted to play with such ones. As originally I come from .NET roots, so I was thinking about choosing Orchard Core CMS.
Let's imagine very simple scenario, together with my colleague I'd like to create a blog. As I'm used to work with web based systems and applications for a business for me it's kinda normal to work with code repository, having multiple environments dev/test/stage/prod, implementing CI / CD, adjusting database via migrations or scripts.
Now the question is do I need all of this with working on our blog with a usage of CMS.
To be more specific I can ask few questions:
Shall I create blog using CMS locally (My PC) -> create few articles and then deploy it to the web or I should create a blog over the internet and add articles in prod environment directly.
How to synchronize databases between environments (dev / prod).
I can add, that as I do not expect many visitors on a website I was thinking to use Orchard Core CMS together with SQLite. Also I expect that I can customize code, add new modules, extend existing ones etc. - not only add content (articles). You can take that into consideration in answering the question
So basically my question is what should be the workflow of a person who want to create / administer and maintain CMS (let it be blog) as a single person or as a team.
Shall I work and create content locally, then publish it and somehow synchronize both application and database (database is my main question mark - also in a context how to do that properly using SQLite).
Or simply all the changes - code + content should be managed directly on a server let's call it production environment.
Excuse me if question is silly and hard to understand, but I'm looking for any advice as I really didn't find any good examples / information about that or maybe I'm seeking in totally wrong direction.
Thanks in advance.
Great question, not at all silly ;)
When dealing with a CMS, you need to think about the data/content in very different terms from the code/modules, despite the fact that the boundary between them is not always completely obvious.
For Orchard, the recommendation is not to install modules in production, but to have a dev - staging - production type of environment: install new modules on a dev environment, test them in staging, and then deploy to production when it's safe to do so. Depending on the scale of the project, the staging may be skipped for a more agile dev to prod setting but the idea remains the same, and is not very different from any modular application.
Then you have the activation and configuration of the settings of the modules you deploy. Because in a CMS like Orchard, those settings are considered data and stored in the database, they should be handled like content. This includes metadata such as the very shape of the content of your site: content types are data.
Data is typically not deployed like code is, with staging and prod environments (although it can, to a degree, more on that in a moment). One reason for this is that a CMS will often feature user-provided data, such as reviews, ratings, comments or usage stats. Synchronizing all that two-ways is very impractical. Another even more important reason is that the very reason to use a CMS is to let non-technical owners of the site manage content themselves in a fast and direct manner.
The difference between code and data is also visible in the way you secure their changes: for code, usual source control is still the rule, whereas for the content, you'll setup database backups.
Also important to mention is the structure of the database. You typically don't have to worry about this until you write your own modules: Orchard comes with a rich data migration feature that makes sure the database structure gets updated with the code that uses it. So don't worry about that, the database will just update itself as you deploy code to production.
Finally, I must mention that some CMS sites do need to be able to stage contents and test it before exposing it to end-users. There are variations of that: in some cases, being able to draft and preview content items is enough. Orchard supports that out of the box: any content type can be marked draftable. When that is not enough, there is an optional feature called Deployments that enables rich content deployment workflows that can be repeated, scheduled and validated. An important point concerning that module is that the deployment only applies to the subset of the site's content you decide it should apply to (and excludes, obviously, stuff like user-provided content).
So in summary, treat code and modules as something you deploy in a one-way fashion from the dev box all the way to production, with ordinary source control and deployment methods, and treat data depending on the scenario, from simple direct in production database instances with a good backup policy, to drafts stored in production, and then all the way to complex content deployment rules.
I have one Developer Portal in Bluemix API Connect for the Development environment, one for Testing environment and one for Production environment. I have made some customizations in the structure of the modules, settings and content and I would like to migrate them from one environment to the other.
I have a theme for the styles, fonts and images but there are settings and content that are not included in the theme. I have found some Drupal plugins to migrate tables from the database. However, there is a risk of overwriting tables related with the API, products, plans, etc.
I would like to know if there is a recommended way of doing this migration without having to do everything again manually.
There isn't currently a simple way to migrate that sort of configuration between portal sites.
There is a Drupal module called "Features" which can export / import capability but it doesn't support all configuration and isn't a process we have tested or documented, so you'd be using it at your own risk.
You presumably uploaded your custom theme to one site, so you can just upload it again to the second site - that bit is simple.
If its an extensive amount of configuration then you can raise a support ticket and ask Ops to overwrite the target site with the configuration of the previous site - but that's a one off process, it would completely wipe the target site. So that isn't really going to help with ongoing changes.
You can write a custom drupal module to make your configuration changes - then simply load the module on each site and it would make the desired changes. However that can be a lot of work. If you only have a couple of sites it's likely easier to simply redo the same changes manually.
Hopefully this will improve in the future.
(Full Disclosure: I am part of the API Connect development team)
I am developing Drupal sites sometimes. This development involves writing custom module and of course lots of configuration work on the admin interface.
I keep track of my custom module using SCM (git of course..). Unfortunately the configuration of all the drupal modules is even more important and fragile. These settings are spread in the database and therefore cannot be easily kept track of.
I create backup of my development DB on a daily basis, but having realized that something went wrong it is a pain to compare the backup with the actual state to hunt for differences.
Do you have any best practices or suggestions how to do it professionally? (I still use Drupal 6 if that matters, but I'm interested in new features of the 7 and 8 versions as well) I read about the Feature module that is very promising, but not exactly what I need.
My first ideas were (1) a module that would store all the settings in files that can be tracked with SCM easily, or (2) some automatism that would export the tables into files every time something changes
More and more configuration can be moved into SCM as time goes by.
At the state of Drupal 7 some people start developing their sites as an installation profiles. E.g:
http://walkah.net/blog/every-drupal-site-install-profile/
Features are another way of tracking changes and usefull to change configuration over time, e.g. when several people working on a site and want to share their configuration which they build on their local dev machine. The usability of features can be enhanced using ctools and drush (See Drush CTools Export Bonus module as if you take that route).
For quick import/export of Node types, Taxonomy, User, Field API fields and Field groups http://drupal.org/project/bundle_copy seems a good option.
Here is a good blog post about the different options: http://palantir.net/blog/multi-headed-drupal
With Drupal 8 we'll see a big shift in configuration management as configuration export will be built into core. There are several core initiatives and one is the configuration management initiative. A backport of some of the functionality is available as a Drupal 7 module.
Besides this a way I like to handle things when working is to note the things I have changed with my time or in the issue tracker of the projekt I am working on.
Consider this scenario:
I could use a CMS, say Wordpress, to create a product catalogue, where my products are effectively tagged and categorised for ease of navigation. For employees and customers, this would provide an effective and visual means to browse a catalogue of products.
The problem with this is that it requires a connection to the internet to serve up the information. There could be many situations where the users of this catalogue are not connected to the internet, but still need to browse the catalogue - like field sales staff, for example.
How then, is it possible to make this entire site available for viewing (and distributing) offline? It would need to function exactly as the internet-connected version, serving up the same information and images.
Is it possible!?
I guess the limitation is the the WP database serves up the info and that would require everyone to have a MAMP-type installation, with Wordpress on their machines?
You could create a static mirror of the site e.g. wget -km http://DOMAIN. Package that into an archive and get them to install a new archive whenever it's been updated.
If you need it to function exactly, like you mentioned, you might want to check out XAMPP. It is a package containing an apache webserver, mysql, perl and php. It is not required to be installed before being used, but it does require starting the components which could probably be scripted.
The downside is you will need to customize this version unless you want to include all your information in the catalogs. Also, since your current server likely has different modules than what comes standard with XAMPP this could lead to having to basically maintain two versions of the site.
If you don't need the databases to sync (e.g. portable POS systems), MAMP is a great solution. I've implemented this several times in cases where field agents required web-based promo materials. Easy to update, maintenance free, small learning curve. MAMP all the way.
I'm developing a Wordpress site, mirrored locally under http://localhost. I'm able to transfer the database with a simple plugin that handles backup, then before I load locally I remap the URI strings inside the SQL. Being PHP serialized, some care is needed to keep the string size aligned. I.e. change each occurrence of s:N:"...http://your_site/" to s:M:"...http://localhost/your_site/", with M = N + 10.