Wordpress - Best practice for pushing updates to multiple servers using version control - wordpress

So I am about to add another WP blog, but I'd like to keep it under version control. Then I started thinking, how would that affect my current WP workflow. Based on my limited xp in using WP, when an update is pushed from WP dev team, I see an indication in my admin control panel. From here I can simply click the button, and the changes are implemented behind the scene. This approach is great for a single WP instance outside of version control, but what about more nodes, and in version control?
Some of the WP updates include both code and schema changes, so I can't simply publish the code without also implementing the new schema changes. The best I can figure it is to do the following:
Localize current WP version stored in version control
Download latest (stable) wp files
Extract to local path (created in step 1)
Diff changes (optional)
Commit changes to version control
Log into each server
Put into maintenance mode
Pull latest changes
Implement new schema changes (????)
Test
Take out of maintenance mode
Step 9 is what is tripping me up. Do I do a schema dump from my local (freshly updated) schema, then import that schema for every server (or use provided schema change file if WP included id).
Is there a better approach to this?
---- EDIT 1.20.2014 ----
After further consideration, I wonder if setting up some type of mysql replication would be the way to go? Have one node access with read/write access so it can make changes which are restricted to database only (i.e. de-activating a widget), but have other servers serving up the blog content read from readonly mysql instances which are replicated to. This way only one server is making changes from which the others will pull. During my research I have noticed that some changes like alterations to child theme via functions.php or style.css can be tracked in version control, but other changes like activating/de-activating widgets are purely sql based, which would be impossible to track in version control.

Is there a better approach to this?
Don't touch WP core (do you really need it?)
OR
Hack core only once in order to replace default repository's URL of WP-core with your's and later use system auto-updater with your repository

Related

How do I make a programatically changed site dashboard refresh without restarting the Alfresco/Tomcat service?

I've created a web-script module extension and have verified that it works correctly. What it does is takes the dashboard.xml and related page.component-X-Y.type~id~dashboard.xml files from one site, deletes all dashboard related files on another site then copies the source files to the new site that had them deleted.
pseudo-code
var siteDashboard = getDashboard("site1-shortname");
var siteDashboard = renameShortNames("site1-shortname", "short2-shortname");
deleteDashboard("site2-shortname");
createDashboard("site2-shortname", siteDashboard);
renameShortNames just renames the site id inside the dashboard files to the new site's id.
This all works, I've tested and verified it. My problem is that when I go to http://alfrescosite.com/alfresco/s/remoteadm/get/s/sitestore/alfresco/site-data/pages/site/site2-shortname/dashboard.xml it shows me the new dashboard layout from site1-shortname which is the correct behavior but when I go to the actual site's dashboard within Alfresco share it shows the old site2-shortname dashboard. The only way I can get the new dashboard to show is by restarting the Alfresco/Tomcat service. I've even tried looking at the dashboard with a different browser just in case it was a local caching issue but it's not.
Any ideas on how to make the dashboards refresh to the new layout without having to restart the Alfresco/Tomcat service every time?
I figured out what the problem was. The problem was that I was deleting and recreating the dashboard via Remote API calls to the Alfresco Repository and doing it that way was making the appropriate changes but not telling Alfresco Share of those changes.
The solution was to use a combination the Share root object sitedata to remove the component bindings, delete the components and recreate them through Share so that the changes are automatically updated on the front end without the need for a service restart.
Basically this ended up being a modified version of the code in customise-dashboard.post.json.js inside Alfresco Share

The Features module and blocks on Drupal 7

I am trying to implement the Features module on one of my Drupal 7 sites for managing blocks. I have a couple questions though. 1st, when you create a new feature on the source site, do you then take that newly created feature and put it in your modules directory and enable it on the source server AND the destination server or JUST the destination server?
Also, I'm wondering how it works when you are trying to manage blocks with a test server and a live server when the live server is a clone of test. In other words we create a test server, construct our site including content and blocks and when it's finished we clone test to live. Then we install the features module on test and create a feature that contains ALL of our custom blocks. When I did this though and moved that feature to the live server and enabled it, it was immediately in an overridden state. Are features only meant for moving NEW blocks from one site to another and not meant to manage blocks that already exist on BOTH servers? Should I create the feature containing all the blocks on the test server and then delete the blocks on the live server and THEN enable the feature on live which would populate the blocks on live. I'm just not sure if I'm missing something or going about this the wrong way.
THANKS
UPDATE: OK, I'm pulling my hair out over here. Again, so I have two sites a source and a destination. The destination was is an exact clone of the source. I have three blocks on both sites that I would like to manage via features. SO, on the source site, I decided to test with just ONE block first. I first edited the block so it would be different than the one on the destination site. I then created the feature including the block and block settings (by the way I'm using Features Extra to accomplish this) and then I place the feature on the destination site and when I activate the feature, the feature is actually NOT in an overridden state and the changes that I made to the block on the source site, show up on the destination site no problem. HOWEVER, if I try to add the other two blocks now to this feature on the source site and recreate it and export it out to the destination site, the feature on the destination site is now in an overridden state which is fine, but no matter how many times I "Revert" the feature to take the blocks out of the dB and into code, it will NOT get out of an overridden state. I have flushed the cache, disabled the feature and re-enabled, and tried reverting and it is stuck as overridden and I do not see the changes to the other two blocks that I made. I then thought maybe it's because I am doing three blocks at once. I then took JUST block number two by itself and created a feature for it and put it on the destination site and it gets stuck in overridden status. Same goes for block number 3. Block number one by itself is fine and does not get stuck in overridden status. It's just block number 2 and three. As far as I can tell all three blocks were created the same exact way and do not have any different settings as far as roles, pages etc. I am stumped on this one for sure.
comment doesn't allow this much log post, so posting as answer.
I can't say much without having exact problem. But This is how features works. You have to do changes in a source site. then create feature of that changes. Now On destination site you have to enable that feature. If you already have that changes in destination site, than feature will be overridden, you revert it and get changes.
As you saying you added two other blocks in feature, but you didn't change anything in those blocks, so they are already in destination site. that why features in overridden state. when you revert It does changes, but sometime it doesn't changes state on (admin/structure/features/).
I don't know your exact requirement, but I think you should change do changes in source site and then pick them in feature and enable on destination site.

What does increasing the modification attribute do?

In %TRIDION_HOME%\web\WebUI\WebRoot\Configuration\System.config we can increment the modification attribute's value to instruct the Content Manager to force a download of items.
The setting is mentioned on the PowerTools discussion but also on the Skinning the Content Manager Explorer topic on SDL Live Content.
<server version="6.1.0.55920" modification="7">
Alternatives to updating the CME include clearing browser cache (CTRL+Shift+Delete in Chrome) or setting cache settings per user.
Question
Should I use this for any CM-side changes such as GUI extensions, schema changes, or template linked schemas? Or does it only apply to certain parts of the Content Manager Explorer?
In other words, after a schema and template change, what's the best way to make users get the latest versions of components, schema drop-downs, and template selections?
The values of the modification and version attributes become part of the URL of every CSS and JavaScript file that the Tridion UI generates/merges and of many of the static (image) files too. So the URLs look like this edit_v.6.1.0.55920.7.aspx?mode=css. Since the browser sees this as a new URL, there is no way it can have the file in its cache yet. And thus it will always have to download the files from the server, instead of using (possibly outdated) files from the local cache.
This is a technique of injecting some version information into the URL is known as "URL fingerprinting". Google commonly embeds a hash-value of the file into the URL, ensuring that the fingerprinting happens without requiring the developers to increase a version number manually. But whichever way of fingerprinting is used, the technique is a pretty efficient way to ensure that all browsers download the latest version of your code.
If you are developing a GUI extension, you can indeed typically get the same effect by clearing your browser cache or even disabling it completely (for the Tridion domain). But once you roll out your extension to a non-development server, changing the modification attribute is the most certain way to ensure that all your users get the latest JavaScript/CSS changes without each of them having to clear their cache manually.
The URL fingerprinting in Tridion only affects CSS, JavaScript and image files. The actual CMS data (such as Schemas and Components) is loaded using XMLHttpRequests and thus not affected by the modification attribute.
As far as I know,
<server version="6.1.0.55920" modification="7">
This clears only JS and CSS related caching. When a User access the CM then CM loads all the files including latest copies.
Should I use this for any CM-side changes such as GUI extensions, schema changes, or template linked schemas? Or does it only apply to certain parts of the Content Manager Explorer?
For this line, answer is No. Since when ever user does any changes to schema, changes should refresh on all publications. Currently this is not happening on the browser.
Hopefully this might be fixed in on coming versions.
In other words, after a schema and template change, what's the best way to make users get the latest versions of components, schema drop-downs, and template selections?
Currently user should do a forceful refresh to get updated info on all publications.
The SDL Tridion CMS interface caches CMS Items in order to provide faster browsing and loading of its own interface. This does mean that sometimes:
Custom GUI extensions may not display latest versions of the files
Recently created or modified CMS items may not be shown, or show the latest version.
This is why sometimes a new keyword isn't shown within a component field, or a new component template isn't shown when trying to add a component page.
Incrementing the modification number in the node will cause all CMS items to show the latest versions to the CMS user(s). You'll see if uses this value to reference CSS and JS files used by the CMS GUI.
As a developer I also turn off my Firefox cache (I prefer firefox for the firebug extension which is great for working with GUI extensions) as this means you don't need to go and change this value, a simple browser refresh seems to always do the trick. Turning off cache is explained here : https://superuser.com/questions/23134/how-to-turn-off-the-firefox-cache

How do I implement a dynamic role in Plone 3?

I want to allow access to certain content to certain users for a limited time,
using a 'Dynamic Role' in Plone 3 ( http://collective-docs.readthedocs.org/en/latest/security/dynamic_roles.html ).
To this end I've created an add-on with a copy paste of example code - except that for now getDummyRolesOnContext() always returns my role.
But Plone never calls, or instanciates my DummyLocalRoleAdapter, and obviously my users never get the role assigned.
Here's what I know so far:
My dynamic role is defined in a rolemap.xml and get's created upon add-on installation.
My add-on is being imported - an exception on it's first line prevents Zope from starting
None of DummyLocalRoleAdapter are being called - I've spiked all of them with warnings and exceptions.
The adapter does get registered.
How do I continue debugging this - what's the magic part I'm missing?
Thanks!
My guess is that you need to somehow activate borg.localprole PAS plug-in in acl_users:
https://github.com/plone/borg.localrole/blob/master/borg/localrole/utils.py
There might have been borg.localrole add-on installer entry in the past, but now there doesn't seem to be one. My guess is that you need to call the actions from borg.localrole add-on setup code manually in your own add-on.
acl_users when borg.localroles is correctly installed:

how do I update a plone custom policy (e.g. mysite.policy) add-on

When I first created my Plone (4.1) site, I made a mysite.policy add-on to include some custom users and a custom workflow.
I need to make some corrections to both the workflow and the permissions. I updated the src to include these changes, but updating the package in through the Plone add-on manager (uninstall - install) does not work. As soon as I uninstall the status of all my entries switches to "local policy", so I cannot get the fine-grained status setttings back when I reinstall.
Also, the user permissions do not seem to change. Possibly because they were already created at set-up of the site. But I cannot figure out how to code a change to permissions versus a setup of permissions in the rolemap.xml. I assumed that whatever is in that xml is what rules my plone world, but that does not seem to be working.
So far I cannot find anything about this in the manuals and books I have at hand. Any hints how to solve this? Perhaps the only way to go about this is a series of manual changes through ZMI, but it is so much less elegant to do it that wat.
There's plenty of options. I'll try to describe a couple of them.
If your changes include only changes in Generic Setup profile of your site policy (./src/my/site/policy/profile/default/-files) and you don't want to automate the upgrade, you could simply update the profile-files and re-run those specific import steps for your policy:
Open ZMI (site/manage) for your site and look for portal_setup.
Select Import-tab when on portal_setup.
Select the profile of your site policy from Select Profile or Snapshot-list (the title of your profile is defined by the registerProfile-directive in configure.zcml or profiles.zcml of your policy product).
Click to select import steps for Role / Permission Map and Workflow Tool.
From the bottom of the page, deselect Include dependencies.
Click Import selected steps-button.
Go to portal_workflow-tool on ZMI and Update security settings, if your workflow update should modify permission in existing workflow states.
These steps should re-import only the selected import steps of you site policy product's Generic Setup -profile. Re-importing individual steps this way should be quite safe, but be careful: accidental clicks at portal_setup screens may have unpredictable consequences.
These steps can also be automated by defining something called Generic Setup Upgrade Step.
I hope that the default Generic Setup -profile of your site policy product includes metadata.xml with line <version>1</version>.
Update that line to <version>2</version>.
Open the zcml-file with registerProfile-directive and, after it, add
<genericsetup:upgradeDepends
source="1" destination="2" sortkey="1"
title="Upgrade my.site.policy (1 to 2)"
description="Upgrades my.site.policy's default profile from version version 1 to 2."
profile="my.site.policy:default"
import_steps="rolemap workflow"
run_deps="false"
/>
These steps should register such an upgrade step from the profile version 1 to 2, which re-imports steps rolemap and workflow (rolemap.xml and workflows.xml). You should be able to run the upgrade step from the Plone Site Setup's Add-ons-screen, where there should now be an upgrade button after your installed policy product.
As mentioned by #toutpt, the Collective Developer Manual has more examples on upgrade steps. If you have ever wondered, why it's recommended to use integers in metadata.xml, usually independently from the product's release version number', this is the reason :).
Any changes that need upgrade must be shown by increment the number in profile/default/metadata.xml (keep integer). Next you have to write an upgrade step. It will add an upgrade button in the addons control panel.
Please follow this tutorial to learn how to create an upgrade step: http://collective-docs.readthedocs.org/en/latest/components/genericsetup.html?highlight=upgradestep#upgrade-steps

Resources