working on local drupal website - drupal

I have website on external server and local linux machine.
I prefer working on local version, because it is faster.
After adding new modules, and changing DB content (for example adding node) I wish to upload these changes to external server.
But how to not loose DB changes (and site/files) added by real time users ?
I'm using drush rsync and sql-sync.

Use Features module to create new exportable features for the external website and they will show as modules. For example you can create a feature which contains a new content type and a view. You can also export roles, fields etc.
Use Strongarm module to export system variables and settings. You can export nodes and taxonomy terms by integrating them with features module with UUID Features module or Default content module.

Related

System Templates version 1.31.0 and higher implementation

I have upgraded my cloud Artifactory to "7.52.0".
Prior to the upgrade I was using System Templates to deploy my pipelines.
Although after the upgrade there is still backward compatibility, The new way to deploy and use System Templates for creating new pipelines is not working for me.
From the release notes I got to this link to configure System Templates in the new way.
https://www.jfrog.com/confluence/display/JFROG/System+Templates
So in my repository A I have 2 files 'pipelines.yml' and 'values.yml'
pipelines.yml is configured as follows:
valuesFilePath: ./values.yml
Include:
template: myTemplates/TestTemplate/1.0.0
My values file contains values for the TestTemplate.
Then I go to https://example.jfrog.io/ui/admin/pipelines/pipelineSources and I try to create a new pipeline from repository A.
Looking at https://example.jfrog.io/ui/pipelines/myPipelines/myPipelines I don't see any pipeline created from the template.
Is that the right way to implement the new System Template?
I have also made sure that the templates are in the Artifactory by checking:
https://example.jfrog.io/ui/pipelines/templates
and also in the Artifactory directory tree.
Currently I am using the REST API in order to CRUD my Template Sources(https://example.jfrog.io/ui/pipelines/sources) and also use the REST API to create a new pipelines sources from a system template (apparently this is the old way).
As after the upgrade creating a source pipeline doesn't sync the old/new templates nor does it create a new pipeline from a system template that is located in the Artifactory.
You need to use the syntax documented in the Global template link.
Using the "jfrog/PublishTemplate" global template documentation
https://www.jfrog.com/confluence/display/JFROG/Global+Templates . I have noticed that in order to create and upload a system template you need to use the following syntax:
valuesFilePath: ./values.yml
include:
template: jfrog/<global_template_name>/<template_version>
According to the system template documentation this is the syntax that got me confused:
valuesFilePath: ./values.yml
Include:
template: jfrog/PublishTemplate/1.0.0
So I have used capital "I" instead of small "i" and bad indentation in order to create a new pipeline from my system template, which failed.
You use the Global template "PublishTemplate" for uploading your system template into your artifactory.
And then use the uploaded templates in order to create your new pipelines.

How do you import config sync files in a Drupal 8 functional test?

I would like to know how to import config sync files in my functional tests for modules I am testing. For instance, I have some custom content types I would like to test against, and there are a number of files in config/sync that pertain to the node modules that defines the custom content type.
class ArticleControllerTest extends BrowserTestBase {
protected static $modules = ['node', 'dist_source'];
}
At the top of my test I define the modules which do succesfully import, but it doesn't include the config sync settings so none of my custom content types are present. How can I import these into my test environment?
At the beginning of testing for Drupal 8, I had the same question. After reading some documents and tutorials, I tried and know several methods:
$this->configImporter() helps import the configurations from sync to active
$this->configImporter() exits in Drupal\Tests\ConfigTestTrait. And the trait has been used in some based test classes, like BrowserTestBase.
However, the method doesn't work for me. Because, I used thunder installation profile. The default content exists after the profile installation was completed. Once the $this->configImporter() starts to import sync configurations, it encounters errors that some entity types fail to be updated, because the entities already exists.
Create a testing profile
(Haven't tried)
If the Drupal site installed by standard profile, you may try to put the sync configurations into a testing profile. And Install Profile Generator module may helps you create the testing profile. There is a related issue #2788777 with config and profile
Put the configurations which depend on module into config/install or config/optional
(Work for me)
Contributed modules always put the config into config/install and config/optional. Once the module is installed, the configurations will also write into database or active storage. Documentation - Include default configuration in your Drupal 8 module
When developing configurations in module, Configuration development helps export the config into config/install of the developed module.
If anyone has the same experience, look forward to share with us.
I do it in my testing base class (extends BrowserTestBase) in setUp() like this:
// import config from sync
$config_path = '/home/rainer/src/asdent/config/sync';
$config_source = new FileStorage($config_path);
\Drupal::service('config.installer')->installOptionalConfig($config_source);
works great.
It's just like a drush cim sync
And provides the production config to my end-2-end automated phpunit tests in gitlab CI.

Database backup Azure Resource Manager

Is there a way in Azure Resource manager to take a copy of an existing database? Currently I know there is a database import option, which points to a bacpac file in Blob Storage and creates a new database from that file, but the process to create the file is a manual one at this point. With that, what is the current process to create bacpacs and put them in Blob storage in an automated way through ARM?
There is a way to specify the createMode of your database in the ARM template. This is very undocumented stuff but I found this in the REST api documentation and then just tried in the ARM template.
You can specify the properties "createMode" and "sourceDatabaseId".
I am not using this functionality because the sourceDatabaseId needs to be in the same subscription which was not the case with me. So i export the bacpac manually and then use an ARM template to import the bacpac (which also is undocumented but I commented the ARM template used here: https://azure.microsoft.com/en-us/documentation/articles/sql-database-import/)

Can buildout create content as part of a Plone install?

I'm trying to achieve a repeatable deployment of Plone for a site, and using buildout, basically following Martin Aspeli's book Professional Plone 4 Development. I can set up the system with my source products <site>.policy and <site>.theme, and have activated the theme automatically, but when I run buildout, I still have to instantiate a Plone site and activate the policy product, before creating the standard objects for the site.
Can buildout check for the existence of content objects like the Plone site object or particular folders during setup, and create them if they don't exist with the right settings? Can I do that in a separate <site>.content product, or should this be handled in the <site>.policy?
In principle buildout can do anything you can code, as long as you create a recipe to do the thing for you.
Luckily, someone already created a recipe to create the plone site for you, called collective.recipe.plonesite:
[buildout]
parts =
...
plonesite
[plonesite]
recipe = collective.recipe.plonesite
site-id = <site>
profiles-initial =
<site>.policy.profile-default
post-extras =
${buildout}/src/<site>.content/site/content/create_content.py
The recipe provides several hooks that let you control site creation, and execute system commands before or after the site is created or extra python code before or after GS profiles are run.
In the above example post-extras runs a create_content.py script with the variables app and site set:
from Products.CMFPlone.utils import _createObjectByType
if 'someobject' not in site:
_createObjectByType('SomeType', site, 'someobject', title='Foo Bar')

How do I make new/edits to nodes pass through intermediary for acceptance first?

I would like my users to be able to create and edit their own content. However, after they create new content, and after they edit it, I would like to have a way to accept those changes. Meanwhile, before changes are accepted, the previous version of the node will remain visible. Can this be accomplished with workbench, or is there another route I should take?
You can make the node content type unpublished by default from [admin/structure/types/manage/article] where "article" is the node content type...and then you can use "views" module to display all un published node... then you can manage the unpublished nodes and make theme published...
then
you can use rules module to make the node unpublish after its edited by the user
Take a look at the Maestro module. It is a way to implement workflows into your site.
From the project page:
The Maestro module is a workflow engine/solution that will facilitate
simple and complex business process automation. The first release of
this module will be for Drupal v7.
Maestro has a number of components that include the workflow engine
and the visual workflow editor. The workflow editor is used to define
the workflow, creating a workflow template. The workflow engine runs
in the backgound and executes the workflow tasks, testing the tasks
execution results and branch the workflow if required. The workflow
engine will run every x seconds and execute all tasks that are in the
queue which have not yet completed. Once they execute and return a
success status, the engine will archive them and step the workflow
forward. Both these components have been developed to support any
number of different task types. New task types can be developed and
added much like the Drupal CCK module can support new field types.
I have installed and configure the "Maestro" Module in Drupal Commons.
actually in Maestro Module every time we should Load the Workflow and it will go through step by step.
Like
Admin: Publishing Workflow Load
User1: Assigend to add new Content.
User2: Review Content
User3: Publish Content.
Here I want to Skip the Admin Step Is it possible? When user create new content type it is automatically go for review.

Resources