How to create new content types using Drush commands in Drupal 8 - drupal

I know the export and import option is available but I have 5 environments, So each and every time after implementing new things on dev env I don't want to do export and import on other envs instead using pipeline I will execute the drush commands to implement the required changes on other envs by executing required Drush commands.

The problem is that on Drupal the config import/export is the best way to do that.
The config export doesn't not only export content type configuration, but the entire website configuration.
So thoses commands help you be iso between environments.
If your probleme is about ouverriding some config values between environments, the module Config Split can help you on it.

With no config import, I would solve it with installing a module of your own, which runs your special code.
Steps:
Create a module my_custom_contenttype
Make a hook_install() (my_custom_contenttype_install())
Run this drush command (uninstall before, if you want to install it again - do it more than one time):
drush pm:uninstall my_custom_contenttype; drush pm:enable my_custom_contenttype;
function my_custom_contenttype_install(){
// Add content type if it not exist
};

You could do this using manually-managed configuration. Either by:
Creating a custom module and including your configurations in custom_module/config/install, and using "drush en" to install the module
OR
By using an existing custom module and using a post_update hook to install the configuration.
We do this on a number of sites, and loading and installing the configurations is just complex enough that we built a helper module to do it. You can pull this module in with composer and use update hooks to activate it, or you can look at the installConfig function here and use it a as a template. (the module has a helpful readme, too).

Related

How to test a new cloudify plugin with a blueprint that internally uses another plugin?

I'm developing a new cloudify plugin that I test using tox, nosetests and #workflow_test decorator following the plugin template.
I'd like to test the plugin interacting with another plugin (speficially openstack plugin). Hence I'm using a blueprint that imports my plugin (test yaml file) and the openstack yaml file, and then defines some nodes from my plugin and from openstack.
The problem is that I'm getting module python import errors as the openstack plugin is not found in the test environment created by tox/nosetests. I tried installing the plugin using wagon before running nosetests, but the installation fails.
Anyone could indicate me how to do that?
Have you tried requirements files? In your tox.ini file, you should be able to define your test requirements like this:
[testenv:py27]
deps =
-rdev-requirements.txt
-rtest-requirements.txt
Then put either the URL of the branch zip (such as master) in your test-requirements.txt file:
https://github.com/cloudify-cosmo/cloudify-openstack-plugin/archive/master.zip
nose>=1.3
tox
In general, I find the workflow_test not to be very useful, and usually I end up writing something else for the same purpose. For example, these functions basically do the same thing in this test base:
https://github.com/cloudify-incubator/cloudify-ecosystem-test/blob/791f02a27313ac0b63b029c66ead333cb17c4d9c/ecosystem_tests/init.py#L100
https://github.com/cloudify-incubator/cloudify-ecosystem-test/blob/791f02a27313ac0b63b029c66ead333cb17c4d9c/ecosystem_tests/init.py#L151
https://github.com/cloudify-incubator/cloudify-ecosystem-test/blob/791f02a27313ac0b63b029c66ead333cb17c4d9c/ecosystem_tests/init.py#L225

How to instal modules with Drupal 8 and Composer?

I installed Drupal 8 via composer with:
composer create-project drupal-composer/drupal-project:8.x-dev my_site --stability dev --no-interaction
This downloaded all the files and run composer install. According to this tutorial - https://www.drupal.org/node/2718229 - doing so this way will also configure composer.json to allow installation of modules, themes etc too via composer. Nice
However, I'm trying to install a new module:
$ composer require drupal/codesnippet
Using version ^1.6 for drupal/codesnippet
./composer.json has been updated
> DrupalProject\composer\ScriptHandler::checkComposerVersion
Loading composer repositories with package information
Updating dependencies (including require-dev)
- Installing drupal/codesnippet (1.6.0)
Downloading: 100%
Writing lock file
Generating autoload files
> DrupalProject\composer\ScriptHandler::createRequiredFiles
However, when I go to Admin Bar > Extend > Install new module, I can search for the module and it says it's not installed yet. If I try to enable/install it from there it tells me I need to download and copy to the /libraries directory:
Before you can use the CKEditor CodeSnippet module, you need to download the codesnippet plugin from ckeditor.com and place it in /libraries/codesnippet. Check the README.txt for more information. Get the plugin here. (Currently using CodeSnippet version Plugin not detected)
Are these two completely different methods? How can I complete the installation with composer of this module?
Composer is a dependency manager, and whether or not third-party dependencies are included depends on how the module author managed their dependencies in the first place.
You aren't going to be able to complete the install via Composer alone, if a specific dependency isn't present on the repository that Composer downloads its packages from.
You're going to have to download the CKEditor CodeSnippet module from ckeditor.com. Composer can't manage that dependency for you, because that CKEditor plugin isn't a Composer package.
You can download it here: http://ckeditor.com/addon/codesnippet
Martyn, I guess you are confusing two different things into the same one: the drupal module and the external library required by the module.
The Drupal module codesnippet (https://www.drupal.org/project/codesnippet) is just a drupal integration module for the CKeditor addon with the same name, which you can download it (http://download.ckeditor.com/codesnippet/releases/codesnippet_4.6.2.zip) and place it in the drupal webroot /libraries folder manually (in your case my_site/web/libraries/ to be more specific - you have to create it if does not exist already).
Then you should be able to enable the drupal module.
PS: You could also add the library requirement in the composer.json library manually, which might be just a bit more complicated for beginners, because you also have to manually specify other things like a repository type, url and installer-paths for the extra external library that you need , but might be easier in the long run to deploy new Drupal8 installations with the same requirements just with a proper main composer.json file, without the need to go and manually download external libraries. There is a similar comment of mine(user zet) that you could read on this drupal dropzonejs module issue https://www.drupal.org/node/2853274

Override postinst scriptlet to not start service and register it

I'm using SBT native packager 1.2.0-M3 for packaging a Play Framework 2.5.3 application as an RPM (targeted for RHEL 7 with systemd). I would like to change the behavior of the generated RPM such that it does NOT automatically start after install but is being enabled (systemctl enable <name>.service).
I've been following the instructions outlined at http://www.scala-sbt.org/sbt-native-packager/archetypes/java_server/customize.html. Specifically, I created a file src/rpm/scriptlets/post-rpm containing a single line systemctl enable <name>.service. As far as I understand the documentation, that's all that is required. However, on installation of the RPM the service gets still automatically started. Is there any additional configuration required?
This is currently the default behaviour. There is a historical explanation here.
What you actually need to do are the maintainerScripts in Rpm.
There is a helper trait which lightens the build definition. Something like
import RpmConstants._
maintainerScripts in Rpm := {
(maintainerScripts in Rpm).value += (
Post -> "systemctl enable <name>.service"
)
}
And there is a feature request to implement this in native-packager directly.

Make grunt-eslint use globally installed eslint plugin

Calling grunt-eslint causes a Cannot find module 'eslint-plugin-react' error that doesn't happen when calling eslint directly from the command line.
I have eslint-plugin-react installed globally.
Is there an easy way to make grunt eslint behave the same way as eslint?
Assuming you don't want to install the node module locally for some reason, I can think of two options. 1. Use grunt-exec within your grunt file to run eslint, or 2. As per the answer in the link below setup a symbolic link to your global node modules folder:
How to use grunt-html installed globally?

Osgi commandline install multiple plugins simultaneously

Is it possible to simultaneously load all the plugins available in a directory in osgi commandline?
path/to/bin/java -jar org.eclipse.osgi_3.6.1.R36x_v20100806 -console -clean
This brought up the osgi console and activated the org.eclipse. ss shows me this
id State Bundle
0 ACTIVE org.eclipse.osgi_3.6.1.R36x_v20100806
I have a bunch of bundles in a directory: /path/to/all/bundles
I can certainly do one-by-one on the osgi console using the following:
osgi> install file:///path/to/bundle/org.springframework.osgi.core
I want to be able to load all bundles at once and next step is to be able to install it as well.
Any pointers?
Thanks!
I agree that you definitely don't want to install all your bundles by hand every time. Installing the single Apache FileInstall bundle will then automatically load anything you put into a directory (./load) by default.
You might be interested in using Apache Karaf it does give you Features, where features is a set of bundles defined in either maven-locations or file-locations. Besides this it also gives you a lot of other benefits for working with OSGi bundles, just for an example you'll have more then 200 commands to work with in the shell and lots of them will help you find problems with your bundles.

Resources