I've been reading about drupal install profiles, and I'm wondering if there's much of a difference between using a packaged install profile vs. installing core + manually installing the modules listed in the install profiles?
I'd like to do the latter (manually installing each) to control the versions of each module installed, which I can't control with a packaged install profile that may not have been maintained.
But should I or will I be opening the door to something I'm not aware of? Shouldn't the 2 be identical, just one automated and other is manual?
What kiamlaluno said, plus the fact that installation profiles may perform custom configuration of settings on install, might construct custom views/content-types/etc (especially by means of features.module, which you can see heavy use of in OpenAtrium), and might provide other custom code in distro-specific module.
The short answer is, no you can't just replicate an install profile by downloading a clean drupal with all those modules -- best bet is to use the install profile. If you're worried about module versions, just make sure you're using a profile that's actively maintained.
The difference is that an installation profile includes the right version of all the modules it needs.
This means that differently from manually installing each module, you don't need to verify the correct version of the module X that effectively works together the module Y; there are few cases where one module doesn't work well when version A of another module is installed, and you need to install version B of the same module, if you don't want problems.
An installation profile can have a custom installation page that allows you to change some parameters of your site; it also allows the installation profile author to define a patch that needs to be applied a module, in order to fix a bug of the module, or to make it work better with another module.
If you need to set a site to work for a particular purpose, installation profiles are useful for you as they allow you to set the site correctly without to know all the details about how a Drupal site needs to be set.
I believe you can specify the versions of the modules you want to install see
Related
I'm currently starting with JFrog Artifactory. Up to now I have only been working with source code control systems not with binary repositories.
Can someone please tell how the versioning of files is done in Artifactory?
I have been trying to deploy a file, then change it and deploy it again.
The checksum has changed, so it's the new file. But it seems that the old version is gone.
So it looks like there are no version of files. If I want that do I have to do it in the filename?
I found versions related to packages.
But I was thinking to use it for other files as well.
Thanks for your help
Christoph
Artifactory, unlike a VCS system, is not managing a history of versions for a given path. When you deploy an artifacts over an existing artifact, it will overwrite it (you can block this by configuring the right permissions).
If you wish to manage permission for generic artifacts (ones which are not managed by a known package manager like npm, Maven etc.), there are a couple of options you can take:
Add the version as part of the artifact name, for example foo-1.0.0.zip
Add the version as part of the artifact path, for example /foo/1.0.0/foo.zip
Combine the 2 above approaches, for example /foo/1.0.0/foo-1.0.0.zip
Use an existing package management tool which is flexible enough to handle generic packages. Many people are using Maven to manage all types of packages beyond Java ones (it comes with its pros and cons)
From the Artifactory point of view there are a couple of capabilities you can leverage:
Generic repositories - aimed at managing proprietary packages which are not managed by a known package manager
Custom repository layout - can be used to define a custom layout for your generic repository and assist with tasks like automatic snapshot version cleanup
Properties - can be used to add version (and other) metadata to your artifacts which can used for searching, querying,resolution and more
Lastly, Conan is another option you should consider. Conan is a package manager intended for C and C++ packages. It is natively supported in Artifactory and can give you a more complete solution for managing your C libraries.
I'm looking to remove/modify the autocomplete-plus package that is bundled together with atom on install.
After a while of struggling and failing, I come to the wisdom of stack-overflow for how I can either:
Modify behaviour of autocomplete-plus
prevent it from loading in the first place (i.e. remove it from the bundle)
The default packages are stored inside an asar file (i.e. Atom.app/Contents/Resources/app.asar on macOS), so it's highly impractical to tamper with its contents, not to mention that your changes are getting lost with each Atom update.
Since you haven't given us a reason why you would want to do that, there is no ideal answer to your question. Generally speaking, I think there are better alternatives:
Disable the autocomplete-plus package and install your fork as you would install any other package. The Atom API offers ways to disable packages programmatically, if you want your fork to handle this.
Build your own custom version of Atom that suits your needs. The default packages are listed as packageDependencies in package.json.
You can go to edit->preferences in the main menu, then check under 'packages' in the left-hand menu, search for 'autocomplete-plus' and then click on 'disable.
I wanted to enable full text searching in plone 4.2(windows).I ultimately installed Products.OpenXml and ftw.tika addon using buildout and properly adding their packages in eggs as well as zcml category, after which they both show in portal_transforms tools.
i.e included this in buildout.cfg and ran it.
eggs =
Products.OpenXml
ftw.tika
zcml =
Products.OpenXml
ftw.tika
But indexing still does not include anything except the title in the searchable field even after using clear and rebuild from plone catalog tool.
Please help me to enable this properly and i am a novice so please explain in detail if possible.
Or is there another better way(maybe faster,for multiple formats or simply better) to enable full text searching for external formats(doc,pdf,..) in version 4.2.
I've never installed ftw.tika on Windows, but if you manage to setup the service and it's up and running it should work.
To use ftw.tikayou need to install the ftw.tika package on your Plone Site by following the instructions in the README. This means you need to install the egg and the necessary zcml configuration to point to your local tika app.
zcml =
<configure xmlns:tika="http://namespaces.plone.org/tika">
<tika:config path="${tika-app-download:destination}/${tika-app-download:filename}"
port="${tika:server-port}" />
</configure>
Please also make sure you a recent version of JAVA installed, because it tries to run the tika-app.jar.
You can check if ftw.tika is installed properly by looking into the portal_transforms tool. There should be a tika_to_plain_text transform (http://plone/portal_transforms/tika_to_plain_text/manage_main), which should look like this:
If not, use quickinstaller, or portal_setup to install ftw.tika.
Also the server should run fine on windows, it's also Java. I guess you cannot use the examples from the package instructions, since they're made for Unix machines.
Usually I have to activate my available add-ons in order to make them working. But I find collective.geo.behaviour and my custom transmogrifier package seem working well without activating them. This make me wonder what is the trick behind the scenes. Will something go wrong if I keep using these addons without activating them?
Python packages that are installed for use in your Plone environment show up in your add-on list because they have Generic Setup profiles for addition to a Plone environment. Usually these profiles do things like set browser layers, add skin layers, add types or setup the catalog. They can also specify that the Generic Setup profile for some other add-on(s) should be run when this package is installed.
The two cases you mention here have different things going on:
crgis transmogrifier has a GS profile, but -- as far as I can tell based on examining its repository -- does not need one. It's GS profile does nothing. So, the install add-on choice will do nothing. Drop a note to the add-on author and tell them that.
Collective Geo Behavior's GS profile does nothing but specify that a couple of other add-on GS profiles be run. If you have already done the add-on installation for those, then this step does nothing. But, don't rely on that fact for future installations.
I think it's because these profiles don't do anything different than declare dependencies and the Code & ZCML are loaded at startup. So as long as you already have imported the dependencies listed in the profile you should be fine.
Also I think you could have packages without a profile, if you don't have a dependency or need to register something.
When I create a new Drupal site I usually end up with at least one custom module and several community contributed modules. To get the site working as it should, many configuration values need to be set on the various modules. This makes deployment onto a fresh Drupal instance painstaking and error-prone.
I would like to give my custom module the ability to configure all the other modules. Either on install or on the click of a button on my custom module's administration page, all the necessary configuration values on the other modules would be programmatically set.
How would I best go about doing this?
AFAIK, there's no way to achieve what you mean easily. I tend to put as much as I can in hook_update_N() implementations and do frequent DB synchronisations as described in my answer to this question. However that does not work when you already have a live server with which you will have to merge data.
To that purpose, I use various tools according to the need. No one is perfect, but here's however a small collection of my favorite ones:
Features. This is a new concept and a new module. The idea is pretty awesome: it allows you to define a set of configuration/modules/settings and to export them as a feature. This feature will then be installed as if it were a module on the target site. This module does not export every possible setting, but it does however do a good job with the modules that need the hardest configuration, as CCK, Views, ImageCache and others... You can see a screencast demo (~10 mins) here.
Backup and migrate. This is a more radical approach: it simply dump and rebuild the entire database on a target system. It is good only if you need to overwrite the target system completely.
Node export. This allows to export (and import) nodes from a drupal installation to another one. It supports bulk operations but - unluckily - it does not support the migration of attached files and images.
Deploy. Because of the limitations of node export I once looked into using this module (still in development). I finally did not, and preferred to do a merge of the production and staging databases, but the concept seems very valid, as it allows to import/export complex data type via SOAP.
Taxonomy import/export. I suppose the name is self-explanatory. It uses files to achieve the tasks (XML or CSV).
Installation profiles (suggested by ctford) are useful when configuring new sites. They allow you to specify modules to enable, theme to default to etc on installation. They can be quite convenient because there is a command-line tool called Drush that automates the building of installation profiles. The downside is that the profiles are designed to be used on installation - not deployment of an individual module. It might be possible however to take the configuration code generated by Drush and call it when your module is enabled.
Finally, you can find a collection of tools for importing/exporting data here.
HTH!
have you looked at the "features" module? it is a new paradigm introduced as part of the open atrium distribution but also available as a stand-alone module. from their description:
"The features module enables the capture and management of features in Drupal. A feature is a collection of Drupal entities which taken together satisfy a certain use-case.
Features provides a UI and API for taking different site building components from modules with exportables and bundling them together in a single feature module. A feature module is like any other Drupal module except that it declares its components (e.g. views, contexts, CCK fields, etc.) in its .info file so that it can be checked, updated, or reverted programmatically."
http://drupal.org/project/features
Installation profiles are useful when configuring new sites. They allow you to specify modules to enable, theme to default to etc on installation. They can be quite convenient because there is a command-line tool called Drush that automates the building of installation profiles.
The downside is that the profiles are designed to be used on installation - not deployment of an individual module. It might be possible however to take the configuration code generated by Drush and call it when your module is enabled.
I know what you mean, it's a pain to set all modules up.
I'm sure you can investigate all 3rd party modules to see how configuration takes place and mimic that in your custom module, but I'd advise you against that...
The problem is that modules may change the way they store their settings from one revision to another, so whenever you update to a new version of any module you should do some reverse-engineering to see if your 'ultimate-one-click-configuration module' still works ok - which, if you ask me, is even more painful than manually configuring all modules for each project.
Just relax, take it easy, and enjoy Drupal :)
As the initialization is only required when Drupal is installed, I would think that a installation profile is the better solution; to keep a module that is not anymore used once that the installation is configured seems a little excessive, IMO.
Changing the installation profile used from a site, and make the new installation profile run its installation code isn't something that Drupal allows out-of-the-box. I would create a custom installation profile before creating the sites I need, and only for the features I know all the sites will share. For the other features, I would create separate custom modules I can later install, and eventually uninstall when the features they implement aren't anymore necessary.