As a typical 'integrator' programmer customising Plone, what should I know about the ZMI to help me code more effectively? What are the settings, tools, pitfalls, shortcuts and dark corners that will save me time and help me write better code?
Edit: take it as read that I am coding on the filesystem, using GenericSetup profiles to make settings changes. I know making changes in the ZMI is a bad idea and generally steer clear. But occasionally the ZMI sure is useful: for inspecting a workflow, or examining a content item's permissions, or installing only one part of a profile via portal_setup. Is there really nothing worth knowing about the ZMI? Or are there other useful little tidbits in there?
There are a few places in the ZMI that I find myself returning to for diagnostic information:
/Control_Panel/Database: Select a ZODB mountpoint. Cache Parameters tab shows how much of your designated ZODB cache size has been used. Activity tab shows how many objects are being loaded to cache and written over time.
/Control_Panel/DebugInfo/manage: Lots of info, including showing what request each thread is serving at the current moment. The 'Cache detail' and 'Cache extreme detail' links give info on what classes of objects are currently in the ZODB cache.
Components tab of the Plone site root: Quick way to see what local adapters and utilities are registered. DON'T HIT THE APPLY BUTTON!
Undo tab of most objects: See who has committed transactions affecting the object lately.
Security tab: See what permissions are actually in effect for an object. You really don't want to change permissions here 90% of the time; it's too hard to keep track of where permissions are set and they are liable to be reset by workflow. Use the Sharing tab in the Plone UI to assign local roles instead. (The one exception is that I often find it handy to enable the add permission for a particular type in specific contexts.) In Zope 2.12, there is a new feature on this tab to enter a username and see what permissions and roles would be in effect for that user, which is handy.
Catalog tab of portal_catalog: See what index data and metadata is stored for a particular path. (Can also remove bogus entries from the index.)
Index tab of portal_catalog: Select an index, then click its Browse tab to get an overview of what keys are indexed and which items are associated with each key.
The key thing to know is that while many ZMI tools provide quick, through-the-web customization, the customizations that you make this way are hard to export out of the database. So, they don't move easily from development to production environments or from one deployment to another.
Ideally, a new developer should use the ZMI to explore and find points of intervention. Then, learn how to implement the same changes in policy add ons (products) that move from one deployment to another much more reproducibly.
If you want to write code for Plone, it's best to avoid the ZMI. The concept of doing things through the ZMI is very limited and discouraged - more and more things are not available in there and it will go away at some point.
The actual Plone control panels offer you most of the configuration options you can use. For anything else the file system is the best place to look.
I agree with the other posters that you shouldn't configure too much via the ZMI, as it's not in version control and you can easily lose track of the changes.
But the ZMI is still very useful for debugging and to see specific site configurations.
Here are some tools in the ZMI that I regularly consult:
portal_javascripts: To turn debugging on off. Checking which
scripts are there, what are their
conditions for rendering and are they
found?
portal_css: Basically the same as portal_javascripts but for
stylesheets.
portal_types: To see what a type's properties are. Can it be
created globally? What types can you
create inside it? What is its default
view? Etc.
portal_catalog: What indexes are there? What metadata is there in the
catalog? You can clear and rebuild
the catalog and even browse the
catalog.
portal_workflow: What states/transitions/permissions are
there in a certain workflow? What workflow is active on a
certain type?
portal_properties/site_properties: View and set site-wide properties. A
lot of these settings are in the
plone_control_panel (i.e outside of
the ZMI), but here they are on one
page and the ZMI is quicker to
navigate.
portal_skins: See which skins folders are installed. See the
ordering of the skin layers (via the properties tab). You can also edit
the templates, stylesheets and javascripts in the skins directories.
Not recommended! But useful for debugging.
portal_setup: Some very big and complex Plone websites can break if
you just willy-nilly add/remove/reinstall add-ons. Often it's safer to just run a specific GenericSetup update. For example, if you have added a new portlet, rather
import the specific (portlets.xml) step via portal_setup (the import tab), then reinstalling the whole product.
portal_actions: Configure which actions are visible/present.
portal_quickinstaller: Quickly reinstall, uninstall add-ons. Often
quicker and more lightweight than
loading Plone Control Panel's
equivalent.
acl_users: Sometimes when using an add-on like LDAPUserFolder, you'll
have to dig around in acl_users to configure and test it. You can also
create users here, although it's better to do this via the Plone
Control Panel (i.e not in ZMI)
There are many more tools and things to tweak (and break your site with) in the ZMI, but the above ones are what I use 90% of the time.
The portal_historiesstorage tool can eat a lot of disk space. Any content type set to save revisions saves them here, and by default Plone keeps all revisions (see the portal_purgepolicy tool).
I want all revisions on the production Data.fs, but after taking a copy for development the first thing I do is purge portal_historiesstorage. The procedure is:
Go to your Plone site in the ZMI
Delete the portal_historiesstorage tool
Go to portal_setup, Import tab
Under 'Select Profile or Snapshot' choose 'CMFEditions'
Select the step with handler Products.GenericSetup.tool.importToolset
Uncheck 'Include dependencies?'
Hit 'Import selected steps' to re-add portal_historiesstorage
Pack the Data.fs and delete the resulting Data.fs.old from the filesystem
On my 3G Data.fs, this little sequence removes 2.5G!
I have only ever done this on a development Data.fs. Without advice from someone who really knows, I don't recommend doing this on your production site.
There is usually no reason for an integrator or a developer to touch the ZMI other for possible maintenance tasks. Almost any customization can be done using Python or a GenericSetup profile. The advantage of profiles are: repeatability - being able to maintain on the filesystem - being able to put files under revision control.
Being able to work and configure stuff through the ZMI is partly working against Plone - especially when Plone is doing extra stuff under the hood. So the only recommendation can be: stay of the ZMI if you can. The ZMI is not a suitable replacement for using the Plone UI and should only be touch if you really know what you are doing.
Yep, the ZMI is for the occasional maintenance task or, when pressed, a quick-and-dirty CSS or template tweak. It's not meant for any real "coding" work, and in the context of Plone is best thought of as an odd and minimally useful leftover from Zope history.
portal_actions is also useful for more flexible top level navigation. but again best configured via gnericsetup.
Related
In my new project we are going to use Alfresco as back-end and Angular as front-end, so we wish to remove/disable Share completely if possible. I read somewhat in internet and some people just removed share.war file. Is this safe? Is it the correct way for doing this? Will any errors appear in the future because of this?
Yes, you can just remove it. You will of course, not have the fancy front end. But if you are just using it for back end stuff it will be fine. There are no dependencies, and you should get no errors.
Yes, definitely you can remove share.war as it is completely separate from alfresco.war. It won't give you any error. refer this
As said above, yes you can live with only alfresco war installed. Were I work we don't use share, we only use the repo via its api.
But keep in mind that if you have a recent version of alfresco, you don't have access to the repo UI any-more and share UI give you access to lot of repo config you can change without restart. I would keep share, disable all of its service (properties in alfresco-global that enable services for share; like the thumb preview generation, swf transformers, activity feed etc...) and keep it safe private to your admins.
Usually I have to activate my available add-ons in order to make them working. But I find collective.geo.behaviour and my custom transmogrifier package seem working well without activating them. This make me wonder what is the trick behind the scenes. Will something go wrong if I keep using these addons without activating them?
Python packages that are installed for use in your Plone environment show up in your add-on list because they have Generic Setup profiles for addition to a Plone environment. Usually these profiles do things like set browser layers, add skin layers, add types or setup the catalog. They can also specify that the Generic Setup profile for some other add-on(s) should be run when this package is installed.
The two cases you mention here have different things going on:
crgis transmogrifier has a GS profile, but -- as far as I can tell based on examining its repository -- does not need one. It's GS profile does nothing. So, the install add-on choice will do nothing. Drop a note to the add-on author and tell them that.
Collective Geo Behavior's GS profile does nothing but specify that a couple of other add-on GS profiles be run. If you have already done the add-on installation for those, then this step does nothing. But, don't rely on that fact for future installations.
I think it's because these profiles don't do anything different than declare dependencies and the Code & ZCML are loaded at startup. So as long as you already have imported the dependencies listed in the profile you should be fine.
Also I think you could have packages without a profile, if you don't have a dependency or need to register something.
My team uses TFS for source control and continuous integration. I'd like to come up with a nice, clean way to show release notes to end users each time we deploy. I'm curious what others are doing to manage release notes in an ASP.NET / TFS environment.
I put together a basic release notes report (for TFS2008) that you may find useful. Not sure if it's what you're after, but it works fairly well for me. You can always take it and do what you want with it to make it more suited to your environment and neds.
Well I typically hold documents like that as part of the solution under source control, so that the document is versioned and tracked. From then on there are several options - one is to bundle it with the release (attach a link to the file to one of the projects and select "Copy Local" = true), or to embed it into one of the projects and use it in a popup - this can be done with the installer project or as part of the "About" dialog. Or do both.
When I create a new Drupal site I usually end up with at least one custom module and several community contributed modules. To get the site working as it should, many configuration values need to be set on the various modules. This makes deployment onto a fresh Drupal instance painstaking and error-prone.
I would like to give my custom module the ability to configure all the other modules. Either on install or on the click of a button on my custom module's administration page, all the necessary configuration values on the other modules would be programmatically set.
How would I best go about doing this?
AFAIK, there's no way to achieve what you mean easily. I tend to put as much as I can in hook_update_N() implementations and do frequent DB synchronisations as described in my answer to this question. However that does not work when you already have a live server with which you will have to merge data.
To that purpose, I use various tools according to the need. No one is perfect, but here's however a small collection of my favorite ones:
Features. This is a new concept and a new module. The idea is pretty awesome: it allows you to define a set of configuration/modules/settings and to export them as a feature. This feature will then be installed as if it were a module on the target site. This module does not export every possible setting, but it does however do a good job with the modules that need the hardest configuration, as CCK, Views, ImageCache and others... You can see a screencast demo (~10 mins) here.
Backup and migrate. This is a more radical approach: it simply dump and rebuild the entire database on a target system. It is good only if you need to overwrite the target system completely.
Node export. This allows to export (and import) nodes from a drupal installation to another one. It supports bulk operations but - unluckily - it does not support the migration of attached files and images.
Deploy. Because of the limitations of node export I once looked into using this module (still in development). I finally did not, and preferred to do a merge of the production and staging databases, but the concept seems very valid, as it allows to import/export complex data type via SOAP.
Taxonomy import/export. I suppose the name is self-explanatory. It uses files to achieve the tasks (XML or CSV).
Installation profiles (suggested by ctford) are useful when configuring new sites. They allow you to specify modules to enable, theme to default to etc on installation. They can be quite convenient because there is a command-line tool called Drush that automates the building of installation profiles. The downside is that the profiles are designed to be used on installation - not deployment of an individual module. It might be possible however to take the configuration code generated by Drush and call it when your module is enabled.
Finally, you can find a collection of tools for importing/exporting data here.
HTH!
have you looked at the "features" module? it is a new paradigm introduced as part of the open atrium distribution but also available as a stand-alone module. from their description:
"The features module enables the capture and management of features in Drupal. A feature is a collection of Drupal entities which taken together satisfy a certain use-case.
Features provides a UI and API for taking different site building components from modules with exportables and bundling them together in a single feature module. A feature module is like any other Drupal module except that it declares its components (e.g. views, contexts, CCK fields, etc.) in its .info file so that it can be checked, updated, or reverted programmatically."
http://drupal.org/project/features
Installation profiles are useful when configuring new sites. They allow you to specify modules to enable, theme to default to etc on installation. They can be quite convenient because there is a command-line tool called Drush that automates the building of installation profiles.
The downside is that the profiles are designed to be used on installation - not deployment of an individual module. It might be possible however to take the configuration code generated by Drush and call it when your module is enabled.
I know what you mean, it's a pain to set all modules up.
I'm sure you can investigate all 3rd party modules to see how configuration takes place and mimic that in your custom module, but I'd advise you against that...
The problem is that modules may change the way they store their settings from one revision to another, so whenever you update to a new version of any module you should do some reverse-engineering to see if your 'ultimate-one-click-configuration module' still works ok - which, if you ask me, is even more painful than manually configuring all modules for each project.
Just relax, take it easy, and enjoy Drupal :)
As the initialization is only required when Drupal is installed, I would think that a installation profile is the better solution; to keep a module that is not anymore used once that the installation is configured seems a little excessive, IMO.
Changing the installation profile used from a site, and make the new installation profile run its installation code isn't something that Drupal allows out-of-the-box. I would create a custom installation profile before creating the sites I need, and only for the features I know all the sites will share. For the other features, I would create separate custom modules I can later install, and eventually uninstall when the features they implement aren't anymore necessary.
Setting up Flex project for group development can be a bit tricky. There are lots of little local settings that might need to be tweaked in order to have a project that can be easily checked out.
I've had limited success using the built-in import/export flex project utilities. I seem to wind up editing by hand a lot and I think I might be missing something.
UPDATE
I neglected to mention originally that my goal is to make it possible to checkout a project from subversion and get up and running with as little fuss as possible. The biggest problems that I have run into all revolve around managing the "dot" files and how to make them flexible enough to deal with different developer environments.
For example, even with just me, I would like to have this ability: at work, I use a Vista machine and at home I use a Mac. There are certainly differences in the way certain paths are described, but they really are quite similar. On Vista, the flex root is c:/ColdFusion8/wwwroot, on OS X, it is /Applications/ColdFusion8. I have been able to set up a linked resource path variable for both CF_FLEX_SERVER and WEBSERVER that I then reference using the ${WEBSERVER}/myProject syntax.
So far, it seems to work pretty well, but I find there are a few places that it still has issues. Specifically, in the .project file you find something like:
<linkedResources>
<link>
<name>bin-debug</name>
<type>2</type>
<location>c:/inetpub/wwwroot/myProject-debug</location>
</link>
</linkedResources>
Unfortunately, if I try to change the location entity to ${WEBSERVER}/wwwroot/myProject-debug, flex throws a compiler error. That's a shame, because pretty much everything else works.
I have worked through this problem before and generally set my projects up as such:
Application/trunk/source/ <-- workspace is here (can also be in 'trunk')
Application/trunk/source/Application <-- Application here
I DO keep my project (.actionScriptProperties, .flexProperties, .project, .settings) in SVN, but NOT my workspace (.metadata) because it's too big.
I find that importing projects via Import -> Flex Project enforces alot of restrictions. For example, if your workspace was in the 'trunk' directory above then importing as a Flex Project will cause the project to be copies into trunk/Application or simply complain about the location.
The better way to go about it is to create the workspace and then Import -> [General] Existing Projects into Workspace. The only difference is that you will have to manually add the Flex Development perspective.
Edit: I'd also recommend setting your compiler options to "Use default SDK" and then setting the appropriate SDK as default. This will prevent commit-tennis when each developer names his SDK differently.
Since Flex Builder is written on top of Eclipse, it can integrate with Subclipse. This allows you to pretty easily tag files as 'SVN ignore' to avoid project-specific settings. I've used this to add my Flex projects to an existing SVN repository, which I've checked out to multiple sites. I have noticed a few issues here and there (some checkins get errors, but they're relatively rare), but it generally works.