How to check if Drupal modules are up to date ? - wordpress

I am using a monitoring tool (Sensu) to execute multiple checks to know if a server has problems.
I have already written a ruby script to know if a wordpress is up to date, to do that I connect through a ssh tunnel to the server, and I connect to his wordpress database, and then I check a table where I parse some data. For exemple if response=lastest, the core is up to date.
I want to do the same for Drupal, but I can't find useful data in the drupal database which says me that a module or the core is up to date, I only the find version number in system table.
Have you got an idea how can I check if drupal modules are up to date, if possible from an another server than the one where drupal is installed ?
Thanks.

There is a module called nagios (https://www.drupal.org/project/nagios) that will allow you to visit a "check page" and it will check the status of a number of different things that you can monitor.
I would only caution if you are using a Drupal Distribution, not all the modules get updated in a timely fashion, but if you are using the standard Drupal installation you should be fine.

There is one nagios plugin I found which does not require any modules to be installed into drupal. However, it requires drush (http://www.drush.org) on the server which holds the drupal site:
https://github.com/cytopia/check_drupal

Related

Install WordPress with its plugins using chef-solo

I have a WordPress website up and running with many plugins installed on it and a huge database, I need to use chef-solo in order to create an environment in which can install the same website with all its plugins and and also importing its database.
I need it to be like, using chef to install the same website on a different server, exactly the same
Now here are my questions:
I know we can use chef to install WordPress but can we set it in a
way that we don't need to configure the the WordPress and everything
is already set once its running?
What to do with the plugins? can we install them using the chef or
now that should be done manually?
How about importing the database, that can be done with chef-solo
as well?
The whole website is on git, can I somehow import the whole
thing?
is there any other issue I may possibly face? if I want do that?
There is a wordpress cookbook openly available for chef.
When you mean configure, I take it you mean setup data in the database. Assuming that you've separated the database instance from the server instance, and you're attempting to scale up the number of servers then you should be able to skip data setup. You should be configuring the new server instance (node) to point to the same database via Chef.
I stumbled into this question looking for the answer to this question. From what I can tell the start may be here.
Kind of hand-wavy, but this should enable you to do some wordpress stuff via the command line with Chef, rather than the point and click it prefers.
As per #1, you should not need to import the database. If the database goes down, you'll want to focus on that as a separate but connected recipe, since then you'll want to be taking snapshots and uploading them somewhere like S3 via a cron job. I believe there are plugins that can enable this.
You'll have to be a little more clear by "import". If it's in a code base you may be able to short-cut your cookbook path by pulling down the git repo onto the host. You may want to look at git-archive.
Other issues that I'm looking at are images. We're migrating from a hosted solution to AWS, and it appears that instead of storing the images in the database, word-press pulls them into a local directory. This means that if we scale to > 1 host, we'll have issues with images. Something to think about, there's a wealth of plugins that can probably solve this.
Hope this is helpful,
Ben

Updating multiple live drupal installations from a central template

I'm helping my university switch from Lenya to Drupal for their CMS. We plan on offering a drupal install to every department that wants one. The installs will all share the same codebase (the custom drupal "template" I'm developing now) but will each have their own database, allowing each site to have its own users, nodes, etc.
The problem I have is when, after making changes to the template, I'd like to update all of the installations. If the change is to core code or that of an installed module, for example, there's no problem since all installations are running off the same codebase. If, on the other hand, I need to make changes to the database, I'm screwed because there's a tonne of installed databases, and they're all different and need to be preserved. Even for simple changes like installing a new module, the module shows up fine on the list of installed modules, but I have to manually go into each installation and enable it by hand.
There must be an easier way! Is there some easy way (like a module I haven't heard about) to force drupal databases to update certain tables from a master database? I'm thinking of something similar to the "update.php" script that I could invoke en mass from drush.
Thanks for the help, all!
You can try to use drush make for that. Check out this site http://drushmake.me/ to see example of how it looks like. All you need is to install drush script to your server http://drupal.org/project/drush. And later you can build your own .make files and run it for different database.

Installing third-party Drupal modules on Azure

I've just started playing around with the new "Website" feature in Azure that allows you to create websites with just one step - and also allows you to create websites from a "Gallery", including Drupal. And I can get my Drupal site up and running, no problem. But if I try to add a third-party module (for instance, Mindtree's ODataDrupal), then I get this error message:
Installation failed! See the log below for more information.
odata_support
Error installing / updating
File Transfer failed, reason: Cannot chmod /DWASFiles/Sites/theparentsunion/VirtualDirectory0/site/wwwroot/sites/all/modules/odata_support.
More-or-less the same thing happens if I try to update some of the existing modules (which Drupal warns, with big red flashing letters, are out of date), except then my Drupal install is left crippled, with no way to fix it that I've been able to find.
Is this as-designed, or some limitation of the beta website integration? (Because a Drupal installation is kinda worthless if you can't add new modules to it, or update existing ones.) Or am I doing something wrong?
If you are trying to use plugins and 3rd party modules to Drupal based Windows Azure Websites, the results may vary person to person. This is mainly because the kind of configuration needed by specific module or plugin may or may not be supported by Windows Azure Websites model and not all kind of custom configuration will work on Windows Azure Websites and you would need to move to Windows Azure Virtual Machines.
About application specific structure, what you can do is open the websites FTP folder and whatever you could see there is user configurable, so you can configure it the way you want. However if you application will try to make changes to outside its limited scope, you will hit errors as above.
Here is a case study where Azure VM was used for Drupal based migration which shows that for complex application you may need to use AZure VM rather then Azure Websites.

how to know out of date drupal modules?

I have a drupal based website with too many enabled modules, and i need to know what are the modules that have updates?
in other words, i need to update all of my modules to the latest release.
how can i get a list of all of out of date modules and what is the best way to update them?
i think there will be a better choice to update modules than downloading the latest release to /sites/all/modules and the run update.php
thanks for your help
Actually, a manual update is the only solution (that I know for sure works in the way it is supposed to work) in Drupal 6.
You get a list of updated modules from the admin/reports/updates page.
With Drupal 7, you have the ability you use admin/reports/updates/update and update the modules. I haven't had the chance to test that myself but I assume that it should work ok. But still, Drupal core needs to be updated manually.
If you have SSH access to your website, consider installing http://drupal.org/project/drush Drush and use the command
drush pm-update
If all goes well, your site has the latest versions of all modules and Drupal itself upgraded in less than 5 minutes. Make sure to create a backup of the site before, however (modules and database).

Duplicate a Drupal installation from one server to another

I have been developing a Drupal 6 site on my PC using XAMPP. I'm done now, and everything looks peachy.
Problem is, I need to put all my content (including custom modules and themes) up onto a staging server which only has a fresh Drupal 6 install on it. I can't imagine having to set up all my custom content types and whatnot all over again on the staging server.
So I ask, how does one go about doing what I need to do? Which is essentially duplicating my Drupal install from my PC, to the staging server.
The staging server is running Linux, and I develop on a Windows PC, if that helps.
Thanks in advance.
Copy up all the files from development to live, and mysqldump your database and run that on the live server. Then all you have to do is change the settings.php file to point at the right database, if for some reason 'localhost' is not also your mysql database.
The quickest solution is probably the backup_migrate module. It is only a tool to copy your database. You could also use phpmyadmin or similar instead if you wanted. The backup_migrate module do have some good defaults settings as to which tables to skip (like cache tables). All the settings etc. that is not defined in code is stored in your db. So you only need to copy the db to be set. You can choose to exclude some tables, like the node or user table if you don't want to bring over your test data.
If you don't use subversion, then you gotta manually copy the files (rsync, scp, whatever) and the db (mysqldump).
what we usually do is have a hierarchy of independent subversion repos as follows:
core
sites/all/modules/contributed
sites/all/modules/custom
sites/all/themes/ (we develop our own and don't use contributed themes)
sites/all/libraries
then we use the svn:externals properties so that if you check out "core" you get every associated repo.
we got about 2 main developers with 4 other guys that may also contribute code to the site. each have their own local dev environment and we all got a common sandbox - where we make sure the stuff we wrote doesn't break someone else's module (it has happened before!).
we use svn commit hooks to update the beta/staging/sandbox site upon commit.
with all that setup, [re]deploying a site is a simple matter of going to the proper folder and issuing a "svn co http://repolocation/reponame ." and then updating the DB.
two last things to consider:
we are moving from svn to git
the features module will allow you to save changes you make to your own modules (views, content types, etc) and package all that into a deployable module so you don't have to duplicate your efforts. we are also looking into using this for ourselves.
I hope this helps you.
I second using backup_migrate. It's great.
When I'm installing a fresh site from development to production, I:
backup the site using backup_migrate module
copy all the files up to the server
edit the sites/default/settings.php to have the right database path and account info
do an import of the last backup_migrate dump (usually using mysql < backupfilename.sql, unless I already have drupal setup and have backup_migrate installed, then I use the GUI
But take a look here for the official version:
http://drupal.org/node/776864
Now, you didn't ask, but when the site is live and users are contributing content, moving future development versions of your site from development/staging to production without blowing away live content is a whole different problem, and one that Drupal doesn't have a good answer for...
Andy-

Resources