Drupal to Drupal Migration Across Servers - drupal

I am in the process of migrating a D7 site from one server to another. I have successfully exported and uploaded the settings to the new site using Features, but I need to get the content over to the site as well. I've been looking at several modules to try and solve this problem, but I have not found anything suitable for this task. Please let me know if I am overlooking a really simple solution.
Thanks!
Mark

Easiest solution is to export a database dump and import it into your new server. You can do it wotj phpMyAdmin but I recommend using Drush.
This way you can simply do a database dump via:
drush sql-dump > ~/sql-dump-file-name.sql
and later import via:
drush sql-cli < ~/sql-dump-file-name.sql
Also copy your files directory from old server to new server which is located in /sites/default/files.

I've successfully used the backup and migrate module for these tasks. True, creating a dump and then spooling the dump into the other database works, but this typically also copies all caches.
The backup_migrate module allows you save backups on your local server, but also to your hard disk, from where you can upload it again to the other site.
A neat thing here is that you can exclude tables, such as cache tables, which makes the transfer much faster.
Obviously you need a core installation on the other end, and the backup_migrate module already installed for this to work, but I assume that since you only ask about the db, you must have mirrored the file structure already (excluding the settings files).

Related

Install WordPress with its plugins using chef-solo

I have a WordPress website up and running with many plugins installed on it and a huge database, I need to use chef-solo in order to create an environment in which can install the same website with all its plugins and and also importing its database.
I need it to be like, using chef to install the same website on a different server, exactly the same
Now here are my questions:
I know we can use chef to install WordPress but can we set it in a
way that we don't need to configure the the WordPress and everything
is already set once its running?
What to do with the plugins? can we install them using the chef or
now that should be done manually?
How about importing the database, that can be done with chef-solo
as well?
The whole website is on git, can I somehow import the whole
thing?
is there any other issue I may possibly face? if I want do that?
There is a wordpress cookbook openly available for chef.
When you mean configure, I take it you mean setup data in the database. Assuming that you've separated the database instance from the server instance, and you're attempting to scale up the number of servers then you should be able to skip data setup. You should be configuring the new server instance (node) to point to the same database via Chef.
I stumbled into this question looking for the answer to this question. From what I can tell the start may be here.
Kind of hand-wavy, but this should enable you to do some wordpress stuff via the command line with Chef, rather than the point and click it prefers.
As per #1, you should not need to import the database. If the database goes down, you'll want to focus on that as a separate but connected recipe, since then you'll want to be taking snapshots and uploading them somewhere like S3 via a cron job. I believe there are plugins that can enable this.
You'll have to be a little more clear by "import". If it's in a code base you may be able to short-cut your cookbook path by pulling down the git repo onto the host. You may want to look at git-archive.
Other issues that I'm looking at are images. We're migrating from a hosted solution to AWS, and it appears that instead of storing the images in the database, word-press pulls them into a local directory. This means that if we scale to > 1 host, we'll have issues with images. Something to think about, there's a wealth of plugins that can probably solve this.
Hope this is helpful,
Ben

Set up for working offline and uploading to remote site

I am currently using Komodo edit for coding, and have a set up where I am using MAMP and a local install of Drupal, and SASS to build my site offline.
Once its ready test online, I upload onto the remote site. However, then I am working sometimes on the remote site and sometimes on the local one, and finding some problems.
I'm not using SASS on the remote site, so I am working in the CSS file. I don't have all the Drupal panels in code yet, so I'm having to rebuild them on the remote site, and I'm making tweaks and changes on the fly.
I end up with two slightly different versions of the site and need to keep track of the changes I want to keep from both. What can I do to clean up my workflow?
It would be better if I could work entirely locally and then sync that to the remote environment.
With something like Netbeans I think I could have a local copy of the site running and then right click and upload each file onto the remote server so there are two copies of the file.
I could do with some advice as to what the cleanest set up is.
I have an actual Dev server with its own ip and a live server with its own ip but they both connect to the same mysql server. There are two databases set up, db1 and db2.
I use a php script with basic sql instructions to:
Check which db is in use.
Backup that db (say, db1).
Sync the databases
Import that db into db2.
Once that is done, I use this:
cp webdb2.settings.php settings.php
So I always have two databases available and can roll right back (in this case with:
cp webdb1.settings.php settings.php) if something when wrong.
This seems to be a pretty good system. I only ever work on the dev, and then push it to the live with the process above.
The best thing that you can do is work offline an then sync the content (themes/modules) when you have to.
If you work on both you could lose data or lose track of code changes.

Drupal Backup and Restoring

So I wrote this script that basically creates a sql dump of the the drupal databases as well as created a tar of of the www directory. I took this off the server and put it on my local machine. I want to take these backup files and test to see if the backup is stable as well as to learn the process.
My problem is that I can't find any clear instructions on how I would be able to do this. Can anyone give me a hand?
Any help is much appreciated.
You need to have a LAMP stack installed on your local machine. In addition, you'll need to modify the settings.php file to change the database connection strings to match your local enviornment. Youi may also need to modify the $base_url variable in the settings.php.
THis would not be necessary if you were simply restoring, but since you're moving the install it is required.

Duplicate a Drupal installation from one server to another

I have been developing a Drupal 6 site on my PC using XAMPP. I'm done now, and everything looks peachy.
Problem is, I need to put all my content (including custom modules and themes) up onto a staging server which only has a fresh Drupal 6 install on it. I can't imagine having to set up all my custom content types and whatnot all over again on the staging server.
So I ask, how does one go about doing what I need to do? Which is essentially duplicating my Drupal install from my PC, to the staging server.
The staging server is running Linux, and I develop on a Windows PC, if that helps.
Thanks in advance.
Copy up all the files from development to live, and mysqldump your database and run that on the live server. Then all you have to do is change the settings.php file to point at the right database, if for some reason 'localhost' is not also your mysql database.
The quickest solution is probably the backup_migrate module. It is only a tool to copy your database. You could also use phpmyadmin or similar instead if you wanted. The backup_migrate module do have some good defaults settings as to which tables to skip (like cache tables). All the settings etc. that is not defined in code is stored in your db. So you only need to copy the db to be set. You can choose to exclude some tables, like the node or user table if you don't want to bring over your test data.
If you don't use subversion, then you gotta manually copy the files (rsync, scp, whatever) and the db (mysqldump).
what we usually do is have a hierarchy of independent subversion repos as follows:
core
sites/all/modules/contributed
sites/all/modules/custom
sites/all/themes/ (we develop our own and don't use contributed themes)
sites/all/libraries
then we use the svn:externals properties so that if you check out "core" you get every associated repo.
we got about 2 main developers with 4 other guys that may also contribute code to the site. each have their own local dev environment and we all got a common sandbox - where we make sure the stuff we wrote doesn't break someone else's module (it has happened before!).
we use svn commit hooks to update the beta/staging/sandbox site upon commit.
with all that setup, [re]deploying a site is a simple matter of going to the proper folder and issuing a "svn co http://repolocation/reponame ." and then updating the DB.
two last things to consider:
we are moving from svn to git
the features module will allow you to save changes you make to your own modules (views, content types, etc) and package all that into a deployable module so you don't have to duplicate your efforts. we are also looking into using this for ourselves.
I hope this helps you.
I second using backup_migrate. It's great.
When I'm installing a fresh site from development to production, I:
backup the site using backup_migrate module
copy all the files up to the server
edit the sites/default/settings.php to have the right database path and account info
do an import of the last backup_migrate dump (usually using mysql < backupfilename.sql, unless I already have drupal setup and have backup_migrate installed, then I use the GUI
But take a look here for the official version:
http://drupal.org/node/776864
Now, you didn't ask, but when the site is live and users are contributing content, moving future development versions of your site from development/staging to production without blowing away live content is a whole different problem, and one that Drupal doesn't have a good answer for...
Andy-

Moving Raw MYSQL Data Files to a Different Directory

I have a directory only backup of a previous server that hosted multiple sites. I had access to a few .sql backups for our databases, but there were some that had not been backed up in that fashion. I located the .MYD,.frm, and .MYI files for the tables in my db in the var/lib/mysql/db_name directory.
I would like to know if there is a way to get the data from these files and move them into the new existing mysql installation? I tried copying the files from the db folder to a db folder with the exact same name, but I still get a "Can't find db_name/table_name.frm" error when trying to access any of the tables. They do show up in phpmyadmin table list, the error comes up when trying to access the tables.
Is this possible? If so, how do I go about taking these table files and turning them into use-able data?
I am sorry if my question or explanation doesn't make any sense. This is part of an ongoing 11+ hour emergency server recovery project that got sprung on me today, so my brain is fried. I will answer any questions necessary.
Have you checked that the files in /var/lib/mysql/db_name/ are owned by mysql and not root? Usually copying the files in should 'just work' (certainly it has and does for me). I assume you're using the same or very similar version of MySQL?
You really need to restore everything relating to that database, including also the matching 'mysql' database for your previous installation.

Resources