I have been developing a Drupal 6 site on my PC using XAMPP. I'm done now, and everything looks peachy.
Problem is, I need to put all my content (including custom modules and themes) up onto a staging server which only has a fresh Drupal 6 install on it. I can't imagine having to set up all my custom content types and whatnot all over again on the staging server.
So I ask, how does one go about doing what I need to do? Which is essentially duplicating my Drupal install from my PC, to the staging server.
The staging server is running Linux, and I develop on a Windows PC, if that helps.
Thanks in advance.
Copy up all the files from development to live, and mysqldump your database and run that on the live server. Then all you have to do is change the settings.php file to point at the right database, if for some reason 'localhost' is not also your mysql database.
The quickest solution is probably the backup_migrate module. It is only a tool to copy your database. You could also use phpmyadmin or similar instead if you wanted. The backup_migrate module do have some good defaults settings as to which tables to skip (like cache tables). All the settings etc. that is not defined in code is stored in your db. So you only need to copy the db to be set. You can choose to exclude some tables, like the node or user table if you don't want to bring over your test data.
If you don't use subversion, then you gotta manually copy the files (rsync, scp, whatever) and the db (mysqldump).
what we usually do is have a hierarchy of independent subversion repos as follows:
core
sites/all/modules/contributed
sites/all/modules/custom
sites/all/themes/ (we develop our own and don't use contributed themes)
sites/all/libraries
then we use the svn:externals properties so that if you check out "core" you get every associated repo.
we got about 2 main developers with 4 other guys that may also contribute code to the site. each have their own local dev environment and we all got a common sandbox - where we make sure the stuff we wrote doesn't break someone else's module (it has happened before!).
we use svn commit hooks to update the beta/staging/sandbox site upon commit.
with all that setup, [re]deploying a site is a simple matter of going to the proper folder and issuing a "svn co http://repolocation/reponame ." and then updating the DB.
two last things to consider:
we are moving from svn to git
the features module will allow you to save changes you make to your own modules (views, content types, etc) and package all that into a deployable module so you don't have to duplicate your efforts. we are also looking into using this for ourselves.
I hope this helps you.
I second using backup_migrate. It's great.
When I'm installing a fresh site from development to production, I:
backup the site using backup_migrate module
copy all the files up to the server
edit the sites/default/settings.php to have the right database path and account info
do an import of the last backup_migrate dump (usually using mysql < backupfilename.sql, unless I already have drupal setup and have backup_migrate installed, then I use the GUI
But take a look here for the official version:
http://drupal.org/node/776864
Now, you didn't ask, but when the site is live and users are contributing content, moving future development versions of your site from development/staging to production without blowing away live content is a whole different problem, and one that Drupal doesn't have a good answer for...
Andy-
Related
I have a WordPress website up and running with many plugins installed on it and a huge database, I need to use chef-solo in order to create an environment in which can install the same website with all its plugins and and also importing its database.
I need it to be like, using chef to install the same website on a different server, exactly the same
Now here are my questions:
I know we can use chef to install WordPress but can we set it in a
way that we don't need to configure the the WordPress and everything
is already set once its running?
What to do with the plugins? can we install them using the chef or
now that should be done manually?
How about importing the database, that can be done with chef-solo
as well?
The whole website is on git, can I somehow import the whole
thing?
is there any other issue I may possibly face? if I want do that?
There is a wordpress cookbook openly available for chef.
When you mean configure, I take it you mean setup data in the database. Assuming that you've separated the database instance from the server instance, and you're attempting to scale up the number of servers then you should be able to skip data setup. You should be configuring the new server instance (node) to point to the same database via Chef.
I stumbled into this question looking for the answer to this question. From what I can tell the start may be here.
Kind of hand-wavy, but this should enable you to do some wordpress stuff via the command line with Chef, rather than the point and click it prefers.
As per #1, you should not need to import the database. If the database goes down, you'll want to focus on that as a separate but connected recipe, since then you'll want to be taking snapshots and uploading them somewhere like S3 via a cron job. I believe there are plugins that can enable this.
You'll have to be a little more clear by "import". If it's in a code base you may be able to short-cut your cookbook path by pulling down the git repo onto the host. You may want to look at git-archive.
Other issues that I'm looking at are images. We're migrating from a hosted solution to AWS, and it appears that instead of storing the images in the database, word-press pulls them into a local directory. This means that if we scale to > 1 host, we'll have issues with images. Something to think about, there's a wealth of plugins that can probably solve this.
Hope this is helpful,
Ben
I'm helping my university switch from Lenya to Drupal for their CMS. We plan on offering a drupal install to every department that wants one. The installs will all share the same codebase (the custom drupal "template" I'm developing now) but will each have their own database, allowing each site to have its own users, nodes, etc.
The problem I have is when, after making changes to the template, I'd like to update all of the installations. If the change is to core code or that of an installed module, for example, there's no problem since all installations are running off the same codebase. If, on the other hand, I need to make changes to the database, I'm screwed because there's a tonne of installed databases, and they're all different and need to be preserved. Even for simple changes like installing a new module, the module shows up fine on the list of installed modules, but I have to manually go into each installation and enable it by hand.
There must be an easier way! Is there some easy way (like a module I haven't heard about) to force drupal databases to update certain tables from a master database? I'm thinking of something similar to the "update.php" script that I could invoke en mass from drush.
Thanks for the help, all!
You can try to use drush make for that. Check out this site http://drushmake.me/ to see example of how it looks like. All you need is to install drush script to your server http://drupal.org/project/drush. And later you can build your own .make files and run it for different database.
I'm not a WordPress developer, but I'm trying to determine the optimal way for a team of folks to work with WordPress. For a Rails project or most anything else, it's easy to work locally and deploy upstream, but my understanding is that WordPress doesn't make this quite as easy. Maybe this is a myth?
From what I gather, it's not uncommon for URLs and file paths to be stored in the database which seems like it would make it difficult to deploy a WP project from dev --> stg --> prd (where each environment has it's own URL and possibly different file paths), much less for individual developers to have their own dev environment that would need to be "merged" into a unified copy for deployment.
I could configure all developer sandboxes to use a single database, but here again, if URLs and file paths are stored then nothing is gained.
There are a series of smaller questions here, but the more I think about those, the more I realize that what I'm really asking for is advice about how to structure things for optimal development of a WordPress site that will be hacked on by a team of developers. I'd prefer the sandboxed approach we use for other projects, but I have no idea if/how things can be unified once all development is complete.
Warning: incoming wall of text..
#Rob, WP is hell when it comes to working in teams; however, with a little work (and some symlink magic) you can set up your WP projects so that your working files for your themes or plugins can reside separately from the WP core. Some of this uses WP's built in mechanisms, some of it is related to SVN externals (hint). I'll let you google that since it's outside the scope of your question.
A note on WP GUIDs
WARNING: DO NOT replace guids. WP GUIDs are there for external feed readers. Feed readers use the GUID to determine if the content is recent. Changing it basically tells those readers that every entry in the feed is new (especially for posts.) That introduces a lot of extra overhead for legacy content that you just don't need. GUID are a legacy feature that should have been changed a long time ago to UUIDs. Technically, you can use anything int he guid field, but WP uses the permalink to populate that field -- legacy.
The only time it is ever acceptable to change the GUID is for new wp projects where content is brand-spankin' new.
To answer your question:
WP stores explicit references to the current domain in a dozen places in it's DB. These locations are a pain to track down and change, and the last thing you want to do is deal with manual edits to a *.sql dump file that you're going to import into production. It just smacks of bad development practices.
There's a couple ways to get around this, but it means a little bit of work if you're already further down your development lifecycle. I'll address the first case.
Case 1: Project Onset
When you're starting the project, you'll likely have a development sandbox and DB ready. You'll likely have WP already installed by now, so it's essentially clean for all intents and purposes.
The first thing you're going to want to do is change how your config file works. Most folks keep with the standard wp-config.php file (beyond a team production project, there's not really any reason to edit it.) However, you can set it up with some logic to include developer-specific or environment-specific config files. For example:
wp-config.php
switch( $current_environment )
{
case 'jack.local' : include( 'wp-config-jack.php' ) break; // Jack's sandbox
case 'jill.local' : include( 'wp-config-jill.php' ) break; // Jill's sandbox
default : ... break; // Staging & Production
}
The next thing you're going to want to do is include the normal contents of the wp-config.php file in a wp-config-remote.php file for use on staging/production. Next, edit your wp-config-remote.php file so that you can use 1 config file across multiple environments (staging,production). An if(...) or switch(...) block is all you need, e.g.
if( (strpos( $_SERVER[ "HTTP_HOST" ], "localhost" ) !== false) || (strpos( $_SERVER[ "HTTP_HOST" ], "local" ) !== false) )
(There are better ways to write that condition... this is just a crude example.)
Configure all of your WP settings specific to each of your remote environments. Hopefully you'll be checking this into a source control repository.
That basically frees you up to let your team have config settings specific to their environment, while letting you check in settings for each of the remote environments once.
The second thing you're going to want to do is build a mechanism to intercept and filter domain-specific links. The intent behind this mechanism is to replace any references to the current domain with a token/placeholder. I've outline the technique to do this here: http://www.farfromfearless.com/2010/09/07/url-token-replacement-techniques-for-wordpress-3-0/
It basically amounts to creating a filter that acts on the content before it's submitted to the DB and before the content is rendered to the page. The technique is transparent in that it won't affect normal editing practices. You can still create your content in the editor, reference other pages, posts, images, etc. and they'll show up just fine while editing in different environments.
In recent projects, I've wrapped all of this and a few other WP "normalization" features into a single bootstrap plugin that I set & forget.
Case 2: Project Ongoing
Now, in your case, you're further along in your development lifecycle. It's going to take some work to replace those domain references, but if you follow the steps I've outline above you should only ever have to do this once. The link I supplied above gives you the SQL you'll need to do that job. It's important to note that in a multi-site environment, you'll need to do this for every "sub-site" you've created.
Once you've updated your DB, I suggest implementing the steps in CASE 1 so you don't have to repeat the steps again.
Bonus: synchronizing content
Synchronizing content is a pain. What I've done in recent projects is had clients work on the staging server and promote changes upstream to production. So then, that leaves you with synchronizing downstream to your sandbox(es). Write a shell script that dumps a copy of SPECIFIC content tables from your staging DB, and imports them into your sandbox DB (effectively replacing content tables.) You should be able to see the benefit of the domain-token-replacement technique.
Images that aren't checked into source control, e.g. client images should be pushed to a common location, e.g. an S3 Bucket. There are WP plugins that can help you with that. That'll save a lot of time having to synchronize assets across environments.
I hope this helps you out -- if not, there's always SilverStripe ;)
It easy we build on a dev server and move to live server with this sql query:
UPDATE wp_posts SET guid = REPLACE(guid, 'devserver.com', 'liveserver.com');
UPDATE wp_posts SET post_content = REPLACE(post_content, 'devserver.com', 'liveserver.com');
UPDATE wp_options SET option_value = REPLACE(option_value, 'devserver.com', 'liveserver.com');
Last year I wrote a bash script to mirror a live MU installation into a sandbox. It's not perfect and not ideal, but a good starting point. It consists of mirroring the databases, files and rewriting the mirrored database to reflect the sandbox.
See http://pp19dd.com/2011/01/bash-script-to-mirror-wordpress-mu-installation-into-a-sandbox/
It's important for developers to be able to take live and exact content snapshots to replicate conditions.
I just ran into this issue myself launching a new website. My solution was to use Vagrant. Vagrant is also platform agnostic so you could be developing on a Mac and a teammate is using Windows. Same Vagrant project runs on both.
I wrote a guide on how to set Vagrant up with Wordpress from a production environment running locally on your machine. I don't use Wordpress that often but every time I do its always a hassle to setup Apache and PHP on my Mac, then make sure all the Wordpress site urls are updated in the database.
Once you've configured your Vagrant project, its a single command for any developer on your team to be up and running with a local instance of Wordpress. In short, Vagrant will mount your project directory from your host machine in the guest machine and run Apache, MySQL, PHP through the guest machine. You still use your host machine's IDE (as you normally would) and your host machines browser. There is no uploading of files anywhere, its just code, save, refresh browser all on your local machine.
http://www.distilnetworks.com/wordpress-development-with-vagrant/ <- Explains how to set it all up with Vagrant
https://gist.github.com/markmalek/fd2e6e65385400d9cd47 <- the shell script when provisioning Vagrant
The shell script could probably be a lot better, this is what worked for me but I would love to hear some better suggestions or ideas. I'm new to Vagrant and now use it for some of our other projects so thought it would be a good fit here as well.
I do update the GUID in the shell script, which I've read in another answer that you shouldn't do because feed readers use this. In this case its irrelevant since its just for your local instance of Wordpress but I wouldn't make this change in production. See this answer for better explanation.
Not a big deal, simply backup you whole site with all in one wp migration plugin and import on live server installation, plugin will replace all url automatically.
I like to use a simple Git workflow for static web sites but I build Joomla and Wordpress sites a semi-regular basis too. However I am at a loss as to how to use Git with with database driven site development.
For a static site I would 'Push' to dev.websitename.com, then push to www.websitename.com once the dev site checks out. How would I mimic that process with database driven site like wordpress or joomla.
Thanks in advance for you insight!
You can definitely use Git with your website code, such as changes to your WordPress theme/plugin, exactly as you would if you are developing a static website.
However, you wouldn't use it for your database. Git provides version control for code, while WordPress and Joomla already manage content stored in the database. Plus, Git wouldn't understand a database, so it wouldn't have any advantage over a periodic backup, which you should already be doing. Take a look at running a dev copy of your site for how to download your database directly from your server.
By the way, if you use Git with WordPress/Joomla, you should add e.g. cache, logs, tmp to .gitignore. There are also lots of tutorials out there--try searching e.g. http://google.com/search?q=wordpress+git.
In addition Chris, you may want to embark on your Git workflow without the handy script approach (at least initially). The script approach and using Git hooks can sure seem sexy (well, because they are) and handy too, but initially why not go with a more manual cmd line approach, which will also help you familiarize yourself with Git.
Once you've got your repo setup (GitHub, Bitbucket, somewhere else) and you've pushed your latest to it and are ready to deploy to production or staging, just login to your host and from wherever you've initialized the git repo (site root, example: /site) just do a:
git pull origin master
This will fetch and merge your code. Good idea to test this on a dev/staging environment and if the merge goes well then do it in production.
Can you help me to understand, how do I do Drupal website deployment and development?
Suppose, I developed 1.0 version of Berty&Frank website. I copied everything to their production server and it is alive and kicking now. Site is already full of contents and is growing.
I am asked to add additional features to the website. I am now experimenting with the way how I can implement them in a dev version. I am creating/deleting content types, fill created nodes with demo data just to see how they look like etc. Now I found the way and I want to upgrade production website to the same structure as my dev version now. How do I do that?
Is the only way to manually make every change I made in dev version?
I would explore the Aegir project for the future management of your website. It allows you to clone a site, then to upgrade the site to a new "platform" which could be the next release of Drupal or another Drupal system (such as OpenAtrium).
More can be found at the aegir wiki.
You can export/import views and contenttypes, but a lot of settings etc is stored in the db. This gives two options
Either to use something like backup & migrate to import your settings from dev. This wont work if you have test data though, as you would overwrite the db.
The other options is to repeat what you did on the live site.
A third options could be to take a fresh dump of the live site, do all the settings in that db in dev environment and overwrite the live db with that. You could loose some comments etc, but shouldn't be a big deal.
I use Subversion, and just do an update on my production server when I am satisfied with the code on my development server (actually, I have a staging server that is a duplicate of the production machine, so I update that before the production; I can see any bugs that might pop up).
For database changes, I haven't found anything better than just keeping track of my changes (usually adding/modifying CCK fields) and performing the same changes to the production database. I also download my production database regularly, so that dev and staging have almost the same content. That helps to minimize the confusion.
read http://www.drupal.org/upgrade/