Use a purchased template in testing environnement - concrete5

I received the task to build the website of a little company and they wanted me to build it with concrete5. They have also chosen and purchased a theme, but only one licence.
Since I want to build the website on my local machine first, I wanted to ask if there is a possibility to migrate websites with purchased themes once I finished building it on the local machine, even if we have only one licence (since the theme would then basically bee used twice ; also when purchasing a theme, they ask to assign the licence a project, making it only available for this one).

This is totally fine, the license is for one site, not one install. So staging environments are totally OK. You might have to get your client to add your URL for automatic updates, but I'm not sure on the process for that. You can just FTP or rsync the files down and install locally, though.
Really, all that the actual mechanism behind checking the licenses does is to show you when there are updates, which, if you're going to be customizing the theme at all, you might not even want. Otherwise, the client could see there's an update and blow away your customizations. So in that case, you might not even want to associate the license with the marketplace at all, instead just leave it unassigned and download a copy of the files manually.

Related

Migrating the API Connect Developer Portal between environments in Bluemix

I have one Developer Portal in Bluemix API Connect for the Development environment, one for Testing environment and one for Production environment. I have made some customizations in the structure of the modules, settings and content and I would like to migrate them from one environment to the other.
I have a theme for the styles, fonts and images but there are settings and content that are not included in the theme. I have found some Drupal plugins to migrate tables from the database. However, there is a risk of overwriting tables related with the API, products, plans, etc.
I would like to know if there is a recommended way of doing this migration without having to do everything again manually.
There isn't currently a simple way to migrate that sort of configuration between portal sites.
There is a Drupal module called "Features" which can export / import capability but it doesn't support all configuration and isn't a process we have tested or documented, so you'd be using it at your own risk.
You presumably uploaded your custom theme to one site, so you can just upload it again to the second site - that bit is simple.
If its an extensive amount of configuration then you can raise a support ticket and ask Ops to overwrite the target site with the configuration of the previous site - but that's a one off process, it would completely wipe the target site. So that isn't really going to help with ongoing changes.
You can write a custom drupal module to make your configuration changes - then simply load the module on each site and it would make the desired changes. However that can be a lot of work. If you only have a couple of sites it's likely easier to simply redo the same changes manually.
Hopefully this will improve in the future.
(Full Disclosure: I am part of the API Connect development team)

wordpress deploying solution, ideas?

I develop on a local machine a Wordpress site and I'm now looking for a mechanism to deploy it easy and fast. I'm thinking about a DEV environment (located on my local machine), a STAGING environment (a subdomain on the client page, maybe staging.example.com) and of course a LIVE environment (example.com)!
My current workaround:
As I work with Aptana I'm able to sync my changed files with the deploy mechanism the IDE provides. Exporting my local database, finding/replacing the permalinks and importing the whole thing - finish! To deploy live, I have to replace all staging files with the live files.
This should be easier! Is there anyone out there, having a better workflow?
I'm open and really excited about your ideas!
Thanks a lot
greetings
Yep, it's frustrating and completely insane that Wordpress requires this process because they put absolute urls in the database. I develop in a similar fashion using multiple staging sites for qa and client review. After my first deployment with Wordpress I almost gave up on the platform entirely. All of the solutions recommended by core developers and others simply didn't work.
So I wrote a plugin: http://wordpress.org/extend/plugins/root-relative-urls/
that fixes the problem. With this plugin you don't need to do a search & replace on your content. No hosts file hacks, or dns tricks. With my plugin you can access the site via IP address or Computername or any type of forwarded host. And since it converts urls to root relative before they enter the database, you won't have to worry about them working between the different domain formats. And since they don't hard-code the scheme (http/s) in the url you won't have to worry about the 520 or so bugs that were reported in the wordpress trac database if you use SSL.
It's a staple for any wordpress project I work on these days. And I have written a couple other plugins to deal with idiosyncrasies that exist in the platform that you can check out here: http://wordpress.org/extend/plugins/profile/marcuspope
Hope that answers your problem.
I use Capistrano https://github.com/capistrano/capistrano/wiki/ for all my deployment needs and it is really good solution. You can simply script anything and it just works.
It could work for your deployment scheme too.
I also use Capistrano for both WordPress and Drupal deployments. I typically install modules locally for testing then push to test and production environments. For uploads, etc. I add custom tasks to manage syncing files stored in scm and those that are not. Here is a simple guide I put together.
http://www.celerify.com/deploy-wordpress-drupal-using-capistrano

Making an entire website available offline?

Consider this scenario:
I could use a CMS, say Wordpress, to create a product catalogue, where my products are effectively tagged and categorised for ease of navigation. For employees and customers, this would provide an effective and visual means to browse a catalogue of products.
The problem with this is that it requires a connection to the internet to serve up the information. There could be many situations where the users of this catalogue are not connected to the internet, but still need to browse the catalogue - like field sales staff, for example.
How then, is it possible to make this entire site available for viewing (and distributing) offline? It would need to function exactly as the internet-connected version, serving up the same information and images.
Is it possible!?
I guess the limitation is the the WP database serves up the info and that would require everyone to have a MAMP-type installation, with Wordpress on their machines?
You could create a static mirror of the site e.g. wget -km http://DOMAIN. Package that into an archive and get them to install a new archive whenever it's been updated.
If you need it to function exactly, like you mentioned, you might want to check out XAMPP. It is a package containing an apache webserver, mysql, perl and php. It is not required to be installed before being used, but it does require starting the components which could probably be scripted.
The downside is you will need to customize this version unless you want to include all your information in the catalogs. Also, since your current server likely has different modules than what comes standard with XAMPP this could lead to having to basically maintain two versions of the site.
If you don't need the databases to sync (e.g. portable POS systems), MAMP is a great solution. I've implemented this several times in cases where field agents required web-based promo materials. Easy to update, maintenance free, small learning curve. MAMP all the way.
I'm developing a Wordpress site, mirrored locally under http://localhost. I'm able to transfer the database with a simple plugin that handles backup, then before I load locally I remap the URI strings inside the SQL. Being PHP serialized, some care is needed to keep the string size aligned. I.e. change each occurrence of s:N:"...http://your_site/" to s:M:"...http://localhost/your_site/", with M = N + 10.

How to develop wordpress site collaboratively?

For simple projects, it is suffice enough to use subversion to synchronise theme/plugin code between team's Wordpress installations. However, with larger projects, in which themes/plugins are content dependent, content needs to be synchronised as well. Is there a way to do it automatically instead of using Import / Export tools from Wordpress?
You can set this plugin to an hourly backup: http://wordpress.org/extend/plugins/wp-db-backup. It can send backups to email, save them on server or download them on your machine (if done manually). Content is saving its revisions anyway so I don't think you'll have problems with that.
SVN for the files sounds actually pretty neat, even without having other people working on your WordPress installation, but can you install svn client on a shared hosting? If you got a server of your own, this won't be an issue for you.
Hope this helps you a bit!

deploying changes on a living drupal site

I really like drupal somehow. But what disturbs me most is that i can't figure out a clear way of deployment. Drupal stores a lot of stuff inside the database (views, cck, workflow, trigger etc) that needs to be updated.
I've seen some modules that could be used for this task (eg features) and I'm not sure if they are sufficient. Yet they are only for drupal6 and i currently have to work on a drupal5 site where updating is not yet an option.
Any ideas?
This is a weakness. Drupal doesn't have the developer tools built in that make development and deployment easy like Rails does (for example). One problem is Drupal isn't aware of it's environment natively. Secondly, there are too many different methods and modules that require special care. It can get very confusing. But things are getting better with drush and drush make.
I'm assuming here that you have a development environment on your local machine and a live or staging server you upload to.
The first thing you have to do is work out how to get your database fixture and your code to and from your server to your development environment very quickly. You need to make this proceedure as painless as possible so you can keep different versions of your site in sync without much effort. This will mean you will hopefully be able to manage less change every time you deploy. Hopefully...
Moving the database around isn't too hard. You could use phpMyadmin or mysqldump but the backup migrate module is my favorite tool.
To upload code from your local repository or site can be done in a few ways. If you use a version control system like git, you can commit on your local machine and check out again on the staging server. There are also special deployment tools like capistrano you should take a look at. (if you know this stuff already it may benefit others to read). If you're using FTP you should probably try something different.
If you're working with a site that is still in production, you can afford to make small incremental changes to your local site, then repeat on the live site and down load the new version of the database when your changes are in place. This does mean you double handle the database but can be a safe way of doing things. It keeps both your database closer to each other and minimises risk.
You can also export views backup to your server in either your code or importing them into your live site. There is a hack to get around deploying cck changes here: http://www.tinpixel.com/node/53 it works OK but cannot truly manage changes like rollbacks. (Respect to the guy who wrote that)
You can also use hook_updateN to capture changes and then run update.php to apply them. I worked on a d5 site with dozens of developers and this was the only way to keep things moving forward. This may be a good option if your site is live or if you need all database schema changes captured in a version control system (so you can roll back).
Also: Take a look at drush and drush make. These tools can be of great benefit. I can't remember how much support is for d5.
One final method of dealing with this is not to use cck or views (and use hook updates). But this is really only suitable for enterprise sites where you have big developer resources. This may seem like a strange suggestion but it can negate this whole problem completely.
Sorry I could not give you a clear answer. This is because one does not exist yet. You'll end up finding your own rhythm once you get into. Just keep backups of your database if you can roll back to them easy enough.

Resources