Do I need to attach storage to Bitnami Wordpress Azure? - wordpress

I'm unable to find the details in Bitnami Wordpress documentation, but if I use it in Azure, do I also need to setup a managed disk? If yes, how can I configured Bitnami to use the managed disk instead of the OS disk?

Bitnami Engineer here. The WordPress solutions we provide in Azure don't use managed disk by default. All the components and files are included in the same instance so if you want to use a different disk, you would need to create that new disk, move the data to it and attach it to the instance.
Please remember that you will need to continue using /opt/bitnami as installation directory so if you mount a new disk, you will need to create a symlink to link /opt/bitnami with the files in the new disk.
I hope this information helps.

Related

How to access Word Press (Elementor) code on local machine

I'm developing a website using Word Press (Elementor plugin) and I want to create a custom widget. I found a tutorial I want to follow (https://www.youtube.com/watch?v=Ko9i153o_iU), the only problem is that I have no idea how to access the code on my local machine to begin. From what I can tell, everything I'm doing is on the word press website, and the code isn't on my local machine. How do I go about getting the code onto my local machine so I can begin working with it in vscode?
You need a way to spin up local apache or nginx server and mysql, along with Wordpress. I suggest the app Local:
https://localwp.com/
Then, if you want to copy your production environment (your website), you need to get the files for the theme and any plugins onto your local environment. Local pairs with some hosting providers to make this easy. Otherwise you can install by downloading the files. Some hosting providers give you FTP or SFTP access to your site files. Figure out how you can gain access and download them.
Lastly, if you want, you can copy the database over to your local environment. There is a great plugin called WP Sync DB that does this for free. It can also be a way to push your local environment to your production environment, but I definitely suggest keeping backups if you are going to do that.

Do I need a VM instance for each WordPress instance on Google Cloud?

I've been playing with Google Cloud, trying to figure out the most cost-effective way to host multiple low-traffic WordPress websites.
With Bitnami, it seems to me that for every new WordPress instance, I'm having to provision a new virtual machine. I also tried Google click-to-deploy WordPress setup, and it forced me to provision a cluster with 3 VM's.
Each of the new VM's cost money, so I'm wondering if there's a way to do something similar to shared Linux hosting, where I could host multiple WordPress instances on a single Virtual Machine.
You can use the Bitnami Wordpress multisite stack, which allows multiple sites to run on one server.
In you don't want to use the Bitnami Multisite solution, you can also install multiple WordPress apps in the same server without installing multiple database or web servers. Bitnami provides modules to install on top of an installed stack (normally LAMP stack) and the WordPress module allows you set the name of the blog you want to create.
The module can be downloaded from here but you will need to run the following commands in the instance (these commands will download the current version)
wget https://bitnami.com/redirect/to/269995/bitnami-wordpress-4.9.8-0-module-linux-x64-installer.run
chmod a+x bitnami-wordpress-4.9.8-0-module-linux-x64-installer.run
sudo ./bitnami-wordpress-VERSION-module-linux-x64-installer.run --wordpress_instance_name NEW_BLOG_NAME
Once you have the module installed, you will be able to access it through http://localhost/NEW_BLOG_NAME.
More info in the Bitnami documentation
https://docs.bitnami.com/installer/apps/wordpress/configuration/install-several-wordpress-modules/
I found the following post which has explains how it might be done.
http://designhack.slashlab.net/en/how-to-setup-multiple-wordpress-without-multisite-ft-bitnami/
Make sure to back up important data before you start.
There is the chance to set up multiple websites using bitnami, but i recommend to kept separate every site to avoid database confusion, and to extend the functionalities of every website.
https://bitnami.com/stack/wordpress-multisite
Im using a single VM per domain to avoid confusion with DNS.

Drupal to Drupal Migration Across Servers

I am in the process of migrating a D7 site from one server to another. I have successfully exported and uploaded the settings to the new site using Features, but I need to get the content over to the site as well. I've been looking at several modules to try and solve this problem, but I have not found anything suitable for this task. Please let me know if I am overlooking a really simple solution.
Thanks!
Mark
Easiest solution is to export a database dump and import it into your new server. You can do it wotj phpMyAdmin but I recommend using Drush.
This way you can simply do a database dump via:
drush sql-dump > ~/sql-dump-file-name.sql
and later import via:
drush sql-cli < ~/sql-dump-file-name.sql
Also copy your files directory from old server to new server which is located in /sites/default/files.
I've successfully used the backup and migrate module for these tasks. True, creating a dump and then spooling the dump into the other database works, but this typically also copies all caches.
The backup_migrate module allows you save backups on your local server, but also to your hard disk, from where you can upload it again to the other site.
A neat thing here is that you can exclude tables, such as cache tables, which makes the transfer much faster.
Obviously you need a core installation on the other end, and the backup_migrate module already installed for this to work, but I assume that since you only ask about the db, you must have mirrored the file structure already (excluding the settings files).

Install WordPress with its plugins using chef-solo

I have a WordPress website up and running with many plugins installed on it and a huge database, I need to use chef-solo in order to create an environment in which can install the same website with all its plugins and and also importing its database.
I need it to be like, using chef to install the same website on a different server, exactly the same
Now here are my questions:
I know we can use chef to install WordPress but can we set it in a
way that we don't need to configure the the WordPress and everything
is already set once its running?
What to do with the plugins? can we install them using the chef or
now that should be done manually?
How about importing the database, that can be done with chef-solo
as well?
The whole website is on git, can I somehow import the whole
thing?
is there any other issue I may possibly face? if I want do that?
There is a wordpress cookbook openly available for chef.
When you mean configure, I take it you mean setup data in the database. Assuming that you've separated the database instance from the server instance, and you're attempting to scale up the number of servers then you should be able to skip data setup. You should be configuring the new server instance (node) to point to the same database via Chef.
I stumbled into this question looking for the answer to this question. From what I can tell the start may be here.
Kind of hand-wavy, but this should enable you to do some wordpress stuff via the command line with Chef, rather than the point and click it prefers.
As per #1, you should not need to import the database. If the database goes down, you'll want to focus on that as a separate but connected recipe, since then you'll want to be taking snapshots and uploading them somewhere like S3 via a cron job. I believe there are plugins that can enable this.
You'll have to be a little more clear by "import". If it's in a code base you may be able to short-cut your cookbook path by pulling down the git repo onto the host. You may want to look at git-archive.
Other issues that I'm looking at are images. We're migrating from a hosted solution to AWS, and it appears that instead of storing the images in the database, word-press pulls them into a local directory. This means that if we scale to > 1 host, we'll have issues with images. Something to think about, there's a wealth of plugins that can probably solve this.
Hope this is helpful,
Ben

Enable S3 bucket contents to always be public

I've managed to use S3FS to mount an Amazon S3 folder into my Wordpress site. Basically, my gallery folder for NextGEN gallery is a symlink to a mounted S3FS folder of the bucket, so when I upload an image, the file is automatically added to the S3 bucket.
I'm busy writing an Apache rewrite rule to replace the links, to fetch gallery images from S3 instead, without having to hack or change anything with NextGEN, but one problem I'm finding, is that images are not public by default on S3.
Is there a way to change a parent folder, to make its children always be public, including new files as they are generated?
Is it possible or advisable to use a cron task to manually make a folder public using the S3 command line API?
I'm the lead developer and maintainer of Open source project RioFS: a userspace filesystem to mount Amazon S3 buckets.
Our project is an alternative to “s3fs” project, main advantage comparing to “s3fs” are: simplicity, the speed of operations and bugs-free code. Currently the project is in the “beta” state, but it's been running on several high-loaded fileservers for quite some time.
We are seeking for more people to join our project and help with the testing. From our side we offer quick bugs fix and will listen to your requests to add new features.
Regarding your issue:
if'd you use RioFS, you could mount a bucket and have a write / read access to it using the following command (assuming you have installed RioFS and have exported AWSACCESSKEYID and AWSSECRETACCESSKEY environment variables):
riofs -o allow_other http://s3.amazonaws.com bucket_name /mnt/static.example.com
(please refer to project description for command line arguments)
Please note that the project is still in the development, there are could be still a number of bugs left.
If you find that something doesn't work as expected: please fill a issue report on the project's GitHub page.
Hope it helps and we are looking forward to seeing you joined our community !
I downloaded s3curl and used that to add the bucket policy to S3.
See this link: http://blog.travelmarx.com/2012/09/working-with-s3curl-and-amazon-s3.html
You can generate your bucket policies using the Amazon Policy Generator:
http://awspolicygen.s3.amazonaws.com/policygen.html

Resources