Backup Wordpress Website Directly on Remote Storage - wordpress

We have this very limitted space on the hosting provider server. Is it possible to do a backup of Wordpress website directly on remote storage, without the need for storing temp files on the hosting server side first?
We tried using the iconic Updraftplus - backup/restore plugin, but we see local compressed files are created first, then they can be sent to cloud/remote storage afterwards (with Premium version of plugin).

Related

Need to transfer site currently on Siteground

my client wants to keep using their hosting that they currently have their site on, where I was given cpanel and ftp logins. I made the Wordpress site on a temporary domain on Siteground. How do I go about transferring the site to their hosting?
there are many plugins to transfer your website and database to different hosting. The good one also rename all URL's in your database to ones of your new hosting.
I suppose the easiest one is Akeeba backup for Wordpress Akeeba Backup for WordPress
download the plugin, install it in your Wordpress and activate it
In your Wordpress menu you will have Akeeba backup entry click it and create new backup (for bigger websites I recommend split the backup file into smaller parts in plugin settings)
upload all parts of your newly created backup (you will find it in wp-content/plugins/akeebabackupwp/app/backups/) into the new server root through FTP
download Akeeba Kickstart Kickstart (Standalone) (this files will uncompress your backup on the server) so copy both files in downloaded zip into server root
visit in your browser your new hosting domain and add /kickstart.php (example.com/kickstart.php) follow onscreen instruction to complete recovery
in case you are stuck here documentation for Akeeba Backup Akeeba Backup for WordPress
and for akeeba-kickstart

Drupal 7 used s3fs modules to storage images, now how to use aws cloudfront CDN for my website?

My website is built using Drupal 7 and uses s3fs modules for storing all files and images, I use i18n to translate my website from source language English to french, german, and more. I use domain as multi domain was working well
I want to use the CDN module to speed my website, so I try to use AWS CloudFront service for it. However, I am confused as to how to do it.
Because the s3fs module makes images use aws s3 store to server images files, and how to use aws CloudFront to server all websites files through CDN?
You can create a CloudFront distribution for your S3 bucket, that's going to be unrelated of how you upload your files in the S3 bucket itself.
After that you can install the CDN module, to use the distribution of your S3.
From this tutorial of AWS:
There is one module, not included in the Core Package that I would
consider. This is the CDN Module, produced by Wim Leers. It has been
actively developed since its initial release in 2008 and has been
re-written for Drupal 8. This module changes file URLs so that CSS,
JavaScript, images, audio, and videos can be cached within CloudFront
more easily. It also changes the URL when a file is changed by a user
in Drupal. This allows the content to be cached early, without having
to think about expiring content from the cache. To enable the module,
go your Drupal Administration site, and click the Extend Tab. From
there, click the blue button for “+ install new module.” I looked up
the latest version supporting Drupal 8, and provided the URL to the
.tar.gz file. In my case, this was
https://ftp.drupal.org/files/projects/cdn-8.x-3.2.tar.gz.
Also remember that you would need to invalidate the cache if you change the underlying content of a file in S3. Eg. update a file content but the filename is the same.

wordpress ftp using google cloud platform

I am doing some testing to determine the feasibility of moving my small web hosting business over to Google Cloud Platform. All of my client websites are Wordpress sites built by me and i also fully manage them.
I have setup a free 60 day trial and am about to install my first project...which will be a prebuilt CMS (Wordpress) found in the software packages list in Google Developers Console.
There are at least 2 things i am wanting to test...
1. using wordpress multisite (as i intend to move all of my existing clients websites into Wordpress MU
2. Speed of websites on this network (one concern is latency as the datacenter location not being in my country)
So in order to test the above, i would like to setup some clones of existing client websites on the Google Cloud Project i create.
Question...
How do i get file and directory access to the Wordpress CMS on Google Cloud so i can upload Websites produced on my local system or another server?
(i need to copy up media files e.g. images, content, and themes)
Or is my only means of file directory access via Wordpress plugins with this solution?
Depends on which tool you choose to use.
If you use Wordpress for AppEngine, you'll have to use a combination of deployment techniques and plugins to get data onto your instance.
If you use Wordpress Multisite then you would interact with it just like any other install of Wordpress.
1) Create a key pairing (https://cloud.google.com/compute/docs/instances/connecting-to-instance#standardssh)
2) Connect via SFTP (CyberDuck, FileZilla)
3) Upload files into your /home/yourusername directory
4) Connect to the server via shell, you can do this from your Google Dashboard (Google Shell)
5) Change your user to Root by typing sudo /bin/bash
6) Now you can move files sudo mv file/you/want/to/move.html /var/www/html/
I recommend you use SFTP, but if you want to connect with FTP see this blog to solve your problems.
If your VM is based on Linux then you have to use an application like vsftpd to set up an FTP server.
Here are the steps:
Deploy a Virtual Instance on Google Cloud
Open SSH terminal
Installing VSFTPD
Create a User
Configure vsftpd.conf file
Preparing an FTP Directory
FTP/S or FTP over SSL setup (optional)
Opening Ports in Google Cloud Firewall
Test and Connect

How to publish a WAMP Server project with one click?

I'm making lots of WordPress sites and currently editing files directly on cpanel accounts via FTP. I would like to do this a different way and I wonder if it is possible:
Develop in WAMP on my local machine (I have this set up)
One click publishing and/or updating of the project to a cpanel account (files and database).
Synch the live cpanel site back to the WAMP environment to do ongoing work and testing (files and database).
I am NOT looking for instructions to manually move WP to a server. These are all over the place and I know how. I want to work from my local machine and synch/publish to the server with a single click, or at least something very simple.
Thank you in advance
I use WP Migrate DB Pro for managing database and media copies from one server to another. I use beanstalk with auto-deployment for pushing code to my ftp servers (I have at least one test site and a single live site for every website I work on)

how can i synchronize wordpress between local side and remote side?

I have installed wordpress in my free web space, and also intalled it in my local pc.
I have to write blog in local wordpress when can't get web.
How to synchronize wordpress between local and remote side ?
which plugin can i download to do the job?
should i write some php code to synchronize wordpress?
I can do the job by hand,export my blog in the xml format file from remote wordpress in which i installed on my free web space,and import the exported xml format file into my local wordpress,can i do that job automatically?
There's a Wordpress plugin called Database Sync which lets you sync your WP database across multiple servers like Dev, QA and Production. But it doesn't let you sync your uploaded files.
http://wordpress.org/plugins/database-sync/
You could use a FTP synchronization software to keep your uploaded files synced. I use WinSCP for that.

Resources