Setting up wordpress to use remote cloud based storage - wordpress

I have a site with a large number of images, I'd like to host these on a remote cloud storage solution as we're getting close to our storage limit on the current server.
I can get a remote cloud storage service setup, what needs to be done on wordpress configuration to use this as the new folder for uploads?
Thanks

Specifically for AWS S3:
The company I work for use this: https://deliciousbrains.com/wp-offload-s3/ and it's worked a treat!
This should handle the automatic upload of your old media, plus updating your posts/pages. To be safe, download a local copy of your WP and database, and run all it locally using a test bucket. Or, have a back up on to hand if the upload doesn't work. Can't be too safe!
We've only had one issue with it this past week and it's when you upload a file but change the file extension afterwards, it never off-loaded that particular image to S3 and continue to load the old /wp-content/<year>/<month> version.

Related

Can't access uploaded files through Firebase storage on web

I'm trying to access/download files that have been uploaded to Firebase storage but I can't figure out how to. Some of the files have a create new access token button under the storage location which gives me a link to the file on the web. Unfortunately, this is only a few files and seems to only be ones uploaded from localhost?
I can't find any reference to this on the documentation, this should be achievable through the Firebase dashboard?
I've tried setting the access rules to allow reads in all cases which hasn't helped.
Thanks
In general, you're supposed to use getDownloadURL(), as shown in the documentation, from within your web or mobile app to generate a download URL for use within your app. Typically, the console is only used to revoke the tokens that enable to download of each file through that URL.
If that's not specifically what you're trying to do, perhaps you want to read up on other was to make data public in Cloud Storage. The Firebase console is not really the best mechanism to manage downloadable content.

How to upload website on Firebase hosting without attaching storage.rules & firestore.rules files?

Today I have uploaded my website on Firebase hosting and in the process of doing that I have to create two rules files, one is storage.rules and another one is firestore.rules. I also have to create a firestore.indexes.json file. But I can remember previously I didn't have to create any such file while uploading another website on Firebas host.
I need to get rid of these 3 files, because every time I upload my files It changed to private rules for both storage and cloud which is unnecessary for me. I need to remain them public. Besides that I can't open these 2 rules files to edit (using Mac) just at least to see what is inside of them. How can I do it, Thanks!
It sounds like you use the Firebase CLI to initialize several products in the same project, including Cloud Storage and Firestore. If you don't want to work with these other products in your project files, you shouldn't select them during initialization.
The easiest thing to do would be to start over in a new directory and initialize only the products you want to use. It sounds like that's only Firebase Hosting.
You could also edit firebase.json and remove the products you don't want to use any more.
If you do want to work with Storage and Firestore, but you only want to deploy to Hosting, then just use firebas deploy --only hosting.

Firebase Functions - generate and host static webpage

I'm using Firebase Cloud Functions to generate an HTML file and now I'd like to host it together with related assets (js, css, fonts etc.) but without success.
I call the function, it generates the file properly and puts it in Firebase Storage together with js/css/other assets. Now I would like to return a URL of the index.html file so that the user can access it in the browser and the .html page will have access to the assets. Unfortunately the generated URL enforces download but I'm pretty sure that even if I managed it somehow, it won't be able to access asset files.
I know it's possible on AWS (S3 bucket) but can I do it on Firebase? Firebase Hosting doesn't seem to be the right solution in that case, does it?
Don't save it to Storage, that's a bad use case for this scenario. Instead, save it to Hosting:
https://firebase.google.com/docs/hosting/
Also, you can consider serving the content directly from the cloud function, probably there's no need to create a static version first.

Amazon Elastic Beanstalk Content Management

I have created my Elastic Beanstalk application with Wordpress to test. However I am struggling to understand the best way of managing changes in dynamic content online and development changes locally.
I upload my initial Wordpress Installation to my AWS Bucket
I run the initial Wordpress Setup
-- Lets presume that I have included a live theme and uploaded some products and time has progress, changes are made online, new products added to WooCommerce etc..
I make a new page template locally and want to upload to the Bucket
I use EB Deploy, but when I do this all content online in my Bucket is overwritten with the local content.
Now I do of course accept this is by design, but how is the problem best addressed?
Does anyone have any advice to offer with managing content of this sort in the AWS EB?
The instances managed by EB have to be considered disposable. This means that they can disappear without notice.
If the changes are dynamic (eg: files are being uploaded) you cannot store those files in the file system of the instances as they are disposable.
In addition, have in mind that if you scale to several instances, you will have different instances managing different data sets (eg: you upload a file to only one instance, not to all of them).
There are several approaches you can try, for example:
Use a Network File System (NFS) server: in a separate instance, setup a NFS server and set up the EB instances to mount a remote mountpoint at startup. With this approach you can centralize the storage for all your EB instances.
Check out the EFS service from AWS. It's like a NFS server but Amazon flavored. Haven't checked it out yet but it looks promising.
To resolve these problems I created a couple of extra S3 buckets, the first for images and the second for css, js and alike.
Since this was a Wordpress, WooCommerce installation, I commissioned the S3 Offload plugin from Delicious Brains (https://deliciousbrains.com/wp-offload-s3/) ehich was a little expensive but this moved and monitored content of this sort and copied off to the other S3 buckets and allowed the 'EB Deploy' to leave the working content untouched.

Maintaing Images when switching servers

I just changed the server one of my sites is hosted on. In doing so, I lost all the images. The CCK file upload fields show "ghost" data but contain no actual image data as they did before the site transfer.
All my data is fine, however.
Is there a way to prevent this so all my images are maintained?
Thanks
You can transfer files with rsync directly or use drush to rsync. You'll need to have ssh access to the servers to get this to work.
Here's some info on setting up your drush aliases:
http://www.leveltendesign.com/blog/dustin-currie/synchronize-one-drupal-site-to-another
http://drupal.org/project/drush
If you're performing this task once you could also use scp to copy files from the destination:
http://www.go2linux.org/scp-linux-command-line-copy-files-over-ssh
If you're on a shared hosting platform, just FTP the files over old school style.

Resources