What can WooCommerce site owners do to optimize their site. Those owners are those to whom I developed an online store.
After some time of operation, the site gets bigger in disk size, I suppose it's due to a growth in the mySql database.
Is there a plugin that the customers can use to optimize their site without knowing anything about databases and wordpress tech stuff?
There are a lot of guides on how to optimize WordPress websites. Here is one. The most basic thing everyone should do is to use a caching and image optimization plugin.
But for WooCommerce specific actions that you can take, the one thing you can do is install the custom orders table plugin. This puts all of the WooCommerce data in a separate database table. However, it does require using wp-cli, so a person would have to know how to SSH in and run the command. It is easy to automate doing this though.
As for bigger disk sizes, if a WordPress database is more than a few GB in size something is terribly wrong. Normally it's only a few hundred MBs. So no, large websites don't happen because the database growing. It's from the stuff getting stored in the wp-contents folder. WordPress stores will store multiple copies of each image at different resolutions, and this along with having lots of plugins and themes downloaded can increase the size of the website, but it still won't get ridiculously big. Although once you add a backup plugin on top of the size of the website gets multiplied. A good backup plugin should default to not keeping around too many local backups. The most serious issues I've seen happen when there are multiple backup plugins, and they start including in their backups all of the backups created by the other plugin. You will then get exponentially bigger backups, and the size of the website will quickly get out of control.
So that's what causes websites to be bigger, and website owners will inevitably add more plugins and images to the site over time, increasing the size of the website, but even worse, all of the additional plugins will slow the website down.
You also need to make sure you're clients are using a good webhost. The guide I shared has some criteria you can use to judge a good hosting company. The one thing that many hosting companies don't have is some sort of in memory cache. Look for hosting companies that use Varnish or Litespeed or Nginx's proxy_cache to cache pages before they hit WordPress.
Related
I have a classified website pkwhistle.com that is leading multiple countries and has a huge collection of images media. Is there any way to store newly uploaded listing images automatically store outside WordPress and fetch back to my site. clasificadospr.com is the best example of my idea. Because this website is using service which I am actually asking about. It's using the "thumbor" service. Please help me in this matter so I can increase the speed of my website. More than 10thousand images on a website can kill speed.
Well, it's called hosting/loading your images from a CDN, and there are many providers that work nicely with Wordpress!
With 10.000 images you mostly end up with a premium solution such as WP offload Media from Deliciousbrains (highly recommended and I am not in any way affiliated to them, just love their products). They also have a free version.
You can hook it up with all the big assets storage providers (digitalOcean Spaces, Amazon's AWS)
And integration with WP is great, it syncs between the CDN and your Wordpress Library.
Alternatively, there are some free options, you can use photon from Wordpress, it does almost the same, but hosted on photon's servers. It comes with the Jetpack plugin.
Another free option is Cloudinary (they have a plugin as well). But it has a limited free plan.
Good luck!
I have one website in wordpress from its I gave access to other user for create their own website but when he crate their site then 11 tables are made in database. And I have almost 10.5 millions user,so when they all create their sites then main database has around 120 millions table due to this our main website has down.
So, please give suggestion how to overcome this problem. Kindly give response as soon as possible.
Thanks,
Rajesh Mishra
CIET
I run a multisite with around 20k sites at the moment and add around 7k each year. We deal with the scale in part by using Multi DB but there are a few others like ShardDB that let you split things up. That helps considerably.
I have seen other people use subdomains and chunk their WP multisite installs into more manageable (but separate) installs on an X sites per install per server basis.
I have not pursued the route of having WP generate fewer tables.
Depending on the level of user activity in the dashboard and with authoring content you'll have a really large load on the server. WPMU Dev has at least an introduction to the large scale (although still only 500k sites) multisite conversation.
To give you a little background, I have a website with WordPress as my content management system, which revolves around users uploading panoramic photos. The site is hosted on a small Amazon EC2 instance. After encountering a few days of noticeably slow speeds, I decided to address the issue. In following the suggestions of several speed diagnostic sites (i.e., enabling browser caching, gzip compression, and keep-alive), I was able to increase my scores substantially and speed over basic site usage. Unfortunately the site remains incredibly slow when uploading files as panoramic photos tend to be large in nature. When a user uploads a file, a new post is created with a resized version of the panoramic image, and once complete, the user is redirected to the new URL. Does anyone have any suggestions to expedite this process? Are there any options besides upgrading my server?
The following plugin does exactly that:
Dynamic Image Resizer
Changes the way WordPress creates images to make it generate the images only when they are actually used somewhere, on the fly. Images created thusly will be saved in the normal upload directories, for later fast sending by the webserver. The result is that space is saved (since images are only created when needed), and uploading images is much faster (since it's not generating the images on upload anymore).
The author is WordPress core developer and knows WP code inside out.
I've had to migrate many Wordpress web sites from different domains on the same server to different domains on different servers. In few cases, a simple export was sufficient. In many cases, an import failed to load the media correctly and I was forced to use a common work around.
Workaround (for those wondering):
I download from the original site and upload to the new site the uploads folder where my media is stored via FTP. Once this transfer is complete, I use the plugin Add From Server to select each individual image, one directory at a time.
This is the best workaround I've found, but it's hardly efficient. It's incredibly time consuming and stressful on your bandwidth.
If you have any better suggestions, I'm all ears. But primarily, I want to know the "Why" to this question. What causes Wordpress to have such a hard time managing media migration while migrating posts, pages, and users are much less of a headache?
There is an excellent tool that certainly eases Wordpress migration WordPress (and others) Search and Replace Tool. With that tool it's easy to search through the entire database for all occurrences of old domain, and replace that with the name of new domain. After replacement all the pictures and widgets should work properly.
The way I'm moving WordPress:
export and import the database with phpMyAdmin
transfer the files with FTP program like FileZilla
edit the wp-config.php settings for a new domain
search and replace on the database with InterconnectIT Search and
Replace Tool
Well, with entirely file-based CMS you can easily put the whole directory into version control system to record any changes to the site. The synchronization with the server would be also trivial because it would only involve uploading the files via ftp.
With these benefits in mind, I am a little puzzled about the popularity of databases as the only storage mode, even when the CMS in question is meant to be used by amateurs for small websites.
How does your versioning and synchronization workflow looks like?
What kind of simplified versioning/synchronization workflow would you suggest for a casual, non-tech, WordPress user, to give them the benefit of working locally and encouraging them to have a backup of their site?
Most CMS systems nowadays tend to have some or other backup solution in place to help you. Since Wordpress is a CMS for the masses and also caters for the non-tech population, you're sure to find a plugin that can help you with this. I know it's built-in backup solution just backups posts etc. to XML, but even this does a pretty decent job of restoring over a clean wordpress installation and working fine.
But I found this plugin (which works for Wordpress and Joomla) by asking Google, which most probably is the answer to your question: XCloner
Also in terms of workflow, specifically for Wordpress, don't give the user Admin privileges, but editor or contributor or something, so they can still edit content, etc. but not make changes that could mess up the CMS itself. And maybe this XCloner plugin can do some kind of recurring backup or something. Otherwise, I suggest you move to a LAMP stack hosting environment where you can at least have cron jobs setup to backup your databse and files regularly. Most hosing companies do this in any case at no cost.
Wordpress also keeps revisions of all posts and pages, so if a user doesn't like an update they've made, the full revision history is available. Be sure to check screen options at the top to see that Revisions is checked, if you aren't seeing this option. Kind of a nice built-in.
Can also (depending on host) have scheduled database/file backups through cPanel, in addition to scheduled database backup plugins through WordPress. Some will save remotely or even email the database out.