why is waiting time of the website is too high - wordpress

I have a WordPress webiste and its waiting time is too high I have done the optimization using autoptimizer and use the gzip compression code in the htaccess file for the compression.
please help me out is this the issue from the my end or is this server issue
I am placing the link of the recent test fone using Gtmatrix fro my website:
https://gtmetrix.com/reports/unicommerce.com/vein49uQ
help me out with the valuable suggestion that I can do for the optimization of speed and scoores

Your server sends the data with an outdated HTTP protocol and is quite slow generally.
HTTP/1.1 200 OK
Try to implement the HTTP2 Protocol on your server because it is capable of multithreading. Then the combining of css files and scripts and move them to footer and so on gets obsolete.
What kind of hosting do you have?
And there are some typical problems, like render blocking ressources and so on to eleminate:
https://developers.google.com/speed/pagespeed/insights/?hl=en&url=https%3A%2F%2Funicommerce.com
Regards Tom

You need to compress JS , CSS and HTML files using htaccess files
Combine CSS into one CSS so all css will not load
Block unnessesary JS and CSS files to load
Add some browser cache plugin
EWWW plugin for image compress
Visit https://samaxes.com/2008/04/htaccess-gzip-and-cache-your-site-for-faster-loading-and-bandwidth-saving/

first thing you can check that which file taking longer time for load and apply defer or async keyword on link so your loading can improve

Related

gzip compression through cloudfront blocks loading through an iframe

I recently enabled website compression through CloudFront that compress all compressible files (html, js, css).
It seemed to work fine on each browser I try it from but recorded browser tests failed over and over (both in datadog and in ghost inspector).
This is the error I get "Your website does not support being loaded through an iframe".
Did anyone came across with something similar to this?
I will appreciate any help! :)

Is it feasible to use SSI (Combine CSS/JS files using htaccess) to speedup website?

I went through this article today
http://isitvivid.com/blog/combining-your-cssjs-files-with-htaccess
and found that we can use htaccess file to combile CSS and JS files and this thing is called Server Side Includes.
I am already usign gZip compression, Keep Alive feature and Cache for my website. By using SSI to combine CSS/JS will help to speed up website?
I'm sure you could speed up your site a little bit by combining assets, but unless you're loading hundreds of things synchronously you shouldn't expect more than a micro optimization. You're more likely to get speed benefits by using the defer script tag attributes when loading your script tags and minimizing your css files.

Pagespeedmodule - Build a ressources server

I use Nginx and I have installed Google PageSpeedModule on one of my domain. This module is really usefull, and easy to use. All CSS and JS are minified, my images are compressed... it has reduced the weight of 500 kb of my pages.
My question is, can I use this module to deliver only ressources ? I create a kind of CDN, containing all my CSS, images, JS... But, I installed Nginx + pagespeedmodule and the module is not working for one image only for example. But it works with an HTML page and compress the images in this page, but can it work with a direct access image ? Thanks.
Yes, you can use InPlaceResourceOptimization to optimize images even if they are not optimized in HTML (Note: That doc says that this is an Apache-only feature, but that's out of date, it works in the latest Nginx as well.). Add this command to your config:
pagespeed InPlaceResourceOptimization on;
Note that the default way that ngx_pagespeed works is by rewriting resources found in HTML. That is the most efficient way to run it. If you only use InPlaceResourceOptimization you will not get some advantages like cache extension and image resizing. However this is a convenient feature if you cannot optimize resources in HTML.

Caching javascript and css files in browser

I an developing a web app and I want to ensure that the client browser caches the static js and css files and updates it only when files are modified. So if files are modified - one month later - no requests for js and css files would be made for a month. If files are modified within hours, the new files will be requested and delivered. I am wondering if its possible to get the browser to first ask if files have been modified - or any other way maybe?
You are certainly looking for something like App Cache.
Please check HTML5 Rocks and Mozilla Docs
It's always good to set an expiry date when your HTTP Request is send to cache this files so that going ahead in time the old versions shouldn't be cached. This is the general usecase of highly scalable Modern Full Stack Javascript Application.
Hope that answers your Query.

Javascript Script Combining and Caching

I am building a AJAX intensive web application (using ASP.NET, JQuery, and WCF web services) and am looking into building an HTTP Handler that handles script combining and compression for my JavaScript files and my CSS files. I know not combining the scripts is generally a less preferred approach, and I'm sure it's probably the way I will end up going, but my question is this...
Since so many of my JS files are due to the controls I use don't they get cached by the browser after the first load anyway? Since so many of these controls can be found on many of the pages of my web application is it actually faster to combine all of my scripts and serve that one file (which will vary for every page) or to serve the individual files which will get cached? I guess what I'm getting at is, by enabling script combining am I now losing part of the caching ability of the browser? I know I can cache the combined script, but the combined script will be different for every page whereas with the individual control scripts each one will be cached and the number of new scripts will be minimal for each page call.
Does this make any sense? Thoughts?
The fewer number of JS files you serve, the faster your pages will be, due a smaller number of round trips to the server. I would manually put all the common js code into one file (or as few files as possible), all the css code into one file etc., and not worry about using a handler to combine the files. The handler is going to take processing time to combine the files, so you are going to pay that penalty also. You can turn on gzip compression on IIS and have it handle that for you. I would run something like YUI Compressor on the Javascript files used in production.
If the handler changes the file contents from page to page, browsers won't be able to cache it. If you are using SSL this point will be moot though as the browser won't cache the files anyway.
EDIT
I've been corrected some browsers (like FF) can cache SSL content but not all.
As other mentioned: minify, gzip and turn on caching (set expire time and see to that you support etags) on the one static JS file you have and the one static CSS file you have. On top of this it's recommended to load your CSS files as early as possible and your JS files as late as possible (JS file loading is blocking other downloads and it's faster for the browser to render the page if it got the CSS as soon as possible). Sprites also help if you have many small images/icons. Loading static content from sub domains will help the browser to have more simultaneous downloads and you could drop all you cookies for those sub domains to lower the http header size.
You could consult YSlow for performance analysis, it's a great tool!
generally a less preferred
is it for live js? I thinking of JavaScriptMVC which compresses all the code into one file when complied for production, not development... It's a heavy weight I believe.
Usually it's better to combine all scripts. In this case you'll reduce http overhead. Minified controls scripts usually are quite small. In rare case when you are using quite large control you could not combine it to main js.

Resources