Google Page Speed Insight Shows In-Consistent Results - wordpress

I have a wordpress website running and i am using W3Total Cache Plug-in to make the site load faster. When i scan the site in Google Page Speed Insight, i noticed i am getting in-consistent scan results. I have a Facebook Messenger chat floating on the webpage and a google map. Since these two gave me Reduce the impact of third-party code Warning i have made changes so that these two will be loaded only after the DOM has loaded completely. Actually i have used jQuery SetTimeOut for this. I actually managed to remove the warning from the result by doing this. But now and then i noticed the same warning coming back in, even if i have made adjustments. if i scan the site two or three times frequently the warnong may go off, but will be back again once i try after a while.
These are the result of frequent scans. Do you guys have any idea about what would be going wrong here ? I spent a lot of time searching but couldn't get my head around it.

With the classic HTTP/1.0 Hypertext Transfer Protocol, resources like Javascript, CSS, HTML, images etc. are loaded in a request / response pair, meaning the browser sends a request to request for a resource (be it CSS, Javascript, etc.), and will wait for the response to come back before it requests another resource. Even though they are loaded in a request / response pair, the request and response pairs are not always going to follow the same sequence strictly, due to randomness in network latency, server response time, the load of the server is currently experiencing, etc.
With HTTP/2 and HTTP/3, the newer versions of HTTP protocols, instead of waiting for a response to come back before sending another request, the requests can be sent all at once. I checked your website and saw that your website is using HTTP/2 and HTTP/3. With HTTP/2 and HTTP/3 protocols, since requests can be sent all at once, it can contribute to a degree of "inconsistency" as well, among other things. Even with HTTP 1, there's always a degree of randomness since there are many factors that play into it like the server response time is going to be different, the network latency is going to be different, etc.
To illustrate this, if you are using the Chrome browser, open the "Developer Tools" tab by clicking the three dots on the very top right corner of the browser, and then click "More Tools" and then click "Developer Tools". Alternatively, you can do "Ctrl+Shift+I" if you use Windows or "Command + Option + I" on Mac. Then go to its "Network" tab, and refresh the page. Each time you refresh the page, the resources are loaded a bit different in sequence:
In the image above, using the Google Tag Manager UA-174548329-1 Javascript as an example (I know it's probably not Google Map), it is loaded as the 4th resource.
When I refresh the page again, your Google Tag Manager UA-174548329-1 Javascript is loaded as the 11th resource:
When the page is being loaded or if you run it on Google's PageSpeed Insight, the main thread is sometimes busy, sometimes not, due to the nature of the randomness of the request and response. Your main thread is also constructing the DOM, and doing a lot of work. Sometimes it's getting blocked by render-blocking resources, such as Javascript.
Javascript is always going to block the Critical Rendering Path by default. Without looking at your Javascript SetTimeOut it's hard to say what implementation you are using to delay your Javascript but it's safe to assume that it probably doesn't help with clearing the critical rendering path. Instead of using SetTimeOut, you should use defer or async.
You can look more into the Critical Rendering Path here. The main thread is the main process your browser is running to do most of the work to process and render the CSS, Javascript, HTML on a page. The critical rendering path is "the sequence of steps the browser goes through to convert the HTML, CSS, and JavaScript into pixels on the screen". - Quoted from Critical Rendering Path. The critical rendering path is the sequence of your Javascript, HTML, CSS, images, and other resources being downloaded and rendered. It requires a lot of knowledge to optimize your critical rendering path and it's no easy job. However there are two attributes you can try to use in the script tag, namely "async" and "defer" to control when your Javascript will be executed.
Take a look at this image:
Credit: Growing with the Web
https://developers.google.com/web/fundamentals/performance/optimizing-content-efficiency/loading-third-party-javascript/?utm_source=lighthouse&utm_medium=unknown
As you can see, you can try putting the async attribute in your script or the defer attribute in your script tag and see if it helps.
With 'async' attribute in the script tag, it means that your Javascript will be executed asynchronously as soon as it's downloaded. The blue bar under the <script async> as shown in the image shows that the script is downloaded at the same time when the HTML is being parsed as well, since the green bar and the blue bar are seen executing in parallel. As soon as the downloading of the script is finished, the script is then executed. At this point, the HTML parsing is paused until the script is finished executing. Whereas without the 'async' attribute, your HTML parsing will be paused (or blocked) when the script is being downloaded and executed.
With 'defer' attribute in the script tag, it means you are deferring the execution of your Javascript until the DOM is finished parsing. Although it will be downloaded as soon as the browser receives the javascript resource, but the downloading won't block the HTML parsing.
In summary, you can use the 'async' attribute in your third party script to 'unblock' your main thread to a certain degree, that they will be downloaded and executed in the background while your DOM is being rendered. This will speed up the main thread a bit. However one caveat is that the execution is still going to be render-blocking. A very important thing to note is that by using the 'async' attribute, be prepared to see some possible erratic behaviors of the page because, more 'inconsistencies' might happen as now the Javascript can be executed anytime in the rendering path and therefore if something needs to happen before or after the script, you might break the flow and the logic of it.
Or you can use the 'defer' attribute in your third party script to tell your script to be executed only after the DOM has been loaded completely. This can only speed up the process very little, only a little because the downloading of the script can now happen in parallel while the HTML parsing is taking place vs using the default script tag without specifying defer or async, but the execution is still going to take an overhead on the main thread.
As per Google's support document, there's a section on How do you load third-party script efficiently?, here are a few ways:
"
Load the script using the async or defer attribute to avoid blocking document parsing.
Consider self-hosting the script if the third-party server is slow.
Consider removing the script if it doesn't add clear value to your site.
Consider Resource Hints like <link rel=preconnect> or <link rel=dns-prefetch> to perform a DNS lookup for domains hosting third-party scripts.
"
Other methods:
Check out how to compress, minify, or combine various Javascript files into one file (if you are using Javascript in the form of files). Use GZIP compression to compress your Javascript, CSS. Also check out how to load third party scripts using a CDN (Content Delivery Network / Content Distribution Network), among others.
Updated Aug 12, 2020:
In response to your comment, since you mentioned that your third party scripts are coming from plugins that you can't code the 'async' or 'defer' attribute into the script tags, you can consider adding this before your other scripts:
<script>
// If your script tag has an id, use either one below:
document.getElementById("your_script_tag_id").async = true;
document.getElementById("your_script_tag_id").defer = true;
// If your script tag has a class name, use either one below:
document.getElementsByClassName("your_script_tag_class_name")[0].async = true;
document.getElementsByClassName("your_script_tag_class_name")[0].defer = true;
// If for once and for all scripts, use either one below:
document.getElementsByTagName("script")[0].async = true;
document.getElementsByTagName("script")[0].defer = true;
</script>
You can also check this out: Async JavaScript, this allows you to defer or async your Javascripts including the third party ones.

From what I can see you have set the "delay" to 3 seconds on Facebook Messenger chat. However your site takes a lot longer than this to load the initial content.
Your site will often not have loaded the "above the fold" content within 3 seconds due to things like network latency, load on your server etc.
For this reason the Facebook Messenger chat script is getting loaded at a point where the CPU may or may not be busy. For things like "Total Blocking Time" this is important as that is listening for when the CPU has it's first quiet period to work out when the page is usable.
For working out "impact of third party code" it is looking at when the CPU is working while trying to render the "above the fold" content, hence why sometimes it shows as an impact and other times it does not as sometimes your above the fold content has loaded sufficiently before the Facebook Messenger is initialised.
Additionally you have to consider when your main JS file containing the timeout is loaded, sometimes it will be loaded sooner depending on latency etc. so this will impact the time the fbDiv is added as well.
There is a lot to cover so to simplify the answer (as there is an awful lot to explain as to why this happens) is to increase the delay on Facebook Messenger or only have it load on a button click.
For example you could have a button that says "chat with us" and then use the click event to load facebook messenger (and hide the "chat with us" button). This would be my recommendation
Alternatively looking at the load speed on your site you could set the delay to about 7 seconds and it would then (probably) be consistent.

Related

Display dynamic content from embedded web server

I have an embedded device running a slimmed down version of a HTTP server. Currently, it can display static HTML pages. Here is an example of how it displays a static HTML page:
char *text="HTTP/1.0 200 OK\r\nContent-Type: text/html\r\n\r\n"
"<html><body>Hello World!</body></html>";
IPWrite(socket, (uint8*)text, (int)strlen(text));
IPClose(socket);
What I'd like to do is display dynamic content, e.g. a reading from a sensor. What I thought of so far is to have the page refresh every once in awhile with
<meta http-equiv="refresh" content="600">
and use sprintf() to attach the sensor reading to the text variable for the response.
Is there a way I can do this without having to refresh the page constantly?
You can try the following (from my experience) approach:
- Divide static and dynamic content, minimize dynamic content.
Create a pseudo CGI interface, i.e. URL your_embedded_site/sensor.cgi should be bound to generation of the following HTTP response:
sprintf(cgi_str, "HTTP/1.0 200 OK\r\nContent-Type: text\r\nContent-Length: %d\r\n\r\nvalue=%02d", 8, sensor_value);
or just (that's all about your design considerations):
sprintf(cgi_str, "HTTP/1.0 200 OK\r\nContent-Type: text\r\nContent-Length: %d\r\n\r\n%02d", 2, sensor_value);
Use simple javascript or small java applet to request periodically your_embedded_site/sensor.cgi. Note that javascript is generally browser dependent and may be switched off, java applet will also require additional static content - some *sensor_reader.class" but it has extraordinary freedom in presenting data and extending simple reading and showing with more features.
This allows to organize communication in very efficient way, instead of reloading of the full page: part of code - user front end - will be executed in browser, other part - back end - on the embedded device.
Please do not use a Java applet to provide this.
AJAX and Javascript client-side make this kind of thing easy, without the nastiness of an embedded applet.
Where "nastiness" can include:
Java security issues
Java runtime mismatches
increased payload size (applets are large, compared to snippets of Javascript code)
speed issues (it can be slow to start up the applet)
web page is harder to maintain
and so on.
In summary: it's 2013, just use Javascript and AJAX.

ASP.NET fragment cache -- control is null second time round

I have an ascx control which works just fine. It is contained in a larger aspx page. I want to put it in the fragment cache, so I added the appropriate CacheOutput directive at the top. However, now the control on the underlying aspx.cs file has the control variable set to null the second time the page has loaded. I found a few places on the web where it said this would happen, but I also didn't find a solution to accessing the control.
What am I missing?
Also, can I control where it is cached? Can I make it cache in the browser cache rather than at the server?
Question #1: Output caching only stores the HTML result on the server. If you want to interact or run any code in the user control at all, you may not use full output caching. You may want to look into a lower-level caching, perhaps database or object caching, or embed another user control within this one that uses full output caching itself but the outer user control no longer does.
Question #2: "Can I control where it is cached?" If you use output caching, then no. That always means cache on the server. However, there are lots of different levels of caching. You can only cache a full HTTP response at the browser: a single HTML page, a CSS file, etc. If you want to cache only part of a page at the browser, but have the rest of the page dynamic, you would have to do it with some kind of JavaScript. Either HTML5 local storage, or via AJAX that has appropriate caching headers or responds with a 304 Not Modified response.
Side note: The term "fragment cache" is more often referred to "partial caching" in the ASP.Net world.
SO Tips: These are two questions, and should really be asked as two individual questions in the future.
Also, there are many ways to solve your problems here; if you provided more context to what you are doing and the performance problem you are trying to solve, we could offer more specific answers.

ASP.NET: How to enforce a reload of a web static file

When doing webpages, the client/browser decides if it updates a file like an image or .css or .js or if it takes that from the Cache.
In case of .aspx page it is the server who decides.
Sure, on IIS level or also using some HttpModule techniques I can change the headers of requests to tell the client if and how long a file should be cached.
Now, I do have a website where the .aspx goes hand-in-hand with a corresponding .js. So, perhaps I have some jQuery code in the .js which accesses an element in the .aspx. If I remove that element from the .aspx I would also adapt the .js. If the user goes to my page he will get the new .aspx but he might still get the old .js, leading to funny effects.
My site uses lots of scripts and lots of images. For performance reasons I configured in the IIS that those files "never" expire.
Now, from time to time a file DOES change and I want to make sure that users get the update files.
In the beginning I helped myself by renaming the files. So, I had SkriptV1.js and SkriptV2.js and so on. That's about the worst option since the repository history is broken and I need to adapt both the references and the file name.
Now, I improved here and change only the references by using Skript.js?v=1 or Skript.js?v=2.
That forces the client to refresh the files. It works fine, but still I have to adapt the references.
Now, there is a further improvement here like this:
<script type='text/javascript' src='../scripts/<%# GetScriptLastModified("MyScript.js") %>'></script>
So, the "GetScriptLastModified" will append the ?v= parameter like this:
protected string GetScriptLastModified(string FileName)
{
string File4Info = System.Threading.Thread.GetDomain().BaseDirectory + #"scripts\" + FileName;
System.IO.FileInfo fileInfo = new System.IO.FileInfo(File4Info);
return FileName + "?v=" + fileInfo.LastWriteTime.GetHashCode().ToString();
}
So, the rendered .js-Link would look like this to the client:
<script type='text/javascript' src='/scripts/GamesCharts.js?v=1377815076'></script>
The link will change every time, when I upload a new version and I can be sure that the user immediately gets a new script or image when I change it.
Now, two questions:
a) Is there a more elegant way to achieve this?
b) If not: Has someone a guess how big the performance overhead on the server would be? There can be easily 50 versioned elements on one page, so for one .aspx the GetScriptLastModified would be invoked 50 times.
Looking forward to a discussion :)
There are a few different answers to this question.
First of all, if you have files which only change every once in a while, set the Expires and Cache-Control headers to expire in one year. Only if the files truly never expire should you say that they never expire. You're seeing the issues with saying "never expire" right now.
Also, if you are having performance issues on your site from serving up lots of images and JavaScript, the commonly accepted solution is to use a CDN (Content Delivery Network). There are many different providers and I'm sure that you can find one that meets your budget. You'll also save money in the long run as the CDN will offload a great deal of I/O and CPU time from IIS. It's astounding how big of a difference it can make.
Lastly, one way to make sure that users are getting the latest for your files which almost never change is to implement some sort of versioning scheme in your assets URLs to make cache busting happen. There are many different ways to do this, but one (very naive) way to do it is to have a version number that increases every time you deploy to your site.
E.g. all your asset URLs will look like /static/123/img/dog_and_pony.jpg
Then, next time you deploy to your site, you increase the version number so that it's "124". You would need some way to keep track of the version, dynamically injecting it into asset URLs, as well as making sure that the version number changes every time you deploy. The idea being that anything referencing this asset should automatically know the new version number.
In terms of performance, it's an admirable goal to never need the user to refresh or have to download the same thing twice. But sometimes it's just a lot less hassle, and if users are only refreshing everything periodically, that's probably okay for most websites.
Hope this helps.

Openfire fastpath chat causing site to load slowly

We are adding openfire fastpath chat to our site. It will determine and indicate when live chat is available or not and display an appropriate image to indicate the current status and links for each state.
The javascript call hit's a function that is on another box and this function uses document.write to output the html to the page. I know there is a delay because it is making the request to another server and waiting for a result to be returned. The pause here is about a half second, but causes the rest of the page load to be held up.
Has anyone experience a similar issue or offer any tips for getting this to load synchronously somehow. I tried putting this into an aspx ajax panel, but that seemed to cause other issues.
I used an iframe that is the only thing I could find that seemed to work.

How to speed up Google adsense and analytics loading time?

This might fall under the category of "you can't", but I thought it might be prudent to at least see if there is something I can do about this.
According to FireBug, the major bottleneck in my page loading times seems to be a gap between the loading of the html and the loading of Google adsense and analytics. Notice in the screenshot below that the initial GET only takes 214 ms, and that adsense + analytics loading takes roughly 130 ms combined. However, the entire page load time is 1.12 seconds due to that large pause in between the initial GET and the adsense/analytics loading.
If it makes any difference at all, the site is running off of the ASP.NET MVC RC1 stack.
alt text http://kevinwilliampang.com/pics/firebug.jpg
Update: After removing adsense and analytics, I'm still seeing a slow response time. Hovering over the initial GET request, I see that the following speeds: 96ms Receiving Data, 736ms DOMContentLoaded (event), 778ms 'load' (event). I'm guessing then that the performance is a result of my own jQuery javascript that has processing tied to the ($document).ready() event?
You should place your analytics code at the bottom of the page so that everything else loads first. Other than that, I don't think there's much you can do.
edit: Actually, I just found this interesting blog post on a way to speed up analytics by hosting your own urchin.js file. Maybe it's worth a look.
I've never seen anything like that using Firebug on Stack Overflow and we use Analytics as well.
I just ran a trace and I see the request for the
http://www.google-analytics.com/__utm.gif?...
Happening directly after the DOMContentLoaded event (the blue line). So I'd suspect the AdSense, first. Have you tried disabling that?
As it goes, I happen to have rather heavily researched this just this week. Long story short, you are screwed. As others have said the best you can do is put it at the bottom of the list of requests and make the rest of your code depend on ready rather than onload events - jQuery is really good here. Some portion of the js is static, so you could clone that locally if you keep an eye on it for maintenance purposes.
The google code isn't quite as helpful as it could be in this area*, but it's their ballgame and anything you do to change it is going to be both complex and risky. In theory, wrapping with a non-blocking script call in the header is possible, but would be unlikely to gain you a benefit given the additional abstraction, and ultimately with adsense your payload is an html source, not script.
* it's possible google have a good reason, but nothing I can deduce from the code they expose
Probably not anything you can do aside from putting those includes right before the closing body tag, if you haven't already. JavaScript includes block parallel HTTP requests which is why they should be kept out of <head>
Surely Google's servers will be the fastest part of the loading, given that your ISP and most ISPs will have a local cache of the files too?
You could inject the script into the head on page load perhaps, but I'm not sure how that effects urchin.js.
Could be that your page simply takes that long to parse? It seems nothing network-related is happening. It simply waits around a second before the adsense/analytics requests are even fired off.
I don't suppose you have a few hundred tables on the page or something? ;)

Resources