This might fall under the category of "you can't", but I thought it might be prudent to at least see if there is something I can do about this.
According to FireBug, the major bottleneck in my page loading times seems to be a gap between the loading of the html and the loading of Google adsense and analytics. Notice in the screenshot below that the initial GET only takes 214 ms, and that adsense + analytics loading takes roughly 130 ms combined. However, the entire page load time is 1.12 seconds due to that large pause in between the initial GET and the adsense/analytics loading.
If it makes any difference at all, the site is running off of the ASP.NET MVC RC1 stack.
alt text http://kevinwilliampang.com/pics/firebug.jpg
Update: After removing adsense and analytics, I'm still seeing a slow response time. Hovering over the initial GET request, I see that the following speeds: 96ms Receiving Data, 736ms DOMContentLoaded (event), 778ms 'load' (event). I'm guessing then that the performance is a result of my own jQuery javascript that has processing tied to the ($document).ready() event?
You should place your analytics code at the bottom of the page so that everything else loads first. Other than that, I don't think there's much you can do.
edit: Actually, I just found this interesting blog post on a way to speed up analytics by hosting your own urchin.js file. Maybe it's worth a look.
I've never seen anything like that using Firebug on Stack Overflow and we use Analytics as well.
I just ran a trace and I see the request for the
http://www.google-analytics.com/__utm.gif?...
Happening directly after the DOMContentLoaded event (the blue line). So I'd suspect the AdSense, first. Have you tried disabling that?
As it goes, I happen to have rather heavily researched this just this week. Long story short, you are screwed. As others have said the best you can do is put it at the bottom of the list of requests and make the rest of your code depend on ready rather than onload events - jQuery is really good here. Some portion of the js is static, so you could clone that locally if you keep an eye on it for maintenance purposes.
The google code isn't quite as helpful as it could be in this area*, but it's their ballgame and anything you do to change it is going to be both complex and risky. In theory, wrapping with a non-blocking script call in the header is possible, but would be unlikely to gain you a benefit given the additional abstraction, and ultimately with adsense your payload is an html source, not script.
* it's possible google have a good reason, but nothing I can deduce from the code they expose
Probably not anything you can do aside from putting those includes right before the closing body tag, if you haven't already. JavaScript includes block parallel HTTP requests which is why they should be kept out of <head>
Surely Google's servers will be the fastest part of the loading, given that your ISP and most ISPs will have a local cache of the files too?
You could inject the script into the head on page load perhaps, but I'm not sure how that effects urchin.js.
Could be that your page simply takes that long to parse? It seems nothing network-related is happening. It simply waits around a second before the adsense/analytics requests are even fired off.
I don't suppose you have a few hundred tables on the page or something? ;)
Related
I have a wordpress website running and i am using W3Total Cache Plug-in to make the site load faster. When i scan the site in Google Page Speed Insight, i noticed i am getting in-consistent scan results. I have a Facebook Messenger chat floating on the webpage and a google map. Since these two gave me Reduce the impact of third-party code Warning i have made changes so that these two will be loaded only after the DOM has loaded completely. Actually i have used jQuery SetTimeOut for this. I actually managed to remove the warning from the result by doing this. But now and then i noticed the same warning coming back in, even if i have made adjustments. if i scan the site two or three times frequently the warnong may go off, but will be back again once i try after a while.
These are the result of frequent scans. Do you guys have any idea about what would be going wrong here ? I spent a lot of time searching but couldn't get my head around it.
With the classic HTTP/1.0 Hypertext Transfer Protocol, resources like Javascript, CSS, HTML, images etc. are loaded in a request / response pair, meaning the browser sends a request to request for a resource (be it CSS, Javascript, etc.), and will wait for the response to come back before it requests another resource. Even though they are loaded in a request / response pair, the request and response pairs are not always going to follow the same sequence strictly, due to randomness in network latency, server response time, the load of the server is currently experiencing, etc.
With HTTP/2 and HTTP/3, the newer versions of HTTP protocols, instead of waiting for a response to come back before sending another request, the requests can be sent all at once. I checked your website and saw that your website is using HTTP/2 and HTTP/3. With HTTP/2 and HTTP/3 protocols, since requests can be sent all at once, it can contribute to a degree of "inconsistency" as well, among other things. Even with HTTP 1, there's always a degree of randomness since there are many factors that play into it like the server response time is going to be different, the network latency is going to be different, etc.
To illustrate this, if you are using the Chrome browser, open the "Developer Tools" tab by clicking the three dots on the very top right corner of the browser, and then click "More Tools" and then click "Developer Tools". Alternatively, you can do "Ctrl+Shift+I" if you use Windows or "Command + Option + I" on Mac. Then go to its "Network" tab, and refresh the page. Each time you refresh the page, the resources are loaded a bit different in sequence:
In the image above, using the Google Tag Manager UA-174548329-1 Javascript as an example (I know it's probably not Google Map), it is loaded as the 4th resource.
When I refresh the page again, your Google Tag Manager UA-174548329-1 Javascript is loaded as the 11th resource:
When the page is being loaded or if you run it on Google's PageSpeed Insight, the main thread is sometimes busy, sometimes not, due to the nature of the randomness of the request and response. Your main thread is also constructing the DOM, and doing a lot of work. Sometimes it's getting blocked by render-blocking resources, such as Javascript.
Javascript is always going to block the Critical Rendering Path by default. Without looking at your Javascript SetTimeOut it's hard to say what implementation you are using to delay your Javascript but it's safe to assume that it probably doesn't help with clearing the critical rendering path. Instead of using SetTimeOut, you should use defer or async.
You can look more into the Critical Rendering Path here. The main thread is the main process your browser is running to do most of the work to process and render the CSS, Javascript, HTML on a page. The critical rendering path is "the sequence of steps the browser goes through to convert the HTML, CSS, and JavaScript into pixels on the screen". - Quoted from Critical Rendering Path. The critical rendering path is the sequence of your Javascript, HTML, CSS, images, and other resources being downloaded and rendered. It requires a lot of knowledge to optimize your critical rendering path and it's no easy job. However there are two attributes you can try to use in the script tag, namely "async" and "defer" to control when your Javascript will be executed.
Take a look at this image:
Credit: Growing with the Web
https://developers.google.com/web/fundamentals/performance/optimizing-content-efficiency/loading-third-party-javascript/?utm_source=lighthouse&utm_medium=unknown
As you can see, you can try putting the async attribute in your script or the defer attribute in your script tag and see if it helps.
With 'async' attribute in the script tag, it means that your Javascript will be executed asynchronously as soon as it's downloaded. The blue bar under the <script async> as shown in the image shows that the script is downloaded at the same time when the HTML is being parsed as well, since the green bar and the blue bar are seen executing in parallel. As soon as the downloading of the script is finished, the script is then executed. At this point, the HTML parsing is paused until the script is finished executing. Whereas without the 'async' attribute, your HTML parsing will be paused (or blocked) when the script is being downloaded and executed.
With 'defer' attribute in the script tag, it means you are deferring the execution of your Javascript until the DOM is finished parsing. Although it will be downloaded as soon as the browser receives the javascript resource, but the downloading won't block the HTML parsing.
In summary, you can use the 'async' attribute in your third party script to 'unblock' your main thread to a certain degree, that they will be downloaded and executed in the background while your DOM is being rendered. This will speed up the main thread a bit. However one caveat is that the execution is still going to be render-blocking. A very important thing to note is that by using the 'async' attribute, be prepared to see some possible erratic behaviors of the page because, more 'inconsistencies' might happen as now the Javascript can be executed anytime in the rendering path and therefore if something needs to happen before or after the script, you might break the flow and the logic of it.
Or you can use the 'defer' attribute in your third party script to tell your script to be executed only after the DOM has been loaded completely. This can only speed up the process very little, only a little because the downloading of the script can now happen in parallel while the HTML parsing is taking place vs using the default script tag without specifying defer or async, but the execution is still going to take an overhead on the main thread.
As per Google's support document, there's a section on How do you load third-party script efficiently?, here are a few ways:
"
Load the script using the async or defer attribute to avoid blocking document parsing.
Consider self-hosting the script if the third-party server is slow.
Consider removing the script if it doesn't add clear value to your site.
Consider Resource Hints like <link rel=preconnect> or <link rel=dns-prefetch> to perform a DNS lookup for domains hosting third-party scripts.
"
Other methods:
Check out how to compress, minify, or combine various Javascript files into one file (if you are using Javascript in the form of files). Use GZIP compression to compress your Javascript, CSS. Also check out how to load third party scripts using a CDN (Content Delivery Network / Content Distribution Network), among others.
Updated Aug 12, 2020:
In response to your comment, since you mentioned that your third party scripts are coming from plugins that you can't code the 'async' or 'defer' attribute into the script tags, you can consider adding this before your other scripts:
<script>
// If your script tag has an id, use either one below:
document.getElementById("your_script_tag_id").async = true;
document.getElementById("your_script_tag_id").defer = true;
// If your script tag has a class name, use either one below:
document.getElementsByClassName("your_script_tag_class_name")[0].async = true;
document.getElementsByClassName("your_script_tag_class_name")[0].defer = true;
// If for once and for all scripts, use either one below:
document.getElementsByTagName("script")[0].async = true;
document.getElementsByTagName("script")[0].defer = true;
</script>
You can also check this out: Async JavaScript, this allows you to defer or async your Javascripts including the third party ones.
From what I can see you have set the "delay" to 3 seconds on Facebook Messenger chat. However your site takes a lot longer than this to load the initial content.
Your site will often not have loaded the "above the fold" content within 3 seconds due to things like network latency, load on your server etc.
For this reason the Facebook Messenger chat script is getting loaded at a point where the CPU may or may not be busy. For things like "Total Blocking Time" this is important as that is listening for when the CPU has it's first quiet period to work out when the page is usable.
For working out "impact of third party code" it is looking at when the CPU is working while trying to render the "above the fold" content, hence why sometimes it shows as an impact and other times it does not as sometimes your above the fold content has loaded sufficiently before the Facebook Messenger is initialised.
Additionally you have to consider when your main JS file containing the timeout is loaded, sometimes it will be loaded sooner depending on latency etc. so this will impact the time the fbDiv is added as well.
There is a lot to cover so to simplify the answer (as there is an awful lot to explain as to why this happens) is to increase the delay on Facebook Messenger or only have it load on a button click.
For example you could have a button that says "chat with us" and then use the click event to load facebook messenger (and hide the "chat with us" button). This would be my recommendation
Alternatively looking at the load speed on your site you could set the delay to about 7 seconds and it would then (probably) be consistent.
My wordpress website is suddenly slow to respond when I want to view another pages. For example, if I clicked one link on the menu, it took around 10-15 seconds for the website to move to the page linked. However, when the website responded and move to the page, the content loaded fast. So, I thought it was not about the loading speed of my website. Correct me if I am wrong.
Are there any solution for this?
Thanks
It is likely TTFB (time to first byte) problem - it takes a long time for your server to generate the HTML of the page and send it as an answer to the request. Meanwhile when the HTML is already sent - all the js files and images are loading fast enough, as you described. There's just this "delay between switching pages".
This could be caused by plenty of factors, I recommend reading a profound article to understand all the nuances, like this: https://kinsta.com/learn/page-speed/
But in short, the first and most sure way to cut down your page loading speed with the described situation - use page caching. You can use https://uk.wordpress.org/plugins/w3-total-cache/ free plugin as the first step.
Managed WP hosting with built-in caching would be even better, but prices start from 30$ per month if that works for you.
Please open this link and follow the steps accordingly
https://www.codeinwp.com/blog/ways-to-speed-up-wordpress/
I know this question is asked before but because Google is conflicting itself in their support I was confused.
When generating a Google analytics code, Google Analytics tells me to put my tracking code immediately after the opening body tag:
However, I read on the Google support forum that it should be inside my <head> tag.
https://support.google.com/analytics/answer/1008080?hl=en-GB
Paste your snippet (unaltered, in it’s entirety) into every web page that you want to track. Paste it immediately before the closing </head> tag.
Anyone knows which one is best?
As long as it's enclosed properly between <script> tags, it should work anyway. The only tradeoff being that:
including it in the <head> section would result in a slower page rendering (because it would execute the js before parsing the rest of the page)
including it in the bottom of the page might not count people that land in your page, stay for 3 seconds and leave.
The latter won't happen in small pages, but I've seen it happen in forums and blogs where there is a lot of html rendering, a couple of feet of scrolling below the fold and the page completion takes several seconds. In those cases it is possible for people to leave before GA snippet is executed.
Current analytics code is asynchronous so even if you put it in the head it shouldn't affect the rendering time but in a few miliseconds.
TL/DR it's about the same when it comes to normal pages
As long as you place the code as it appears from Google, the tracking will fire.
The higher on the page you have your code, the better your stats will be.
For example, if you have a user with a slow internet connection (think smart phone with poor signal) and your page takes 3 seconds to load. If you have your tracking code higher on the page, the code will fire sooner and start tracking the user's time on site, including the bulk of the load time.
Say in the above example, that the GA code is the very last tag on the site, and after 2 seconds the user sees a link they're looking for, and clicks that link before the page has fully loaded. In this case, the GA code may never fire from the bottom of the page, and you've missed analytics on a hit/visit/visitor that actually went to your site.
Having the GA code in the header or near the top of your code DOES NOT have to slow down the load time. You can call the GA code asynchronously to keep it from slowing down the rest of the load (documented here: https://developers.google.com/analytics/devguides/collection/gajs/).
I have recently developed 4 websites in WordPress using the rtpanel theme framework.
When I put the websites live, I noticed that a couple of them upon clicking through to the blog page, are taking up to 25 seconds to load. (see link)
http://tools.pingdom.com/fpt/#!/brQN7J/www.exactabacussoftware.com/blog
Can anyone tell me what is causing this long wait? If i change my theme back to twentytwelve it loads fine and the same applies on the other sites eg: http://www.exactabacusfulfilment.com/blog
The two examples are both running on the same server using the same theme but I cannot find out what is slowing the software site down so much.
Any help would be greatly appreciated!
It seems that PHP execution is taking lot of time. The analysis of your site shows that it takes around 22 seconds to generate HTML.
There could be few reasons why php execution is taking time:
You have activated some plugin which is causing your site to slow down.
There may be some theme component which is causing your site to load slow.
Your database queries are taking long to execute. (If this is the case, check why this is happening and you can enable Memcache to cache mysql queries)
Install and activate P3 (Plugin Performance Profiler) on the website and find out which component of your site is taking the performance of the site down. To debug in detail, you can also try Query Monitor plugin.
Once the issue tracked down and resolved, you may activate PHP-APC on your server if you do not make changes to the code.
There are couple of really easy ways to investigate the reasons behind slow loads:
Google Page Speed. Basically you feed the URL and you will get list of suggestions how to improve the page load speed. Here is the url for testing for your site:
http://developers.google.com/speed/pagespeed/insights/?url=www.exactabacussoftware.com%2Fblog
The first thing that I notice is very high server response time (in my tests between 0.5sec and 1.6sec). This means that it takes at least 0.5+resource download time sec to load every image, javascript etc. If you have 100 resources this will take you 50 seconds or so which is a lot. So you might want to look for hosting alternatives.
Google page speed will give you more details what could be fixed and improved look through it and try to solve those issues. It should help you to improve your speed quite a bit.
Another option is Google Chrome Developer Tools, Firefox Firebug or similar tools. Just open Network tab and reload the page, you will be able to see how long it takes to load one or another resource of your page.
Another option is Google Chrome Developer Tools, Firefox Firebug or similar developer tools. Just open Network tab and reload the page, you will be able to see how long it takes to load one or another resource of your page.
Building on that.
It looks like there is 2seconds of latency before your server even answers the first GET request --- followed by another 2 seconds that contain 84 more GET requests.
Now, a 4 second load time isn't awful, but if you want it to go faster, the best thing you can do is:
1). Combine all of your javascript files into one file - making sure jQuery/other dependencies are first.
2). Combine all of your PNGs into one file -- a sprite -- or, alternatively, Base64 encode them all.
3). A lot of those pngs could be compressed --- 5kb for an icon is a bit big. 66kb for an image is certainly too big.
4). Same thing with your CSS -- combine them all, and there will be fewer requests.
I have an asp.net C# .net 3.5 page which contains several user controls. I am noticing that sometimes the html loaded on the browser is incomplete. It seems to get cut-off.
Any suggestions on how to troubleshoot whats the root cause?
This can be symptomatic of server errors or proxy problems. I would use Fiddler to check what's going back and forth between your browser and the server. If you are getting any 500 (server error) response codes, that would be a good place to look.
Another thing to check would be javascript errors on the page, because depending on what your javascripts are doing, errors can prevent loading of other content in some cases.
womp probably has most of the bases covered, but the other angle that can lead to issues like this would be exceptions getting eaten but causing processing to stop, thereby sending half the page or somesuch.
Verify that your HTML is being written to the page by viewing the source code of the page after it loads. My guess is that the HTML that is being output is invalid, and that the browser isn't able to properly display it. Make sure all your HTML tags are properly closed and balanced.
It could also be an issue with the request being ended midway through. Try removing one control at a time from the page and see if the situation improves. If it does, you'll know which control is to blame.
It is quite unlikely that it is the same problem, but I had that happen before where the page had a custom filter attached to response.filter which reformatted the output to fix up some dotnet SEO problems. And the one we wrote had a bug where one regular expression consumed a bit too much copy in some instances and broke the output