Gtag.js increasing LCP in pagespeed insights - google-analytics

I have a problem with lcp score in core web vitals (mobile). My LCP score is 3 seconds. The required lcp score is 2.5 seconds in mobile. I have tried everything. Have optimized images with a premium plugin (EWWWW Image Optimizer). Serving the files through a CDN. Reduced the network requests to a minimum and applied other things which are common. The only way I can get the lcp score down is by removing gtag.js. This reduces the lcp to 2 seconds but I really need this script for analyzing the traffic. Is there any other way I can reduce the LCP? HAve tried all the suggestions listed in Lighthouse. Removed Unused CSS,JS. Defered Loading of Images, CSS, JS.

The PageSpeed Insights results that stand out to me are:
Avoid an excessive DOM size (1,889 elements)
Minimize main-thread work (3.5 s)
In the mobile test, PSI reports 3.5 seconds spent on the main thread, with 1.275 s on style and layout. I think some of this can be attributed to the large DOM. So try to find ways to trim excess content or markup from the page to simplify the HTML structure.
Script evaluation also accounts for 0.987 s, suggesting that there is significant JavaScript code that runs on page load. I'm not too familiar with GTM but based on the results it does seem like it's contributing a lot to main thread consumption (2.88 s). Some things to check are disabling or removing unnecessary services in GTM, as these third parties can be adding excess JavaScript, and optimizing the services you do need so that anything not absolutely necessary for the initial page load is deferred until later. See the Loading Third-Party JavaScript guide for some additional advice related to GTM.

On the live site, the slider image is the LCP, and this is loaded via CSS. The browser does not know to load the image until the CSS is loaded and it knows the image is needed on the page. You could speed things up with a preload on that image.
You're also using a lot of web fonts. You would free up bandwidth for the main image by reducing the number of fonts.

Related

Lowering Largest Contentful Paint, Over A Full Second After FCP On Mobile

I've spent quite a bit of time on performance optimization for my site and I'm right there at the finish line for getting all the green good scores in search console for mobile usability and core web vitals. The last outstanding thing is getting my LCP under 2.5 seconds for my blog posts.
I'm consistently getting 2.6-2.7 and I can't quite figure out why. The LCP element is the <h1> tag on my blog posts and there are no above-the-fold images.
Example URL With Higher LCP
Let me say that I am using a Beaver Builder website and some marketing tags like GA, GTM, etc but I've spent a lot of time analyzing my script and style loads to create an optimal experience with preconnects and preloads of various resources. I know it's a big ask to try to get a site like this down in load time. I'm on Kinsta hosting with low TTFB, full-page caching and CDN. I am also using the Perfmatters plugin to control various aspects of load times.
I've done everything I can think of to get the LCP down and it seems like the <H1> tag is visible almost immediately but then may be repainted later towards the end of the page load, but I can't figure out the cause of this.
Anyone out there feeling generous that could take a look?
Page Speed Insights URL
as i can see Pagespeed is showing aggregated real user experience of your entire domain, not of this specific pages but for all pages on this domain.
DOM content load time for your page on mobile is around 3sec, that means your LCP has to be greater than this
You need to reduce your DOM content load time first. You need to prioritize your network calls such that redundant dependency needs to be minimized
And also for desktop page, you are lazy loading first fold image which is not considered as a good user experience and might be contributing to LCP for desktop version of your page, also impact negatively for your SEO

Improve Pagespeed when using Facebook Pixel

I am creating a general facebook integration solution, which uses the facebook pixel, however I am in a situation where if a use the facebook pixel it instantly reduces the pagespeed which is also a requirement for me.
I am using the PageSpeed Insghts to measure my pagespeed, and by adding the facebook pixel my Largest Contentful Paint gets a hit and I get this opportunity showing the facebook pixel load as the problem:
The only solution I found is by adding a setTimeout(fb, 3000) around the pixel script so that it only gets loaded after the page is fully rendered, however I am concerned about this solution since this will largely postpone the facebook load and it may cause some problems for some of my users.
Anyone knows any other way to fix this hit in pagespeed using facebook pixel?
Thanks for providing the clarification.
Your best bet is to use defer in your script embed vs arbitrary timeouts, this means your critical rendering path and interactivity are unaffected. LCP specifically should improve as your application rendering remains unaffected by the pixel.
via MDN:
This Boolean attribute is set to indicate to a browser that the script is meant to be executed after the document has been parsed, but before firing DOMContentLoaded.
Note: if you are not observing specific interaction issues and are purely focused on addressing LCP numbers, deferring may meet the need.

Slow Product Category Page WooCommerce - Need Speeding Up

I have installed and customized WooCommerce Product Pages on my WordPress site, but one of the product category pages takes about 7 seconds on average to load. Other category pages load in around 3 seconds. I am struggling to find the reason for this. There are less products on this page than other pages and less sub-categories. I have installed plug-ins such as 'W3TC' and 'Better WordPress Minify' but it hasn't made much difference.
Has anyone else experienced an issue like this and if so, would you mind sharing how you resolved it?
Any help would be greatly appreciated.
Thanks
Using caching plugins is fine and dandy but the reason these pages load slowly is simply the data model that WordPress uses, post-types and the metadata look-ups. The only way to truly get speeds up is good hosting and turning on Object Cache on the server.
We enable this on a WP-Engine site and it was night and day. 12 seconds turned into 2.5 seconds.
Object caching
Object Caching is designed to capture queries to the database and store them in memory. This allows you to run an "expensive" query - a query that takes a long time - one time, and then reuse the results again. When used properly, Object Caching can give your site a speed boost by reducing the time that is spent accessing the database. Please note that this change can take a while to take effect.
There can be many reasons for a WordPress pages to load slower. But you problem seems to be unique.
Here are some useful tips by which you can speed up your page loading:
Optimize Your Images
The page on which you are having issue might have High Resolution Images.
Avoid displaying flash on your Page
Avoid too many advertisements
Cut off the Unnecessary ads from the page.
Do not use inline cascading style sheets
Besides utilizing inline cascading style sheets make a CSS file and call up file on all page of your site that will likewise help in repressing download speed.
Put stylesheets at the top - Put scripts at the bottom
Utilize javascript at the bottom of the page this will serve to load up your page fast. When web browser download javascript it will finish downloading your internet site data, and so any analog downloading will end while browser request Javascript downloading.
Use CSS Sprites
A CSS sprite is an an image comprised of other images used by your design as something of a map containing the coordinates of all the images. Some clever CSS is used to show the proper section of the sprite when your design is loaded.
Here you do not have to load multiple images which are used on you site. Just loading of a single sprite image will do all your work.
Limit Your External Scripts
There might be a issue that external script is being loading on that page. You need to check and limit the same.
Add LazyLoad to your images
You can use this technique to load the page part by part.
Control the amount of post revisions stored
I saved this post to draft about 8 times.
WordPress, left to its own devices, would store every single one of these drafts, indefinitely.
Turn off pingbacks and trackbacks
Let me know if the problem resolves using these tips for you site.
The list of suggestions that WisdmLabs mentions above is great!
However, I'm not sure if you've seen the plugin for Wordpress called W3 Total Cache. It has a load of built in functionality to automatically improve the performance of your Wordpress web pages.
It's free and worthwhile using if you are looking to improve the performance across your whole site.
https://wordpress.org/plugins/w3-total-cache/

Soundcloud iFrame Embed Leaking Memory

I'm currently building a single-page backbone app that embeds up to 10 separate Soundcloud iFrames on a single page. Users can then view other pages, each of which contain their own set of iFrames.
I've noticed that each time a new set of iframes is loaded the memory consumption for the tab increases roughly 80-100MB (according to the Chrome Task Manager). This memory is never relinquished, so after a few clicks the tab easily reaches 300MB and becomes intolerably slow. This slowness occurs in both Chrome 20 and Firefox 13.
After each page change I've tried .remove()'ing all the iframes as well as clearing out the container element via .html('') and neither stems the memory growth.
Provided in this gist is sample code that exhibits the same behavior as described above. On each load the individual iFrame consumes roughly 10MB of additional memory.
https://gist.github.com/3202151
Is the Soundcloud embed code doing something to maintain a handle to the iframe and preventing it from being GC'd? Is there another way I can remove the elements from the DOM to avoid the memory bloat?
Note: I cannot add all the tracks to a single set which can be loaded once since tracks being embedded are not my own.
I have been running into a similar problem. I'm using the SoundCloud js sdk to stream audio in a custom player on my site. I got it running and let it go all night (because I was suspicious of the swf size). Sure enough, in the morning the swf was massive and my computer was noticeably slow. The SoundCloud sdk uses SoundManager2 to stream/play audio, so it creates a soundManager object which you can access in JS. I ended up managing the swf size by calling
soundManager.reboot();
...between each song that plays. In your case you could call it between the various pages, keeping the memory to the 80-100MB max at a time. It increases the load time by a fraction of a second, but that's a small price to pay for fixing the ever-growing memory issue.
I'm not sure if the iframe player also creates the soundManager object, but if so, give this a try!
Check out the SoundManager2 documentation here:
http://www.schillmania.com/projects/soundmanager2/doc/#soundmanager-reboot

How can I verify that overall a site is faster after compression has been switched on?

I have suggested that we Enable dynamic content compression on IIS7 to improve user experience. The QA department wants me to demonstrate that it is in fact faster. I've been using Firebug to view the load time waterfall chart generated under the Net setting and it is inconsistent with the overall (total) page load time from page to page. i.e. sometimes faster but sometimes slower.
Dynamic pages by themselves are always faster but now some static uncompressed content appears to be slower.
What I would like is a tool (Firefox addin) that can add together all page load times during a typical workflow (usage) of a site and give me a final time figure. I would then use that with dynamic compression enabled and disabled to see what the total net effect is. Any suggestions?
Use Fiddler from Microsoft, it has a much low level interaction with Explorer and could be easily used to produce comparative graphics.
Firebug is very useful for a lot of stuff, but for every kind of measurement that involves network Fiddler is much better because instead of examining the page it works as a local proxy and so can examine the network traffic much better.
Site link: http://www.fiddler2.com/fiddler2/
The Net tab in Firebug gives you timings for every item the browser downloads. There's also an option to disable the cache, to guarantee accurate timings. I believe that by default it will clear the timings after every page load, but that can be disabled.

Resources