Core web vitals results consistency problem, even all URL's are in good ( LCP | CLS ) - pagespeed

1. In Core Web Vitals, Does the Google search console rely on traffic to show the results of LCP and CLS? If yes then how much minimum traffic is required? Because In the live page insight tool there are all URLs are in good condition.
LCP is : Under 2.5 | CLS is : Under 0.1
2. Does Google search console/core web vitals results depend on CruX for each device OR page insight tool?
3. All the pages have CLS less than 0.01 but when the Google search console makes URL Group average it shows 0.23 CLS. As I have checked all the sample URLs given in the group are also good. is there any way to check how it got calculated?
4. Sometimes in the search console my web vitals URLs count goes to 0 on the desktop. Does anyone know about this thing?

It seems like you might be conflating the Chrome UX Report data from real users at the top of the page with the Lighthouse data from lab tests at the bottom. Core Web Vitals are assessed using real-user data from the Chrome UX Report.
In your first screenshot, it shows a CLS value of 0.22. This means 75% of desktop users on the page have a CLS of at or below 0.22, which is in the "Needs Improvement" category. Search Console is saying that the average experience over all pages in the group is a CLS value of 0.23, which is very similar to this particular page's performance.
The CLS value of 0.001 is based on a Lighthouse test. Importantly, Lighthouse doesn't interact with the page as a real user would, so it doesn't encounter types of CLS that may be caused by scrolling or clicking. Think of this value more as a lower bound for CLS than your page's actual user-perceived performance. That's also why Core Web Vitals are not assessed using data from Lighthouse.

Related

Failing on CLS in the google console

Our website is failing on CLS under the core web vitals. But google's page speed results are totally contradicting themselves.
https://pagespeed.web.dev/report?url=https%3A%2F%2Fshoppingonline.ie%2Fshops%2Fnext-ie%2F
Its saying under
Discover what your real users are experiencing
Cumulative Layout Shift (CLS)
0.13
Yet under
Diagnose performance issues
Cumulative Layout Shift
0
How can it be 0.13 and 0 ?
Kind Regards
Scott
Expecting the results to be the same.

Why is there a CLS problem with real user data but not with the lab data?

when I test the page in DevTools, there is no problem with CLS.
But, with pagespeed-insights there appears a difficulty with the mobile page CLS.
Not for the lab data, but only for the real data:
https://federhilfe.de/rotkehlchen-anlocken-im-garten-ansiedeln-fuetterung-nisthilfen/
https://pagespeed.web.dev/report?url=https%3A%2F%2Ffederhilfe.de%2Frotkehlchen-anlocken-im-garten-ansiedeln-fuetterung-nisthilfen%2F&hl=de
Do you have any idea how to solve this problem?
Thank you very much!
Alex
The guide at web.dev/lab-and-field-data-differences mentions a few reasons why CLS may be different across lab and field measurement:
CLS may depend on real-user interaction, while lab tests do not simulate any interactions (by default)
Personalized content from real users may affect CLS, which may not be represented in lab testing
Real users may have warm caches, so content may not need to load, minimizing CLS, whereas lab tests traditionally run with empty caches
To help identify why CLS is happening to real users, the first step I'd recommend is to use the website yourself with something like the Web Vitals extension enabled. Just from scrolling to the bottom of the page with the heads-up display (HUD) turned on, I can see CLS values gradually increasing and exceeding the "good" threshold of 0.1
I'd also recommend measuring Web Vitals in the field to ensure that you have additional debug data about what might be causing CLS to your real users.
Even though this kind of interaction-induced CLS wouldn't be reproduced by lab testing, it could still identify possible causes of real-user CLS. The PageSpeed Insights link you shared, you can click Show audits relevant to: CLS. For the desktop results, there are two audits that need attention:
Image elements do not have explicit width and height
Avoid large layout shifts

Page speed Does not pass even after scoring 90+ on Page Speed Insight

My webpage is scoring 90+ on desktop version but yet it's test result on Field Data show "does not pass". While the same page on Mobile with 70+ speed is marked as "Passed"
What's the criteria over here and what else is needed to pass test on desktop version. Here is the page on which I'm performing test: Blog Page
Note: This page speed is on 90+ from about 2 months. Moreover if anyone can guide about improving page speed on Mobile in WordPress using DIVI builder, that would be helpful.
Although 6 items show in "Field Data" only three of them actually count towards your Core Web Vitals assessment.
First Input Delay (FID)
Largest Contentful Paint (LCP)
Cumulative Layout Shift (CLS)
You will notice that they are denoted with a blue marker.
On mobile all 3 of them pass, despite a lower overall performance score.
However on Desktop your LCP occurs at 3.6 seconds average, which is not a pass (it needs to be within 2.5 seconds).
That is why you do not pass on Desktop but do on mobile.
This appears to be something with your font at a glance (sorry not at PC to test properly), causing a late switch out. I could be wrong, as I said, I haven't had chance to test so you need to investigate using Dev Tools etc.
Bear in mind that the score you see (95+ on Desktop, 75+ on mobile) is part of a synthetic test performed each time you run Page Speed Insights and has no bearing on your Field Data or Origin Summary.
The data in the "Field Data" (and Origin Summary) is real world data, gathered from browsers, so they can be far apart if you have a problem at a particular screen size (for example) etc. that is not picked up in a synthetic test.
Field Data pass or fails a website based on historical data.
Field Data Over the previous 28-day collection period, field data shows
that this page does not pass the Core Web Vitals assessment.
So if you have made recent changes to your website to improve your site score you need to wait atleast a month so that Field Data shows result based on newer data.
https://developers.google.com/speed/docs/insights/v5/about#distribution

Largest Contentful Paint - different scores?

I have troubles with the Largest Contentful Paint on my Desktop-Sites. Since a few days the LCP is growing and growing and I don't know why and how I should stopp it. A problem is also that I get always different scores.
Page Speed Insights tells me that the featured image is the LCP and it takes 0,7 till 0,8 seconds for loading. The score for 28 days shows a value of 4,6s. That would be ok. But in search console under Web Core Vitals section I have 215 URLs for Desktop that have a LCP score of 7,4 seconds and it is getting higher and higher. When I do a test in WebDevTools in incognito mode the LCP for Desktop is about 9700 ms (too much). Why there is so a difference and what can I do to get the LCP down? The LCP wasn't a problem some days ago...
Greetings Kathrin
Keep in mind that the data you see in Chrome (Lighthouse) is so called Lab-data and does significantly differ from what you see in search console as search console uses field-data.
The field data is collected from all devices with a chrome installed accessing your website. So e.g. if your servers are in the US, but most of your users come from Africa, your LCP might be high as the image takes long time to load.
Another thing that could be a problem in your particular case is that as of some changes to your site, a different element became the LCP. This is something you can check in Lighthouse and see what element is the LCP, actually.

Struggling to get CLS down under 0.1s on mobile. Can't reproduce it on tests

I try to optimize the whole Pagespeed of this page but I can't get the CLS under 0.1 on mobile. I really don't know why as I use critical css, page-caching and font-preloading and I cant reproduce the behaviour in tests.
https://developers.google.com/speed/pagespeed/insights/?url=https%3A%2F%2Fwww.birkengold.com%2Frezept%2Fselbstgemachte-zahnpasta
Tested with an simulated Galaxy S5 on 3G Fast.
https://www.webpagetest.org/result/210112_DiK9_256ca61d8f9383a5b927ef5f55644338/
In no Scenario I get somewhere near the 0.1 in CLS.
Field Data and Origin Summary
Field data and Origin Summary are real world data.
There is the key difference between these metrics and the synthetic test that Page Speed Insights runs.
For example: CLS is measured until page unload in the real world, as mentioned in this explanation on CLS from Addy Osmani who works on Google Chrome.
For this reason your CLS can be high for pages if they perform poorly at certain screen sizes (as Lighthouse / PSI only tests one mobile screen size by default) or if there are things like lazy loading not performing well in the real world and causing layout shifts when things load too slowly.
It could also be certain browsers, connection speeds etc. etc.
How can you find the page / root cause that is ruining your Web Vitals?
Let's assume you have a page that does well in the Lighthouse synthetic test but it performs poorly in the real world at certain screen sizes. How can you identify it?
For that you need to gather Real User Metrics (RUM) data.
RUM data is data gathered in the real world as real users use your site and stored on your server for later analysis / problem identification.
There is an easy way to do this yourself, using the Web Vitals Library.
This allows you to gather CLS, FID, LCP, FCP and TTFB data, which is more than enough to identify pages that perform poorly.
You can pipe the data gathered to your own API, or to Google Analytics for analysis.
If you gather and then combine the web vitals information with User Agent strings (to get the browser and OS) and the browser size information (to get the effective screen size) you can narrow down if the issue is down to a certain browser, a certain screen size, a certain connection speed (as you can see slower connections from high FCP / LCP figures) etc. etc.

Resources