Failing on CLS in the google console - pagespeed

Our website is failing on CLS under the core web vitals. But google's page speed results are totally contradicting themselves.
https://pagespeed.web.dev/report?url=https%3A%2F%2Fshoppingonline.ie%2Fshops%2Fnext-ie%2F
Its saying under
Discover what your real users are experiencing
Cumulative Layout Shift (CLS)
0.13
Yet under
Diagnose performance issues
Cumulative Layout Shift
0
How can it be 0.13 and 0 ?
Kind Regards
Scott
Expecting the results to be the same.

Related

Core web vitals results consistency problem, even all URL's are in good ( LCP | CLS )

1. In Core Web Vitals, Does the Google search console rely on traffic to show the results of LCP and CLS? If yes then how much minimum traffic is required? Because In the live page insight tool there are all URLs are in good condition.
LCP is : Under 2.5 | CLS is : Under 0.1
2. Does Google search console/core web vitals results depend on CruX for each device OR page insight tool?
3. All the pages have CLS less than 0.01 but when the Google search console makes URL Group average it shows 0.23 CLS. As I have checked all the sample URLs given in the group are also good. is there any way to check how it got calculated?
4. Sometimes in the search console my web vitals URLs count goes to 0 on the desktop. Does anyone know about this thing?
It seems like you might be conflating the Chrome UX Report data from real users at the top of the page with the Lighthouse data from lab tests at the bottom. Core Web Vitals are assessed using real-user data from the Chrome UX Report.
In your first screenshot, it shows a CLS value of 0.22. This means 75% of desktop users on the page have a CLS of at or below 0.22, which is in the "Needs Improvement" category. Search Console is saying that the average experience over all pages in the group is a CLS value of 0.23, which is very similar to this particular page's performance.
The CLS value of 0.001 is based on a Lighthouse test. Importantly, Lighthouse doesn't interact with the page as a real user would, so it doesn't encounter types of CLS that may be caused by scrolling or clicking. Think of this value more as a lower bound for CLS than your page's actual user-perceived performance. That's also why Core Web Vitals are not assessed using data from Lighthouse.

Why is there a CLS problem with real user data but not with the lab data?

when I test the page in DevTools, there is no problem with CLS.
But, with pagespeed-insights there appears a difficulty with the mobile page CLS.
Not for the lab data, but only for the real data:
https://federhilfe.de/rotkehlchen-anlocken-im-garten-ansiedeln-fuetterung-nisthilfen/
https://pagespeed.web.dev/report?url=https%3A%2F%2Ffederhilfe.de%2Frotkehlchen-anlocken-im-garten-ansiedeln-fuetterung-nisthilfen%2F&hl=de
Do you have any idea how to solve this problem?
Thank you very much!
Alex
The guide at web.dev/lab-and-field-data-differences mentions a few reasons why CLS may be different across lab and field measurement:
CLS may depend on real-user interaction, while lab tests do not simulate any interactions (by default)
Personalized content from real users may affect CLS, which may not be represented in lab testing
Real users may have warm caches, so content may not need to load, minimizing CLS, whereas lab tests traditionally run with empty caches
To help identify why CLS is happening to real users, the first step I'd recommend is to use the website yourself with something like the Web Vitals extension enabled. Just from scrolling to the bottom of the page with the heads-up display (HUD) turned on, I can see CLS values gradually increasing and exceeding the "good" threshold of 0.1
I'd also recommend measuring Web Vitals in the field to ensure that you have additional debug data about what might be causing CLS to your real users.
Even though this kind of interaction-induced CLS wouldn't be reproduced by lab testing, it could still identify possible causes of real-user CLS. The PageSpeed Insights link you shared, you can click Show audits relevant to: CLS. For the desktop results, there are two audits that need attention:
Image elements do not have explicit width and height
Avoid large layout shifts

Page speed Does not pass even after scoring 90+ on Page Speed Insight

My webpage is scoring 90+ on desktop version but yet it's test result on Field Data show "does not pass". While the same page on Mobile with 70+ speed is marked as "Passed"
What's the criteria over here and what else is needed to pass test on desktop version. Here is the page on which I'm performing test: Blog Page
Note: This page speed is on 90+ from about 2 months. Moreover if anyone can guide about improving page speed on Mobile in WordPress using DIVI builder, that would be helpful.
Although 6 items show in "Field Data" only three of them actually count towards your Core Web Vitals assessment.
First Input Delay (FID)
Largest Contentful Paint (LCP)
Cumulative Layout Shift (CLS)
You will notice that they are denoted with a blue marker.
On mobile all 3 of them pass, despite a lower overall performance score.
However on Desktop your LCP occurs at 3.6 seconds average, which is not a pass (it needs to be within 2.5 seconds).
That is why you do not pass on Desktop but do on mobile.
This appears to be something with your font at a glance (sorry not at PC to test properly), causing a late switch out. I could be wrong, as I said, I haven't had chance to test so you need to investigate using Dev Tools etc.
Bear in mind that the score you see (95+ on Desktop, 75+ on mobile) is part of a synthetic test performed each time you run Page Speed Insights and has no bearing on your Field Data or Origin Summary.
The data in the "Field Data" (and Origin Summary) is real world data, gathered from browsers, so they can be far apart if you have a problem at a particular screen size (for example) etc. that is not picked up in a synthetic test.
Field Data pass or fails a website based on historical data.
Field Data Over the previous 28-day collection period, field data shows
that this page does not pass the Core Web Vitals assessment.
So if you have made recent changes to your website to improve your site score you need to wait atleast a month so that Field Data shows result based on newer data.
https://developers.google.com/speed/docs/insights/v5/about#distribution

How can I change the graph interval in Live Metrics Stream

The current interval of 1 second and max of 60 seconds is too small and issues may be missed.
When viewing the live metrics stream page of Application Insights the interval on all graphs are 1 second, and it only goes up to 60 seconds. I am trying to use this as a monitoring page to keep an eye on recently released or updated function apps. For this I need to be able to change the interval to view more data at once without having to keep watch on it. Right now if we don't keep watch on it every minute we may miss something important.
I have searched the Microsoft documentation, the git repository, stackoverflow, and various other sites trying to find my answer but the only thing I found was from over 4 years ago and I would hope that this has changed since then.
Live Metrics Stream allows to peek at what's going one right now with 1 second resolution. But it doesn't persist data anywhere. So, data is only stored in UX (browser) and right now only for 60 seconds.
For bigger intervals it might make sense to refer to other Application Insights experiences (including Analytics).

Industry benchmarks for site search

We have been asked to increase the performance of a clients site search. Before we start we would like to set benchmarks. I have asked the client if they are comfortable with enabling unanimous data sharing so we have access to industry benchmarks as I don't have control over this setting: http://support.google.com/analytics/bin/answer.py?hl=en&answer=1011397 however it sounds like things have changed in the google analytics camp and these reports are only available via a newsletter now? Is this true?
Also, will these reports give me industry standards to compare my clients current search performance against? Or is there another service that has these baseline standards available?
Here's an example of the data we are interested in. This is our clients current search performance:
Visits with Search: 772
Total Unique Searches: 1,093
Results Page Views/Search: 1.36
% Search Exits: 56.45%
% Search Refinements: 24.78%
Time after Search: 00:01:40
Search Depth: 0.59
I work at large ecommerce site, and I asked our AdWords rep about this, having recently wanted access to this kind of data myself.
He said that benchmarking was removed 3/15/11, at which point they were experimenting with a monthly newsletter format to deliver the same kind of data.
From what I've seen they may have done one newsletter before (quietly) retiring it completely. I never saw the newsletter, but I think I remember reading reports of people who did receive one.
Disappointing to know they had access to all that data, but pulled the plug on the program. I wonder if they killed it due to data integrity concerns--they can't guarantee correct tracking-code installations on all these sites opting in, so what is the data worth if it's of questionable quality. iono... just a total guess.
We used to use coremetrics here, and they had an opt-in benchmarking program. So if you know any other webmasters using Coremetrics, you could probably ask them to pull some benchmarking info.
We were able to get some benchmarking data from fireclick.com, but none of it (that I've seen anyways) covers on-site search. Mainly just top line metrics. :-/
So the search for benchmark data continues...

Resources