First Input Delay and Interaction to Next Paint showing N/A in the google pagespeed - pagespeed

I am checking my website on google PageSpeed and getting 97 scores. Also, core web vitals is also showing passed.
But why I am getting First Input Delay and Interaction to Next Paint N/A?
Is there any issue with my code or do I have to change some settings?

No all pages will have any interaction from users. For example a simple blog site, may just have readers who open the article, scroll (which doesn't count toward FID nor INP) to read the page, and then leave the page happy.
FID and INP are therefore optional in terms of Core Web Vitals - as you say your Core Web Vitals Assessment still says: Pass.

Related

Delay in Website Loading in Page Insights

Please check this Screenshot: Page Insights Image
I have marked in it. I have 2 Questions:
Why the website is loading after 4-5 Slides in the Page Insights.
Why the FID is not fetching in Page Insights.
Thanks
Could be any number of reasons. A few slides of non-content are expected for most sites. The filmstrip is suppose to show how it loads (in your case the menu loads first, then the image), rather than how slowly it loads. The stats above are more important but they do show it being slow. Page speed insights will give plenty of advice and guidance on improving your performance below the filmstrip.
FID is a measure of how long it take from interaction (I.e. clicking on a button) until that interaction is processed. Many static, content, sites (e.g. newspapers, blogs and other articles) will not have an interaction (scrolling don’t count as interactions for FID for example) so it’s perfectly normal for there to be no FID score there. In this case FID is ignored in terms of whether you pass or fail Core Web Vitals so no FID is effectively the same as a passing FID.

How to fix Core Web Vitals FID issue in Asp.Net?

I have created an E-commerce based web application in Asp.Net(Backend: Vb) with a BVcommerce tool which is working absolutely fine but due to new guideline of google for page ranking and SEO application has to pass Web Vitals Test so I have changed some of my code and now Largest Contentful Paint (LCP) & Cumulative Layout Shift (CLS) passes with more than 85% percent but when talks come to First Input Delay (FID) it passes only 40% so can you please help me to solve that.
I have also tried to remove all homepage content(Images, Menu and Footer) but still, the results are the same as earlier.
In my application, there is all page load mechanism because it created 4years ago so it creates an issue. I also attach result image here.
Can anyone have an idea about this then please let us know.
First Input Delay (FID) is a pure real user metric and measures the delta between when an input event is received and when the main thread is next idle to execute that event, hence the main thread blocking time is directly proportional to this metric score
The web vitals suggests some good optimizations to improve this score:
Break up Long Tasks
Optimize your page for interaction readiness
Use a web worker
Reduce JavaScript execution time
It all goes to the main thread optimization at the end - an approach idle until urgent is the best in terms of optimizing the performance, hence the lighthouse score. Try reducing non-critical network requests for resources (CSS and JavaScript files) loading over the main thread and instead load them asynchronously
Image heavy pages are better optimized by determining when the media item has entered the user’s viewport and fire an event, this is achievable by Intersection Observer API, hence a lazy loading methodology used by modern mobile first frameworks for better user experience

GTM Strips URL fragments breaking functionality

We have on our site a physician directory search which has been working cross platform for years. Over the last few months, we have been getting reports that the functionality is broken. My debugging into this issue has led me to find that GTM is actually stripping the URL fragments breaking the functionality in all browsers but IE.
We use Ajax calls to retrieve the directory page by page, 10 items at a time. Some results can yield up to 15 pages, but users are no longer able to get past page 2 of the result set. After page 2 it produces the search page again.
This was rewritten a number of years ago to utilize the URL hash as opposed to using the original cookie based system which simply didn't work. This can be easily reproduced using Chrome by:
Visit https://www.montefiore.org/doctors
Click Search By Specialty
Click Family Practice
Navigate to any secondary page, you will see that the hash fragments have been striped
When you try to navigate to any tertiary page, you are simply presented with the specialty list again.
I have gone through various debugging sessions and have even outsourced the issue to our outside developers, but the issue remains unresolved. Any idea on what could be causing GTM to strip out the fragments?

Google Experiments A/B Transaction Tracking

I am adding a step into the buying journey on an eCommerce retail website. I want to test to see if this affects overall conversion rates. Our platform doesn't support A/B testing itself so I'm hoping I can test this using Google Experiments.
A lot of examples for Google Experiments involve having a different page/url to test with, so you want to test a different button style or page layout, different content etc. In my case this doesn't apply as my test is a multipage test, if I add a step here how many of those customers end up on my order confirmation page. The business logic which determines whether to add the additional step is handled via javascript so essentially I want half of my visitors to load to old js file and half to load the new js file and then to see how they differ in Google Experiments.
I'm guessing the only way I can do this is to set-up an experiment where the original page would be
www.mydomain.com/product_page
and my variation page would be
www.mydomain.com/product_page?some_variable=1
Then in my framework I could check for that url parameter and load to new js file. But would that track through the whole buying journey once the customer has left that page?
Any help or suggestions would be very welcome.
Thank You
Saul

Compare URL access with Google Analytics

We have a bunch of URLs of the form https://sensr.net/cameras/... Sometimes the suffix is an integer, sometimes it's text.
For example:
https://sensr.net/cameras/88
https://sensr.net/cameras/columbine-lake-webcam
https://sensr.net/cameras/ap-cam1--2
I would like to find a way in GA to find which of these URLs are most popular. Is there some way to track URLs of a specific form in GA for comparison?
you can check the page performance using "All Pages" report. The report placed in "Behavior >> Site Content".
Once you open the report you can see all the pages and some metrics. Based on these metrics you can measure the performance.
Ex: If people read and stay on more time in some pages than other pages, those pages we can assume as quality pages. Measure the above situation using "Pageviews" and "Avg.Time.On Page " metrics.
If people are leaving the pages or the web site quickly, that mean you need to give attention to those pages.Maybe the content is not relevant or not have enough content to get user engagement. Use "Bounce Rate" metric to measure this. if you have a higher bounce rate that mean page performance are poor.
Refer this link for more infor: https://support.google.com/analytics/answer/2404517?hl=en&ref_topic=1120718

Resources