domInteractive vs Time to Interactive - whats the difference? - google-analytics

Google offers a number of polyfill libraries for measuring and tracking First Input Delay(FID) and Time to Interactive (TTI) on analytics platforms. However this metric does not come standard with GA.
domInteractive however is a metric you can track out of the box with GA.
What's the difference? The only explanation I've found for the competing interactive metrics is a vague forum post explaining that TTI may offer a more complex look at interactive delays, but without much in the way of details.
Am I better off tracking TTI on my users if I'm concerned about input delays affecting conversion, or am I fine to stick with domInteractive?

My understanding is the following:
Time to Interactive (TTI) is when the website is visually usable and engaging. For example, when a user can click around on the UI and the website is functional. Ideally, we want all experiences to get interactive as soon as possible. Examples of websites with poor TTI are websites where a user can be actively engaged with the UI for a good amount of time before anything actually happens. Poor TTI is caused by too much (main thread) JavaScript which adversely causes delays for interactivity for visible UI elements. An example, of this is here. This is an especially important metric to consider for the mobile space since everyone doesn't have a nice phone (so it will take longer to parse the JavaScript needed to load a site) as well as the variance that occurs due to different network speeds: i.e. WI-FI, 3G, 4G
domInteractive however is when a page's primary content is visible and meaningful paints have occurred. At this stage a user can visually see the webpage and the corresponding UI elements that represent the site's DOM.
First Input Delay (FID) is the measurement of how long it took to respond to a user event. For example, how long did it take for a button's event handler to take over and respond once the user clicked the button.
As far as I know FID and TTI are experimental metrics right now so they probably wouldn't be baked into Google Analytics by default. As for your question: "Am I better off tracking TTI on my users if I'm concerned about input delays affecting conversion, or am I fine to stick with domInteractive?" You actually want to track FID if you're concerned with input delays affecting conversion. TTI is still a very useful metric to track since it measures when your site as a whole is interactive and both TTI and FID will provide more value than domInteractive.
If you're still interested check out this explanation on the Cost of JavaScript by Addy Osmani. He does a beautiful job explaining the performance issues we are facing with JavaScript as well as talking about TTI and FID.
Cheers

According to this link, domInteractive is "when the parser finished its work on the main document". Time to interactive - is time when all page scripts (including library e.g. Angular and yours) finished initialization, page is not freezen and user can start interacting with it.

Had to dig into the Spec but I think I found what I was looking for:
The DOMContentLoaded event fires after the transition to "interactive" but before the transition to "complete", at the point where all subresources apart from async script elements have loaded.
Basically domInteractive will not reflect async scripts that are still loading in, which is why your TTI metric can vary so widely.

Related

Unusual amount of pageviews in GA4 - result of headless web?

I manage the analytics of a website that uses a headless web, and have noticed an unusual amount of page_view events one some of the pages.
Perhaps it could have something to do with the website being headless, meaning that the URL doesn't change/refresh when clicking, even though the content on the site is changed as if it was a url redirect.
does this make sense? Anyone got any good suggestions on why my events might be off?
My first thought was that the event tracking configuration wasn't set up correctly, resulting in multiple pageviews on the wrong pages (i.e. first page visit → 2nd page → 3rd page = three pageview fires on first page), but upon investigation this doesn't seem to be the problem.
Checked for bot traffic and it doesn't seem to be that, as we're also tracking through UA and Matomo and those numbers look way more likely.
First, what you've described is not necessarily a headless website, it's just a misconfigured Single Page Application that doesn't care about updating the url. A Huge SEO issue, but not a blocker for Analytics. And when an SPA affects analytics, it's most commonly less events, not more.
If the bot traffic inflates one analytics system as a side-effect of whatever it does, it will inflate similarly pretty much any other analytics system, so if numbers in UA and Matomo look alike, it doesn't rule out bots. Especially if your GTM sends events to both systems on the same triggers.
Now, there are ways to debug it besides just going to the website and looking at a few pages tracking.
In cases like this, you want to use data to debug your tracking.
You build a report (custom report, or just use pregenerated UA reports) in which you compare the anomalous traffic period to the previous period so that you would have your base. Now, whatever dimension you're using, you're looking at the value or a few values of this dimension that contain most of the anomalous traffic. This is to see if any dimension contains the outlier that would explain the nature of the anomaly.
Dimensions that I would look at right away are: hostname, country, hour of the day, source, page, landing page, exit page, referrer. I would also take a quick look at all the conversion numbers, bounce rate and avg time on site.
If all of these look organic, then I would presume natural growth of good traffic.

How to fix Core Web Vitals FID issue in Asp.Net?

I have created an E-commerce based web application in Asp.Net(Backend: Vb) with a BVcommerce tool which is working absolutely fine but due to new guideline of google for page ranking and SEO application has to pass Web Vitals Test so I have changed some of my code and now Largest Contentful Paint (LCP) & Cumulative Layout Shift (CLS) passes with more than 85% percent but when talks come to First Input Delay (FID) it passes only 40% so can you please help me to solve that.
I have also tried to remove all homepage content(Images, Menu and Footer) but still, the results are the same as earlier.
In my application, there is all page load mechanism because it created 4years ago so it creates an issue. I also attach result image here.
Can anyone have an idea about this then please let us know.
First Input Delay (FID) is a pure real user metric and measures the delta between when an input event is received and when the main thread is next idle to execute that event, hence the main thread blocking time is directly proportional to this metric score
The web vitals suggests some good optimizations to improve this score:
Break up Long Tasks
Optimize your page for interaction readiness
Use a web worker
Reduce JavaScript execution time
It all goes to the main thread optimization at the end - an approach idle until urgent is the best in terms of optimizing the performance, hence the lighthouse score. Try reducing non-critical network requests for resources (CSS and JavaScript files) loading over the main thread and instead load them asynchronously
Image heavy pages are better optimized by determining when the media item has entered the user’s viewport and fire an event, this is achievable by Intersection Observer API, hence a lazy loading methodology used by modern mobile first frameworks for better user experience

Do I necessarily need to remove "Render Blocking CSS"

I put my homepage through Google's PageSpeed test and it gave me a score of 69 for Mobile and 95 for Desktop. The one and only issue being a Render Blocking CSS.
Now, all my web pages on my website are Above the Fold. i.e. There is no scroll involved anywhere. Given this, personally I feel I should not be doing anything special, since the CSS is required to view my page the way I designed it, right from the get go.
If I do asynchronous loading or something, it'll end up showing the content on a black and white un-organised page, just before the intended output.
Do I ignore Google? It would mean that I'd never score 100/100, and wouldn't that affect my SEO chances?
TL;DR — No, you don't have to. But in most cases, it helps, indirectly.
Render blocking is in place to prevent FOUC.
Ideally you should only load the CSS responsible for rendering the "above the fold" of your page as render blocking and all the rest of your styles using async methods.
However, most sites load all their CSS as render blocking. Why? Because most websites do not afford a CSS specialist to customize their CSS loading for their specific case. They'll sometimes pay for a theme, but that's it.
Themes are not typically optimized from this point of view because there is no way to know what elements the user will want in their above the fold area.
Is this a huge problem?
NO.
First of all, all of this is only about when the user loads the very first page of your website. All the other pages will use the cached stylesheets (already loaded on first page visited) (unless you load different stylesheets for different pages).
And second of all, the general idea that Google lowers your page's SEO score for having render-blocking CSS is, technically, wrong. They do penalize for a lot of other reasons (like accessibility, readability and responsiveness issues) but not for having render-blocking CSS.
However, there is an indirect correlation between the two.
Google Page Speed is a tool telling you how you can improve the loading speed of your page or to leave the impression the page loads faster.
if you fix the problems it identifies, the page will load faster or at least it will seem to load faster
if your page is or feels faster, there are less chances users will hit the back button while waiting for your page to load.
THIS user behavior is where the SEO penalty comes in. Google registers any such behavior as a general "user did not find what he was looking for on that website" and lowers the page's SEO score for whatever the user searched for
Any method of keeping users from hitting the back button in the first 30 seconds after they left for your website (that will keep the bounce rate down) is a good method to fight SEO penalties.
And... it's true: one of the most efficient methods is to make your page load faster.
Others include:
make the loading process look professional (place correctly sized placeholders for images, so the page doesn't jump around when loading);
keep FOUC as close to 0 as possible
render something, rather than nothing
if possible, give users a general idea of how much of the page has loaded (in %)
make the website loadup with some basic schema of what's on longer pages. users will read the schema, trying to figure out if they're on the right page and they won't notice the loading time - since you give them something to do while waiting
cut the "bla bla" and try to be honest about whatever your page has to offer
I can't emphasize this enough: it really pays off to be honest. There is a huge difference in results, SEO wise:
If your page is about A, but you want to show this to users looking for B, do not tell them you've got B and don't hide it from them. Just tell them:
"Look, this is not B, it's A, but here are a few reasons why you should consider A instead of B."
Most users will read those reasons. Especially if they're well written, they address real problems, and they don't look like they're just trying to buy time.
A very good idea is to place your strongest argument second or third in the list (second if first is rather long, third if first two are not so long).
The reasoning is: if you place it further down, many users don't read past three weak arguments - they label the entire list as unconvincing and go back.
Also, if you place the arguments in the order of their importance, the user will realize it and, as soon as they reach two arguments that are not convincing, they'll assume it gets worse further down the list and, again, they'll hit back button.
But if you place a second or third argument stronger than the previous ones, they will read through the entire list hoping to find another one.
Now, if your arguments are compelling, the user will go for A instead of B => Win.
If not, they will still go for B, but at least they'll do it later (after they read your reasons), and the penalty will be much smaller, if any (the longer time a user spends on your page, the less the penalty, should they press back) => No loss.
If you can keep the user occupied for more than 30 seconds, you're typically in the clear SEO wise. And that's the really important SEO issue at hand, not render-blocking per-se.
In the end, it is totally possible to create a page with a very low score on Page Speed while having a very high SEO score. It's unlikely, but totally possible.

Finding click-counter for NFP website, written in iframe

I am a non-programmer working for a church. We have no tech staff. Our website is based upon a template that doesn't provide a widget for counting clicks. We'd like to add one (or preferably two) jpg image(s) with a counter(s) to track the number of times clicked, and display the cumulative total next to the jpg(s). Church members will go to the page and click each time they participate in one or both of two different church objectives.
Our web host says to do this I must find, write, or purchase 3rd party code written in iframe, to embed into one of our pages.
I googled the issue and am only finding hit counters which track visitors to a page, rather than clicks of an image. We'd prefer two different jpgs to track two different objectives, but if necessary I can change from two jpgs to one, if having two counters on the same page is a problem.
Can anyone point me to where I could get code like this either for free, or for pay, and what it would cost?
There is a lot of good information here. They talk about an issue with iframe receiving the click vs. you recording it. If you keep reading there is a possibility to work it. Hope this helps!
Look here: Detect Click into Iframe using JavaScript

How to speed up Google adsense and analytics loading time?

This might fall under the category of "you can't", but I thought it might be prudent to at least see if there is something I can do about this.
According to FireBug, the major bottleneck in my page loading times seems to be a gap between the loading of the html and the loading of Google adsense and analytics. Notice in the screenshot below that the initial GET only takes 214 ms, and that adsense + analytics loading takes roughly 130 ms combined. However, the entire page load time is 1.12 seconds due to that large pause in between the initial GET and the adsense/analytics loading.
If it makes any difference at all, the site is running off of the ASP.NET MVC RC1 stack.
alt text http://kevinwilliampang.com/pics/firebug.jpg
Update: After removing adsense and analytics, I'm still seeing a slow response time. Hovering over the initial GET request, I see that the following speeds: 96ms Receiving Data, 736ms DOMContentLoaded (event), 778ms 'load' (event). I'm guessing then that the performance is a result of my own jQuery javascript that has processing tied to the ($document).ready() event?
You should place your analytics code at the bottom of the page so that everything else loads first. Other than that, I don't think there's much you can do.
edit: Actually, I just found this interesting blog post on a way to speed up analytics by hosting your own urchin.js file. Maybe it's worth a look.
I've never seen anything like that using Firebug on Stack Overflow and we use Analytics as well.
I just ran a trace and I see the request for the
http://www.google-analytics.com/__utm.gif?...
Happening directly after the DOMContentLoaded event (the blue line). So I'd suspect the AdSense, first. Have you tried disabling that?
As it goes, I happen to have rather heavily researched this just this week. Long story short, you are screwed. As others have said the best you can do is put it at the bottom of the list of requests and make the rest of your code depend on ready rather than onload events - jQuery is really good here. Some portion of the js is static, so you could clone that locally if you keep an eye on it for maintenance purposes.
The google code isn't quite as helpful as it could be in this area*, but it's their ballgame and anything you do to change it is going to be both complex and risky. In theory, wrapping with a non-blocking script call in the header is possible, but would be unlikely to gain you a benefit given the additional abstraction, and ultimately with adsense your payload is an html source, not script.
* it's possible google have a good reason, but nothing I can deduce from the code they expose
Probably not anything you can do aside from putting those includes right before the closing body tag, if you haven't already. JavaScript includes block parallel HTTP requests which is why they should be kept out of <head>
Surely Google's servers will be the fastest part of the loading, given that your ISP and most ISPs will have a local cache of the files too?
You could inject the script into the head on page load perhaps, but I'm not sure how that effects urchin.js.
Could be that your page simply takes that long to parse? It seems nothing network-related is happening. It simply waits around a second before the adsense/analytics requests are even fired off.
I don't suppose you have a few hundred tables on the page or something? ;)

Resources