I know this question is asked before but because Google is conflicting itself in their support I was confused.
When generating a Google analytics code, Google Analytics tells me to put my tracking code immediately after the opening body tag:
However, I read on the Google support forum that it should be inside my <head> tag.
https://support.google.com/analytics/answer/1008080?hl=en-GB
Paste your snippet (unaltered, in it’s entirety) into every web page that you want to track. Paste it immediately before the closing </head> tag.
Anyone knows which one is best?
As long as it's enclosed properly between <script> tags, it should work anyway. The only tradeoff being that:
including it in the <head> section would result in a slower page rendering (because it would execute the js before parsing the rest of the page)
including it in the bottom of the page might not count people that land in your page, stay for 3 seconds and leave.
The latter won't happen in small pages, but I've seen it happen in forums and blogs where there is a lot of html rendering, a couple of feet of scrolling below the fold and the page completion takes several seconds. In those cases it is possible for people to leave before GA snippet is executed.
Current analytics code is asynchronous so even if you put it in the head it shouldn't affect the rendering time but in a few miliseconds.
TL/DR it's about the same when it comes to normal pages
As long as you place the code as it appears from Google, the tracking will fire.
The higher on the page you have your code, the better your stats will be.
For example, if you have a user with a slow internet connection (think smart phone with poor signal) and your page takes 3 seconds to load. If you have your tracking code higher on the page, the code will fire sooner and start tracking the user's time on site, including the bulk of the load time.
Say in the above example, that the GA code is the very last tag on the site, and after 2 seconds the user sees a link they're looking for, and clicks that link before the page has fully loaded. In this case, the GA code may never fire from the bottom of the page, and you've missed analytics on a hit/visit/visitor that actually went to your site.
Having the GA code in the header or near the top of your code DOES NOT have to slow down the load time. You can call the GA code asynchronously to keep it from slowing down the rest of the load (documented here: https://developers.google.com/analytics/devguides/collection/gajs/).
Related
I have these 2 tags that works properly except for the landing pages hosted by our LP builder provider RDStation. One sends the conversion to Google Analytics and the other to Facebook. Both are triggered by one element visibility. What I'm tracking is a Form engagement with 3 steps. I'm expecting an event on GA and FB after the completion of the first step.
These tags work on the preview mode and both Pixel and Tag Manager are present in the live pages. The Form in other pages outside this host is the same code. I've tried to move code around, edit the form, edit and recreate tags and triggers. With no success. For some reason, very occasionally the tags work. I wasn't able to isolate the cause tho.
Stuck in this for 2 days, any ideas?
LP: https://materiais.bodyscience.pt/endermologia-02-2021-cons-online
If you have published the GTM with updated tags in my opinion the problem may be due to the way your page is made. If you look at the source of page you can see that the Tag Manager snippet is there 2 times, you have 2 times the head tag and 2 times the body tag ... Probably some conflict may be due to this, so first fix the page correctly and then try again.
I have a wordpress website running and i am using W3Total Cache Plug-in to make the site load faster. When i scan the site in Google Page Speed Insight, i noticed i am getting in-consistent scan results. I have a Facebook Messenger chat floating on the webpage and a google map. Since these two gave me Reduce the impact of third-party code Warning i have made changes so that these two will be loaded only after the DOM has loaded completely. Actually i have used jQuery SetTimeOut for this. I actually managed to remove the warning from the result by doing this. But now and then i noticed the same warning coming back in, even if i have made adjustments. if i scan the site two or three times frequently the warnong may go off, but will be back again once i try after a while.
These are the result of frequent scans. Do you guys have any idea about what would be going wrong here ? I spent a lot of time searching but couldn't get my head around it.
With the classic HTTP/1.0 Hypertext Transfer Protocol, resources like Javascript, CSS, HTML, images etc. are loaded in a request / response pair, meaning the browser sends a request to request for a resource (be it CSS, Javascript, etc.), and will wait for the response to come back before it requests another resource. Even though they are loaded in a request / response pair, the request and response pairs are not always going to follow the same sequence strictly, due to randomness in network latency, server response time, the load of the server is currently experiencing, etc.
With HTTP/2 and HTTP/3, the newer versions of HTTP protocols, instead of waiting for a response to come back before sending another request, the requests can be sent all at once. I checked your website and saw that your website is using HTTP/2 and HTTP/3. With HTTP/2 and HTTP/3 protocols, since requests can be sent all at once, it can contribute to a degree of "inconsistency" as well, among other things. Even with HTTP 1, there's always a degree of randomness since there are many factors that play into it like the server response time is going to be different, the network latency is going to be different, etc.
To illustrate this, if you are using the Chrome browser, open the "Developer Tools" tab by clicking the three dots on the very top right corner of the browser, and then click "More Tools" and then click "Developer Tools". Alternatively, you can do "Ctrl+Shift+I" if you use Windows or "Command + Option + I" on Mac. Then go to its "Network" tab, and refresh the page. Each time you refresh the page, the resources are loaded a bit different in sequence:
In the image above, using the Google Tag Manager UA-174548329-1 Javascript as an example (I know it's probably not Google Map), it is loaded as the 4th resource.
When I refresh the page again, your Google Tag Manager UA-174548329-1 Javascript is loaded as the 11th resource:
When the page is being loaded or if you run it on Google's PageSpeed Insight, the main thread is sometimes busy, sometimes not, due to the nature of the randomness of the request and response. Your main thread is also constructing the DOM, and doing a lot of work. Sometimes it's getting blocked by render-blocking resources, such as Javascript.
Javascript is always going to block the Critical Rendering Path by default. Without looking at your Javascript SetTimeOut it's hard to say what implementation you are using to delay your Javascript but it's safe to assume that it probably doesn't help with clearing the critical rendering path. Instead of using SetTimeOut, you should use defer or async.
You can look more into the Critical Rendering Path here. The main thread is the main process your browser is running to do most of the work to process and render the CSS, Javascript, HTML on a page. The critical rendering path is "the sequence of steps the browser goes through to convert the HTML, CSS, and JavaScript into pixels on the screen". - Quoted from Critical Rendering Path. The critical rendering path is the sequence of your Javascript, HTML, CSS, images, and other resources being downloaded and rendered. It requires a lot of knowledge to optimize your critical rendering path and it's no easy job. However there are two attributes you can try to use in the script tag, namely "async" and "defer" to control when your Javascript will be executed.
Take a look at this image:
Credit: Growing with the Web
https://developers.google.com/web/fundamentals/performance/optimizing-content-efficiency/loading-third-party-javascript/?utm_source=lighthouse&utm_medium=unknown
As you can see, you can try putting the async attribute in your script or the defer attribute in your script tag and see if it helps.
With 'async' attribute in the script tag, it means that your Javascript will be executed asynchronously as soon as it's downloaded. The blue bar under the <script async> as shown in the image shows that the script is downloaded at the same time when the HTML is being parsed as well, since the green bar and the blue bar are seen executing in parallel. As soon as the downloading of the script is finished, the script is then executed. At this point, the HTML parsing is paused until the script is finished executing. Whereas without the 'async' attribute, your HTML parsing will be paused (or blocked) when the script is being downloaded and executed.
With 'defer' attribute in the script tag, it means you are deferring the execution of your Javascript until the DOM is finished parsing. Although it will be downloaded as soon as the browser receives the javascript resource, but the downloading won't block the HTML parsing.
In summary, you can use the 'async' attribute in your third party script to 'unblock' your main thread to a certain degree, that they will be downloaded and executed in the background while your DOM is being rendered. This will speed up the main thread a bit. However one caveat is that the execution is still going to be render-blocking. A very important thing to note is that by using the 'async' attribute, be prepared to see some possible erratic behaviors of the page because, more 'inconsistencies' might happen as now the Javascript can be executed anytime in the rendering path and therefore if something needs to happen before or after the script, you might break the flow and the logic of it.
Or you can use the 'defer' attribute in your third party script to tell your script to be executed only after the DOM has been loaded completely. This can only speed up the process very little, only a little because the downloading of the script can now happen in parallel while the HTML parsing is taking place vs using the default script tag without specifying defer or async, but the execution is still going to take an overhead on the main thread.
As per Google's support document, there's a section on How do you load third-party script efficiently?, here are a few ways:
"
Load the script using the async or defer attribute to avoid blocking document parsing.
Consider self-hosting the script if the third-party server is slow.
Consider removing the script if it doesn't add clear value to your site.
Consider Resource Hints like <link rel=preconnect> or <link rel=dns-prefetch> to perform a DNS lookup for domains hosting third-party scripts.
"
Other methods:
Check out how to compress, minify, or combine various Javascript files into one file (if you are using Javascript in the form of files). Use GZIP compression to compress your Javascript, CSS. Also check out how to load third party scripts using a CDN (Content Delivery Network / Content Distribution Network), among others.
Updated Aug 12, 2020:
In response to your comment, since you mentioned that your third party scripts are coming from plugins that you can't code the 'async' or 'defer' attribute into the script tags, you can consider adding this before your other scripts:
<script>
// If your script tag has an id, use either one below:
document.getElementById("your_script_tag_id").async = true;
document.getElementById("your_script_tag_id").defer = true;
// If your script tag has a class name, use either one below:
document.getElementsByClassName("your_script_tag_class_name")[0].async = true;
document.getElementsByClassName("your_script_tag_class_name")[0].defer = true;
// If for once and for all scripts, use either one below:
document.getElementsByTagName("script")[0].async = true;
document.getElementsByTagName("script")[0].defer = true;
</script>
You can also check this out: Async JavaScript, this allows you to defer or async your Javascripts including the third party ones.
From what I can see you have set the "delay" to 3 seconds on Facebook Messenger chat. However your site takes a lot longer than this to load the initial content.
Your site will often not have loaded the "above the fold" content within 3 seconds due to things like network latency, load on your server etc.
For this reason the Facebook Messenger chat script is getting loaded at a point where the CPU may or may not be busy. For things like "Total Blocking Time" this is important as that is listening for when the CPU has it's first quiet period to work out when the page is usable.
For working out "impact of third party code" it is looking at when the CPU is working while trying to render the "above the fold" content, hence why sometimes it shows as an impact and other times it does not as sometimes your above the fold content has loaded sufficiently before the Facebook Messenger is initialised.
Additionally you have to consider when your main JS file containing the timeout is loaded, sometimes it will be loaded sooner depending on latency etc. so this will impact the time the fbDiv is added as well.
There is a lot to cover so to simplify the answer (as there is an awful lot to explain as to why this happens) is to increase the delay on Facebook Messenger or only have it load on a button click.
For example you could have a button that says "chat with us" and then use the click event to load facebook messenger (and hide the "chat with us" button). This would be my recommendation
Alternatively looking at the load speed on your site you could set the delay to about 7 seconds and it would then (probably) be consistent.
We are having an issue with our tracking on www.x3tradesmen.com where a Google Tag Manager tag is firing way too many times and we cannot determine why...
We only have one website event tag linked to Google Analytics called Form Submit and typically we would receive between 2-10 Form Submit events per day at the most, however, recently we have noticed that the tag is firing 1000's of times sporadically and we cannot pinpoint the issue. We have also noticed that our users have drastically increased for short time periods (minutes/hours). We typically only get 40-80 users per day on our website but we saw a massive spike of around 400 users in less than one hour once.
We recently added the facebook pixel via GTM and that is really the only change that we have made and now we are seeing these issues. Does anyone know of any common reasons to why this would be behaving this way or can anyone see any major issues with our implementation of GA or GTM on our website that would cause this?
I know this information is vague, so please let me know if there is specific information that would help identify the issue.
Thanks in advance!
Screenshot 1
Screenshot 2
Screenshot 3
Screenshot 4
Screenshot 5
I presume it is the FB pixel - Facebook automatically collects information in addition to what you have configured yourself and uses post/submit events to send them. You can disable that behaviour as per documentation and see if it makes a difference:
Automatic Configuration
The Facebook pixel will send button click and
page metadata (such as data structured according to Opengraph or
Schema.org formats) from your website to improve your ads delivery and
measurement and automate your pixel setup. To configure the Facebook
Pixel to not send this additional information, in the Facebook Pixel
Base code, add fbq('set', 'autoConfig', 'false', '')
above the init call.
I had a similar issue where suddenly additional submit events turned up in the GTM preview pane that I finally tracked down to FB, so there is a good chance that yours is the same problem.
Since evening of last Wednesday (March 15, 2017) I have noticed that the google analytics script is not being requested by my website consistently. When the script does load, the tags are not firing - for example for button clicks.
Is anyone else having this issue? I have searched around for threads regarding any malfunction on Google's side and could not find any. Google recently had an issue with their captcha which is why I am thinking this is an issue on their side.
To note, the site is built on a CMS and the tag manager code snippet is inherited on every page. This snippet has not been altered in the time surrounding March 15th.
Edit: Moving the code snippet from head to body creates consistent behaviour - gtm.js and analytics.js are loading and the tags are firing and registering in GA.
Check for not using dataLayer = [{...}] declaration after script part of container snippet in your pages code. This declaration should be used only before container snippet.
First thing to check, which you may have already done, is the position of the two code snippets used for implementation. I always position the <script> tag first in the <head>. Also, make sure the <noscript> tag is the first tag rendered in the <body>.
I had a similar issue once. My tags would inconsistently fire. After checking the console log, I found that JQuery was being declared toward the bottom of the <body>. When I moved the JQuery tag to the <head>, the issue was resolved. I can't imagine that you're issue is related, because my issue was due to a mistake by a developer. Nonetheless, check your console log to see if you notice any errors. It may help you determine the root of the problem.
One other thing to try is the Tag Assistant Chrome extension. Run it on your website, and it will run checks on your implementation.
Hope this helps! Let me know if you have further questions.
This might fall under the category of "you can't", but I thought it might be prudent to at least see if there is something I can do about this.
According to FireBug, the major bottleneck in my page loading times seems to be a gap between the loading of the html and the loading of Google adsense and analytics. Notice in the screenshot below that the initial GET only takes 214 ms, and that adsense + analytics loading takes roughly 130 ms combined. However, the entire page load time is 1.12 seconds due to that large pause in between the initial GET and the adsense/analytics loading.
If it makes any difference at all, the site is running off of the ASP.NET MVC RC1 stack.
alt text http://kevinwilliampang.com/pics/firebug.jpg
Update: After removing adsense and analytics, I'm still seeing a slow response time. Hovering over the initial GET request, I see that the following speeds: 96ms Receiving Data, 736ms DOMContentLoaded (event), 778ms 'load' (event). I'm guessing then that the performance is a result of my own jQuery javascript that has processing tied to the ($document).ready() event?
You should place your analytics code at the bottom of the page so that everything else loads first. Other than that, I don't think there's much you can do.
edit: Actually, I just found this interesting blog post on a way to speed up analytics by hosting your own urchin.js file. Maybe it's worth a look.
I've never seen anything like that using Firebug on Stack Overflow and we use Analytics as well.
I just ran a trace and I see the request for the
http://www.google-analytics.com/__utm.gif?...
Happening directly after the DOMContentLoaded event (the blue line). So I'd suspect the AdSense, first. Have you tried disabling that?
As it goes, I happen to have rather heavily researched this just this week. Long story short, you are screwed. As others have said the best you can do is put it at the bottom of the list of requests and make the rest of your code depend on ready rather than onload events - jQuery is really good here. Some portion of the js is static, so you could clone that locally if you keep an eye on it for maintenance purposes.
The google code isn't quite as helpful as it could be in this area*, but it's their ballgame and anything you do to change it is going to be both complex and risky. In theory, wrapping with a non-blocking script call in the header is possible, but would be unlikely to gain you a benefit given the additional abstraction, and ultimately with adsense your payload is an html source, not script.
* it's possible google have a good reason, but nothing I can deduce from the code they expose
Probably not anything you can do aside from putting those includes right before the closing body tag, if you haven't already. JavaScript includes block parallel HTTP requests which is why they should be kept out of <head>
Surely Google's servers will be the fastest part of the loading, given that your ISP and most ISPs will have a local cache of the files too?
You could inject the script into the head on page load perhaps, but I'm not sure how that effects urchin.js.
Could be that your page simply takes that long to parse? It seems nothing network-related is happening. It simply waits around a second before the adsense/analytics requests are even fired off.
I don't suppose you have a few hundred tables on the page or something? ;)