I have some problem with google analytics. In the Google Ads campaign report, the number of clicks is six times higher than the number of sessions. I have no idea why.
https://imgur.com/a/nENMVbx
Usually a discrepancy is physiological, however that indicated in the image is actually a bit high.
It could depend, for example, on the slow loading of pages or on the loss of the gclic parameter (for example in the case of redirects) or on an error on the site that blocks tracking.
Related
Looking at Google Analytics Real Time tracking.
Why is there such a difference in the two numbers?
It is a Google Analytics bug, it is an accumulation of the number of active users over time without however the total being decreased once the user is no longer in that state. There is nothing to do, it realigns itself.
https://www.analyticstraps.com/bug-numero-anomalo-di-utenti-attivi-in-tempo-reale/
We are using the free level of GA and have been creating reports using Custom Dimensions and Metrics since last summer.
We also use the Google Sheets Analytics add-on to post process data pulled from the API.
Overnight on 16-17 May (UK Time), our reports suddenly started showing as being sampled. Prior to that we had no sampling at all, as our reports are scheduled so I can look back through the revision history to see changes made when the scheduled reports run.
This sampling is occurring in custom reports viewed in the GA platform and in GA sheets. I've done some analysis and it appears to only occur at the point that more than one Custom Dimension is added to a report, or when the GA dimensions ga:hour or ga:dateHour are used (ga:date does not trigger sampling).
All our Custom Dimensions and Custom Metrics are set at Hit level (I've read a post where it was claimed to be due to mixing scopes on Dimensions & Metrics, but we are not doing this).
If I reduce the date range of a query (suggested as a solution on many blogs), the sampling level actually gets worse rather than better.
For the month of May we didn't even hit 4k sessions at property level. I can't find any reference anywhere to any changes being made to GA that would cause sampling to apply to our reports (change documentation, Google Blogs etc).
Is anyone else experiencing this or can anyone shed any light on why this might be happening? Given how we use GA if we can't resolve this then it's a year of work down the drain, so I'm really keen to at least know why this has suddenly happened even if ultimately nothing can be done about it.
I have several months worth of data in Google Analytics that is currently more or less useless, because some pageviews were being counted 2 or 3 (or more?) times.
The issue as far as I can tell came from a combination of a jQuery address plugin and AddThis' adding a tracking # to URLs shared on social media.
I've removed the plugins and implemented this filter to stop Google Analytics from tracking trailing vs. non-trailing slashes as 2 or more pageviews.
It works now, but is there a good way to go back and apply the filter to previous months?
The data should be salvageable I think, since Analytics tracked the Sessions correctly, but the Pageviews and Unique Pageviews way too high.
Comparing an affected time period last year to the same (unaffected) time period this year:
Unfortunately... Once GA data is sent and processed on their servers, it can't be changed.
Although your "pages/session" and "bounce rate" are unreliable, most of the other information should still be fine, for example "device type". Even conversion rate percentages are calculated as Conversions / Users - so in most cases that won't be affected.
My blog is about 7 months old. At my current level I usually get around 100 sessions per day. I have always actively filtered out all ghost referrers as they appear and thus should have virtually none of them appearing in my Google Analytics data. I have also checked the box that instructs Analytics to ignore "known bots".
So I'm wondering after all these measures, how many of my sessions each day should I still reasonably chalk up to bot traffic?
And a side question, is there anything else I can do to make my Analytics data more accurate in detecting only real human traffic?
One thing you could do is add an invisible link to you main page anyone clicking on the link has a very high probability of being a bot.
I am using Google Web Analytics Online Tool to monitor visits on my site.
What bugs me is that often I see that records contain the folloowing entries:
Page Visits: 1.00
Average Visit Duration: 00:00:00
Bounce Rate: 100%
What does that mean?
If the visitor comes to my site it should stay at least couple of seconds until he leaves?
Could that mean that something is wrong with accessing my site (I had similar problems before, but I am convinced I fixed them since I am not getting any errors when I try to access my site from different computers.)
When a visitor comes to your page google analytics sets a cookie where a timestamp is stored. When the user visits a second page in your site Google compares the stored timestamp to the actual time and calculates visits duration from the difference between the two. If all your visitors have bounced there is no second data point to compare the stored value to and google is unable to compute a duration.
A common workaround is to set a javascript timeout and trigger an event after ten seconds or so (with the "interaction" flag in the event set to true, see Google Analytics event tracking docs for details). The assumption is that somebody who looks for more than ten seconds at you page is not actually a bounce (I think that since "bounce rate" has so hugely negative connotations people try to avoid high bounce rates even at the price of introducing bad data; you should realize that "bounce rate" simply means that there are not enough data points to say anything meaningful about those particular visitors).
Personally I do not like that approach because it means to redefine inaction of a visitor as action. A better idea (IMO) is to implement a meaningful interaction point - like a "read more" link that loads content via ajax or something like it - and track that via event tracking or virtual page view.
Event tracking guide:
https://developers.google.com/analytics/devguides/collection/gajs/eventTrackerGuide
Short Update: With Universal Analytics the technical details have changed (i.e. there are no longer cookies with timestamps, all information is processed on the GA servers). So the first paragraph is no longer up to date, however the rest of the answer is still valid.
I'm having a similar issue, i monitor those placements and recently found out the traffic is hardly getting to my site, recent experiment showed that those are placements triggered via clicks from GDN, but people have not even reached my page, were blocked by pop-up blocker or other similar software