Google Analytics data limits - google-analytics

We are using Google Analytics on a webshop. Recently we have added enhanced ecommerce to measure more events so we can optimize the webshop. But now we are experiencing less pageviews and other data is missing.
I don't know what it is, but on a specific page we are nog measuring anymore, I removed some items from the ga:addImpression data, and now the pageview is measured again.
I can find limits for GA, but I can't find anything for the amount of data that can be send to GA. Because is this seems to be related to the amount of data that is send to GA. If I shorten the name of a product, the pageview is also measured again. GA is practically broken now for us because we are missing huge numbers of pageviews.
Where can I find these limits, or how will I ever know when I'm running into these limits?

In one hand, im not sure how are you building your hits but maybe you should keep in mind the payload limits to send information to GA. (The limit is 8Kb)
In the other hand there is a limit in fact that you should consider (Docs)
This applies to analytics.js, Android iOS SDK, and the Measurement Protocol.
200,000 hits per user per day
500 hits per session
If you go over either of these limits, additional hits will not be processed for that session / day, respectively. These limits apply to Analytics 360 as well.
My best advise is to regulate the amount of events you send really considering which information has value. No doubt EE data is really important so you should partition productImpression hits in multiple ones of the problem is the size. (As shown in the screenshot)
And finally, migrate to GTM.
EDIT: Steps to see what the dataLayer has in it (in a given moment)

A Google Analytics request can send max about 8KB of data:
POST:
payload_data – The BODY of the post request. The body must include
exactly 1 URI encoded payload and must be no longer than 8192 bytes.
URL Endpoint
The length of the entire encoded URL must be no longer than 8000
Bytes.
If your hit exceeds that limit (happens e.g. with large product lists in EEC tracking) it is not (as far as I can tell) processed.
There are also restrictions to field length for some fields (e.g. custom dimension with max 150 bytes, others are detailed in the parameter reference ).
In some cases the data type is relevant, e.g. if in your event tracking the event value is set to a string the call might fail.

I think this is the page you are looking for Quota and limits page can help
These limits apply to the Web Property / Property / Tracking ID.
10 million hits per month per property
If you go over this limit, the Google Analytics team might contact you and ask you upgrade to Analytics 360 or implement client sampling to reduce the amount of data being sent to Google Analytics.

Related

Google Analytics Hit Quotas

I wonder whether someone can help me please.
I have a user who under a specific property, sporadically receives the following error:
Some hits sent on 03-Jul-2018 to property ...... exceeded one or more hit quotas and were therefore not processed.
Hits can be dropped when daily or monthly hit limits are exceeded. You can view your hit volume levels in Property Settings in Analytics.
Hits can also be dropped if visitor hit limits are exceeded. This can happen when your site is incorrectly generating the visitor ID for a GA session. Contact your website administrator to check that the visitor ID generation has been correctly implemented.
They are not using the Premium account but when I look at the data for the day in question, there aren't any issues with regards to 'High Cardinality' which unless I've misunderstood I'd expect to see.
Could someone look at this please and offer some guidance where the issue may be because this area is fairly new to me.
Many thanks and kind regards
Chris
Collection limits are influenced by 2 factors:
The tracker: whether you use ga.js,gtag.js,analytics.js etc... here are the details.
The property type: whether you are using GA (10M hits / month) or GA 360 (2B hits / month).
In your case you are facing a property limit. To find out when such limits where reached, you can create a custom report using a time dimension (eg date+time) combined with the hits metric. You can also combine the hit metrics with other dimensions (country, browser, device) to see if you find any patterns as to why you're getting so many hits.
Cardinality is something else: it refers to the number of unique value combinations for your dimensions. For instance if you have 500K events where each event category is different, you'll have a Cardinality of 500K on the event category dimension. The more hits, the more likely you'll have a high cardinality, but the 2 aren't necessary related (if you send 10B events with the same category, the cardinality on the category is 1).
So focus on identifying and solving your limits/quotas issue, as it's the real issue here:
If the number of hits is legitimate (you have a huge amount of traffic), then the only options are to upgrade to GA 360 or reduce the number of hits for each session
If the number of hits is abnormally high (eg traffic is stable but hits increased dramatically), look for implementation issues, especially generic event trackers such as error tracking with tools like Google Tag Manager

Google Analytics Client Sampling when over 10milion hits/month

Google Analytics Documentation says the following:
hese limits apply to the Web Property / Property / Tracking ID.
10 million hits per month per property
If you go over this limit, the Google Analytics team might contact you
and ask you upgrade to Analytics 360 or implement client sampling to
reduce the amount of data being sent to Google Analytics.
For monthly total Analytics 360 limits, please contact your account
manager or service representative.
What does this mean exactly ?
I know there is sampling , the one which you see in your reports..
But if your traffic exceeds the 10 milion hits per month, is there an automatic sampling system which forbids you to capture all incoming traffic?
In other words : Does google limit your traffic automaticly at the source? Not in the reports but in the source, let's say I capture 20mil hits a month, will i have all that traffic in my property or does it stop at a certain point?
AGAIN : i'm not talking about report sampling but about the actual captured data a month
Thanks in advance
No, Google does not limit data collection. You have to implement this yourself, although they give you a means to do that at least in the Javascript Tracking code. Implementing sampling yourself would be a little tricky since you want to sample out whole sessions, not individual pageviews.
If you record 20 mio hits you will have them in your property. But at that point you operate outside your quota and Google has the right to terminate your account (they will not do that without getting in contact with you, provided you respond to mails send to the Google accounts authorized to use your GA properties).
So far Google has been, in my experience, very generous even with large overruns, but you should not base something business critical on the violation of TOS for a free service.

Google Analytics: How to overcome payload size restriction?

I use Google Analytics Enhanced E-commerce for some shops. On catalog page I have many products and I need to track their impressions. I do not track each product one-by-one, cause it will cause many requests, instead I add all of them through .ec:addImpression and then track entire pack by sending single pageview.
And everything was going well until I faced a problem, that on page with too many products requests to collect stopped working with no error. I've installed analytics debugger for Chrome and found out, that I've exceeded a payload limit, which is set to 8 KB (according to official documentation):
payload_data – The BODY of the post request. The body must include
exactly 1 URI encoded payload and must be no longer than 8192 bytes.
And this is fine, but here's my question: is there any way to overcome this restriction? Maybe some option or method, that will allow not to bother about payload size and it will be automatically split into proper chunks? Or at least a method to get a payload in run-time to check its size. I run through documentation and found nothing.
Note: currently I manually track a "safe" number (which was discovered by experience) of products added by addImpression and then send them by non-interaction pageview hit. Of course, this solves my problem, by I want to know if there's a built-in solution.
Another possibility is to send only true impressions, that is, only for those products/items that the user is actually seeing above the fold. Not all products that you are sending impressions for are actually seen by the user until they scroll down the page below the fold. So this would require a modification to the implementation where you send your impression data as the user scrolls down the page and reveals more products. You can likely send more information with each product and still not exceed the payload, and you get a more accurate measure of your impressions.
Create a product data import that matches your product ids to product to product attributes (name, category, price etc). Wait until the data is processed, then change your tracking code so that only product ids are sent.
That should shrink down the request body enough to send all products, and the ids will be joined with the imported data when the incoming hits are processed.
Imported data is not applied retroactively, so it'S important that you do the data import first.
AFAIK there is no way to get your payload size from google analytics and it's a crying shame that analytics.js does not handle this issue automatically since the analytics.js library which constructs the payloads is best suited to handle this and thus minimise load on Google's servers...
I like Eike's solution, though if your products change a lot it might require automation. As #nyuen implies - sending only real impressions may help and is more accurate.
Another trick is to send the impression one at a time. (As shown or on page load) This will require the smallest change and reduce the payload to well bellow the limit.

Site Speed for Google Analytics not working

I am successfully tracking traffic on a site (so far over 150 hits), however the Site Speed is still showing all zeroes. I know that the default is to only calculate speed for 1% of the requests, but it seems that nothing is getting tracked at all for Site Speed. I'm using the newest version of Chrome.
Any ideas? Thanks!
Site speed sample rate sets the percentage of users to be tracked not the percentage of requests. Here's what the docs say:
This setting determines how often site speed tracking beacons will be sent. By default, 1% of users will be automatically be tracked. Note: Analytics restricts Site Speed collection hits for a single property to the greater of 1% of users or 10K hits per day in order to ensure an equitable distribution of system resources for this feature.
In your case, if you don't expect to have more than 10K hits to your property in a single day, you can safely set the sample rate to 100%. Note: this has to be done in the create method:
ga('create', 'UA-XXXX-Y', {'siteSpeedSampleRate': 100});
Have you called the _setSightSpeedSampleRate() method? That would be my best guess. According to the Google docs:
The _setSiteSpeedSampleRate() method must be called prior to _trackPageview() in order to be effective.

"There is no data for this view." in reports with custom dimensions in Measurement Protocol of Google Analytics

Our application needs to gather usage data through the Measurement Protocol of Google Analytics.
We can successfully send "appview" hits to the Google Analytics server, and get a proper response by it (a GIF image). The appview hits appear on the GA Dashboard, along with the country of origin, session duration, etc.
We also have several custom dimensions and metrics that we want track for each hit. We have set those up in the GA Admin panel with the correct scope, index and active state. We have 3 Hit-scoped dimensions, 3 User-leveled dimensions and 1 Hit-scoped metric, all set to Active state.
We send the dimensions and metrics as described in the docs at
Custom Dimensions / Metrics
attached to the hits they apply for, like so:
...&cm1*=3 <--for the metric
and
...&cd6*=15 <--for the dimensions (some dimensions have numeric values, others are text)
The problem is that those metrics and dimensions don't show up in our custom reports: the reports always say "There is no data for this view.". For example, we have a report that has one dimension and one metric, without any filters, set to "Any view". It doesn't matter if the Type of the report is Explorer, Flat Table or Map Overlay, it never shows anything.
There have been several days since the hits were received and appeared in the dashboard, but the reports are still empty. So scratch out any processing lag.
We tried sending "event" hits instead of "appview" hits - again, the hits show up in the Dashboard, but the reports are empty.
We cannot get any useful insights without using dimensions and metrics - so there is no way to get by without this.
Because of reasons too long to describe, we cannot use any of the Google-provided Analytics libraries.
Is there anything else we need to do to see data in those reports?
When using the measurement protocol you need to that the profile is not set to exclude bot and spiders.
Go to the google analytics website under the admin and the settings under the profile in question.
Beyond that check the realtime reports you should see the hits coming in. You will need to wait 24 -48 hours for data to appear in the standard reports due to data processing lag.

Resources