Tag Assistant plugin is giving a warning that custom dimension parameter is too long, since it's been like that for quite a while I would like to asses how the data was impacted.
Does anyone know how Analytics will handle a request with a property that is too long? Does the entire event get dropped, only the too long property, or it gets trimmed.
Someone asked an identical question here but it got no answer:
https://support.google.com/tagmanager/thread/15716265?hl=en
Custom dimension max length is 150 Bytes (150 characters):
https://developers.google.com/analytics/devguides/collection/analyticsjs/field-reference#dimension
However I had fun doing a test with the hit builder (https://ga-dev-tools.appspot.com/hit-builder/) on one of my Analytics Properties. I entered a number of allowed characters and initially received a warning message:
The value provided for parameter 'cd1' is over the recommended limit.
however, I managed to send the hits with a different number of characters as value for the custom dimension:
150 characters (limit declared in Google documentation)
300 characters
600 characters
1200 characters
2400 characters
4800 characters
8107 characters (limit beyond which the hit builder returned an error and did not validate the hit)
The result in Google analytics was the following:
So I would say that the answer is .... the limits are made to be overcome :)
Related
I've been using the Bing Web Search v7 API for roughly about a month now and I'm starting to notice some obscure inconsistencies. Sometimes when I run a query, I get a JSON response with no results. I can then run the same exact query shortly after but this time the JSON returned will contain results. I'm not sure if I'm nearing my TPS limit (which is currently 100/s) but I don't believe I am. I also explicitly check for the URL length to ensure Bing doesn't return an error for a string that's too long. I'm doing queries in the same way that's described here. Each query contains a capture group with a set of terms "OR"'d together (which is done following the Bing advanced search guidelines). I had an issue with double spaces causing no results but that has been fixed. Is this possibly a result of the HTTP GET length requirements? Or maybe something wrong with my key? Any help is appreciated and let me know if any more information is required!
I'm using Bing Web Search API v7, I'm sending following requests (selected few):
/bing/v7.0/search?q=mate%C5%99sk%C3%A1%20%C5%A1kola&count=50&offset=0&responseFilter=Webpages
/bing/v7.0/search?q=mate%C5%99sk%C3%A1%20%C5%A1kola&count=50&offset=50&responseFilter=Webpages
/bing/v7.0/search?q=mate%C5%99sk%C3%A1%20%C5%A1kola&count=50&offset=950&responseFilter=Webpages
/bing/v7.0/search?q=mate%C5%99sk%C3%A1%20%C5%A1kola&count=50&offset=1000&responseFilter=Webpages
/bing/v7.0/search?q=mate%C5%99sk%C3%A1%20%C5%A1kola&count=50&offset=1050&responseFilter=Webpages
First search query request with offset=0 returns 50 records, value
of totalEstimatedMatches > 50000.
Second request with offset=50 returns another 50 records, value of
totalEstimatedMatches is different, but still above 50000.
And so on with increasing offset (not presented above).
However, request with offset=1000 or any value offset >= 1000
provides result with records which are identical to records returned
in request offset=950.
This behavior in fact corresponds to MS Bing web search - when I click on page 101 with offset 1001 (or any higher page), I in fact get page 96 with offset 951.
So, I can't figure out any way to access more than 1000 results, even if there should be above 50000 of them (I'm aware that totalEstimatedMatches is only an estimate and the real value can differ).
Does anyone know how to get more than 1000 webpage results (more than 100 pages with 10 records / more than 20 pages with 50 records)?
Search engines optimize their index and return fewer results than totalEstimatedMatches to 1) Stop serving repetitive pages and 2) Focusing on relevance of top pages only. Bulk (if not 99.x%) of users alter query if they don't find results on first 2-3 pages. So maybe for search engines it is not worth storing index of billions of pages for a given query. Note that this behavior is common across all the search engines and not only Bing.
We are using Google Analytics on a webshop. Recently we have added enhanced ecommerce to measure more events so we can optimize the webshop. But now we are experiencing less pageviews and other data is missing.
I don't know what it is, but on a specific page we are nog measuring anymore, I removed some items from the ga:addImpression data, and now the pageview is measured again.
I can find limits for GA, but I can't find anything for the amount of data that can be send to GA. Because is this seems to be related to the amount of data that is send to GA. If I shorten the name of a product, the pageview is also measured again. GA is practically broken now for us because we are missing huge numbers of pageviews.
Where can I find these limits, or how will I ever know when I'm running into these limits?
In one hand, im not sure how are you building your hits but maybe you should keep in mind the payload limits to send information to GA. (The limit is 8Kb)
In the other hand there is a limit in fact that you should consider (Docs)
This applies to analytics.js, Android iOS SDK, and the Measurement Protocol.
200,000 hits per user per day
500 hits per session
If you go over either of these limits, additional hits will not be processed for that session / day, respectively. These limits apply to Analytics 360 as well.
My best advise is to regulate the amount of events you send really considering which information has value. No doubt EE data is really important so you should partition productImpression hits in multiple ones of the problem is the size. (As shown in the screenshot)
And finally, migrate to GTM.
EDIT: Steps to see what the dataLayer has in it (in a given moment)
A Google Analytics request can send max about 8KB of data:
POST:
payload_data – The BODY of the post request. The body must include
exactly 1 URI encoded payload and must be no longer than 8192 bytes.
URL Endpoint
The length of the entire encoded URL must be no longer than 8000
Bytes.
If your hit exceeds that limit (happens e.g. with large product lists in EEC tracking) it is not (as far as I can tell) processed.
There are also restrictions to field length for some fields (e.g. custom dimension with max 150 bytes, others are detailed in the parameter reference ).
In some cases the data type is relevant, e.g. if in your event tracking the event value is set to a string the call might fail.
I think this is the page you are looking for Quota and limits page can help
These limits apply to the Web Property / Property / Tracking ID.
10 million hits per month per property
If you go over this limit, the Google Analytics team might contact you and ask you upgrade to Analytics 360 or implement client sampling to reduce the amount of data being sent to Google Analytics.
I would like to get some data from GA via spreadsheet add-on as I did a few weeks ago (I gathered ~200 000 rows). I am using same metrics, dimensions and rest of the settings but I am still getting this error :
https://i.stack.imgur.com/hTpIg.png
I found that I will get some data when I do not set up "max-results", but the default is set up on 1000 which is not enough for my needs. Why?
What I have tried to solve this problem and it doesn’t work:
change GA views
change dimensions and metrics
change time range
create new spreadsheet
set up sharing settings of spreadsheet to "public on web"
I found the link regarding limits and quotas on API (https://developers.google.com/analytics/devguides/config/mgmt/v3/limits-quotas#) and I should pass only through 50 000 requests per project, which I actually exceed on the first run, so another question how is it even possible to get more data than I suppose to get?
Should I really order more request or does "request" mean anything else than "one row"? Second why or what?
There is no any interpretation for the error.
Perhaps I am missing something, appreciate your help.
In short: while one could only guess what causes your problem it's most certainly not the API limit. Rows and requests are not at all the same, every request may fetch up to 10,000 rows.
"Request" is a call to the API, which might include one or many rows of data (unless your script somehow only requests one row at a time, which would be unusual).
If you exceeded your API quota the error message would say pretty much that.
The default is 1000 rows because that's a sensible default (compromise between convenience and performance). The API will return max 10,000 rows per request. To fetch 200 000 results the Add-on would have to do 20 requests, not 50 000.
Also a Google spreadsheet support 2mio cells at max, this might be exceeded by your result set.
"Service error" is a very unspecific error message which can be caused by a variety of causes from out-of-bound ranges to script timeouts or network latency. Sometimes the spreadsheet service dumps an additional error message in the browser console, so you should check your developer tools.
I've just finished tagging a website, and currently submitting test data.
On one page we fire an Ecommerce purchase event, which consists off the transaction information / product information.
The transaction object is populated by retrieving values off the page i.e from html labels this is all working as expected, however when we check Google Analytics > All Web Site Data it seems GA is formatting the number and removing trailing zeros for some strange reason, please see an example:
This doesn't happen to all purchase events, as you can see at the top of the table there is an entry for $1,793.04 and this has been displayed correctly.
In regards to populating the transaction object, we aren't doing any formatting what so ever we simply reference the html label value and pass that, so I'm unsure into how this could be happening. Has anyone experienced this before?
Personally I have not found this anywhere else documented but just to be sure you can conform with the formatting expected by the Measurement Protocol. Meaning the http request that will be ultimately sent to GA servers to process your values. The transactionRevenue and most monetary values in analytics.js are of type CURRENCY and you can find the documentation of these data type in the parameter reference. And from the link I quote
A decimal point is used as a delimiter between the whole and
fractional portion of the currency.
So the thing that I would advise you to do is use some js function to format your string values. Remove the comma and then use the (.) as a separator for whole and fractional point.