Response times Google Analytics API - google-analytics

I'm using a proxy to make requests to the Google Analytics API. This proxy has a timeout of 30 seconds and I reach this timeout some times. Do you know if it is normal that Google last more than 30 seconds in the response? The biggest request I have contains 4 metrics and 4 dimensions.
Also, I want to ask if there is any way of improving this response times, maybe requesting a lower number of results per page or any other mechanism similar.

Related

VCL: limit too many GET requests on same url

We are experiencing a problem where twice a day (when we send out a newsletter using a third service provider) the home page gets hit by thousands of requests from a particular bot for about 3-5 minutes. Such traffic seems to be generated by a legit bot and it creates fake data on our third parties analytics aggregators. We wonder if we could block/deny such traffic without penalizing the bot and its IP’s. We would like to be able to set in the VCL a sort of rule to block/deny/reject traffic when GET requests from the same user-agent hit the same exact URL too many times in a very short period of time. Do you have any suggestions?
You can do that using the vsthrottle VMOD: https://github.com/varnish/varnish-modules/blob/master/src/vmod_vsthrottle.vcc.

Low request throughput in Analytics API

I've been using the google analytics API to authenticate on behalf of my customers and show them customized reports. The req throughput was always good enough (11 req/sec).
However, since a couple of days ago, the throughput had decreased drastically, from 11 req/sec to 5 req/sec. And the reports that usually took 10 seconds, now finishes in 40 seconds.
Nothing has been modified in the last days, neither the queries nor the way to access the API
I tried but haven't found anything about any performance degradation from Google.
Here are the requests for the last 4 days:
Is there something that I can do to validate what's wrong or where is the problem?

Keep getting 403 rateLimitExceeded from Reporting api

I have an application which gets Google Analytics statistics for 28 google users (a.k.a: accounts/logins/emails) from Reporting API.
Every client makes a request to his own ga.data (some metrics for the last 3 days) ones per 10 minutes.
Everything was ok for a long period, but yesterday around 5:00 pm UTC one the GA users started to get 403:rateLimitExceeded error in response.
A cycle of 10 repeating requests gets the same 403 error. In 10 min., the new cycle starts and the result is the same.
All other clients on the same application keep updating well, without getting 403:rateLimit error in return.
I have a "sleep" function for 1 second before making a request. Also I am sending a uniq "quotaUser" in each request. My application makes less than 1 RPS and keeps within 20K requests per day.
As I am aware "403:rateLimitExceeded error" stands for the overall limit of the requests for the whole application per day, however in my case all other clients except this one keep updating properly and the overall daily limit of 50K is not being exceeded.
UPDATE:
0:00 UTC this GA user stopped getting "403 error" and now keeps updating well.
Please advise what could be the possible reasons of getting "403:rateLimitExceeded error" for that client and what I can do to avoid getting the same problem again?
The reason was too many failed requests per a little time period, in this case you should wait for quotas reset at midnight

What is the rate limit for direct use of the Google Analytics Measurement Protocol API?

In the Documentation for Google Analytics Collection Limits and Quotas
It gives the rate limits that are implemented by the various Google-provided libraries. I can't seem to find a published rate limit for users that are POSTing directly to measurement protocol (https://www.google-analytics.com/collect).
Is there one and if so what is it?
Edit on 10 July 2015 -
A few commenters asked for an example of the kind of data I am sending in.
Using a series of calls to wget with a sleep of one second between each call.
Here is an example with the app name and tracking code removed:
wget -nv --post-data 'ul=en&qt=7150000&av=0.0.1&ea=PLET&v=1&tid=<my_tracking_code>&ec=Move+to+Object&cid=1434738538-738-654031&an=<my_app_name>&t=event' -O /dev/null 'https://www.google-analytics.com/collect'
I've tried sending these queries to the /debug endpoint and all of them are valid. My first upload worked as expected and reports looked good. Subsequent uploads of the same data set to different GA properties have had mixed results. Sometimes no data appears in reports. Sometimes partial data appears in reports. During upload, realtime reports always show activity, though.
Directly from the documentation Google Analytics Collection Limits and Quotas
These limits apply to the Web Property / Property / Tracking ID.
10 million hits per month per property
Measurement protocol
Universal Analytics Enabled
This applies to analytics.js, Android iOS SDK, and the Measurement
Protocol.
200,000 hits per user per day 500 hits per session not including
ecommerce (item and transaction hit types). If you go over either of
these limits, additional hits will not be processed for that session /
day, respectively. These limits apply to Premium as well.
Now I agree it doesn't specifically state the per second it rate for measurement protocol but the above one dumped Measurement in with analytics.js so I think we can assume its
analytics.js:
Each analytics.js tracker object starts with 20 hits that are
replenished at a rate of 2 hit per second. Applies to all hits except
for ecommerce (item or transaction).
But just to make sure I am sending an email off to the development team they should make it more clear where the per second rate of the measurement protocol lies. I will repost here when I hear from them
Response from Google
The Measurement Protocol does not do any kind of rate limiting or
quota-ing by IP address or tracking ID or anything like that. However,
most of the client libraries do rate limit in some form or another.
As Linda points out in her answer, there are various limits and quotas
imposed by the back end, but those are done at processing time, not
collection time.
Conclusion
There is no limit to sending data through the measurement protocol. But when the data is processed limit may be applied. I think they may be referring to the max 2 million hits a month. It seems it's the libraries that apply limits on how fast you can send data not the measurement protocol directly.
Last Update: Please watch this video which explains all GA quotas policies:
https://youtu.be/1UfER93ALxo
In particular, your issue might be result of 10 requests / 1 second limitation:
https://youtu.be/1UfER93ALxo?t=5m27s
I can confirm the same thing. In my case I had own buildHitTask which constructs URL for a measurement protocol request (MPR) and stores it in the hitPayload field. But instead of original GA reporting - I was saving those URLs into cookies for delayed reporting.
In my experiment, only 10-20% of 2,000 measurement protocol requests were actually "stored".
Rest of hits are not available in GA Reporting UI, neither API or BigQuery. Each request was sent with 2 seconds delay via new Image() method, and slowdown in case of errors. Received results are not consistent. Both success and failed hits are randomly distributed across whole time period.
Please let me know in case if you find more details on this constraint!

Unclear Google Analytics API quota restrictions

I've been recently fixing my application that apparently reached some GA quota
limitations and I've found a couple of things that were not clear to
me:
Does the 4 concurrent requests limitation apply per application,
per web property or anything else?
If we break the 10 requests in any given 1-second period or 4
concurrent requests limitation, how long does it take before GA stops
responding with 503 ServiceUnavailable error?
Does quota per application refer to the application name string
only? We are running two different web application using different GA
application string. Both apps connect GA API from the same IP address.
Can we expect the quota per application is calculated for each
application string separately in this case?
Are the status codes sent with 503 ServiceUnavailable response
documented anywhere? Can we be sure that rateLimitExceeded refers to
the 10 requests per second limitation? How can I found out the cause
of an 503 response?
Btw is it possible that a stronger quota restrictions than documented
may take effect sometimes?
For example, is it possible that GA replies with 503 ServiceUnavailable
response just after 6 fast but subsequent requests or just because of any
other undesired behavior of a client application that's not included
in the documentation?
Regards,
Pavel
It's just been answered by Nick in the GA Google Group.
Main points: the 10 qps and 4 parallel requests limitation count per IP, even an application running on a different machine in the same network may be counted.
I've submitted a documentation bugreport to the GData issue tracker.

Resources