Google Analytics Reporting API v4 - 1MM row limit - google-analytics

How to Increase Data limit for Analytics 360 using resource based quota: 100M sessions
As per the documentation, I'm setting the useResourceQuotas flag to True to enable this feature, which would allow up to 100MM sessions per request:
https://developers.google.com/analytics/devguides/reporting/core/v4/resource-based-quota
After running a test query for a single date in the postman I do see that the sampling is removed (samplesReadCounts is absent in the response). However, I'm still getting only 1MM total rows in the response, even though I know for a fact there are more than 1MM rows.
Endpoint: analyticsreporting.reports.batchGet
Some things to note:
I am requesting data on behalf of client and accessing data via accesstoken (Client is having Google Analytics 360 account)

Related

Google Calendar API - Deeper insight into calendar usage limits exceeded errors

I have an application that's been running since 2015. It both reads and writes to approx 16 calendars via a service account, using the Google node.js library (calendar v3 API). We also have G Suite for Education.
The general process is:
Every 30 seconds it caches all calendar data via a list operation
Periodically a student will request an appointment "slot", it first checks to see if the slot is still open (via a list call) then an insert.
That's all it does. It's been running fine until the past few days, where API insert calls started failing:
{
"code": 403,
"errors": [{
"domain": "usageLimits",
"reason": "quotaExceeded",
"message": "Calendar usage limits exceeded."
}]
}
This isn't all that special - the documentation has three "solutions":
Read more on the Calendar usage limits in the G Suite Administrator
help.
If one user is making a lot of requests on behalf of many users
of a G Suite domain, consider using a Service Account with authority
delegation (setting the quotaUser parameter).
Use exponential backoff.
I'm not exceeding any of the stated limits as far as I can tell.
While I'm using a service account, it isn't making a request on behalf of a user. The service account has write access to the calendar and adds the user as an attendee
Finally, I do not think exponential backoff will help, although I do not have this implemented. The time between a request to insert and the next insert call is measured in seconds, not milliseconds. Additionally, just running calls directly on the command line with a simple script produce the same problem.
Some stats:
2015 - 2,466 inserts, 186 errors
2016 - 25,747 inserts, 237 errors
2017 - 42,815 inserts, 225 errors
2018 - 41,390 inserts, 1,074 errors (990 of which are in the past 3 days)
I have updated the code over the years, but it has remained largely untouched this term.
At this point I'm unsure what to do - there is no channel to reach Google, and while I have not implemented a backoff strategy, the way timings work with this application, subsequent calls are delayed by seconds, and processed in a queue that sequentially processes requests. The only concurrent requests would be list operations.

How does Google Analytics calculate 10000 requests per Profile?

I am Fetching Data from Google Analytics For Metrics (Pageviews,Unique pageviews, TimeonPAge, Exits) as below
DataResource.GaResource.GetRequest r = GAS.Data.Ga.Get(profileID,startdate.ToString("yyyy-MM-dd"),enddate.ToString("yyyy-MM-dd"),"ga:pageviews,ga:uniquePageviews,ga:timeOnPage,ga:exits");
r.Dimensions = "ga:pagePath";
r.Filters = "ga:pagePath=~ItemID=" + strPagePath + "*";
r.MaxResults = 1000;
GaData d = r.Fetch();`
then I received the following exception after fetching data(Metrics) for some random number of videos:
>>Error while fetching pageviews From GA Google.Apis.Requests.RequestError
>>Quota Error: profileId ga:****** has exceeded the daily request limit. [403]
>>Errors [
>> Message[Quota Error: profileId ga:****** has exceeded the daily request >>>limit.] Location[ - ] Reason[dailyLimitExceeded] Domain[global]
>>]
I am fetching these four metrics( page views, unique views.. so on) for one ItemID.
Does Google Analytics calculate it as 4 different Requests or one single request??
Each request you send against the Google analytics API counts as one. The Quota is not project or user based.
Pagination:
Your request above you are requesting maxResults of 1000 if the total number of rows in your request is 100000 then you are going to have to make 100 requests to get all of the data.
All APIs:
Requests to all of the APIs count against the same quota so if you are also using the management api it counts as well as the reporting api.
All Users and applications:
Now here is the fun part about the current quota system it is not project related.
Lets say my company has a profile 1234567. Now our marketing team all has access. Each member of the marketing team likes different apps. They all install the app they like best. They are all using the same 10000 request quota.
Reset:
Your quota will reset at midnight west cost USA time. No one will be able to access that view id until then. Top tip when testing create a development view under the web property to request from then you wont blow out your production view.

Google Analytics API returns "Quota Error: User Rate Limit Exceeded." while only few thousand queries

I've created a custom dashboard(javascript # localhost) which has about 10 views that are generated with Google analytics API and Google Charts .
Now that I've been debugging it for a while day or two it has started to return error "Quota Error: User Rate Limit Exceeded." .
From google console it shows only 1,416 queries for the past 4 days.
And yet, the quota states following:
Queries per day 50,000
Queries per 100 seconds per user 100
Queries per 100 seconds 2,000
Am I perhaps using some other Google API that has a lot smaller quota limits?
However, I couldn't find any other API from google console.
Well, with correct keywords it was easy to find.
https://developers.google.com/analytics/devguides/reporting/core/v3/errors
One can change upgrade the limit to 1000 queries per 100 second per user.
Go to Google Developer's Console
Dashboard > Under API column, Click on "Analytics API"
"Quotas" tab > Change "Queries per 100 seconds per user"
from 100 to 1000
src: How to fix "Quota Error: User Rate Limit Exceeded"

"Bing Error - Out of call volume quota" on first use

I am trying to use Blockspring in Excel (and Google Sheets) to access Bing Web Search to return URLs based upon a list of company names (method is described here: https://www.youtube.com/watch?v=35U-FKAlaPY). I am receiving this: "#ERROR! Bing Error - Out of call volume quota. Quota will be replenished in 15.01:52:46", even though my Microsoft Cognitive Services account shows zero of 1,000 calls used before 1/7/17. Note that this is a free trial subscription (if that matters). Any ideas would be appreciated.

Quota violation is not working as per quota set in API proxies

I have created a below quota which can consume API 6 times per hour. This is an verify API key authentiication type.
URL is http://damuorgn-test.apigee.net/weatherforecastforlongandlat?apikey=dJAXoH8y6GfVNJSjlDhpVIB4XCVyJZ1R
But Quota exception occurs after 8th time only (actually it should be on 7th time). Also, when i try to change quota limit and re-deploy the API proxies, still I see Quota exception on first time itself. PLease advise.
I am using free organization from cloud computing.
Quota 1
1
false
false
hour
2014-6-11 19:00:00
20
5
Okay two things...
1) Your Quota is set to <Distributed>false</Distributed>.
By default your Apigee instance runs on two separate Message Processors (the servers that do the heavy lifting). This means that each MP will count and with a Quota of 6 you effectively have 6 * 2 servers = 12.
2) Your Quota is Distributed but Asynchronous in the second example.
If you don't set <Distributed> and <Synchronous> to false, Apigee will share Quota counts by checking in with the central data server. There will always be some lag with this, but you have set your AsynchronousConfiguration to check in with the central server every 20 seconds or every 5 messages, so you could, count up to 5 on each MP processor before checking in with the other servers.
Keep in mind that in a distributed processing model like Apigee you will never get an absolutely precise number because even with Distributed set to true and Asynchronous set to false there will always be some lag with the servers talking to each other.
Also, you might want to strip out the request.header.quota_count and other request.header variables -- if I passed a number (say 100000) as a header like
quota_count: 100000
Apigee will use the 100000 rather than your value of 1 (it uses the referenced variables and rolls back to the default value only if the reference is NULL).
And... you probably want to add an <Idnetifier ref="client_ip"> or something otherwise the quota is global to all users. See the Apigee variables reference at http://apigee.com/docs/api-services/api/variables-reference for the variables that are available in every flow.

Resources