I am Fetching Data from Google Analytics For Metrics (Pageviews,Unique pageviews, TimeonPAge, Exits) as below
DataResource.GaResource.GetRequest r = GAS.Data.Ga.Get(profileID,startdate.ToString("yyyy-MM-dd"),enddate.ToString("yyyy-MM-dd"),"ga:pageviews,ga:uniquePageviews,ga:timeOnPage,ga:exits");
r.Dimensions = "ga:pagePath";
r.Filters = "ga:pagePath=~ItemID=" + strPagePath + "*";
r.MaxResults = 1000;
GaData d = r.Fetch();`
then I received the following exception after fetching data(Metrics) for some random number of videos:
>>Error while fetching pageviews From GA Google.Apis.Requests.RequestError
>>Quota Error: profileId ga:****** has exceeded the daily request limit. [403]
>>Errors [
>> Message[Quota Error: profileId ga:****** has exceeded the daily request >>>limit.] Location[ - ] Reason[dailyLimitExceeded] Domain[global]
>>]
I am fetching these four metrics( page views, unique views.. so on) for one ItemID.
Does Google Analytics calculate it as 4 different Requests or one single request??
Each request you send against the Google analytics API counts as one. The Quota is not project or user based.
Pagination:
Your request above you are requesting maxResults of 1000 if the total number of rows in your request is 100000 then you are going to have to make 100 requests to get all of the data.
All APIs:
Requests to all of the APIs count against the same quota so if you are also using the management api it counts as well as the reporting api.
All Users and applications:
Now here is the fun part about the current quota system it is not project related.
Lets say my company has a profile 1234567. Now our marketing team all has access. Each member of the marketing team likes different apps. They all install the app they like best. They are all using the same 10000 request quota.
Reset:
Your quota will reset at midnight west cost USA time. No one will be able to access that view id until then. Top tip when testing create a development view under the web property to request from then you wont blow out your production view.
Related
How to Increase Data limit for Analytics 360 using resource based quota: 100M sessions
As per the documentation, I'm setting the useResourceQuotas flag to True to enable this feature, which would allow up to 100MM sessions per request:
https://developers.google.com/analytics/devguides/reporting/core/v4/resource-based-quota
After running a test query for a single date in the postman I do see that the sampling is removed (samplesReadCounts is absent in the response). However, I'm still getting only 1MM total rows in the response, even though I know for a fact there are more than 1MM rows.
Endpoint: analyticsreporting.reports.batchGet
Some things to note:
I am requesting data on behalf of client and accessing data via accesstoken (Client is having Google Analytics 360 account)
I have an application that's been running since 2015. It both reads and writes to approx 16 calendars via a service account, using the Google node.js library (calendar v3 API). We also have G Suite for Education.
The general process is:
Every 30 seconds it caches all calendar data via a list operation
Periodically a student will request an appointment "slot", it first checks to see if the slot is still open (via a list call) then an insert.
That's all it does. It's been running fine until the past few days, where API insert calls started failing:
{
"code": 403,
"errors": [{
"domain": "usageLimits",
"reason": "quotaExceeded",
"message": "Calendar usage limits exceeded."
}]
}
This isn't all that special - the documentation has three "solutions":
Read more on the Calendar usage limits in the G Suite Administrator
help.
If one user is making a lot of requests on behalf of many users
of a G Suite domain, consider using a Service Account with authority
delegation (setting the quotaUser parameter).
Use exponential backoff.
I'm not exceeding any of the stated limits as far as I can tell.
While I'm using a service account, it isn't making a request on behalf of a user. The service account has write access to the calendar and adds the user as an attendee
Finally, I do not think exponential backoff will help, although I do not have this implemented. The time between a request to insert and the next insert call is measured in seconds, not milliseconds. Additionally, just running calls directly on the command line with a simple script produce the same problem.
Some stats:
2015 - 2,466 inserts, 186 errors
2016 - 25,747 inserts, 237 errors
2017 - 42,815 inserts, 225 errors
2018 - 41,390 inserts, 1,074 errors (990 of which are in the past 3 days)
I have updated the code over the years, but it has remained largely untouched this term.
At this point I'm unsure what to do - there is no channel to reach Google, and while I have not implemented a backoff strategy, the way timings work with this application, subsequent calls are delayed by seconds, and processed in a queue that sequentially processes requests. The only concurrent requests would be list operations.
I have been testing an application I am developing using the webapi, and I have started to get the following error message:
GCSP: Hello error: [1010] The Gracenote ODP 15822 [Name: *registered-name*]
[App: *registered-app*] application has reached is daily lookup limit with
Gracenote. You may try again tomorrow or may contact Gracenote support at
support#gracenote.com.
[Gracenote Error: <ERR>]
The application I am developing is looking up track details and cover artwork for songs being streamed from Mood/Pandora for Business service. It is making approximately one call for each song, so something like 15 searches per hour on average. I may have done more during testing, but not a lot more.
Once completed, I would expect this service to make fewer than 500 searches per day per location, and for it initially to be used at 4 locations.
What are the lookup limits I am running into?
What are my options to get a higher lookup limit?
Thanks
I am issuing GET requests as defined in the Google Measurement Protocol from our server to record offline conversions.
The following test request (tracking id obfuscated)
https://www.google-analytics.com/debug/collect?v=1&tid=xx&cid=111300&t=transaction&ti=1500000&tr=100
validates against the /debug Endpoint (using Postman)
{
"hitParsingResult": [ {
"valid": true,
"parserMessage": [ ],
"hit": "/debug/collect?v=1\u0026tid=xxu0026cid=111300\u0026t=transaction\u0026ti=1500000\u0026tr=100"
} ],
"parserMessage": [ {
"messageType": "INFO",
"description": "Found 1 hit in the request."
} ]
}
And shows up in the Sales Performance report in Google Analytics when submitted to the production endpoint using PostMan (i.e. without /debug/)
However I can't see any of the actual production data, submitted from the server in the Sales Performance report.
Any ideas?
This is kind of tricky, yes the transaction is valid,but the debuger only check the syntaxis, but your Google Analytics configuration has not enabled that type of hit (t=transaction, That is only for Standart E-commerce). In my test account, I run that hit and this work. In your case, if your account is enhanced e-commerce is being filtered on the processing.
So here is a screeshot of you hit on my test view running on classic ecommerce.
So you have 2 options to fix this, downgrade you e-commerce (not recommendable in all the cases)
Downgrade
If you do want to use that syntaxis, you have to uncheck the enhance e-commerce and that should work in your case. With your hit and with my configuration this works (a new account w/no filter and standard e-commerce enabled)
Attach information
The enhanced ecommerce was designed to be send attached with other hits (on event or pageview mainly).
For example, this hit is a no interaction event and it's valid for receive transaction and the purchase. Use no interaction events avoid fake sessions and allows to you import the data of the transaction without alter metrics as bounce rate.
https://www.google-analytics.com/collect?v=1&t=event&ni=1&ec=Ecommerce&ea=Transaction&cid=2.2&tid=UA-xxxxx-1&ti=T12345&tr=35.43&pa=purchase
There is a data latency with Google analytics. Officially its 24 - 72 hours before data shows up in the standard reports.
From my own experience I can say depending upon how much data there is in your account you can see it as early as 12 - 24 hours.
If the debug end point says its a valid it you can assume its working fine.
I've created a custom dashboard(javascript # localhost) which has about 10 views that are generated with Google analytics API and Google Charts .
Now that I've been debugging it for a while day or two it has started to return error "Quota Error: User Rate Limit Exceeded." .
From google console it shows only 1,416 queries for the past 4 days.
And yet, the quota states following:
Queries per day 50,000
Queries per 100 seconds per user 100
Queries per 100 seconds 2,000
Am I perhaps using some other Google API that has a lot smaller quota limits?
However, I couldn't find any other API from google console.
Well, with correct keywords it was easy to find.
https://developers.google.com/analytics/devguides/reporting/core/v3/errors
One can change upgrade the limit to 1000 queries per 100 second per user.
Go to Google Developer's Console
Dashboard > Under API column, Click on "Analytics API"
"Quotas" tab > Change "Queries per 100 seconds per user"
from 100 to 1000
src: How to fix "Quota Error: User Rate Limit Exceeded"