I'm using a proxy to make requests to the Google Analytics API. This proxy has a timeout of 30 seconds and I reach this timeout some times. Do you know if it is normal that Google last more than 30 seconds in the response? The biggest request I have contains 4 metrics and 4 dimensions.
Also, I want to ask if there is any way of improving this response times, maybe requesting a lower number of results per page or any other mechanism similar.
I have an application which gets Google Analytics statistics for 28 google users (a.k.a: accounts/logins/emails) from Reporting API.
Every client makes a request to his own ga.data (some metrics for the last 3 days) ones per 10 minutes.
Everything was ok for a long period, but yesterday around 5:00 pm UTC one the GA users started to get 403:rateLimitExceeded error in response.
A cycle of 10 repeating requests gets the same 403 error. In 10 min., the new cycle starts and the result is the same.
All other clients on the same application keep updating well, without getting 403:rateLimit error in return.
I have a "sleep" function for 1 second before making a request. Also I am sending a uniq "quotaUser" in each request. My application makes less than 1 RPS and keeps within 20K requests per day.
As I am aware "403:rateLimitExceeded error" stands for the overall limit of the requests for the whole application per day, however in my case all other clients except this one keep updating properly and the overall daily limit of 50K is not being exceeded.
UPDATE:
0:00 UTC this GA user stopped getting "403 error" and now keeps updating well.
Please advise what could be the possible reasons of getting "403:rateLimitExceeded error" for that client and what I can do to avoid getting the same problem again?
The reason was too many failed requests per a little time period, in this case you should wait for quotas reset at midnight
Is there a way to request the 6 per minute limit increase for the calcHistogram method?
This limitation doesn't currently allow for any kind of development, testing let alone a production environment
I do not see at the moment any subscription option or upgrade or request limit removal
The Academic Knowledge API is still in preview and is subject to the transaction limitations on calcHistogram. We are in the process of adding a new tier with higher transaction limits, and these should be online in the next couple of months.
As far as I know, Marketo limits the number of REST API requests to 10,000 per day. Is there a way to overcome this limit? Can I pay and get more of those?
I found out that the REST API requests and the SOAP API requests counts separately but I'm trying to find a solution that is limited to REST API.
Moreover, in order to get an access token I need to sacrifice a request. I need to know how long this access token will be alive in order to save as much requests as possible.
You can increase your limit just by asking your account manager. It costs about 15K per year to increase your limit by 10K API calls.
Here are the default limits in case you don't have them yet:
Default Daily API Quota: 10,000 API calls (counter resets daily at 12:00 AM CST)
Rate Limit: 100 API calls in a 20 second window
Documentation: REST API
You'll want to ask your Marketo account manager about this.
I thought I would update this with some more information since I get this question a lot:
http://developers.marketo.com/rest-api/
Daily Quota: Most subscriptions are allocated 10,000 API calls per day (which resets daily at 12:00AM CST). You can increase your daily quota through your account manager.
Rate Limit: API access per instance limited to 100 calls per 20 seconds.
Concurrency Limit: Maximum of 10 concurrent API calls.
For the Daily limit:
Option 1: Call your account manager. This will cost you $'s. For a client I work for we have negotiated a much higher limit.
Option 2: Store and Batch your records. For example, you can send a batch of 300 leads in a single lead insert/update call. Which means you can insert/update 3,000,000 leads per day.
For the Rate limit:
Option 1 will probably not work. Your account manager will be reluctant to change this unless you a very large company.
Option 2: You need to add some governance to your code. There are several ways to do this, including queues, timers with a counter, etc. If you make multi-threaded calls, you will need to take into account concurrency etc.
Concurrent call limit:
You have to limit your concurrent threads to 10.
There are multiple ways to handle API Quota limits.
If you all together want to avoid hitting API limit, try to achieve your functionality thru Marketo Webhooks. Marketo webhook will not have API limits, but it has its own CONS. Please research on this.
You may use REST API, but design your strategy to batch the maximum records in a single payload instead of smaller chunks, e.g. sending 10 different API calls with each 20 records, accumulate the max allowed payload and call Marketo API once.
The access token is valid for 1 hour after authenticating.
Marketo's Bulk API can be helpful in regard to rate limiting as once you have the raw activities the updates, etc on the lead object can be done without pinging marketo for each lead: http://developers.marketo.com/rest-api/bulk-extract/ however be aware of export limits that you may run into when bulk exporting lead + activities. Currently, Marketo only counts the size of the export against the limit when the job has been completed which means you can launch a max of 2 concurrent export jobs(which sum to more than the limit) at the same time as a workaround. Marketo will not kill a running job if a limit has been reached so long as the job was launched prior to the limit being reached.
Marketo has recently upgraded the maximum limit
Daily Quota: Subscriptions are allocated 50,000 API calls per day (which resets daily at 12:00AM CST). You can increase your daily quota through your account manager.
Rate Limit: API access per instance limited to 100 calls per 20 seconds.
Concurrency Limit: Maximum of 10 concurrent API calls.
https://developers.marketo.com/rest-api/
We've been developing with Firebase for a couple of months and recently we've seen some long delays in downloading data (e.g. 20 seconds). During those times the "forge" web UI is also tremendously slow to respond.
After a while, it seems to clear up and go back to its lightning-fast self.
Could this be because I'm using a significant portion of the free quota (80 MB / 100MB of storage and 1.6 GB / GB in bandwidth)? Are there undocumented rate limits we're hitting?
The last time this happened we had 6 concurrent users, and our alltime peak so far has been 13.
The short answer is no, dev accounts aren't rate-limited. They're capped on connections, data storage, and monthly data transfer, but there's no rate-limiting.
If you're having performance issues, your best bet would be to email support#firebase.com detailing what you're seeing and the name of your Firebase so that we can investigate. Typically, delays are the result of large data transfers going into or out of your Firebase (e.g. downloading your entire Firebase, which could be accidentally triggered by opening Forge) and there are usually mitigation strategies that we can help you out with.