I tried searching but I didn't find any useful resource that would answer my question.
I'm trying to develop a service for my costumers where I will need to connect to their analytics data and combine with information of other services that I already provide. However, with the quota of the API request, how can I get it to work with several costumers?
I mean, the limitation is 10.000 requests per month, and I will probably make around 40-50 requests per day per costumer. That means that if I get more than 7 costumers to use it I would reach the monthly quota. What is the best approach to make this scalable?
Thank you in advance!
I think you are confused a little about the Google Analtyics api limits.
Managment api, and metadata api have a limit of 10,000 requests per day. 10 requests per second.
The Core reporting api is 10,000 requests per day per User and or (View (used to be profile)) and 50,000 requests per application. You can request that that 50k be extended. But you need to show that there arnt a lot of errors comming from your application.
It might be a good idea to also send send either Userip or quotaUser with all over your requests this will ensure that each of your users gets 10k requests each day. If you dont send quotaUser or UserIp then google lumps them all under the same quota user and they they are as a group limited to the 10k. This may or may not be a problem if you can ensure that sevral users wont be requesting the same data from the same view (used to be profile)
Another thing you should remember is that nextlinks count twards the limit as well so you should either try refine your requests so that you dont get to many rows back or set max-results high enough that you dont get to many nextlinks.
You can read more about how and why you should use QuotaUser here Google Analtyics QuotaUser
The quota is 10,000 per day per profile.
You should be fine especially if each of your clients has a separate profile.
https://developers.google.com/analytics/devguides/reporting/core/v2/limits-quotas#core_reporting
Related
Does anyone know how many concurrent requests per second can be made to sabre developer bargain finder Max API?
I have searched everywhere and all I can find is contact your rep to have it increased.
We had the setup of 50 Sessions included but it was no hard limit. Also we were able to extend it to 150 without any costs. There is as far as i know no limit on concurrent requests but depending on what calls you make, you should have your "look to book ratio" (usually 500:1) in mind.
I believe they are usually sold in bundles of 50 sessions per EPR but can be increased. I do not believe there is any way of viewing that information about a given EPR in any kind of tool, so your rep is the best place to go for that kind of information. Pricing depends on all sorts of contractual items, I believe.
There is a certain limit associated with the number of concurrent requests for the API. If the same gets exceeded, you would get the following error :- USG_CONNECTOR_IS_BUSY. This means that the maximum number of concurrent requests for the API has been exceeded. Please contact your Sabre account manager to determine or increase your allocated concurrent request limit for this API. In this case, wait at least 500 milliseconds and resend the request.
The allocations are usually increased in bundles of 50. Your Sabre account manager would be the best person who can provide specific details.
#PCM7,
There is no specific limit, as this depends on the commercial agreement between the travel agency and Saber, as the token generated for the BFM is managed directly at SABRE.
In the agency where I work we have several tourism companies and with that each one has a specific amount of TPS for BFM.
This information will be obtained directly from the SABER account executive for the travel agency in question.
https://developer.sabre.com/docs/soap_apis/air/search/bargain_finder_max
I have an app that have hundreds of users and connects to Evernote. As I have more users I make more requests to Evernote and it is causing a lot of rate limiting for my users and causing frustration. Is there a way to get my current limit increased from Evernote?
I have fixed a lot of inefficient calls I used to do, but we still have the same issue.
Rate limits are applied to calls against the Evernote API on a per API key, per user, per time period basis. This means that the API limits the number of calls a third-party app can make for each individual user during a given one-hour period. [source]
The number of users of your application is irrelevant. The source for that quote details a number of reasons and fixes.
If you've optimised your code fully, this may be a "special case". You should contact Evernote developer support.
We are using the Core Reporting API and Real Time Reporting API.
This API limit 10000request/day per view, but I want to increase the this
Is this possible?
if possible, please let me know how we can increase quota limits and the price for the case of 20000 request/day.
There are several quotas for the Google Analytics APIs and Google APIs in general.
requests/day 0 of 50,000
requests/100seconds/user 100
requests/perView 10000
Your application can make 50000 requests per day by default. This can be extended but it takes a while to get permission when you are getting close to this limit around 80% its best to request an extension at that time.
Your user can max make 100 requests a second which must be something that has just gone up last I knew it was only 10 requests a second. User is denoted by IP address. There is no way to extend this quota more then the max you cant apply for it or pay for it.
Then there is the last quota the one you asked about. You can make max 10000 requests a day to a view. This isn't just application based if the user runs my application and your application then together we have only 10000 requests that can be made. This quota is a pain if you ask me. Now for the bad news there is no way to extend this quota you cant apply for it you cant pay for it and you cant beg the Google Analytics dev team (I have tried)
Answer: No you cant extend the per view per day quota limit.
I am planning to develop a website which will allow registered users to
view there analytic data from various sites like Google Analytic in one
dashboard, some what similar to http://www.cyfe.com/ which provides all in
one dashboard.
I am thinking of two approaches to implement this application.
Approach #1: once the user logins to my web application and request for data, my application would make a call to analytic website using there API (ex
Google Analytic API) and display response data.
Approach #2: execute a job which executes for a particular interval
(say for every 30 min) and retrieves analytics data for all registered users
and saves in my application database. And when user requests data, my application would display the data from application database instead sending request to Analytic website.
Can anyone please suggest the pros/cons of each approach and which one is good to implement?
Remember google analytics data isn't done processing for 24 - 48 hours so requesting data every 30 minutes is over kill the data wont be complete or accurate. Run your application once a day to get data for two days ago.
The main problem you are going to have is the limit of 7 dimensions and 10 metrics per request. There is no primary key so there is no way of linking data from one request back to the data of another request.
Another issue you will have is that you can max return 10k rows per request depending upon how many rows are returned by the request you end up making a large number of requests against the API which will be hard on your quota.
Also you may end up with quota issues you can make a max of 10k requests to each profile per day. once you have hit that quota you will not be able to make any more requests against that profile until the next day. This quota can not be extended.
You can also make a max of 10 requests a second per user / profile you can tweek this a little using quota user but your application will not be able to run very fast it takes on average half a second for each request to return data. Things are going to take time unless you want to run multiple versions of your extracting application but again it will require you tweek quota user. This quota can not be extended.
Your application can make a max of 50 k requests against the api a day for all profiles. Once you reach 80% of that quota I recommend you apply for an extension it can take a month or more to get the extension of this quota it is a good idea to plan ahead for this.
Note: I am the lead developer on a business intelligence application that exports data from Google Analytics into data warehouse application daily I have run into each of those issues. While what you are planning on doing is possible to do you just need to understand the limitations of the google analytics api before you begin your development process.
Recently I was developing an application using Linkedin people-search API. Documentation says that a developer registration has 1 lac API calls per day, but when I have registered this API, and ran a python script, after some 300 calls it says throttle limit exceeds.
Did anyone face such kind of issue using Linkedin API, comments are appreciated.
Thanks in advance.
It's been a while but the stats suggest people still look at this and I'm experimenting with the LinkedIn API and can provide some more detail.
The typical throttles are stated as both a max (e.g. 100K) and a per-user-token number (e.g. 500). Those numbers together mean you can get up to a maximum of 100,000 calls per day to the API but even as a developer a single user token means a maximum of 500 per day.
I ran into this, and after setting up a barebones app and getting some users I can confirm a daily throttle of several thousands of API calls. [Deleted discussion of what was probably, upon further consideration, an accidental back door in the LinkedIn API.]
As per the Throttle Limits published by LinkedIn:
LinkedIn API keys are throttled by default. The throttles are designed
to ensure maximum performance for all developers and to protect the
user experience of all users on LinkedIn.
There are three types of throttles applied to all API keys:
Application throttles: These throttles limit the number of each API call your application can make using its API key.
User throttles: These throttles limit the number of calls for any individual user of your application. User-level throttles serve
several purposes, but in general are implemented where there is a
significant potential impact to the user experience for LinkedIn
users.
Developer throttles: For people listed as developers on their API keys, they will see user throttles that are approximately four times
higher than the user throttles for most calls. This gives you extra
capacity to build and test your application. Be aware that the
developer throttles give you higher throttle limits as a developer of
your application. But your users will experience the User throttle
limits, which are lower. Take care to make sure that your application
functions correctly with the User throttle limits, not just for the
throttle limits for your usage as a developer.
Note: To view current API usage of your application and to ensure you haven't hit any throttle limits, visit
https://www.linkedin.com/developer/apps and click on "Usage & Limits".
The throttle limit for individual users of People Search is 100, with 400 being the limit for the person that is associated with the Application as the developer:
https://developer.linkedin.com/documents/throttle-limits
When you run into a limit, view the api usage for the application on the application page to see which throttle you are hitting.