My squad is building a Grafana dashboard that uses a custom datasource plugin https://github.com/vistaprint/Application-Insights-Datasource-Plugin which queries Azure Application Insights APIs.
We have discovered that there is a limit on max number of API requests https://dev.applicationinsights.io/documentation/Authorization/Rate-limits:
* Throttling limit: no more than 15 requests can be made in a minute across all API paths (/metrics, /events and /query), and
* Daily cap: no more than 1500 requests per day (UTC day) per API key can be made across all API paths.
We're worried that 1500 requests per day may not be enough, especially if the dashboard is refreshed frequently and accessed by many users. We wonder if there is a way to increase that limit.
the next section of the article you linked states:
Limits when using Azure Active Directory authentication
If using the
Azure API and Azure Active Directory for per user authentication, the
throttling limit is 60 requests per minute per user and there is no
daily cap. [emphasis added]
if you need higher than what the API Key auth allows, you'll need to look at using AAD auth instead.
Related
I have never used Vision API before but recently I have found it very powerful for a project of mine. However I have two concerns regarding its budget limiting, in order to not get an unexpected bill:
Is it possible to set a monthly cost limit? I have been used to Compute Engine which gives me an almost exact cost of the month but this seems not possible here. Since I will be using the API for labelling I have set the label detections requests per minute and per user to a specific amount, also to be sure I have set the global request per minute and per user to the same amount, all the other quotas to 0. If I have understood correctly, setting the max calls quota per minute to 4, for example, should provide a maximum of 178560 calls per month, right? Should this limit my budget? Am I safe?
The API will be used as an API key in a mobile app. I have followed the code examples for iOS & Android and I have seen the key is written in the code. Is this safe? For a better security I have restricted the key to iOS/Android apps bundle and to Cloud Vision API only. Would it be a safe enough option?
Thanks everyone for any help!
Yes, it’s possible to set a monthly cost limit. Refer to this doc for more information about creating the budget, setting the budget scope, budget amount and threshold. Yes your understanding is correct by setting the max calls quota per minute to 4, it should provide a maximum of 178560 calls per month. It shouldn’t limit the maximum quotas.
API keys that are embedded in the code are not safe and secured.
Do not embed API keys directly in code. API keys that are embedded in code can be accidentally exposed to the public. For example, you may forget to remove the keys from code that you share. Instead of embedding your API keys in your applications, store them in environment variables or in files outside of your application's source tree.
Refer to this doc for more information about best practices for securing an API key.
Edit based on a question in the comment:
Can the quotas be seen as a hard limit?
The quotas might be seen as a hard limit only if you don't have any other resources running in your GCP project other than Vision API requests. Refer to this doc for more information about capping API usage.
If you want to set a hard limit and disable billing, configure a Cloud Function to call the Cloud Billing API that disables billing for the project as described in the GCP doc.
Note: Use this feature only if you want to stop the spending and might be willing to shutdown all your Google Cloud services and usage when your budget limit is reached.
I was just wondering whether it is possible to set limit on number calls in Cognitive Services - this is in context with - keeping API keys in the app
There currently is not a way to set limits or caps for your quota at this time. If you are subscribed to a paid subscription through Azure you are able to monitor your monthly usage.
This information appears when you are viewing the resource item (your subscription) in Azure.
We are using the Core Reporting API and Real Time Reporting API.
This API limit 10000request/day per view, but I want to increase the this
Is this possible?
if possible, please let me know how we can increase quota limits and the price for the case of 20000 request/day.
There are several quotas for the Google Analytics APIs and Google APIs in general.
requests/day 0 of 50,000
requests/100seconds/user 100
requests/perView 10000
Your application can make 50000 requests per day by default. This can be extended but it takes a while to get permission when you are getting close to this limit around 80% its best to request an extension at that time.
Your user can max make 100 requests a second which must be something that has just gone up last I knew it was only 10 requests a second. User is denoted by IP address. There is no way to extend this quota more then the max you cant apply for it or pay for it.
Then there is the last quota the one you asked about. You can make max 10000 requests a day to a view. This isn't just application based if the user runs my application and your application then together we have only 10000 requests that can be made. This quota is a pain if you ask me. Now for the bad news there is no way to extend this quota you cant apply for it you cant pay for it and you cant beg the Google Analytics dev team (I have tried)
Answer: No you cant extend the per view per day quota limit.
I am planning to develop a website which will allow registered users to
view there analytic data from various sites like Google Analytic in one
dashboard, some what similar to http://www.cyfe.com/ which provides all in
one dashboard.
I am thinking of two approaches to implement this application.
Approach #1: once the user logins to my web application and request for data, my application would make a call to analytic website using there API (ex
Google Analytic API) and display response data.
Approach #2: execute a job which executes for a particular interval
(say for every 30 min) and retrieves analytics data for all registered users
and saves in my application database. And when user requests data, my application would display the data from application database instead sending request to Analytic website.
Can anyone please suggest the pros/cons of each approach and which one is good to implement?
Remember google analytics data isn't done processing for 24 - 48 hours so requesting data every 30 minutes is over kill the data wont be complete or accurate. Run your application once a day to get data for two days ago.
The main problem you are going to have is the limit of 7 dimensions and 10 metrics per request. There is no primary key so there is no way of linking data from one request back to the data of another request.
Another issue you will have is that you can max return 10k rows per request depending upon how many rows are returned by the request you end up making a large number of requests against the API which will be hard on your quota.
Also you may end up with quota issues you can make a max of 10k requests to each profile per day. once you have hit that quota you will not be able to make any more requests against that profile until the next day. This quota can not be extended.
You can also make a max of 10 requests a second per user / profile you can tweek this a little using quota user but your application will not be able to run very fast it takes on average half a second for each request to return data. Things are going to take time unless you want to run multiple versions of your extracting application but again it will require you tweek quota user. This quota can not be extended.
Your application can make a max of 50 k requests against the api a day for all profiles. Once you reach 80% of that quota I recommend you apply for an extension it can take a month or more to get the extension of this quota it is a good idea to plan ahead for this.
Note: I am the lead developer on a business intelligence application that exports data from Google Analytics into data warehouse application daily I have run into each of those issues. While what you are planning on doing is possible to do you just need to understand the limitations of the google analytics api before you begin your development process.
Recently I was developing an application using Linkedin people-search API. Documentation says that a developer registration has 1 lac API calls per day, but when I have registered this API, and ran a python script, after some 300 calls it says throttle limit exceeds.
Did anyone face such kind of issue using Linkedin API, comments are appreciated.
Thanks in advance.
It's been a while but the stats suggest people still look at this and I'm experimenting with the LinkedIn API and can provide some more detail.
The typical throttles are stated as both a max (e.g. 100K) and a per-user-token number (e.g. 500). Those numbers together mean you can get up to a maximum of 100,000 calls per day to the API but even as a developer a single user token means a maximum of 500 per day.
I ran into this, and after setting up a barebones app and getting some users I can confirm a daily throttle of several thousands of API calls. [Deleted discussion of what was probably, upon further consideration, an accidental back door in the LinkedIn API.]
As per the Throttle Limits published by LinkedIn:
LinkedIn API keys are throttled by default. The throttles are designed
to ensure maximum performance for all developers and to protect the
user experience of all users on LinkedIn.
There are three types of throttles applied to all API keys:
Application throttles: These throttles limit the number of each API call your application can make using its API key.
User throttles: These throttles limit the number of calls for any individual user of your application. User-level throttles serve
several purposes, but in general are implemented where there is a
significant potential impact to the user experience for LinkedIn
users.
Developer throttles: For people listed as developers on their API keys, they will see user throttles that are approximately four times
higher than the user throttles for most calls. This gives you extra
capacity to build and test your application. Be aware that the
developer throttles give you higher throttle limits as a developer of
your application. But your users will experience the User throttle
limits, which are lower. Take care to make sure that your application
functions correctly with the User throttle limits, not just for the
throttle limits for your usage as a developer.
Note: To view current API usage of your application and to ensure you haven't hit any throttle limits, visit
https://www.linkedin.com/developer/apps and click on "Usage & Limits".
The throttle limit for individual users of People Search is 100, with 400 being the limit for the person that is associated with the Application as the developer:
https://developer.linkedin.com/documents/throttle-limits
When you run into a limit, view the api usage for the application on the application page to see which throttle you are hitting.