I currently use Here API v7 and get a free 250k transaction allowance per month which I use for getting route directions.
I see that v7 is now discontinued and I need to migrate to v8.
Looking at their website, there is no mention of the 250k transactions anymore, just a daily 1k allowance, so presumably this has changed.
Does anyone know if I switch to using v8 whilst keeping my existing account whether I will still have the 250k allowance or will it drop to the new one. It’s quite a big difference.
Yes, if you continue to use the existing developer portal account, you will still get a free 250k transaction allowance per month for all location services including Routing V8 API. The daily 1k allowance applies to the account in platform and subscribes to the Limited Plan.
Related
I have an app that have hundreds of users and connects to Evernote. As I have more users I make more requests to Evernote and it is causing a lot of rate limiting for my users and causing frustration. Is there a way to get my current limit increased from Evernote?
I have fixed a lot of inefficient calls I used to do, but we still have the same issue.
Rate limits are applied to calls against the Evernote API on a per API key, per user, per time period basis. This means that the API limits the number of calls a third-party app can make for each individual user during a given one-hour period. [source]
The number of users of your application is irrelevant. The source for that quote details a number of reasons and fixes.
If you've optimised your code fully, this may be a "special case". You should contact Evernote developer support.
I have been playing around with Google Datastore and was thinking of using it in production. My concern is that because it autoscales and you pay for the queries (after the free tier), if you get a surge in traffic, you will also get an increased bill. Is there a way to limit the amount Google Datastore scales? I would rather have users experience slow traffic then get a huge bill.
And before anyone suggests setting a budget: I don't want to shut down the site, just have it slower.
Based on Pricing and Quota documentation, Google Cloud Datastore is charging per Stored data, Entity Reads, Entity Writes, Entity Deletes and Small Operations. It is not charging for autoscaling. Which means that either they are served fast or slow, since you are accessing the data, you will be billed anyway.
e.g. Currently the price is $0.036 per 100,000 entities for Entity Reads operations daily. Which means that if during the day your users have already read 50,000 entities (for free) in total, you are going to get billed $0.036 for that day for the next 100,000 entities read operations.
The only way to limit this is to actually avoid any read operations for that day, which will make your application unusable.
Does anyone know if the "Out of call volume quota" is exclusively for free trial user and if we subscribe to the monthly plan, there will be no limit to the number of calls to Microsoft Face API?
I would also like to know since the API can take 10 requests per second from a paid key, does that mean by requesting with different processes simultaneously, the total process time can be shortened?
Thank you
From the pricing page the limit for the free trial is 30,000 API calls a month and that's removed on the paid tier. The standard paid tier has a max throughput of 10 transactions per second, for example that could come for example from multiple apps all submitting calls at the same time. If you need higher volume please reach out to the team via the contact us link at the bottom of the page on www.microsoft.com/cognitive
After reading Ryan's answer I tried to determine from Microsoft how much it would cost to increase the transaction limit.
As of today, Microsoft Inside Sales confirmed that it is not possible.
We are stymied and so are considering other Cloud services that have a higher or no rate limit, such as Kairos (https://www.kairos.com/pricing)
We are using the Core Reporting API and Real Time Reporting API.
This API limit 10000request/day per view, but I want to increase the this
Is this possible?
if possible, please let me know how we can increase quota limits and the price for the case of 20000 request/day.
There are several quotas for the Google Analytics APIs and Google APIs in general.
requests/day 0 of 50,000
requests/100seconds/user 100
requests/perView 10000
Your application can make 50000 requests per day by default. This can be extended but it takes a while to get permission when you are getting close to this limit around 80% its best to request an extension at that time.
Your user can max make 100 requests a second which must be something that has just gone up last I knew it was only 10 requests a second. User is denoted by IP address. There is no way to extend this quota more then the max you cant apply for it or pay for it.
Then there is the last quota the one you asked about. You can make max 10000 requests a day to a view. This isn't just application based if the user runs my application and your application then together we have only 10000 requests that can be made. This quota is a pain if you ask me. Now for the bad news there is no way to extend this quota you cant apply for it you cant pay for it and you cant beg the Google Analytics dev team (I have tried)
Answer: No you cant extend the per view per day quota limit.
I am planning to develop a website which will allow registered users to
view there analytic data from various sites like Google Analytic in one
dashboard, some what similar to http://www.cyfe.com/ which provides all in
one dashboard.
I am thinking of two approaches to implement this application.
Approach #1: once the user logins to my web application and request for data, my application would make a call to analytic website using there API (ex
Google Analytic API) and display response data.
Approach #2: execute a job which executes for a particular interval
(say for every 30 min) and retrieves analytics data for all registered users
and saves in my application database. And when user requests data, my application would display the data from application database instead sending request to Analytic website.
Can anyone please suggest the pros/cons of each approach and which one is good to implement?
Remember google analytics data isn't done processing for 24 - 48 hours so requesting data every 30 minutes is over kill the data wont be complete or accurate. Run your application once a day to get data for two days ago.
The main problem you are going to have is the limit of 7 dimensions and 10 metrics per request. There is no primary key so there is no way of linking data from one request back to the data of another request.
Another issue you will have is that you can max return 10k rows per request depending upon how many rows are returned by the request you end up making a large number of requests against the API which will be hard on your quota.
Also you may end up with quota issues you can make a max of 10k requests to each profile per day. once you have hit that quota you will not be able to make any more requests against that profile until the next day. This quota can not be extended.
You can also make a max of 10 requests a second per user / profile you can tweek this a little using quota user but your application will not be able to run very fast it takes on average half a second for each request to return data. Things are going to take time unless you want to run multiple versions of your extracting application but again it will require you tweek quota user. This quota can not be extended.
Your application can make a max of 50 k requests against the api a day for all profiles. Once you reach 80% of that quota I recommend you apply for an extension it can take a month or more to get the extension of this quota it is a good idea to plan ahead for this.
Note: I am the lead developer on a business intelligence application that exports data from Google Analytics into data warehouse application daily I have run into each of those issues. While what you are planning on doing is possible to do you just need to understand the limitations of the google analytics api before you begin your development process.