Remove the calcHistogram 6 calls per minute limit - microsoft-cognitive

Is there a way to request the 6 per minute limit increase for the calcHistogram method?
This limitation doesn't currently allow for any kind of development, testing let alone a production environment
I do not see at the moment any subscription option or upgrade or request limit removal

The Academic Knowledge API is still in preview and is subject to the transaction limitations on calcHistogram. We are in the process of adding a new tier with higher transaction limits, and these should be online in the next couple of months.

Related

Firebase Error: The project cannot be created because you have exceeded your allotted project quota [duplicate]

I have created a total of 30 projects in the Google Developers Console, including 23 between 12/23 and 12/27. Most recently, 3 projects were created on 12/27. When I tried to create a 4th project on 12/27, I got the message You have exceeded the quota for project creations per day. It has now been well over 24 hours since then, and I still cannot add new projects.
This same question has gone unanswered (at least) here, here and here.
Is it possible that nobody actually knows what the quota is? Since I have waited more than 24 hours after receiving the message before adding any new projects, and only 3 were added in the 24 hour period prior to that, it would appear that I have triggered something that has longer lasting impact, and isn't just a limit for the current day.
I can live with a ~20 per day limit, but not with a maximum of ~30 projects. Is there another account type I need to have with Google? Does anyone have experience in getting past this or in contacting Google directly for assistance?
So I did ran in to this a few months ago. At that time I did not see a solution. Though when I tried again today that error notice have been updated with a link to a form where you can request a increase of the project quota for your account. This is likely a ongoing roll out at the moment (I don't get this in all my accounts).
Reference it looks like this:
You may not see this updated message yet. Any how, the form link is https://support.google.com/code/contact/project_quota_increase. After filling in the form support got in touch pretty much within a day and my quota is now raised.
This may not be helpful to everyone with this problem, but here's what worked for our needs...
We were creating new projects in the Developers Console for each app so that each would have its own unique sender key for push notifications. What we discovered is that all apps can have the same sender key, as long as the push notification server (AWS SNS in our case) can differentiate the delivery targets by another means, which we're doing by bundle ID when devices register for push. In the end, we didn't need to have unique projects for each app.
It doesn't answer the questions of what the limits are or how to get past them once they've been reached, but it provided the means to the end we were seeking.

Can the Google Calendar API events watch be used without risking to exceed the usage quotas?

I am using the Google Calendar API to preprocess events that are being added (adjust their content depending on certain values they may contain). This means that theoretically I need to update any number of events at any given time, depending on how many are created.
The Google Calendar API has usage quotas, especially one stating a maximum of 500 operations per 100 seconds.
To tackle this I am using a time-based trigger (every 2 minutes) that does up to 500 operations (and only updates sync tokens when all events are processed). The downside of this approach is that I have to run a check every 2 minutes, whether or not anything has actually changed.
I would like to replace the time-based trigger with a watch. I'm not sure though if there is any way to limit the amount of watch calls so that I can ensure the 100 seconds quota is not exceeded.
My research so far shows me that it cannot be done. I'm hoping I'm wrong. Any ideas on how this can be solved?
AFAIK, that is one of the best practice suggested by Google. Using watch and push notification allows you to eliminate the extra network and compute costs involved with polling resources to determine if they have changed. Here are some tips to best manage working within the quota from this blog:
Use push notifications instead of polling.
If you cannot avoid polling, make sure you only poll when necessary (for example poll very seldomly at night).
Use incremental synchronization with sync tokens for all collections instead of repeatedly retrieving all the entries.
Increase page size to retrieve more data at once by using the maxResults parameter.
Update events when they change, avoid re-creating all the events on every sync.
Use exponential backoff for error retries.
Also, if you cannot avoid exceeding to your current limit. You can always request for additional quota.

API rate limits on calling getSyncChunk()

Given that Evernote don't publish their exact API rate limits (at least I can't find them), I'd like to ask for some guidance on it's usage.
I'm creating an application that will sync the user's notes and store them locally. I'm using getFilteredSyncChunk to do this.
I'd like to know how often I can make this API call without hitting the limits. I understand that the limits are on a per-user basis, so would it be acceptable to call this every 5 minutes to get the latest notes?
TIA
The rate limit is on a per API key basis. You'll be okay calling getFilteredSyncChunk every five minutes, although it's a little more efficient to call getSyncState instead.
In case you haven't seen it yet, check out this guide for info on sync (accessible from this page).

Overcome Marketo's quota limits

As far as I know, Marketo limits the number of REST API requests to 10,000 per day. Is there a way to overcome this limit? Can I pay and get more of those?
I found out that the REST API requests and the SOAP API requests counts separately but I'm trying to find a solution that is limited to REST API.
Moreover, in order to get an access token I need to sacrifice a request. I need to know how long this access token will be alive in order to save as much requests as possible.
You can increase your limit just by asking your account manager. It costs about 15K per year to increase your limit by 10K API calls.
Here are the default limits in case you don't have them yet:
Default Daily API Quota: 10,000 API calls (counter resets daily at 12:00 AM CST)
Rate Limit: 100 API calls in a 20 second window
Documentation: REST API
You'll want to ask your Marketo account manager about this.
I thought I would update this with some more information since I get this question a lot:
http://developers.marketo.com/rest-api/
Daily Quota: Most subscriptions are allocated 10,000 API calls per day (which resets daily at 12:00AM CST).  You can increase your daily quota through your account manager.
Rate Limit: API access per instance limited to 100 calls per 20 seconds.
Concurrency Limit:  Maximum of 10 concurrent API calls.
For the Daily limit:
Option 1: Call your account manager. This will cost you $'s. For a client I work for we have negotiated a much higher limit.
Option 2: Store and Batch your records. For example, you can send a batch of 300 leads in a single lead insert/update call. Which means you can insert/update 3,000,000 leads per day.
For the Rate limit:
Option 1 will probably not work. Your account manager will be reluctant to change this unless you a very large company.
Option 2: You need to add some governance to your code. There are several ways to do this, including queues, timers with a counter, etc. If you make multi-threaded calls, you will need to take into account concurrency etc.
Concurrent call limit:
You have to limit your concurrent threads to 10.
There are multiple ways to handle API Quota limits.
If you all together want to avoid hitting API limit, try to achieve your functionality thru Marketo Webhooks. Marketo webhook will not have API limits, but it has its own CONS. Please research on this.
You may use REST API, but design your strategy to batch the maximum records in a single payload instead of smaller chunks, e.g. sending 10 different API calls with each 20 records, accumulate the max allowed payload and call Marketo API once.
The access token is valid for 1 hour after authenticating.
Marketo's Bulk API can be helpful in regard to rate limiting as once you have the raw activities the updates, etc on the lead object can be done without pinging marketo for each lead: http://developers.marketo.com/rest-api/bulk-extract/ however be aware of export limits that you may run into when bulk exporting lead + activities. Currently, Marketo only counts the size of the export against the limit when the job has been completed which means you can launch a max of 2 concurrent export jobs(which sum to more than the limit) at the same time as a workaround. Marketo will not kill a running job if a limit has been reached so long as the job was launched prior to the limit being reached.
Marketo has recently upgraded the maximum limit
Daily Quota: Subscriptions are allocated 50,000 API calls per day (which resets daily at 12:00AM CST). You can increase your daily quota through your account manager.
Rate Limit: API access per instance limited to 100 calls per 20 seconds.
Concurrency Limit: Maximum of 10 concurrent API calls.
https://developers.marketo.com/rest-api/

What are the Best Practices For SQL Inserts on Large Scale in reference to ad impressions?

I am working on a site where I will need to be able to track ad impressions. My environment is ASP.Net with IIS using a SQL Server DMBS and potentially Memcached so that there are not as many trips to the database. I must also think about scalability as I am hoping that this application becoming a global phenom (keeping my fingers crossed and working my ass off)! So here is the situation:
My Customers will pay X amount for Y Ad impressions
These ad impressions (right now, only text ads) will then be shown on a specific page.
The page is served from Memcached, lessening the trips to the DB
When the ad is shown, there needs to be a "+1" tick added to the impression count for the database
So the dilemma is this: I need to be able to add that "+1" tick mark to each ad impression counter BUT I cannot run that SQL statement every time that ad is loaded. I need to somehow store that "+1" impression count in the session (or elsewhere) and then run a batch every X minutes, hours, or day.
Please keep in mind that scalability is a huge factor here. Any advice that you all have would be greatly appreciated.
I've seen projects deal with this by deploying SQL Server Express edition on each web farm server and relying on Service Broker to deliver the track audit to the central server. Because each server in the farm updates a local SQL instance, they can scale as high as the sky. Service Broker assures the near-real-time reliable delivery. I've seen web farms handle 300-400 requests per second on average, 24x7 over long time, and with the queued nature of Service Broker being able to absorb spikes of 5000-7500 hits per second for hours at end, and recover in reasonable time, without audit loss and with the backlog staying under control.
If you really expect to scale and become the next MySpace, then you should learn from how they do it, and queue based asynchronous, decoupled processing is the name of the game.
Something you can do is increment the counts into a less permanent store, and periodically (every 1 minute, 5 minutes, hour..) sync into your more reliable database.
If your temporary store goes down, you lose the hit counts for some short period of time; the net effect of this is that in a rare event of a malfunction, the people paying for ads get some free impressions.
You can send a "atomically increment this value by +1" command to memcache. You could also do something like write a line to a flat file every time an ad is displayed, and have your "every 5 minute" sync job rotate the log, then go through counting all the lines in the file that was just rotated out.

Resources