How can we increase the rate limit of goggle dlp api - google-cloud-dlp

The rate limit set as 600 req/min by default is very low for our application where we are trying to process millions of records through a Spark job.
Even with cluster of 16 nodes and 4 cores we are hitting the rate limit.
We plan to have 25 such jobs running in parallel. Can you please suggest how the rate limit can be increased to something like 20k per min

According with this official GCP documentation:
You can edit your quotas up to their maximum values by selecting Edit
Quotas from the Quotas page of the Google Cloud Dashboard. To request
an increase in quota, edit your quota with your requested increase and
justification and submit your update. You are notified when your
request is received. You might be contacted for more information
regarding your request. After your request is reviewed, you are
notified whether it has been approved or denied.

https://cloud.google.com/docs/quota is the source documentation for Google Cloud quotas.
Once you submit your quota request we'll send you some follow up questions to help us calculate the capacity needs to grant it, so keep an eye out for that email.

Related

Sabre Concurrent request limit

Does anyone know how many concurrent requests per second can be made to sabre developer bargain finder Max API?
I have searched everywhere and all I can find is contact your rep to have it increased.
We had the setup of 50 Sessions included but it was no hard limit. Also we were able to extend it to 150 without any costs. There is as far as i know no limit on concurrent requests but depending on what calls you make, you should have your "look to book ratio" (usually 500:1) in mind.
I believe they are usually sold in bundles of 50 sessions per EPR but can be increased. I do not believe there is any way of viewing that information about a given EPR in any kind of tool, so your rep is the best place to go for that kind of information. Pricing depends on all sorts of contractual items, I believe.
There is a certain limit associated with the number of concurrent requests for the API. If the same gets exceeded, you would get the following error :- USG_CONNECTOR_IS_BUSY. This means that the maximum number of concurrent requests for the API has been exceeded. Please contact your Sabre account manager to determine or increase your allocated concurrent request limit for this API. In this case, wait at least 500 milliseconds and resend the request.
The allocations are usually increased in bundles of 50. Your Sabre account manager would be the best person who can provide specific details.
#PCM7,
There is no specific limit, as this depends on the commercial agreement between the travel agency and Saber, as the token generated for the BFM is managed directly at SABRE.
In the agency where I work we have several tourism companies and with that each one has a specific amount of TPS for BFM.
This information will be obtained directly from the SABER account executive for the travel agency in question.
https://developer.sabre.com/docs/soap_apis/air/search/bargain_finder_max

How to prevent throttling with Firestore when executing high amount of reads?

I am a Google blaze plan user and I have an express server containing a simple endpoint that just pulls from firestore. During high traffic hours, I can retrieve 5000+ simultaneous read requests which eventually throws this error below,
Error: 8 RESOURCE_EXHAUSTED: Quota exceeded.
After I wait a few minutes I am able to read the collection again.
Update:
Unsure why the downvotes without any explainations.. but I also have a Mutex system which I think may be leading to hitting these limits. If fails to lock a document using transactions, it goes down an array of snapshots until a lock is given. If the array becomes empty, it does another read to firebase for another set of N documents, which is only 50 in my case.
So my question is, is there a limit on the amount of transactions or reads per second that we're allowed to do from a single connection (my express server)? I don't think it's stated anywhere in the documents.
It looks like you're reaching one of the read/write/transaction limits stated in this page.
Might be this one Maximum writes per second per database: 10,000 (up to 10 MiB per second) but I'm only guessing...
To answer your question, according to this link, the maximum concurrent connections for mobile/web clients per Firebase database is 1,000,000. Thus, your connections seem to not exceed the limits.
For the Blaze Plan project, the limit for Cloud Firestore Document Read is 50K/day, since free usage from Spark Plan is included in Blaze Plan. The limit is such, unless you have set any budget limit in your Billing account. The usage will be reset at midnight of PST. If you upgrade your plan to Flame Plan, the limit is 250K/day.
Here you may read about the official Cloud Firestore Quotas and limits, such as maximum document reads, maximum size for a document, that can be useful. Furthermore you may monitor your database usage and check your plan's limits from the “Usage” tab in the Firebase console. You can check usage over the current billing period, the last 30 days, or the last 24 hours.
Stackdriver Monitoring is also a practical tool for monitoring document reads/writes/deletes, active connections and snapshot listeners.
A good practice, if you want to avoid unexpected charges on your billing account, would be to create an alerting policy based on the Cloud Firestore metrics, as stated here.
Additionally, you can estimate and verify your monthly costs on the “Blaze Plan” by using this Blaze Plan calculator.
For anyone who runs into this issue in the future, please check your App Engine budget settings under "Application Settings". I set the daily spending limit to avoid unnecessary charges during testing and it slipped my mind. I increased the budget and the error is currently gone.
AWS usually sends me an email when my budget has been exceeded.

Can we increase number of topics and subscriptions in a google console project?

I am using google cloud pub/sub for receiving the push notifications of any gmail user by setting a watch on the concerned users mailbox. I am able to do that. I am creating a one topic per user in my google console project and also creating one subscription per topic. As per my knowledge we cannot make more than 10,000 topics and subscriptions in one project(GCP). My question is that can we increase this limit. I want to increase it to 100k or 10 million. Is it possible. If yes kindly suggest some resources or references, where I can read about it.
At this time, the 10,000 topics and subscriptions limit cannot be increased. It is a dimension we'd like to scale up at some point, but for now, this is not a quota for which one can request an increase.

Google calendar API service account rate limit

I have a google service account setup for their calendar api, but it seems as though I can only make 5 requests per second. I've only figured that out from trial and error, there are no per second rate limit settings on my developer console settings.
I have 'queries per day' and 'queries per 100 seconds per user', both of which are currently set to 1,000,000.
I'm definitely not hitting these limits, so I can only assume there is a hidden 'per second' rate limit that is being applied. Does anyone know if that is the case?
Thanks!
I think this documentation will help you to understand more the Calendar usage limits.
Google Calendar puts certain limits in place to protect our users and infrastructure from abusive behavior. When these limits are reached by a user, Google Calendar will go into read-only mode for that user, and all edit actions will fail for a certain period of time. Most users will never hit these limits, as they are well above the activity level of a typical Calendar user.
I'd also like to add a few tips to work efficiently with your quota:
Use push notifications instead of polling.
If you cannot avoid polling, make sure you only poll when necessary (for example poll very seldomly at night).
Use incremental synchronization with sync tokens for all collections instead of repeatedly retrieving all the entries.
Increase page size to retrieve more data at once by using the maxResults parameter.
Update events when they change, avoid re-creating all the events on every sync.
Use exponential backoff for error retries.
Check the performance tips of the Calendar API

I want to increase the Real Time Reporting API request limit

We are using the Core Reporting API and Real Time Reporting API.
This API limit 10000request/day per view, but I want to increase the this
Is this possible?
if possible, please let me know how we can increase quota limits and the price for the case of 20000 request/day.
There are several quotas for the Google Analytics APIs and Google APIs in general.
requests/day 0 of 50,000
requests/100seconds/user 100
requests/perView 10000
Your application can make 50000 requests per day by default. This can be extended but it takes a while to get permission when you are getting close to this limit around 80% its best to request an extension at that time.
Your user can max make 100 requests a second which must be something that has just gone up last I knew it was only 10 requests a second. User is denoted by IP address. There is no way to extend this quota more then the max you cant apply for it or pay for it.
Then there is the last quota the one you asked about. You can make max 10000 requests a day to a view. This isn't just application based if the user runs my application and your application then together we have only 10000 requests that can be made. This quota is a pain if you ask me. Now for the bad news there is no way to extend this quota you cant apply for it you cant pay for it and you cant beg the Google Analytics dev team (I have tried)
Answer: No you cant extend the per view per day quota limit.

Resources