Vision API quota/budget limit and API key security - google-cloud-vision

I have never used Vision API before but recently I have found it very powerful for a project of mine. However I have two concerns regarding its budget limiting, in order to not get an unexpected bill:
Is it possible to set a monthly cost limit? I have been used to Compute Engine which gives me an almost exact cost of the month but this seems not possible here. Since I will be using the API for labelling I have set the label detections requests per minute and per user to a specific amount, also to be sure I have set the global request per minute and per user to the same amount, all the other quotas to 0. If I have understood correctly, setting the max calls quota per minute to 4, for example, should provide a maximum of 178560 calls per month, right? Should this limit my budget? Am I safe?
The API will be used as an API key in a mobile app. I have followed the code examples for iOS & Android and I have seen the key is written in the code. Is this safe? For a better security I have restricted the key to iOS/Android apps bundle and to Cloud Vision API only. Would it be a safe enough option?
Thanks everyone for any help!

Yes, it’s possible to set a monthly cost limit. Refer to this doc for more information about creating the budget, setting the budget scope, budget amount and threshold. Yes your understanding is correct by setting the max calls quota per minute to 4, it should provide a maximum of 178560 calls per month. It shouldn’t limit the maximum quotas.
API keys that are embedded in the code are not safe and secured.
Do not embed API keys directly in code. API keys that are embedded in code can be accidentally exposed to the public. For example, you may forget to remove the keys from code that you share. Instead of embedding your API keys in your applications, store them in environment variables or in files outside of your application's source tree.
Refer to this doc for more information about best practices for securing an API key.
Edit based on a question in the comment:
Can the quotas be seen as a hard limit?
The quotas might be seen as a hard limit only if you don't have any other resources running in your GCP project other than Vision API requests. Refer to this doc for more information about capping API usage.
If you want to set a hard limit and disable billing, configure a Cloud Function to call the Cloud Billing API that disables billing for the project as described in the GCP doc.
Note: Use this feature only if you want to stop the spending and might be willing to shutdown all your Google Cloud services and usage when your budget limit is reached.

Related

Firebase storage cost explosion (or how to prevent it) [duplicate]

Some functions in the Google Developers Console, like the Analytics API, are free until you reach a quota. Other functions, like Google Cloud Storage, create costs from the first click.
When I upload a file under https://console.developers.google.com/ > Storage > Cloud Storage > Storage Browser and I make this file publicly available, I pay about $0.12 per GB traffic.
But theoretically the traffic to this link could explode, e.g. because of sudden popularity. Therefore I would like to set something like a daily or monthly cost limit.
Q: How do I protect myself from overly high costs in the Google Developers Console?
You cannot. I asked Google about this, here's their response, from May 7 2016:
(GCE = Google cloud engine. No spending limits.
GAE = Google app engine — yes it has spending limits.)
... you are eligible for support on ... only ...
... [various helpful links] ...
That been said, at the moment there is no a feature that allows you to
configure a limited budget on GCE. This feature is certainly available
for GAE [1]. As you mentioned in your comments, you either can totally
shut down your VMs (will depend on your use case) or set the VMs to
send you alerts if they reach a certain traffic limit [2].
Sincerely,
Someone's first name
Technical Solutions Representative
Google Cloud Platform
[1] https://cloud.google.com/appengine/docs/quotas
[2] https://cloud.google.com/monitoring/support/notification-options
#wmdry, you wrote: "traffic to this link could explode" — I'm afraid of this too. That's why I asked Google about this. And I'm planning to avoid Google's CDN because of this, and use another CDN provider instead, which has spending limits. Because, unlike Nginx, I don't see any way for me to rate limit / throttle Google's CDN.
I do plan to use GCE (Google Cloud Engine) though. Therefore, right now I'm reading about how to rate limit my Nginx server. Because if I just configure Nginx correctly, then those $0.12 / GB you mentioned, cannot possible explode to ... like $10k in a month? What if Google sends a $10k bill when I'm back from an a few week's vacation, just because of my hobby project and a few people downloading a 1 MB movie over and over again forever (because: evil). Hmm, & the bigger & faster my servers, the higher the risk.
I hope Google will add spending limits, because I did want to use Google's CDN.
Update 2020: Apparently this does bite people from time to time — look here:
"Burnt $72k testing Firebase and Cloud Run and almost went bankrupt", Dec 08, 2020, https://news.ycombinator.com/item?id=25372336,
In that case, they could contact Google and in the end didn't need to pay.
As of July 2017 you can set budgets that send notifications via email but do not cap spending:
To set an alert-only budget, which will not cap spending:
Go to the Cloud Platform Console.
Open the console left side menu and click Billing
If you have more than one billing account, click the billing account name.
On the left, click Budgets & alerts.
Official help page: https://support.google.com/cloud/answer/6293540?hl=en
I found that Google's documentation now provides two methods to actually limit the cost of a GCP project. It involves the following setup:
Create a Cloud Function that checks the cost against the budget, and carries out a certain action if the cost exceeds the budget. Google's Documentation provides a sample code snip that can either shutdown all VM instances in a Project or disable the billing for a project. Shutting down all VMs would stop all VM-related cost but you get to keep your data (and still have to pay for the storage). Disabling the billing for a project would effectively zap all cost-related activities and you could lose data. You can name the Cloud Function "budget-enforcer".
The Google code snip as provided above has a hard coded ZONE variable. Remember to change it to match your zone!
Create a Service Account to run the Cloud Function "budget-enforcer". For shutting down VMs, the Service Account would need role "Compute Instance Admin (v1)". For disabling billing on a project, the Service Account would need role "Project Billing Manager".
Set a Topic for the Cloud Function (I call mine "proj-name-stop-vm" and "proj-name-disable-bill").
Set up a budget alert as usual, and connect it to one of the Pub/Sub topic above.
Please be noted that Google's documentation did mention that there could be a delay between the cost exceeds a budget and the function is triggered, so you should build in a buffer if you have an absolute hard cost limit. I use 90% of the budget as the trigger line for shutting down my instances.
The API usage can be limited with a hard limit:
Depending on the API, you can explicitly cap requests in a variety of
ways, including: requests per day, requests per 100 seconds, and
requests per 100 seconds per user. You might want to limit the
billable usage by setting caps. For example, to prevent getting billed
for usage beyond the free courtesy usage limits, you can set requests
per day caps
Source
You can combine budget pub/sub alerts with a cloud function that can disable billing on your entire account if a threshold is met.
Full Tutorial Here:
https://www.youtube.com/watch?v=KiTg8RPpGG4
GitHub Repo Here: https://github.com/aioverlords/Google-Cloud-Platform-Killswitch
To Disable Billing
const _disableBillingForProject = async projectName => {
const res = await billing.updateBillingInfo({
name: projectName,
resource: {
billingAccountName: ''
}, // Disable billing
});
console.log(res);
console.log("Billing Disabled");
return `Billing disabled: ${JSON.stringify(res.data)}`;
};
Simply go to the developer console:
https://console.developers.google.com/project
Select your project.
Select "billings & settings"
Enable billing.
Then go to Compute/AppEngine/Settings and set a daily budget.
Go to Google Cloud console, and then to Billing / Budgets and Alerts and create a new budget for one or all your projects. You can select which services should be included in the limit and set a monthly amount that should not be exceeded.

How can you limit the billing in firebase? They used to have this possibility, it looks like they removed it [duplicate]

I'm currently working in a social network app and I need to do a search feature. Firestore does not support these kind of queries, so I need to use an external service like Algolia.
The problem is that the free plan does not support connecting to external websites/APIs other than Google's own ones, so I can't connect to Algolia to get my search system working.
I have read multiple stories about devs paying high bills because of loops or errors in their code, and as the Blaze plan is a pay-to-go plan, they get charged what they used. If a loop generated 10TB of files they will get charged for that.
I also know that Blaze plan's features are free as long as each of them (individually) stay below the limits of the free Spark plan.
So as my question says, is there a way to set limits? For example, I would like to tell Firebase to limit my cloud functions invocations to 100k per month. That way it would be free and I would never be able to get over 100k as it's limited, which means I'll never get billed for that.
Take into account that the only thing I need right now from a paid plan is the connection to external networks. I don't need anything else as we're just starting and the app is not in production, so there's no need for huge limits.
Every Firebase project is also a Google Cloud Platform project. This means that many of the advanced features of Google Cloud Platform are also available for your Firebase project.
For example, you can set up billing alert for your Firebase project, so that you are alerted when the usage reaches a certain level. While you can't configure it to switch off the project at some point, the alert should typically be quite good for alerting you to unusual usage patterns.
For more on this see:
Tracking your spending with budgets in a recent blog post.
The GCP documentation on how to set budget alerts, which is what Firebase uses under the hood.
The GCP documentation now also has a section on capping (disabling) billing to stop usage. This is a brute force approach though and may lead to data being lost, so I'd recommend investigating all other options first.
Update (December 2020): Firebase's Todd Kerpelman just released a series of videos where he disables billing using the process from the documentation mentioned above.
You cannot set spending limits to your app now.
As of December 12, 2019, you can no longer create spending limits, but
you can change or remove existing spending limits.
https://cloud.google.com/appengine/pricing#spending_limit
You can create budgets, which will alert you when reaching the budget. But it won't stop the usage when hitting the budget.
https://cloud.google.com/billing/docs/how-to/budgets#add-new-budget
The screenshot here seems to show a Spending Limit setting for Firebase projects: Firebase: Budget and Daily Spending Limit
That settings page is located here (the Spending Limit setting apparently only shows up once you set up billing for the project): https://console.cloud.google.com/appengine/settings
It's disabled in the poster's case, but I think that's only because he connected it up to a "NodeJS App Engine app", which isn't the case for many Firebase developers.
I haven't tried it yet myself, but will do so once I start a paid plan.
EDIT: Yep, the setting shows up once you switch to a paid plan. (in my case, Blaze) I don't have enough traffic yet to confirm that it works as expected, but if I find later that it doesn't, I'll give an update here.
"This example shows you how to cap costs and stops usage for a project by disabling Cloud Billing. This will cause all Google Cloud services to terminate non-free tier services for the project."
Google Cloud Source

Evernote Rate Limit

I have an app that have hundreds of users and connects to Evernote. As I have more users I make more requests to Evernote and it is causing a lot of rate limiting for my users and causing frustration. Is there a way to get my current limit increased from Evernote?
I have fixed a lot of inefficient calls I used to do, but we still have the same issue.
Rate limits are applied to calls against the Evernote API on a per API key, per user, per time period basis. This means that the API limits the number of calls a third-party app can make for each individual user during a given one-hour period. [source]
The number of users of your application is irrelevant. The source for that quote details a number of reasons and fixes.
If you've optimised your code fully, this may be a "special case". You should contact Evernote developer support.

Google calendar API service account rate limit

I have a google service account setup for their calendar api, but it seems as though I can only make 5 requests per second. I've only figured that out from trial and error, there are no per second rate limit settings on my developer console settings.
I have 'queries per day' and 'queries per 100 seconds per user', both of which are currently set to 1,000,000.
I'm definitely not hitting these limits, so I can only assume there is a hidden 'per second' rate limit that is being applied. Does anyone know if that is the case?
Thanks!
I think this documentation will help you to understand more the Calendar usage limits.
Google Calendar puts certain limits in place to protect our users and infrastructure from abusive behavior. When these limits are reached by a user, Google Calendar will go into read-only mode for that user, and all edit actions will fail for a certain period of time. Most users will never hit these limits, as they are well above the activity level of a typical Calendar user.
I'd also like to add a few tips to work efficiently with your quota:
Use push notifications instead of polling.
If you cannot avoid polling, make sure you only poll when necessary (for example poll very seldomly at night).
Use incremental synchronization with sync tokens for all collections instead of repeatedly retrieving all the entries.
Increase page size to retrieve more data at once by using the maxResults parameter.
Update events when they change, avoid re-creating all the events on every sync.
Use exponential backoff for error retries.
Check the performance tips of the Calendar API

Linkedin API throttle limit

Recently I was developing an application using Linkedin people-search API. Documentation says that a developer registration has 1 lac API calls per day, but when I have registered this API, and ran a python script, after some 300 calls it says throttle limit exceeds.
Did anyone face such kind of issue using Linkedin API, comments are appreciated.
Thanks in advance.
It's been a while but the stats suggest people still look at this and I'm experimenting with the LinkedIn API and can provide some more detail.
The typical throttles are stated as both a max (e.g. 100K) and a per-user-token number (e.g. 500). Those numbers together mean you can get up to a maximum of 100,000 calls per day to the API but even as a developer a single user token means a maximum of 500 per day.
I ran into this, and after setting up a barebones app and getting some users I can confirm a daily throttle of several thousands of API calls. [Deleted discussion of what was probably, upon further consideration, an accidental back door in the LinkedIn API.]
As per the Throttle Limits published by LinkedIn:
LinkedIn API keys are throttled by default. The throttles are designed
to ensure maximum performance for all developers and to protect the
user experience of all users on LinkedIn.
There are three types of throttles applied to all API keys:
Application throttles: These throttles limit the number of each API call your application can make using its API key.
User throttles: These throttles limit the number of calls for any individual user of your application. User-level throttles serve
several purposes, but in general are implemented where there is a
significant potential impact to the user experience for LinkedIn
users.
Developer throttles: For people listed as developers on their API keys, they will see user throttles that are approximately four times
higher than the user throttles for most calls. This gives you extra
capacity to build and test your application. Be aware that the
developer throttles give you higher throttle limits as a developer of
your application. But your users will experience the User throttle
limits, which are lower. Take care to make sure that your application
functions correctly with the User throttle limits, not just for the
throttle limits for your usage as a developer.
Note: To view current API usage of your application and to ensure you haven't hit any throttle limits, visit
https://www.linkedin.com/developer/apps and click on "Usage & Limits".
The throttle limit for individual users of People Search is 100, with 400 being the limit for the person that is associated with the Application as the developer:
https://developer.linkedin.com/documents/throttle-limits
When you run into a limit, view the api usage for the application on the application page to see which throttle you are hitting.

Resources