I've been trying to do some research into the use of Dynamic Links with Firebase and was curious if anyone knew of any limit on the number of dynamic links we can use. For example, we're looking to generate links server side which is dynamic and will log users into our apps. However, this will go into the 10,000s. so will this be a bit too extreme for Firebase and therefore not a viable solution?
Thanks in advance.
That should be fine - the limits on FDL will be in terms of how many you can create per second, so as long as you spread the creation out, you should be fine.
Based on my (short) experience, if using the free plan there is a limit of 100 links per 100 seconds per user, which means if you generate links on the backend (like I do) you are basically limited to creating 1 link per second, which is not much. If you exceed this limit you receive an error like this:
429 Too Many Requests
Insufficient tokens for quota 'DefaultQuotaGroup' and limit 'USER-100s' of service
Also a lot of times the Dynamic Links API returns error 503 Service Unavailable when generating links instead of 429, I don't know why but I don't think they have availability issues. It's just kind of confusing.
Here is what I see on my project's quotas page:
Default quota. per day 100,000
Default quota. per 100 seconds 5,000
Default quota. per 100 seconds per user 100
Related
I am using the Google Calendar API to preprocess events that are being added (adjust their content depending on certain values they may contain). This means that theoretically I need to update any number of events at any given time, depending on how many are created.
The Google Calendar API has usage quotas, especially one stating a maximum of 500 operations per 100 seconds.
To tackle this I am using a time-based trigger (every 2 minutes) that does up to 500 operations (and only updates sync tokens when all events are processed). The downside of this approach is that I have to run a check every 2 minutes, whether or not anything has actually changed.
I would like to replace the time-based trigger with a watch. I'm not sure though if there is any way to limit the amount of watch calls so that I can ensure the 100 seconds quota is not exceeded.
My research so far shows me that it cannot be done. I'm hoping I'm wrong. Any ideas on how this can be solved?
AFAIK, that is one of the best practice suggested by Google. Using watch and push notification allows you to eliminate the extra network and compute costs involved with polling resources to determine if they have changed. Here are some tips to best manage working within the quota from this blog:
Use push notifications instead of polling.
If you cannot avoid polling, make sure you only poll when necessary (for example poll very seldomly at night).
Use incremental synchronization with sync tokens for all collections instead of repeatedly retrieving all the entries.
Increase page size to retrieve more data at once by using the maxResults parameter.
Update events when they change, avoid re-creating all the events on every sync.
Use exponential backoff for error retries.
Also, if you cannot avoid exceeding to your current limit. You can always request for additional quota.
Firestore offers 50000 documents read operations as part of its free bundle.
However, in my application, the client is fetching a collection containing price data. The price data is created over time. Hence, starting from a specific timestamp, the client can read up to 1000 documents. Each document represents one timestamp with the price information.
This is means that if the client refreshes his/her web browser 50 times, it will exhaust my quota immediately. And that is just for a single client.
That is what happened. And got this error:
Error: 8 RESOURCE_EXHAUSTED: Quota exceeded
The price data are static. Once they have been written, it is not supposed to change.
Is there a solution for this issue or I should consider other database other than Firestore?
The error message indicates that you've exhausted the quota that is available. On the free plan the quota is 50,000 document reads per day, so you've read that number of documents already.
Possible solutions:
Upgrade to a paid plan, which has a much higher quota.
Wait until tomorrow to continue, since the quota resets every day.
Try in another free project, since each project has its own quota.
If you have a dataset that will never-ever (or rarely) change, why not write it as JSON object in the app itself. You could make it a separate .js file and then import for reading to make your table.
Alternatively - is there a reason your users would ever navigate through all 1,000 records. You can simulate a full table even with limiting to calls to say 10 or so and then paginate to get more results if needed.
Given that Evernote don't publish their exact API rate limits (at least I can't find them), I'd like to ask for some guidance on it's usage.
I'm creating an application that will sync the user's notes and store them locally. I'm using getFilteredSyncChunk to do this.
I'd like to know how often I can make this API call without hitting the limits. I understand that the limits are on a per-user basis, so would it be acceptable to call this every 5 minutes to get the latest notes?
TIA
The rate limit is on a per API key basis. You'll be okay calling getFilteredSyncChunk every five minutes, although it's a little more efficient to call getSyncState instead.
In case you haven't seen it yet, check out this guide for info on sync (accessible from this page).
As far as I know, Marketo limits the number of REST API requests to 10,000 per day. Is there a way to overcome this limit? Can I pay and get more of those?
I found out that the REST API requests and the SOAP API requests counts separately but I'm trying to find a solution that is limited to REST API.
Moreover, in order to get an access token I need to sacrifice a request. I need to know how long this access token will be alive in order to save as much requests as possible.
You can increase your limit just by asking your account manager. It costs about 15K per year to increase your limit by 10K API calls.
Here are the default limits in case you don't have them yet:
Default Daily API Quota: 10,000 API calls (counter resets daily at 12:00 AM CST)
Rate Limit: 100 API calls in a 20 second window
Documentation: REST API
You'll want to ask your Marketo account manager about this.
I thought I would update this with some more information since I get this question a lot:
http://developers.marketo.com/rest-api/
Daily Quota: Most subscriptions are allocated 10,000 API calls per day (which resets daily at 12:00AM CST). You can increase your daily quota through your account manager.
Rate Limit: API access per instance limited to 100 calls per 20 seconds.
Concurrency Limit: Maximum of 10 concurrent API calls.
For the Daily limit:
Option 1: Call your account manager. This will cost you $'s. For a client I work for we have negotiated a much higher limit.
Option 2: Store and Batch your records. For example, you can send a batch of 300 leads in a single lead insert/update call. Which means you can insert/update 3,000,000 leads per day.
For the Rate limit:
Option 1 will probably not work. Your account manager will be reluctant to change this unless you a very large company.
Option 2: You need to add some governance to your code. There are several ways to do this, including queues, timers with a counter, etc. If you make multi-threaded calls, you will need to take into account concurrency etc.
Concurrent call limit:
You have to limit your concurrent threads to 10.
There are multiple ways to handle API Quota limits.
If you all together want to avoid hitting API limit, try to achieve your functionality thru Marketo Webhooks. Marketo webhook will not have API limits, but it has its own CONS. Please research on this.
You may use REST API, but design your strategy to batch the maximum records in a single payload instead of smaller chunks, e.g. sending 10 different API calls with each 20 records, accumulate the max allowed payload and call Marketo API once.
The access token is valid for 1 hour after authenticating.
Marketo's Bulk API can be helpful in regard to rate limiting as once you have the raw activities the updates, etc on the lead object can be done without pinging marketo for each lead: http://developers.marketo.com/rest-api/bulk-extract/ however be aware of export limits that you may run into when bulk exporting lead + activities. Currently, Marketo only counts the size of the export against the limit when the job has been completed which means you can launch a max of 2 concurrent export jobs(which sum to more than the limit) at the same time as a workaround. Marketo will not kill a running job if a limit has been reached so long as the job was launched prior to the limit being reached.
Marketo has recently upgraded the maximum limit
Daily Quota: Subscriptions are allocated 50,000 API calls per day (which resets daily at 12:00AM CST). You can increase your daily quota through your account manager.
Rate Limit: API access per instance limited to 100 calls per 20 seconds.
Concurrency Limit: Maximum of 10 concurrent API calls.
https://developers.marketo.com/rest-api/
What is the meaning of the following?
50 Max Connections, 5 GB Data Transfer, 100 MB Data Storage.
Can anyone explain me? Thanks
EDIT - Generous limits for hobbyists
Firebase has now updated the free plan limits
Now you have
100 max connections
10 GB data transfer
1 GB storage
That means that you can have only 50 active users at once, only 5GB data to be transferred within one month and store only 100 MB of your data.
i.g. you have an online web store: only 50 users can be there at once, only 100 mbytes of data (title, price, image of item) can be stored in DB and only 5 GB of transfer - means that your web site will be available to deliver to users only 5gb of data (i.e. your page is 1 mbyte size and users will be able to attend that page only 50 000 times).
UPD: to verify the size of certain page (to define if 5gb is enough for you) - using google chrome right click anywhere on page - "Inspect Element" and switch to tab "Network". Then refresh the page. In bottom status bar you will amount of transferred data (attached size of current stackoverflow page, which is 25 kbytes)
From the same page where the question was copied/pasted:
What is a concurrent connection?
A connection is a measure of the number of users that are using your
app or site simultaneously. It's any open network connection to our
servers. This isn't the same as the total number of visitors to your
site or the total number of users of your app. In our experience, 1
concurrent corresponds to roughly 1,400 monthly visits.
Our Hacker Plan has a hard limit on the number of connections. All of
the paid Firebases, however, are “burstable”, which means there are no
hard caps on usage. REST API requests don't count towards your
connection limits.
Data transfer refers to the amount of bytes sent back and forth between the client and server. This includes all data sent via listeners--e.g. on('child_added'...)--and read/write ops. This does not include hosted assets like CSS, HTML, and JavaScript files uploaded with firebase deploy
Data storage refers to the amount of persistent data that can live in the database. This also does not include hosted assets like CSS, HTML, and JavaScript files uploaded with firebase deploy
These limits mentioned and discussed in the answers, are per project
The number of free projects is not documented. Since this is an abuse vector, the number of free projects is based on some super secret sauce--i.e. your reputation with Cloud. Somewhere in the range of 5-10 seems to be the norm.
Note also that deleted projects take around a week to be deleted and they continue to count against your quota for that time frame.
Ref