Firebase Functions quota when serving locally using firebase serve --only functions - firebase

TL;DR:
Does my firebase project incur charges/use up quota by running firebase functions locally using $firebase serve --only functions?
Details:
I have a collection in my firestore database called locations with thousands of documents that include location data such as address and postCode. I want to add some geolocation data to each document using an external api service (Open Postcode Geo API). I've written an HTTP-triggered firebase function, triggered by hitting the endpoint /api/geocodeLocations. This cloud function will run through each document in the locations collection and use the postCode data to make an api call to Open Postcode Geo which will respond with the geolocation data (along with other data). It will then get the data it needs from the response and write it to the document in firestore.
I am aware that if I deploy this function and call it by hitting https://my-project/api/geocodeLocations, I will start racking up my quotas and maybe incur charges for the amount of external requests that it's making (and more than likely time out...). However, if I run $firebase serve --only functions locally on my dev machine to emulate functions and then hit the url http://localhost:5000/my-project/us-central1/app/api/geocodeLocations, will I still rack up quotas for external requests?
I don't think I will, as the requests are between my local machine and the external api, but I can't find any firebase documentation on quotas when running locally.
Thanks in advance!

Running functions locally in any way doesn't cost anything in terms of billing or quota for the Cloud Functions product.
If you read data out of Firestore or Realtime Database in a function running locally, you will still be billed for that access, but it won't be in relation to Cloud Functions billing or quotas.

Related

Which is the most optimal base url for functions fetch calls?

Background: Some days ago, Firebase Hosting added support for NextJS.
I have Firebase Functions and Firebase Hosting inside the same project.
I'm implementing an SSR with NextJS in Firebase Hosting, and now I need to run some fetch calls inside getServerSideProps. Those fetch calls will hit my own Firebase Functions, and I wonder if there is a better base-URL for making "local fetch" while getting that data. Some base-URL to tell Firebase Hosting: "Hey, this is a call to a resource that is inside Google".
My goal is to reduce the response time, while reducing the "tracert" stack.
In other words, I need a "local URL" to point locally in the Google-Firebase-Functions-Servers.

Caching in firebase

I have my app hosted in firebase and using cloud functions to fetch data from a third-party API. This cloud function is HTTP triggered and runs whenever client-side requests data. I want to reduce cloud functions calls (as it is currently on Blaze plan), so thinking to add caching in my app.
What are the caching strategies I can use on the client (web browser) as well as server-side (Node.js)?
Overall I want to reduce cloud function calls to minimize costs, so that on every client request, cloud function doesn't need to be called, instead the client can fetch data from cache.
Should I use firebase real-time database for storing data from third-party API and update this in firebase over a period of time, so that data is up-to-date? Data on the third-party side doesn't change frequently.
Whether fetching data from real-time database (as mentioned in point 3 above) instead of cloud function calls would eventually reduce costs?
If you host your Cloud Function behind Firebase Hosting, you can write a cache header in every response and manage the cache behavior of Firebase Hosting that way. Allowing the content to be served from Firebase Hosting's CDN may significantly reduce the number of times your Function gets executed, and thus its cost.

Firebase Cloud Functions pricing for Admin SDK

Is 125k invocations per month for Firebase cloud functions applies to admin-sdk used on custom server? How price of cloud functions is calculated if used with admin sdk on self hosted server?
The 125k invocations per month you are referring to is talking about how many times a Cloud Function is executed/triggered (the term used by Firebase here is invoked). Use of the admin SDK on your own hardware/third-party server doesn't make use of Cloud Functions and is unrelated to the invocation limits.
Let's say you set up a HTTPS Cloud Function called date. Each time a user visits https://us-central1-<project-id>.cloudfunctions.net/date, this would count as 1 invocation of that Cloud Function (ignoring response caching/use of a CDN).
Another example is listening for new data in Cloud Firestore that you can call createUser. Each time a new users/someUserID document is created, the createUser function would be invoked.
For such trivial use cases, you aren't likely to hit the 125k limit. But if you have Cloud Functions that deal with frequently modified data or rapidly triggered pub/sub topics, you can quickly approach these limits if care is not taken.
One example of this is if you set up a RTDB Cloud Function that was incorrectly listening to any data under /posts. Every time a user (or server) changed any data under /posts, the function would be invoked. If your cloud function updated /posts/count everytime it was called, the cloud function would retrigger itself leading to an infinite loop.
Is 125k invocations per month for Firebase cloud functions applies to admin-sdk used on custom server?
Cloud Functions billing is not at all related to whatever billing is incurred by the Admin SDK. If you use the Admin SDK, you will be billed according to the products it uses, in addition to whatever billing might happen for Cloud Functions.
How price of cloud functions is calculated if used with admin sdk on self hosted server?
It's not possible to self-host Cloud Functions. Cloud Functions only runs within Google Cloud infrastructure.
If you use the Admin SDK on your own host, outside of Cloud Functions or any Google Cloud hosting, it does not change the billing compared to the same usage in Cloud Functions.
If you want to know what the cost is for using the Admin SDK, you should understand which product you are accessing with that SDK, and look up its own pricing. The Admin SDK itself does not bill - it is the usage of the underlying Firebase or Cloud product that incurs billing.

Need python files stored in Google Database to compile in Google Cloud Engine and return data to an IOS App

My Current Plan:
I'm currently creating an IOS App that will access/change java/python files that are stored in the Google Cloud Storage. Once confirmed the App will talk with App Engine that will have a Compute Engine VM receive files and compile them. Once compiled have the result returned back to the IOS App
Is there any better or easier method to achieve this task? Should I use firebase or Google Cloud Functions? Would it be any help
Currently, I'm lost how to design and have requests sent between many platforms.
It would also depend on what type of data processing you are doing to the files in Cloud Storage. Ideally you would want to avoid as many "hops" between services as possible. You could do everything via Cloud Functions and listen on GCS Triggers. You can think of Cloud Functions as a sudo App Engine Backend to use for quick request handling.
Use Cloud Functions to respond to events from Cloud Storage or Firebase Storage to process files immediately after upload
If you are already using Firebase, it would be better to stay within their ecosystem as much as possible. If you are doing bigger or more intensive data processing you might want to look at different options.
With more information and current pain points, we may be able to offer more insight.

How to obtain Firebase/Google Cloud Storage Storage Download Analytics

I understand that much of Firebase's services are wrappers around the Google Cloud platform (functions, storage etc..) I would like to obtain analytics on Google Cloud Storage downloads on an object basis (downloads, time). In the firebase console, it shows the amount of requests as well as the amount of data downloaded, but I don't know which objects were downloaded and how often.
Is there a logging method or API in Google Cloud I can utilize to obtain this data?
You have the Access Logs & Storage Logs that allows you to get the information for all the requests made to a bucket.
You will have to set up a bucket dedicated to store Access and Storage logs and then set logging for each bucket pointing to the dedicated one.

Resources