Firebase Cloud Functions Response Time - firebase

so up until mid 2018 there have been complaints about performance issues with Firebase Cloud Functions and Google CFs (which are the same under the hood I believe). Like these ones:
https://github.com/googleapis/google-cloud-node/issues/2374
https://github.com/firebase/firebase-functions/issues/161
I remember seeing that a simple Hello World example had a response time of 500ms - 800ms. EDIT: I know about cold starts, but as described in the GitHub issues cold starts were not the main problem. A Firebase Cloud Functions would randomly take up to 10s to respond which looked like a problem within Firebase.
I am currently considering building a project with Firebase and would like to build a REST API with Firebase cloud functions - but bad performance would be a deal breaker.
What's the current status? Do these problems still occur? None of these GitHub issues were properly answered by Google, but also no more users have complained ever since …

Cold start times are a fact of life for serverless backends such as Cloud Functions. It's due to the way server instances are automatically scaled up and down to handle load in a cost-effective way. You can always expect that the first request to a new server instance will take some amount of time longer than the subsequent requests that get directed to that same server instance. That amount of time will be variable depending on a number of factors, including the type of trigger, and what all needs to happen with the first request.
If you want to learn more about Cloud Functions scale, what you can expect as a result, and what you can do to mitigate cold starts, watch my video series on the matter.

Cloud Functions for Firebase are Google cloud Functions with a wrapper to allow them to integrate better with other Firebase products. Therefor it is expected a small loss of performance.
The important part to decide which one to use is more to what are you integrating the most.
If your project is running in Firebase, uses firebase authentication etc then Cloud Functions for Firebase is the best choice.
On the other hand if you are using Google Cloud Platform Products then Google Cloud Funtions is the best choice.

Related

Firebase - Perform Analytics from database/firestore data

I am using Firebase as my authentication and database platform in my React Native-Expo app. I have not yet decided if I will be using the realtime-database or Firestore database.
I need to perform statistical analysis on daily data gathered from my users, which is stored in the database. I.e. the users type in their daily intake of protein, from it I would like to calculate their weekly average, expected monthly average, provide suggestions of types of food if protein intake is too low and etc.
What would be the best approach in order to achieve the result wanted in my specific situation?
I am really unfamiliar and stepping into uncharted territory regarding on how I can accomplish this. I have read that Firebase Analytics generates different basic analytics regarding usage of the app, number crash-free users etc. But can it perform analytics on custom events? Can I create a custom event for Firebase analytics to keep track of a certain node in my database, and output analytics from that? And then of course, if yes, does it work with React Native-Expo or do I need to detach from Expo? In addition, I have read that Firebase Analytics can be combined with Google BigQuery. Would this be an alternative for my case?
Are there any other ways of performing such data analysis on my data stored in Firebase database? For example, export the data and use Python and SciKit Learn?
Whatever opinion or advice you may have, I would be grateful if you could share it!
You're not alone - many people building web apps on GCP have this question, and there is no single answer.
I'm not too familiar with Firebase Analytics, but can answer the question for Firestore and for your custom analytics (e.g. weekly avg protein consumption)
The first thing to point out is that Firestore, unlike other NoSQL databases, is storage only. You can't perform aggregations in real time like you can with MongoDB, so the calculations have to be done somewhere else.
The best practice recommended by GCP in this case is indeed to do a regular export of your Firestore data into BQ (BigQuery), and you can run analytical calculations there in the meantime. You could also, when a user inputs some data, send that to Pub/Sub and use one of GCP Dataflow's streaming templates to stream the data into BQ, and have everything in near real time.
Here's the issue with that however: while this solution gives you real time, and is very scalable, it gets expensive fast, and if you're more used to Python than SQL to run analytics it can be a steep learning curve. Here's an alternative I've used for smaller webapps, which scales well for <100k users and costs <$20 a month on GCP's current pricing:
Write a Python script that grabs the data from Firestore (using the Firestore Python SDK), generates the analytics you need on it, and writes the results back to a Firestore collection
Create an endpoint for that function using Flask or Django
Deploy that server application on Cloud Run, preventing unauthenticated invocations (you'll only be calling it from within GCP) - see this article, steps 1 and 2 only. You can also deploy the Python script(s) to GCP's Vertex AI or hosted Jupyter notebooks if you're more comfortable with that
Use Cloud Scheduler to call that function every x minutes - see these docs for authentication
Have your React app query the "analytics results" collection to get the results
My solution is a FlutterWeb based Dashboard that displays relevant data in (near) realtime like the Regular Flutter IOS/Android app and likewise some aggregated data.
The aggregated data is compiled using a few nodejs based triggers in the database that does any analytic lifting and hence is also near realtime. If you study pricing you will learn, that function invocations are pretty cheap unless of-course you happen to make a 'desphew' :)
I came up with a great solution.
I used the inbuilt firebase BigQuery plugin. Then I used Cube.js (deployed on GCP - cloud run on docker) on top of bigquery.
Cube.js just makes everything just so easy. You do need to make a manual query It tries to do optimize queries. On top of that, it uses caching so you won't get big bills on GCP. I think this is the best solution I was able to find. And this is infinitely scalable and totally real-time.
Also if you are a small startup then it is mostly free with GCP - free limits on cloud run and BigQuery.
Note:- This is not affiliated in any way with cubejs.

Throttling callable https firebase cloud function execution per user?

I was not able to find any resources about this, hence wanted to ask if it is a good idea / necessary to add throttling to callable https cloud functions in firebase on per user basis?
Example, I want to limit one user to be only able to call https function every 5 seconds.
If it is a viable thing to do, how would it be acheived?
There is not any inbuilt per user throttling capabilities in cloud functions. You have a few options of doing your own:
Put logic in your client side apps that tracks the amount of times a user is calling them and deny the call if too frequent
Issue here is that if someone is trying to game you this wouldn't be 100% effective as they could use multiple windows, etc.
You could implement a database solution where you track their usage and at the beginning of your function you check if they are violating your rate limit
Issue here is you are still having the triggers of your functions incurring the costs.
If it was a super big issue for you, I would recommend looking at using an API management platform such as Apigee where you can apply policies such as rate limiting
This a heavy weight solution with an increased cost and so wouldn't do it unless necessary

Performance difference Firestore through Firebase Functions vs Firestore SDK

Our team is developing a mobile app and is currently in use of (Firebase) Firestore for our backend. We wrapped every DB access with Firebase Functions in order to clean up the object returned to the client app.
Does this approach introduce any (additional) unignorable overhead compared to accessing to Firestore directly?
Yes but No depending on your use case.
If you have small amount of users with relatively low usage (in terms of the given quota), it is recommended to apply Cloud Functions. As stated in the documentation, Firebase Cloud Function offers big quota in terms of Resource limits, Time limits and Rate limits with good pricing especially for the Spark plan (FREE).
The advantage of using Cloud Functions is that it has a high speed and scalable computing / processing unit which could shorten the processing time of a specific function as compared with using the mobile phone CPU which in some cases the mobile phones has low computing power (have to consider various users as not everyone own a high spec phone), in order to provide better user experience (UX), all this hassle can be done by Cloud Function!
Note: I do agree with Doug where cost is one of the factor, but we should also consider the performance and other perspective.
Yes, at the very least, now your path to get data has two hops instead of one. Before, you directly accessed the database using a channel that's optimized for returning the query results. Now, you have to pay the cost of an additional hop to Cloud Functions, which makes the query. And it's possible that the results returned to the client are bigger than if you made the query directly.
Perhaps the biggest loss you'll experience is the client side caching of documents that's automatically performed by the client (enabled by default on Android and iOS). If you repeat a query and none of the documents have changed, you get immediate results from the cache instead of having to wait for the server. And you won't have to pay for document reads for cache hits. So, if you aren't also caching your results, you're also paying the monetary cost of Cloud Functions and the query to Firestore for every request.
Yes, but the answer could be different based on the situation.
If a client wants to fetch a record exactly as in the database, the Firebase SDK might be faster because there is no overhead calling the Firebase Functions.
If we have a heavy processing after fetching a record, then Firebase Functions + Firebase Admin SDK could be faster because the processing unit in Firebase Functions could be faster than mobile CPU. However, if the request responds faster, the client app could display an additional message that something was fetched and currently in process during the heavy processing, the user experience could be acceptable.
The only case I can come up with Firebase Functions could always win is that the server reduces the data size so that the overhead introduced by Firebase Functions (including processing time) was compensated by the shorter network delay. This also has advantage of saving client's data plan.

How to host daemon process that aperiodically updates firebase database?

I have so far been very impressed with the firebase platform for hosting a client-side single page app and for data storage. However, I have one component that I don't know where to host...
I want to have a background process that aperiodically updates the database. The nature of when an update is needed is based on an external source and, although the general timeframe of when updates are available is known, the exact timing is not. My thinking was to have a background task running that has some smarts to determine when an update is needed, and then trigger an update at that time.
I don't know where I would host something like this. I considered running it in a loop in a firebase function, but due to pricing model being based on time, that would get very expensive, and functions are not suited for daemon-type processes. The actual "database update" would be suitable for a function, but not the triggering logic. Also, I have seen functions-cron which does offload the triggering logic, but since my updates are not truly periodic, it doesn't seem exactly appropriate. I haven't looked too much into AppEngine and how that relates to the firebase platform...so basically my question:
What are the options for "reasonably-priced" hosting an always-running background task?
Google App Engine - Standard is something you want to look at more. It is reasonably priced since what you are doing will likely fit into GAE-Std's free daily quota. In GAE-Std, you create a scheduled cron job: GAE will call you task as if it was an incoming web request.
See Firebase doc for integrating with GAE
See GAE doc for cron jobs

An Alternative for Firebase functions ? Is it okay to run them on a VM?

I am using firebase functions for an Uberlike product. I can't get expected performance. Specially it takes a long time to load data from realtime-db. Up to 2-3 seconds for a read.
It's may be due to called start, which is discussed here. => Why is Cloud Functions for Firebase taking 25 seconds?
So I decided to move the functionality of these functions to a VM instance. Using firebase onWrite and admin SDK, a similar functionality can be achieved on a virtual machine.
Is it okay to do so? Will I get any scalability issue?
It is definitely possible to run similar code on your own hardware/VM. In fact that is how many of Firebase's own back-end processes ran, before Cloud Functions was available.
What you'll miss is the auto-scaling of Cloud Functions though. Your machine/VM will always be running, and has a limited capacity (how much it can handle). Unlike Firebase, it has a fixed capacity.
Cloud Functions on the other hand, scaled down to 0 when there are no request, and scales up to meet demand as needed. Whether that is needed for your use-case, only you can determine.

Resources