Firebase access latency - firebase

we have an issue with Firebase access latency.
We have Node.js application with Firebase SDK.
In the code we have next consequence of requests to Firebase:
1. Get keys by GeoFire
2. Then get serveral (from 3 to 100) branches by keys in parallel.
3. Then get same number of branches from another entity by keys in parallel.
In Javascript dialect parallel requests looks like this:
const cardsRefs = map(keys, (key) => this.db.ref('cards').child(key));
return Promise.all(map(cardsRefs, (ref) => {
return ref
.once('value')
.then((snapshot) => snapshot.val())
})
);
That's all, not so big, I think.
But it can take a lot of time (up to 5000ms). And we can't be sure about this time, because it can be 700ms sometimes and sometimes it can be much more (up to 5000ms). We expected more predictible behaviour at this point from Firebase.
First, we thought that the reason of unstable latency in wrong choosen Google Cloud location (code was ran on Google Compute Engine). We tried from us-west1 an us-central1. Please, lead us to right datacenter location.
But then we rewrote code into Google Function and got same result.
Please, tell us, how we can get more predictible and sustainable latency on Firebase requests?

We migrate our functions with the backend to cloud functions and the situation has improved. Each function approximately began to work faster for 1.5 - 2 seconds. At the moment, we are satisfied with this situation.

Related

Firebase One-time Functions

I'm sure these are common scenarios, but after researching some hours, I couldn't really find what the common practice is. Maybe someone with more experience in Firebase can point me to the right direction.
I have two particular scenarios:
1. Code that runs once
Example 1: adding new data to all users in firestore, which is needed for a new feature
Example 2: start duplicating data into existing documents
I currently write the code in a cloud function, and run it on a firestore event (onUpdate of a "hidden" document) and then I immediately delete the function if everything goes well.
I also increase the timeout and memory for this function, as the idea is to potentially update millions of documents.
2. Manually trigger a function from the firebase console (or command line)
Example: Give a user admin privileges in the app (function that sets custom claims and firestore data). We don't have time to implement a back-office, so doing this from the firebase web portal/console would be ideal, specifying the user id.
My current solution is to use a https function, and run it from the GCP portal (on the function's "Testing" tab, being able to pass a json). BUT the function can be triggered publicly, which I don't really like...
What are the common practices for these scenarios?
To expand on my comment: if you want to create a node script to run one-off code, you just write your JS code like for any cloud function but simply run it immediately. Something like this.
const admin = require('firebase-admin');
admin.initializeApp();
const db = admin.firestore();
db.collection('users')
.where('myField', '==', true)
.get()
.then(querySnapshot => {
querySnapshot.docs.forEach(doc => {
// update doc
});
});
If you save this as script.js and execute it with node script.js you’ll be pointed towards downloading a JSON file with credentials. Follow the instructions and you can then run the script again and now you’re running your own code on Firestore, from the command line.
For administrative type operations, you're better off just running them on your desktop or some other server you control. Cloud Functions is not well suited for long running operations, or things that must just happen once on demand.
Case 1 really should be managed by a standalone program or script that you can monitor by running it on your desktop.
Case 2 can be done a number of ways, such as building your own admin web site. But you might find it easiest to mirror the contents of a document to custom claims using a Firestore trigger. Read this: https://medium.com/firebase-developers/patterns-for-security-with-firebase-supercharged-custom-claims-with-firestore-and-cloud-functions-bb8f46b24e11

Firestore Deadline Exceeded Node

I would like to load collection that is ~30k records. I.e load it via.
const db = admin.firestore();
let documentsArray: Array<{}> = [];
db.collection(collection)
.get()
.then(snap => {
snap.forEach(doc => {
documentsArray.push(doc);
});
})
.catch(err => console.log(err));
This will always throw Deadline Exceeded error. I have searched for some sorts of mechanism that will allow me to paginate trough it but I find it unbelievable not to be able to query for not that big amount in one go.
I was thinking that it may be that due to my rather slow machine I was hitting the limit but then I deployed simple express app that would do the fetching to app engine and still had no luck.
Alternatively I could also export the collection with gcloud beta firestore export but it does not provide JSON data.
I'm not sure about firestore, but on datastore i was never able to fetch that much data in one shot, I'd always have fetch pages of about 1000 records at a time and build it up in memory before processing it. You said:
I have searched for some sorts of mechanism that will allow me to paginate trough
Perhaps you missed this page
https://cloud.google.com/firestore/docs/query-data/query-cursors
In the end the issue was that machine that was processing the 30k records from the Firestore was not powerful enough to get the data needed in time. Solved by using, GCE with n1-standard-4 GCE.

How can I have a continuous firebase cloud function for a continuous stream of data?

I need to use the Twitter Stream API to stream tweet data to my firebase cloud function like this:
client.stream('statuses/filter', params, stream => {
stream.on('data', tweet => {
console.log(tweet);
})
stream.on('error', error => {
console.log(error)
})
})
The stream is continuous but the firebase cloud function shuts down after a certain period of time. What solution could I make use of to be able to continuously receive the stream data?
Cloud Functions have a max running time of 540 seconds as documented. You would have to look at probably using a Compute Engine Instance from Google Cloud where you can have code running without limits. Or you could look at using the Google Cloud Scheduler to run your function every x time to get new tweets.
The accepted response suggests running GCE and while it's certainly correct, I'd like to point out that anyone who was interested in Cloud Functions - a serverless solution - might find GAE (App Engine) much more viable for streaming data.
Our application utilises App Engine Standard as an ingestion service and it works like a charm - removing overhead required by GCE. If advanced networking features are required by your app App Engine Flexible or GKE (Kubernetes Engine) might also be something to look at!

Firebase: First write is slow

Currently developing a hybrid mobile app using ionic. When the app starts up, and a user writes to the Realtime Database for the first time, it's always delayed by around 10 or more seconds. But any subsequent writes are almost instantaneous (less than 1 second).
My calculation of delay is based on watching the database in the Firebase console.
Is this a known issue, or maybe I am doing something wrong. Please share your views.
EDIT:
The write is happening via Firebase Cloud Function.
This is the call to the Firebase Cloud function
this.http.post(url+"/favouritesAndNotes", obj, this.httpOptions)
.subscribe((data) => {
console.log(data);
},(error)=>{
console.log(error);
});
This is the actual function
app.post('/favouritesAndNotes', (request, response) => {
var db = admin.database().ref("users/" + request.body.uid);
var favourites = request.body.favourites;
var notes = request.body.notes;
if(favourites!==undefined){
db.child("favourites/").set(favourites);
}
if(notes!==undefined){
db.child("notes/").set(notes);
}
console.log("Write successfull");
response.status(200).end();
});
The first time you interact with the Firebase Database in a client instance, the client/SDK has to do quite some things:
If you're using authentication, it needs to check if the token that it has is still valid, and if not refresh it.
It needs to find the server that the database is currently hosted on.
It needs to establish a web socket connection.
Each of these may take multiple round trips, so even if you're a few hundred ms from the servers, it adds up.
Subsequent operations from the same client don't have to perform these steps, so are going to be much faster.
If you want to see what's actually happening, I recommend checking the Network tab of your browser. For the realtime database specifically, I recommend checking the WS/Web Socket panel of the Network tab, where you can see the actual data frames.

On how many process do cloud function run?

I would like to use google cloud functions to update an additional database living on heroku because firebase realtime database is not cutting it.
However there is a limit of concurrent connections that are accepted on heroku, so I'm wondering how many of them will be opened via cloud functions ? Is it 1 ? Is there no limit ?
I've something like this in my functions
import { Pool } from 'pg';
const pool = new Pool(connectionString);
exports.onAuth = functions.auth.user().onCreate(event => {
pool.query(...)
});
Cloud Functions will spin up as many server instances as is required to meet the demand on your functions. It could be as low as 0 if there is no load, and much higher if there is a lot of load. Each instance could be doing work for your function. You can't constrain the number of instances for your functions - it just scales automatically on demand. So you can think of it as having no limit (as long as you are paying your bills).

Resources