Firebase REST API persistent connection, speed optimisation - firebase

Retrieving multiple documents in the loop with Firebase JavaScript library is almost as fast as retrieving one document. Whether it is thanks to websocket of anything else.
Doing same with the REST Api is linearly slow. Each request takes a bit less than one second, 10 GET requests takes about 9 seconds on my machine. Setting 'Connection' header to 'keep-alive' does not improve the speed.
Given that quote from Firebase docs, I'd like to know how can one optimise the speed of the multiple lookup requests via REST Api.
Is it really okay to look up each record individually? Yes. The Firebase protocol uses web sockets, and the client libraries do a great deal of internal optimization of incoming and outgoing requests. Until we get into tens of thousands of records, this approach is perfectly reasonable. In fact, the time required to download the data (i.e. the byte count) eclipses any other concerns regarding connection overhead.

You can use Firebase JavaScript SDK in node.js for backend functions to take advantage of sockets.
For example, I'm using JavaScript API to send SMS on child_added event in firebase.
var Firebase = require('firebase');
var ref = new Firebase('YOUR_FIREBASE_URL_REFERENCE');
ref.authWithCustomToken("YOUR_SECRET_TOKEN", function(error, aut$
if (error) {
console.log("Authentication Failed!", error);
} else {
console.log("Authenticated successfully with payload:", authData);
}
});
var messageRef = new Firebase('https://cozywait.firebaseio.com/messages');
messageRef.orderByChild('status').equalTo('requested').on('child_added', function(snaps$
console.log('Message notification sending to ', snapshot.val().number);
initSendSMS(snapshot);
});

Related

How to continue running Firebase Cloud Function after request is finished

Really bizarre that Firebase doesn't seem to work quite like typical Express app. Whatever I write in Express and copy-paste to Firebase Functions I typically get error. There is one that I can't figure out on my own though.
This endpoint is designed to start a function and live long enough to finish even longer task. That request is a webhook (send docs, we will transform them and ping you when it's done to specified another webhook). Very simplified example below:
router.post('/', (req, res) => {
try {
generateZipWithDocuments(data) // on purpose it's not async so request can return freely
res.sendStatus(201)
} catch (error) {
res.send({ error })
}
})
On my local machine it works (both pure Express app and locally emulated Firebase Functions), but in the cloud it has problems and even though I put a cavalcade of console.log() I don't get much information. No error from Firebase.
If generateZipWithDocuments() is not asynchronous res.sendStatus() will be immediately executed after it, and the Cloud Function will be terminated (and the job done by generateZipWithDocuments() will not be completed). See the doc here for more details.
You have two possibilities:
You make it asynchronous and you wait its job is completed before sending the response. You would typically use async/await for that. Note that the maximum execution time for a Cloud Function is 9 minutes.
You delegate the long time execution job to another Cloud Function and, then, you send the response. For delegating the job to another Cloud Function, you should use Pub/Sub. See Pub/Sub triggers, the sample quickstart, and this SO thread for more details on how to implement that. In the Pub/Sub triggered Function, when the job is done you can inform the user via an email, a notification, the update of a Firestore document on which you have set a listener, etc... If generateZipWithDocuments() takes a long time, it is clearly the most user friendly option.

If I implement onSnapshot real-time listener to Firestore in Cloud Function will it cost more?

I have a listener to Firestore DB changes and it fetches automatically every time there is a change, however, if I decide to implement it in Cloud Function and call it from the client app, will it cost more because it will running 24h/7 even when users are not using the app?
This is in Client side:
firestore()
.collection('Collection').doc().collection('public')
.where('act', '==', 1)
.orderBy('time', 'asc')
.limit(10)
.onSnapshot({
error: (e) => this.setState({ errorMessage: e, loading: false }),
next: (querySnapshot) => { this._calculateLocationDistance(querySnapshot) },
});
Moreover, is it necessary to do it in Cloud Function? Is it risky if I leave it in the client side?
You can't really use listeners effectively in Cloud Functions. Cloud Functions are meant to be stateless. They serve a single request at a time, and clean up afterward. If you try to use a listener, it just won't work the way you expect. Cloud Functions also don't keep a socket open to the requester. Once a response is sent, the connection is closed, and there's no way to keep it open.
Given these constraints, functions typically just use get() to fetch data a single time, and return the results to the client. If you want realtime results, that should be implemented on the client.
If you are working with a backend that can keep a socket connection open to a client, it is no less expensive to have a listener on the backend that delivers results to the client. You are still charged a document read for each document read by the listener as it continues to receive results.

Firebase: First write is slow

Currently developing a hybrid mobile app using ionic. When the app starts up, and a user writes to the Realtime Database for the first time, it's always delayed by around 10 or more seconds. But any subsequent writes are almost instantaneous (less than 1 second).
My calculation of delay is based on watching the database in the Firebase console.
Is this a known issue, or maybe I am doing something wrong. Please share your views.
EDIT:
The write is happening via Firebase Cloud Function.
This is the call to the Firebase Cloud function
this.http.post(url+"/favouritesAndNotes", obj, this.httpOptions)
.subscribe((data) => {
console.log(data);
},(error)=>{
console.log(error);
});
This is the actual function
app.post('/favouritesAndNotes', (request, response) => {
var db = admin.database().ref("users/" + request.body.uid);
var favourites = request.body.favourites;
var notes = request.body.notes;
if(favourites!==undefined){
db.child("favourites/").set(favourites);
}
if(notes!==undefined){
db.child("notes/").set(notes);
}
console.log("Write successfull");
response.status(200).end();
});
The first time you interact with the Firebase Database in a client instance, the client/SDK has to do quite some things:
If you're using authentication, it needs to check if the token that it has is still valid, and if not refresh it.
It needs to find the server that the database is currently hosted on.
It needs to establish a web socket connection.
Each of these may take multiple round trips, so even if you're a few hundred ms from the servers, it adds up.
Subsequent operations from the same client don't have to perform these steps, so are going to be much faster.
If you want to see what's actually happening, I recommend checking the Network tab of your browser. For the realtime database specifically, I recommend checking the WS/Web Socket panel of the Network tab, where you can see the actual data frames.

GCF HTTP Request, Error: quota exceeded (CPU allocation in function invocations : per day)

I am currently working with Firebase Cloud Functions, doing an HTTP Request through Functions. The HTTP request is being done by a 3G Module, and I need to always read a value change in the Database.
This system is to be used in an application that, as soon as there is a change in the DB, I should notify the 3G module, so currently I am doing it with an HTTP request.
exports.moduleRequest = functions.https.onRequest((req, res) => {
var change = admin.database().ref('/userInfo');
////Once there is a change in any userInfo child, do something
change.once('child_changed', (snapshot) =>{
res.send(snapshot.val());
});
});
This is working perfectly fine, the problem is that I leave the HTTP request open until there is a change in the DB, so this is consuming the quota provided from Firebase in about 60 minutes.
Error: quota exceeded (CPU allocation in function invocations : per day);
check and increase your quota at https://console.cloud.google.com/iam-
admin/quotas?project=pass-
e098f&service=cloudfunctions.googleapis.com&usage=ALL. Function killed.
Do you know if there is another approach to get this system working?
I found that the easiest and best way to solve my issue is to use the REST API, as it let me do streaming through a HTTP GET request. It uses my SIMCOM SIM5320 3G module as a client, and the server then sends an event with the database update at the requested path.

Emulate RPC with Firebase

I am using Firebase to monitor machines across the building.
So architecture is multiple front-ends and multiple machines.
At a certain moment I want to be able to trigger some actions on these machines like:
take screenshot and put to ftp
encode a certain video file
analyze a large data-set
I am used to Actionscript, there are NetConnection and
Client objects to whom one could invoke remote methods.
Is there something similar in Firebase ?
How would you implement such a feature easily ?
I thought of having a message box, using an Array, where a message could be a data structure like:
{
'client_id': 'xxx-yyy-zzz',
'name': 'takeScrenshot',
'body': { 'creator': 'my-name' },
'timestamp': 1406214344
}
How it might work
a method call is a message entering this message box Array
listening with value_changed over this message box
pop item from array (this will trigger another value changed)
use the item to perform async operation
when async operation is done, using the client_id, notify the invoker about the operation
But to implement it correctly a lot of work must be done, does anyone know if there is an easy way to achieve this kind of functionality ?
Since Firebase is a powerful backend service, scalable, and has a RESTful API in addition to SDKs (not yet for Python, unfortunately), it generally makes the most sense to just use it directly, rather than fashioning API services on top of it.
One fast and effective way to do this is to utilize a queue approach. Have each client write data into an in/ path, and have the recipient of the event listen for child_added on that path. Then perform the remote invocation, and write data back to an out/ path for the requesting client.
client
// send it
var ref = new Firebase(QUEUE_URL);
var request = ref.child('in').push( requestData );
// wait for a reply and remove after processing
ref.child('out/'+request.name()).on('value', function(snap) {
if( snap.val() !== null ) {
console.log(snap.val());
request.remove();
// stop listening
snap.ref().off();
}
});
remote service
var ref = new Firebase(QUEUE_URL);
// listen for queue events
ref.child('in').on('child_added', function(snap) {
/*
... process queue event ...
*/
doneProcessing(snap, resultData);
});
function doneProcessing(snap, results) {
ref.child('out/'+snap.name()).set(results);
snap.ref().remove();
}

Resources