Access all children on node and update them for Firebase Function - firebase

I detected some recursion on one of the nodes of my realtime database and I want to delete (or set tu null) that specific node. This is my firebase function so far:
exports.cleanForms = functions.https.onRequest((req, res) => {
const parentRef = admin.database().ref("forms");
return parentRef.once('value').then(snapshot => {
snapshot.forEach(function(child) {
admin.database().ref('forms/'+child.key+'/user/forms').set(null);
});
});
});
Basically it should iterate all the records inside the forms node and delete its user/forms property.
But calling that function by going to this url: https://.cloudfunctions.net/cleanForms gives me this error:
Error: could not handle the request
And this is what I see on the logs:
10:47:57.818 PM cleanForms Function execution took 13602 ms, finished
with status: 'connection error'
The forms node has less than 3,000 records but as I mentioned before, it has some recursion on it. I don't know if it is failing due to its size or something related to that.

You are using an HTTPs Cloud Function: therefore you must "send a response to the client at the end" (Watch this official video by Doug Stevenson for more detail: https://youtu.be/7IkUgCLr5oA).
In your case, the "end" of the function will be when ALL of your set() asynchronous operations will be "done". Since the set() method returns a Promise, you have to use Promise.all() (again, watch this official video: https://youtu.be/d9GrysWH1Lc ).
So the following should work (not tested however):
exports.cleanForms = functions.https.onRequest((req, res) => {
const parentRef = admin.database().ref("forms");
parentRef.once('value')
.then(snapshot => {
const promises = [];
snapshot.forEach(child => {
promises.push(admin.database().ref('forms/'+child.key+'/user/forms').set(null));
});
return Promise.all(promises)
.then(results => {
response.send({result: results.length + ' node(s) deleted'});
})
.catch(error => {
response.status(500).send(error);
});
});

Related

Firebase cloud functions - update a different object within OnUpdate cloud trigger

Assume there is a collection of users and each user is associated with accounts, which are kept in a separate collection. For each account there is a balance which is updated periodically by some external means (e.g. the http trigger below). I need to be able to query for the user's total balance across all of her accounts.
I added onUpdate trigger which gets called everytime an account changes and updates the total accordingly. However, it seems that there is some race condition e.g. when two accounts get updated around the same time: after onUpdate is called for the first account and updates the total balance, it is still not updated when onUpdate is called for the second account. I'm guessing I need to somehow use "transaction" for the bookkeeping but not sure how.
const data = {
'users/XXX': {
email: "a#b.com",
balance: 0
},
"accounts/YYY": {
title: "Acc1",
userID: "XXX"
balance: 0
},
"accounts/ZZZ": {
title: "Acc2",
userID: "XXX"
balance: 0
}
};
exports.updateAccounts = functions.https.onRequest((request, response) => {
admin.firestore().collection('accounts').get().then((accounts) => {
accounts.forEach((account) => {
return admin.firestore().collection('accounts').doc(account.id).update({balance:
WHATEVER});
})
response.send("Done");
});
exports.updateAccount = functions.firestore
.document('accounts/{accountID}')
.onUpdate((change, context) => {
const userID = change.after.data().userID;
admin.firestore().doc("users/"+userID).get().then((user) => {
const new_balance = change.after.data().balance;
const old_balance = change.before.data().balance;
var user_balance = user.data().balance + new_balance - old_balance;
admin.firestore().doc("users/"+userID).update({balance: user_balance});
});
});
By looking at your code we can see several parts of it that could lead to incorrect results. It is not possible, without thoroughly testing and reproducing your problem, to be sure at 100% that correcting them will totally solve your problem but it is most probably the cause of the problems.
HTTP Cloud Function:
With the forEach() loop you are calling several asynchronous operations (the update() method) but you don't wait that all these asynchronous operations are completed before sending back the response. You should do as follows, using Promise.all() to wait all the asynchronous methods are completed before sending the response:
exports.updateAccounts = functions.https.onRequest((request, response) => {
const promises = [];
admin.firestore().collection('accounts').get()
.then(accounts => {
accounts.forEach((account) => {
promises.push(admin.firestore().collection('accounts').doc(account.id).update({balance: WHATEVER}));
return Promise.all(promises);
})
.then(() => {
response.send("Done");
})
.catch(error => {....});
});
onUpdate background triggered Cloud Function
There you need to correctly return the Promises chain in order to indicate to the platform when the Cloud Function is complete. The following should do the trick:
exports.updateAccount = functions.firestore
.document('accounts/{accountID}')
.onUpdate((change, context) => {
const userID = change.after.data().userID;
return admin.firestore().doc("users/"+userID).get() //Note the return here. (Note that in the HTTP Cloud Function we don't need it! see the link to the video series below)
.then(user => {
const new_balance = change.after.data().balance;
const old_balance = change.before.data().balance;
var user_balance = user.data().balance + new_balance - old_balance;
return admin.firestore().doc("users/"+userID).update({balance: user_balance}); //Note the return here.
});
});
I would suggest that you watch the 3 videos about "JavaScript Promises" from the Firebase video series: https://firebase.google.com/docs/functions/video-series/. They explain all the key points that were corrected above.
At first sight, it seems that if you modify, in the updateAccounts Cloud Function, several account documents that share the same user you will indeed need to implement the user balance update in a transaction, as several instances of the updateAccount Cloud Function may be triggered in parallel. The doc on Transactions is here.
Update:
You could implement a Transaction as follows in the updateAccounts Cloud Function (untested):
exports.updateAccount = functions.firestore
.document('accounts/{accountID}')
.onUpdate((change, context) => {
const userID = change.after.data().userID;
const userRef = admin.firestore().doc("users/" + userID);
return admin.firestore().runTransaction(transaction => {
// This code may get re-run multiple times if there are conflicts.
return transaction.get(userRef).then(userDoc => {
if (!userDoc.exists) {
throw "Document does not exist!";
}
const new_balance = change.after.data().balance;
const old_balance = change.before.data().balance;
var user_balance = userDoc.data().balance + new_balance - old_balance;
transaction.update(userRef, {balance: user_balance});
});
}).catch(error => {
console.log("Transaction failed: ", error);
return null;
});
});
In addition to what #Renaud Tarnec covered in their answer, you may also want to consider the following approaches:
Batched Write
In your updateAccounts function, you are writing many pieces of data at once, if any one of these fail, you may end up with a database that contains a mix of correctly updated data and data that had failed to be updated.
To solve this, you can use a batched write to write the data atomically where all new data is updated successfully or none of your data is written leaving your database in a known state.
exports.updateAccounts = functions.https.onRequest((request, response) => {
const db = admin.firestore();
db.collection('accounts')
.get()
.then((qsAccounts) => { // qs -> QuerySnapshot
const batch = db.batch();
qsAccounts.forEach((accountSnap) => {
batch.update(accountSnap.ref, {balance: WHATEVER});
})
return batch.commit();
})
.then(() => response.send("Done"))
.catch((err) => {
console.log("Error whilst updating balances via HTTP Request:", err);
response.status(500).send("Error: " + err.message)
});
});
Splitting the counters
Instead of storing a single "balance" in your document, it may instead be desirable (based on what you are trying to do) to store each account's balance in the user's document.
"users/someUser": {
...,
"balances": {
"accountId1": 10,
"accountId4": -20,
"accountId23": 5
}
}
If you need the cumulative balance, just add them together on the client. If you need to remove a balance, simply delete it's entry in the user document.
exports.updateAccount = functions.firestore
.document('accounts/{accountID}')
.onUpdate((change, context) => {
const db = admin.firestore();
const accountID = context.params.accountID;
const newData = change.after.data();
const accountBalance = newData.balance;
const userID = newData.userID;
return db.doc("users/"+userID)
.get()
.then((userSnap) => {
return db.doc("users/"+userID).update({["balances." + accountID]: accountBalance});
})
.then(() => console.log(`Successfully updated account #${accountID} balance for user #${userID}`))
.catch((err) => {
console.log(`Error whilst updating account #${accountID} balance for user #${userID}`, err);
throw err;
});
});

await response of image upload before continue function

So I am working on a upload function for multiple images in an array. After a lot of struggling I have finally got my upload function to work and the images are showing up in the Firebase Database. However I have yet to find out a working way to make sure my upload function completes before continuing.
Below is the part were I am calling the upload function and try to store the response in uploadurl, the uploadurl variable is later used in the dispatch function to store the url with other data.
try {
uploadurl = await uploadImages()
address = await getAddress(selectedLocation)
console.log(uploadurl)
if (!uploadurl.lenght) {
Alert.alert('Upload error', 'Something went wrong uploading the photo, plase try again', [
{ text: 'Okay' }
]);
setIsLoading(true);
return;
}
dispatch(
So the image upload function is below. This works to the point that the images are uploaded, however the .then call to get the DownloadURL is not started correctly and the .then images also is not working.
uploadImages = () => {
const provider = firebase.database().ref(`providers/${uid}`);
let imagesArray = [];
try {
Promise.all(photos)
.then(photoarray => {
console.log('all responses are resolved succesfully')
for (let photo of photoarray) {
let file = photo.data;
const path = "Img_" + uuid.v4();
const ref = firebase
.storage()
.ref(`/${uid}/${path}`);
var metadata = {
contentType: 'image/jpeg',
};
ref.putString(file, 'base64', metadata).then(() => {
ref
.getDownloadURL()
.then(images => {
imagesArray.push({
uri: images
});
console.log("Out-imgArray", imagesArray);
})
})
};
return imagesArray
})
} catch (e) {
console.error(e);
}
};
So I want to return the imagesArray, AFTER, all the photos are uploaded. So the imagesArray is then set as uploadURL in the first function? After all images URL are set in imagesArray and passed to uploadURL, only then my dispatch function to upload the rest of the data should continue. How can I make sure this is happening as expected?
I have changed this so many times now because I keep getting send to different ways of doing this that I am completely at a loss how to continue now :(
Most of your uploadImages() code was correct, however in many places you didn't return the promise from each asynchronous action.
Quick sidestep: Handling many promises
When working with lots of asynchronous tasks based on an array, it is advised to map() the array to an array of Promises rather than use a for loop. This allows you to build an array of promises that can be fed to Promise.all() without the need to initialise and push to another array.
let arrayOfPromises = someArray.map((entry) => {
// do something with 'entry'
return somePromiseRelatedToEntry();
});
Promise.all(arrayOfPromises)
.then((resultsOfPromises) => {
console.log('All promises resolved successfully');
})
.catch((err) => {
// an error in one of the promises occurred
console.error(err);
})
The above snippet will fail if any of the contained promises fail. To silently ignore individual errors or defer them to handle later, you just add a catch() inside the mapped array step.
let arrayOfPromises = someArray.map((entry) => {
// do something with 'entry'
return somePromiseRelatedToEntry()
.catch(err => ({hasError: true, error: err})); // silently ignore errors for processing later
});
Updated uploadImages() code
Updating your code with these changes, gives the following result:
uploadImages = () => {
const provider = firebase.database().ref(`providers/${uid}`);
// CHANGED: removed 'let imagesArray = [];', no longer needed
return Promise.all(photos) // CHANGED: return the promise chain
.then(photoarray => {
console.log('all responses are resolved successfully');
// take each photo, upload it and then return it's download URL
return Promise.all(photoarray.map((photo) => { // CHANGED: used Promise.all(someArray.map(...)) idiom
let file = photo.data;
const path = "Img_" + uuid.v4();
const storageRef = firebase // CHANGED: renamed 'ref' to 'storageRef'
.storage()
.ref(`/${uid}/${path}`);
let metadata = {
contentType: 'image/jpeg',
};
// upload current photo and get it's download URL
return storageRef.putString(file, 'base64', metadata) // CHANGED: return the promise chain
.then(() => {
console.log(`${path} was uploaded successfully.`);
return storageRef.getDownloadURL() // CHANGED: return the promise chain
.then(fileUrl => ({uri: fileUrl}));
});
}));
})
.then((imagesArray) => { // These lines can
console.log("Out-imgArray: ", imagesArray) // safely be removed.
return imagesArray; // They are just
}) // for logging.
.catch((err) => {
console.error(err);
});
};

Using a callable function to send data back to the client from Firebase

I have created a callable Cloud Function to read data from Firebase and send back the results to the client, however, only "null" is being returned to the client.
exports.user_get = functions.https.onCall((data, context) => {
if (context.auth && data) {
return admin.firestore().doc("users/" + context.auth.uid).get()
.then(function (doc) {
return { doc.data() };
})
.catch(function (error) {
console.log(error);
return error;
})
} return
});
I just reproduced your case connecting from a Cloud Function with a Firestore database and retriving data. As I can see you are trying to access the field in a wrong way when you are using "users/" + context.auth.uid, the method can't find the field so its returning a null value.
I just followed this Quickstart using a server client library documentation to populate a Firestore database and make a Get from it with node.js.
After that i followed this Deploying from GCP Console documentation in order to deploy a HTTP triggered Cloud Function with the following function
exports.helloWorld = (req, res) => {
firestore.collection('users').get()
.then((snapshot) => {
snapshot.forEach((doc) => {
console.log(doc.id, '=>', doc.data().born);
let ans = {
date : doc.data().born
};
res.status(200).send(ans);
});
})
And this is returning the desired field.
You can take a look of my entire example code here
This is because you are making a query from a database firestore, however the cloud support team has made it very cool to protect your applications from data leakages and so in a callable function as the name suggest you can only return data you passed to the same callable function through the data parameter and nothing else. if you try to access a database i suggest you use an onRequest Function and use the endpoint to get you data. that way you not only protect your database but avoid data and memory leakage.
examples of what you can return from a callable function
exports.sayHello = functions.https.onCall((data, context) => {
const name = data.name;
console.log(`hello ${name}`);
return `It was really fun working with you ${name}`;
});
first create a function in your index.js file and accept data through the data parameter but as i said you can only return data you passed through the data parameter.
now call the function
this is in the frontend code (attach an event listener to a button or something and trigger it
/* jsut say hello from firebase */
callButton.addEventListener('click', () => {
const sayHello = firebase.functions().httpsCallable('getAllUsers');
sayHello().then(resutls => {
console.log("users >>> ", resutls);
});
});
you can get your data using an onRequest like so
/* get users */
exports.getAllUsers = functions.https.onRequest((request, response) => {
cors(request, response, () => {
const data = admin.firestore().collection("users");
const users = [];
data.get().then((snapshot) => {
snapshot.docs.forEach((doc) => {
users.push(doc.data());
});
return response.status(200).send(users);
});
});
});
using a fetch() in your frontend code to get the response of the new onRequest function you can get the endpoint to the function in your firebase console dashboard.
but not that to hit the endpoint from your frontend code you need to add cors to your firebase cloud functions to allow access to the endpoint.
you can do that by just adding this line to the top of your index.js file of the firebase functions directory
const cors = require("cors")({origin: true});

Unhandled Rejection in Google Cloud Functions

I got the following cloud function which works great.
It's listening for an update in the real-time database and update Firestore accordingly.
Everything is fine, except when my user does not exist yet my Firestore database.
This where I need to deal with Unhandled rejection that I see in the Google Cloud Functions log.
So see below, in the shortened version of the function, for my db.collection("users").where("email", "==", email).get() how to stop the function to move forward and prevent the crash.
exports.updateActivities = functions.database.ref("delegates/{userId}/activities").onWrite((event) => {
//Here I set all the needed variable
return rtdb.ref(`delegates/${userId}/email`).once("value", snapshot => {
//Here I'm fine, email is always present.
})
.then(() => {
db.collection("users").where("email", "==", email).get()
//This is where I need to handle when there is not matching value, to stop moving forward.
.then(querySnapshot => {
querySnapshot.forEach(val => {
console.log("Found match in FireStore " + val.id);
firestoreId = val.id;
})
})
.then(() => {
//Here I start my update on Firestore
});
})
});
You should use catch() on every promise that you return from your function that could be rejected. This tells Cloud Functions that you handled the error. The promise returned from catch() will be resolved.
Typically you log the error from catch() so you can see it in the console logs:
return somePromise
.then(() => { /* do your stuff */ }
.catch(error => { console.error(error) })
I used bluebird with suppressUnhandledRejections to get round this issue:
import Promise from 'bluebird';
function pool_push(pool, promise)
{
// In Google Cloud there is a change to crash at `unhandledRejection`.
// The following trick, basically, tells not to emit
// `unhandledRejection` since `.catch` will be attached
// a bit latter.
const tmp = Promise.resolve(promise);
tmp.suppressUnhandledRejections();
pool.items.push(tmp);
}
export default pool_push;

firebase admin failing to query large items

Using firebase admin to retrieve data from a collection in a cloud function fails for large items. Sample code i am using to query the selection from the cloud function is as follow
admin.database().orderByChild('mmyyyy').equalTo(month).once('value');
this call fails when i try to retrieve 10600 items (trying to figure out why). in google console there is this log but nothing else that can point me in the right direction
textPayload: "Function execution took 18547 ms, finished with status: 'response error'"
After many failed attempt, i decided to try to execute this call on the client using firebase sdk as follow:
result = await firebase.database().ref(`transactions`).orderByChild('mmyyyy').equalTo(month).once('value');
this works perfectly on the client without error and returning all my items 17000 of them (size of this json is 26MB).
Why is this the case? is there any limitation that is not documented?
Note:
I increased my cloud function memory to 1gb and timeout to 5min, didn't help.
Here is full sample code
const admin = require('firebase-admin');
var functions = require('firebase-functions');
admin.initializeApp(functions.config().firebase);
const cors = require('cors')({
"origin": "*",
"methods": "POST,GET",
"allowedHeaders": "Content-Type,uid,agence,month,course,raceType,raceNumber,status",
"preflightContinue": false,
"optionsSuccessStatus": 204
});
function _findTransactions(agence, month, course, raceType, raceNumber, status) {
return new Promise((resolve, reject) => {
try {
let db = admin.database();
let findPromise = db.ref(`transactions`).orderByChild('mmyyyy').equalTo(month).once('value');
findPromise.then((result) => {
let transactions = result.val();
//removed business logic
resolve(transactions);
}).catch((err) => {
console.log(err);
reject(err);
});
} catch (error) {
console.log(error);
reject(error);
}
});
}
exports.findTransactions = functions.https.onRequest((req, res) => {
let uid;
try {
cors(req, res, () => {
uid = req.headers.uid;
let agence = req.headers.agence;
let month = req.headers.month;
let course = req.headers.course;
let raceType = req.headers.raceType;
let raceNumber = req.headers.raceNumber;
let status = req.headers.status;
if (req.method !== 'GET') {
return handleResponse(req, res, 403);
}
if (!uid || uid == null || uid == undefined) {
return handleResponse(req, res, 401);
}
_validateUserId(uid, ['central_cashier', 'admin'])
.then(() => {
_findTransactions(agence, month, course, raceType, raceNumber, status)
.then((result) => {
return handleResponse(req, res, 200, result);
}).catch((error) => {
return handleResponse(req, res, 500);
});
}).catch((error) => {
return handleResponse(req, res, 401);
});
});
} catch (error) {
return handleError(res, uid, error);
}
});
Your payload is too large and is exceeding the quota for Google Cloud Functions as you stated.
Two options comes to mind:
Compress the payload. Gzip the file before sending it to the client. This is easy with the NodeJS built in Zlib module, or;
Set up a virtual machine. Virtual machines are not bound to these restrictions.
I did some testing and conclude that Google Cloud Functions (GCF) is enforcing some kind of timeout or "abort" action when a query results in a large number of results (ie. many Datastore entities). See my comments attached to this question for some background.
tl;dr I created my own Express.js webserver and ran my GCF code on it.
This is how I tested it: I created an ubuntu instance with http/https and the Datastore API enabled. On my instance, I installed Node, Express, and got a basic https server working (self-signed certificate worked fine since this is just testing an api backend service). Then I copy-pasted my GCF code (the function that was failing in GCF) into my minimal Express webserver. I pointed my React app to use my instance, which triggered a query that resulted in over 32,000 Datastore entities. My GCF function sends a query with datastore.runQuery() which is common.
It took about a minute, but eventually all 32,000 entities were served by Express and loaded in the React app (browser) with no errors.
A basic Express route calls my GCF function:
app.post('/foo', (req, res) => {
myCloudFunction(req, res);
})
const myCloudFunction = (req, res) => {
// Inspects req, queries Datastore, and returns the results.
};
For this test, my React app just points to https://mydomain.example.com:3000/foo
(because my Express server listens on port 3000).
So it seems that GCF is not good enough for my application, unless I add pagination to the app (which is on the roadmap).

Resources