Wait for Cloud Function to be finished - firebase

Is there any way to wait for a Cloud Function, that was triggered by a Firestore document write, to finish?
Context:
My app has groups. Owners can invite other users to a group via an invite code. Users can write themselves as member of a group if they have the right invite code. They do this by writing the groups/{groupId}/members/{userId} document that contains their profile info.
To make reading more efficient, this info is copied to array members in the groups/{groupId} document by a Cloud Function.
The Cloud Function that does that is triggered by the document write. It is usually finished after a couple of seconds, but there's no predictable execution time and it might take a bit longer if it is a cold start.
After the user has joined the group, I forward them to the groups view in my app which reads the group document. In order for the view to render correctly, the membership info needs to be available. So I would like to forward AFTER the Cloud Function has finished.
I found no way to track the execution of a Cloud Function that was triggered by a Firestore document write.
A fellow developer recommended to just poll the groups/{groupId} document until the info is written and then proceed but this doesn't seem like a clean solution to me.
Any ideas how this could be done better?
Is it possible to get a promise that resolves after the Cloud Function has finished? Is there a way to combine a Firestore document write and a Cloud Function execution into one transaction?

Thanks for the hints, I came up with the following ways to deal with the problem. The approach depends on if/when the user is allowed to read a document:
A) User is member and leaves the group > at the start of the transaction they are allowed to read the group > the moment they can't read anymore confirms that the membership was successfully revoked:
async function leaveGroup (groupId) {
await deleteDoc(doc(db, 'groups', groupId, 'members', auth.currentUser.uid))
// Cloud Function removes the membership info
// from the group doc...
await new Promise((resolve) => {
const unsubscribeFromSnapshot = onSnapshot(
doc(db, 'groups', groupId),
() => { }, // success callback
() => { // error callback
// membership info is not in the group anymore
// > user can't read the doc anymore
// > transaction was successful
// read access was revoked > transaction was successful:
unsubscribeFromSnapshot()
resolve()
}
)
})
}
B) User is not a member and wants to join the group > at the start of the transaction they are allowed to read the group > the moment they can read the group confirms that the membership was successfully confirmed (this is a simplified version that does not check the invite code):
async function joinGroup (groupId) {
try {
await setDoc(
doc(db, 'groups', groupId, 'members', auth.currentUser.uid),
{
userId: auth.currentUser.uid,
userDisplayName: auth.currentUser.displayName
}
)
// Cloud Function adds the membership
// information to the group doc ...
await new Promise((resolve) => {
let maxRetries = 10
const interval = setInterval(async () => {
try {
const docSnap = await getDoc(doc(db, 'groups', groupId))
if (docSnap.data().members.includes(auth.currentUser.uid)) {
// membership info is in the group doc
// > transaction was successful
clearInterval(interval)
resolve()
}
} catch (error) {
if (maxRetries < 1) {
clearInterval(interval)
}
}
maxRetries--
}, 2000)
})
}
Note: I went with polling here, but similar to what #samthecodingman suggested, another solution could be that the Cloud Function confirms the membership by writing back to the members document (which the user can always read) and you listen to snapshot changes on this document.
C) Most straightforward way: someone else (the group owner) removes a member from the group > they have read access through the whole transaction > directly listen to snapshot changes:
async function endMembership (groupId, userId) {
await deleteDoc(doc(db, 'groups', groupId, 'members', userId))
// Cloud Function removes the membership info
// from the group doc...
await new Promise((resolve) => {
const unsubscribe = onSnapshot(doc(db, 'groups', groupId), (doc) => {
if (!doc.data().members.includes(userId)) {
// membership info is not in the group doc anymore
// > transaction was successful
unsubscribe()
resolve()
}
})
})
}
In any case you should do proper error handling that covers other causes. I left them out to demonstrate how to use the error handlers when waiting for gaining/loosing read access.

Related

Firestore Transactions is not handling race condition

Objective
User on click a purchase button on the web frontend, it will send a POST request to the backend to create a purchase order. First, it will check the number of available stocks. If available is greater than 0, reduce available by 1 and then create the order.
The setup
Backend (NestJS) queries the Firestore for the latest available value, and reduce available by 1. For debugging, I will return the available value.
let available;
try {
await runTransaction(firestore, async (transaction) => {
const sfDocRef = doc(collection(firestore, 'items_available'), documentId);
const sfDoc = await transaction.get(sfDocRef);
if (!sfDoc.exists()) {
throw 'Document does not exist!';
}
const data = sfDoc.data();
available = data.available;
if(available>0){
transaction.update(sfDocRef, {
available: available-1,
});
}
});
} catch (e) {
console.log('Transaction failed: ', e);
}
return { available };
My stress test setup
Our goal is to see all API requests having different available value, this would mean that Firestore Transactions is reducing the value even though there are multiple requests coming in.
I wrote a simple multi-threaded program that queries the backend's create order API, it will query the available value and return the available value. This program will save the available value returned for each API request.
The stress test performed is about 10 transactions per second, as I have 10 concurrent processes querying the backend. Each process will http.get 20 queries:
const http = require('http');
function call(){
http.get('http://localhost:5000/get_item_available', res => {
let data = [];
res.on('data', chunk => {
data.push(chunk);
});
res.on('end', () => {
console.log('Response: ', Buffer.concat(data).toString());
});
}).on('error', err => {
console.log('Error: ', err.message);
});
}
for (var i=0; i<20; i++){
call();
}
The problem
Unfortunately, the available values I got from the requests contains repeated values, that is, having same available values instead of having unique available values.
What is wrong? Isn't Firestore Transactions meant to handle race conditions? Any suggestions on what I could change to handle multiple requests hitting the server and return a new value for each request?
You have a catch clause to handle when the transaction fails, but then still end up returning a value to the caller return { available }. In that situation you should return an error to the caller.

Decrease response time in Firebase Vue app when liking a post

I have an app with different 'procedures' (think posts or pages), which one can like. Currently the process works: Tap like => run method "likeProcedure" => run dispatch action "likeProcedure" => update UI. It usually happens almost immediately, but sometimes there's a lag that gives this a "non-native" feel. Is there some sort of way that I could return feedback immediately, while stile holding single origin of truth on the firebase database?
Thank you!
Page Code:
<v-icon
v-if="!userProfile.likedProcedures || !userProfile.likedProcedures[procedure.id]"
color="grey lighten-1"
#click="likeProcedure({ id: procedure.id })"
>
mdi-star-outline
</v-icon>
and
computed: {
...mapState(["userProfile"]),
procedures() {
return this.$store.getters.getFilteredProcedures();
},
},
Vuex code:
async likeProcedure({ dispatch }, postId) {
const userId = fb.auth.currentUser.uid;
// update user object
await fb.usersCollection.doc(userId).update({
[`likedProcedures.${postId.id}`]: true,
});
dispatch("fetchUserProfile", { uid: userId });
},
Side note: I'm trying to remove the dispatch("fetchUserProfile") command, but this doesn't work, because then I'm calling dispatch without using it. And I cannot remove dispatch because then the object calling it is empty. And I cannot remove the object, because then the argument ('postId') isn't working. So if anyone knows how to deal with that, that would be extremely helpful.
Thank you :)
So this is the best solution I've come up yet. It kind of destroys the idea of a single source of truth, but at least it provides an immediate UI update:
async likeProcedure({ dispatch, state }, postId) {
console.log("likeProcedure");
const userId = fb.auth.currentUser.uid;
// line below provides immediate update to state and hence to the UI
state.userProfile.likedProcedures[postId.id] = true;
// line below updates Firebase database
await fb.usersCollection.doc(userId).update({
[`likedProcedures.${postId.id}`]: state.userProfile.likedProcedures[
postId.id
],
});
// line below then fetches the updated profile from Firebase and updates
// the profile in state. Kind of useless, but ensures that client and
// Firebase are in-sync
dispatch("fetchUserProfile", { uid: userId });
},
async fetchUserProfile({ commit }, user) {
// fetch user profile
const userProfile = await fb.usersCollection.doc(user.uid).get();
// set user profile in state
commit("setUserProfile", userProfile.data());
// change route to dashboard
if (router.currentRoute.path === "/login") {
router.push("/");
}
},

Cloud Functions for Firebase: how to get authenticated user in a database trigger [duplicate]

In the example below, is there a way to get the uid of the user who wrote to /messages/{pushId}/original?
exports.makeUppercase = functions.database.ref('/messages/{pushId}/original')
.onWrite(event => {
// Grab the current value of what was written to the Realtime Database.
const original = event.data.val();
console.log('Uppercasing', event.params.pushId, original);
const uppercase = original.toUpperCase();
// You must return a Promise when performing asynchronous tasks inside a Functions such as
// writing to the Firebase Realtime Database.
// Setting an "uppercase" sibling in the Realtime Database returns a Promise.
return event.data.ref.parent.child('uppercase').set(uppercase);
});
UPDATED ANSWER (v1.0.0+):
As noted in #Bery's answer above, version 1.0.0 of the Firebase Functions SDK introduced a new context.auth object which contains the authentication state such as uid. See "New properties for user auth information" for more details.
ORIGINAL ANSWER (pre v1.0.0):
Yes, this is technically possible, although it is not currently documented. The uid is stored with the event.auth object. When a Database Cloud Function is triggered from an admin situation (for example, from the Firebase Console data viewer or from an Admin SDK), the value of event.auth is:
{
"admin": true
}
When a Database Cloud Function is triggered from an unauthenticated reference, the value of event.data is:
{
"admin": false
}
And finally, when a Database Cloud Function is triggered from an authed, but not admin, reference, the format of event.auth is:
{
"admin": false,
"variable": {
"provider": "<PROVIDER>",
"provider_id": "<PROVIDER>",
"user_id": "<UID>",
"token": {
// Decoded auth token claims such as sub, aud, iat, exp, etc.
},
"uid": "<UID>"
}
}
Given the information above, your best bet to get the uid of the user who triggered the event is to do the following:
exports.someFunction = functions.database.ref('/some/path')
.onWrite(event => {
var isAdmin = event.auth.admin;
var uid = event.auth.variable ? event.auth.variable.uid : null;
// ...
});
Just note that in the code above, uid would be null even if isAdmin is true. Your exact code depends on your use case.
WARNING: This is currently undocumented behavior, so I'll give my usual caveat of "undocumented features may be changed at any point in the future without notice and even in non-major releases."
Ever since Firebase functions reached version 1.0, this behavior is no longer undocumented but has sligtly changed. Be sure to read the docs.
Context has been added to cloud functions and you can use it like this
exports.dbWrite = functions.database.ref('/path/with/{id}').onWrite((data, context) => {
const authVar = context.auth; // Auth information for the user.
const authType = context.authType; // Permissions level for the user.
const pathId = context.params.id; // The ID in the Path.
const eventId = context.eventId; // A unique event ID.
const timestamp = context.timestamp; // The timestamp at which the event happened.
const eventType = context.eventType; // The type of the event that triggered this function.
const resource = context.resource; // The resource which triggered the event.
// ...
});

How to implement transactions in Meteor Method calls

Suppose I have 2 collections "PlanSubscriptions" and "ClientActivations". I am serially doing a insert on both the collections.
Later one depends on previous one, if any of the transaction fails then the entire operation must rollback.
How can I achieve that in Meteor 1.4?
Since MongoDB doesn't support atomicity, you will have to manage it with Method Chaining.
You can write a method, say, transaction where you will call PlanSubscriptions.insert(data, callback). Then in the callback function you will call ClientActivations.insert(data, callback1) if the first insertion is success and in callback1 return truthy if second insertion is succes, otherwise falsy. If the first insertion returns error you don't need to do anything, but if the second insertion returns error then remove the id got from the insertion in first collection.
I can suggest following structure:
'transaction'(){
PlanSubscriptions.insert(data, (error, result)=>{
if(result){
// result contains the _id
let id_plan = result;
ClientActivations.insert(data, (error, result)=>{
if(result){
// result contains the _id
return true;
}
else if(error){
PlanSubscriptions.remove(id_plan);
return false;
}
})
}
else if(error){
return false;
}
})
}
There is no way to do that in Meteor, since mongodb is not an ACID-compliant database. It has a single-document update atomicity, but not a multiple-document one, which is your case with the two collections.
From the mongo documentation:
When a single write operation modifies multiple documents, the modification of each document is atomic, but the operation as a whole is not atomic and other operations may interleave.
A way to isolate the visibility of your multi-document updates is available, but it's probably not what you need.
Using the $isolated operator, a write operation that affects multiple documents can prevent other processes from interleaving once the write operation modifies the first document. This ensures that no client sees the changes until the write operation completes or errors out.
An isolated write operation does not provide “all-or-nothing” atomicity. That is, an error during the write operation does not roll back all its changes that preceded the error.
However, there are a couple of libraries which try to tackle the problem at the app-level. I recommend taking a look at fawn
In your case, where you have exactly two dependent collections, it's possible to take advantage of the two phase commits technique. Read more about it here: two-phase-commits
Well I figured it out myself.
I added a package babrahams:transactions
At server side Meteor Method call, I called tx Object that is globally exposed by the package. The overall Server Side Meteor.method({}) looks like below.
import { Meteor } from 'meteor/meteor';
import {PlanSubscriptions} from '/imports/api/plansubscriptions/plansubscriptions.js';
import {ClientActivations} from '/imports/api/clientactivation/clientactivations.js';
Meteor.methods({
'createClientSubscription' (subscriptionData, clientActivationData) {
var txid;
try {
txid = tx.start("Adding Subscription to our database");
PlanSubscriptions.insert(subscriptionData, {tx: true})
ClientActivations.insert(activation, {tx: true});
tx.commit();
return true;
} catch(e){
tx.undo(txid);
}
return false;
}
});
With every insert I had added {tx : true}, this concluded it to be a apart of transaction.
Server Console Output:
I20170523-18:43:23.544(5.5)? Started "Adding Subscription to our database" with
transaction_id: vdJQvFgtyZuWcinyF
I20170523-18:43:23.547(5.5)? Pushed insert command to stack: vdJQvFgtyZuWcinyF
I20170523-18:43:23.549(5.5)? Pushed insert command to stack: vdJQvFgtyZuWcinyF
I20170523-18:43:23.551(5.5)? Beginning commit with transaction_id: vdJQvFgtyZuWcinyF
I20170523-18:43:23.655(5.5)? Executed insert
I20170523-18:43:23.666(5.5)? Executed insert
I20170523-18:43:23.698(5.5)? Commit reset transaction manager to clean state
For more Information you can goto link : https://github.com/JackAdams/meteor-transactions
NOTE: I am using Meteor 1.4.4.2
Just sharing this link for future readers:
https://forums.meteor.com/t/solved-transactions-with-mongodb-meteor-methods/48677
import { MongoInternals } from 'meteor/mongo';
// utility async function to wrap async raw mongo operations with a transaction
const runTransactionAsync = async asyncRawMongoOperations => {
// setup a transaction
const { client } = MongoInternals.defaultRemoteCollectionDriver().mongo;
const session = await client.startSession();
await session.startTransaction();
try {
// running the async operations
let result = await asyncRawMongoOperations(session);
await session.commitTransaction();
// transaction committed - return value to the client
return result;
} catch (err) {
await session.abortTransaction();
console.error(err.message);
// transaction aborted - report error to the client
throw new Meteor.Error('Database Transaction Failed', err.message);
} finally {
session.endSession();
}
};
import { runTransactionAsync } from '/imports/utils'; // or where you defined it
Meteor.methods({
async doSomething(arg) {
// remember to check method input first
// define the operations we want to run in transaction
const asyncRawMongoOperations = async session => {
// it's critical to receive the session parameter here
// and pass it to every raw operation as shown below
const item = await collection1.rawCollection().findOne(arg, { session: session });
const response = await collection2.rawCollection().insertOne(item, { session: session });
// if Mongo or you throw an error here runTransactionAsync(..) will catch it
// and wrap it with a Meteor.Error(..) so it will arrive to the client safely
return 'whatever you want'; // will be the result in the client
};
let result = await runTransactionAsync(asyncRawMongoOperations);
return result;
}
});

How to bulk delete Firebase anonymous users

Due to my probable misuse of anonymous authentication (see How to prevent Firebase anonymous user token from expiring) I have a lot of anonymous users in my app that I don't actually want.
I can't see any way to bulk delete these users. Do I have to do it manually one-by-one? Is there anyway to use the API to access user accounts and manipulate them for users other than the current user?
This code sample uses the Firebase Admin SDK for Node.js, and will delete any user that has no providerData, which means the user is anonymous:
function deleteAnonymousUsers(nextPageToken) {
adminApp
.auth()
.listUsers(20, nextPageToken)
.then(function(listUsersResult) {
listUsersResult.users.forEach(function(userRecord) {
if (userRecord.providerData.length === 0) { //this user is anonymous
console.log(userRecord); // do your delete here
adminApp.auth().deleteUser(userRecord.uid)
.then(function() {
console.log("Successfully deleted user");
})
.catch(function(error) {
console.log("Error deleting user:", error);
});
}
});
if (listUsersResult.pageToken) {
// List next batch of users.
deleteAnonymousUsers(listUsersResult.pageToken);
}
})
.catch(function(error) {
console.log('Error listing users:', error);
});
}
There is no way in the Firebase Console to bulk-delete users.
There is no API to bulk-delete users.
But there is administrative API that allows you to delete user accounts. See https://firebase.google.com/docs/auth/admin/manage-users#delete_a_user
I just wanted to add a method I just used to (sort-of) bulk-delete. Mostly because I felt clever after doing it and I am not that clever.
I downloaded a mouse-automation application that lets you record your mouse clicks then replay it automatically. I just deleted almost 1000 users while playing the piano lol.
I used Macro Recorder and it worked like a charm. Just recorded a few iterations in the console of me deleting users, set it to repeat 500 times and walked away.
I know this isn't a very technical answer, but it saved me hours of monotonous mouse clicking so hopefully someone else looking for a way to bulk-delete will benefit from it as well. I hated the fact that there was no bulk-delete and really needed a way out of it. It only took about 20 manual deletes to realize there were apps that could do what I was doing.
If you do not need to do it on a large scale and you want to delete some anonymous users from Firebase Console UI, but you are lazy to click on 250 users one-by-one, run the following code in your console (screen where table with users is shown):
rows = Array.from(document.querySelectorAll('td.auth-user-identifier-cell')).map(td => td.parentNode).filter((tr) => tr.innerText.includes('anonymous'))
var nextTick = null
function openContextMenu(tr) {
console.log('openning menu')
tr.querySelector('.edit-account-button').click()
nextTick = deleteUser
}
function deleteUser() {
console.log('deleting user')
document.querySelector('.cdk-overlay-connected-position-bounding-box button:last-of-type').click()
nextTick = confirmDelete
}
function confirmDelete() {
console.log('confirming action')
document.querySelector('.cdk-global-overlay-wrapper .confirm-button').click()
nextTick = getUser
}
function getUser() {
console.log('getting user')
openContextMenu(rows.shift())
}
nextTick = getUser
step = 500
setInterval(() => {
nextTick()
}, step)
It basically selects all rows which contain anonymous user and simulate you clicking the three dots, then clicking on delete account and as a last step it confirms action in the modal which appears.
Before running the script, select 250 rows per page in the table's footer. When all anonymous users are removed, you must manually go to next page and re run the script (or code in another "tick" which paginates for you).
It takes 1.5 second to delete one user (you can modify this with step variable, but I do not recommend go lower than 500ms - mind the UI animations).
It runs also in a tab in background so you can watch some YT in meantime :)
Update 2021:
I had around 10,000 anonymous users, and #regretoverflow's solution lead to exceeding the delete user quota. However, slightly tweaking the code to utilize the admin's deleteUsers([userId1, userId2, ...]) API works like a charm.
function deleteAnonymousUsers(nextPageToken: string | undefined) {
firebaseAdmin
.auth()
.listUsers(1000, nextPageToken)
.then(function (listUsersResult) {
const anonymousUsers: string[] = [];
listUsersResult.users.forEach(function (userRecord) {
if (userRecord.providerData.length === 0) {
anonymousUsers.push(userRecord.uid);
}
});
firebaseAdmin
.auth()
.deleteUsers(anonymousUsers)
.then(function () {
if (listUsersResult.pageToken) {
// List next batch of users.
deleteAnonymousUsers(listUsersResult.pageToken);
}
})
})
}
deleteAnonymousUsers(undefined);
There is a firebase-functions-helper package, that can help to delete firebase users in bulk.
// Get all users
firebaseHelper.firebase
.getAllUsers(100)
.then(users => {
users.map(user => {
firebaseHelper.firebase
.deleteUsers([user.uid]);
})
})
The code above will get 100 users, and delete all of them. If you don't pass the number, the default value is 1000. You can read the instruction on Github repository.
I faced the same problem today then I found Firebase Admin SDK. I am using Node.js which is very easy to install, so you can try the following code. It is not a complete answer I know but one can build its own script/application to delete stored uids. Yet, there is no way to retrieve a list, so you have to build one somehow every time you create an anonymous account.
First, download your 'serviceAccountKey.json' which can be done through the Firebase Console (Project Settings). In my case I renamed the download file to a more friendly name and saved to documents folder.
console.firebase.google.com/project/yourprojectname/settings/serviceaccounts/adminsdk
Useful links:
Firebase Admin SDK Setup
Firebase Admin User Management
Firebase Admin Database API
Then, play around using Windows cmd.exe or any other shell. The 'npm install -g' installs firebase-admin globally in your machine.
$ npm install firebase-admin -g
$ node
> var admin = require("firebase-admin");
> admin.initializeApp({
credential: admin.credential.cert("./documents/yourprojectname-firebase-admin.json"),
databaseURL: "https://yourprojectname.firebaseio.com"
});
> var db = admin.database();
// Of course use an existent UID of your choice
> admin.auth().getUser('2w8XEVe7qZaFn2ywc5MnlPdHN90s').then((user) => console.log
(user))
> admin.auth().deleteUser('2w8XEVe7qZaFn2ywc5MnlPdHN90s').then(function() {
console.log("Successfully deleted user");
}).catch(function(error) {
console.log("Error deleting user:", error);
});
// To get access to some key/values in your Database:
> var ref = db.ref("users/1234");
> ref.once("value", function(snapshot) {
console.log(snapshot.val());
});
I was writing myself a firebase functions function with Firebase auth.
It works like a charm for me and i can clean with one API call.
// Delete all Anon User
exports.deleteUser = functions.https.onRequest(async (req, res) => {
const admin = require("firebase-admin");
//initialize auth
admin.initializeApp();
//create auth instance
const auth = admin.auth();
//Get the list of all Users
const allUsers = await auth.listUsers();
//Identify the Anon User give other user null
const allUsersUID = allUsers.users.map((user) => (user.providerData.length === 0) ? user.uid : null);
//remove the null
const filteredallUsersUID = allUsersUID.filter(e => e !== null)
//delete and answer the API call
return auth.deleteUsers(filteredallUsersUID).then(() => res.send("All Anon-User deleted"));
});
With this you can just simply call your API URL
https://[Your_API_URL]/deleteUser
Just require basic knowledge of Firebase Functions.
I assume this could be also added to a cron job.
I had the same problem. because Firebase doesn't provide any API to delete bulk users but this is how I have deleted all anonymous users.
Download all the users as json via firebase tool
firebase auth:export users --format=json
https://firebase.google.com/docs/cli/auth#file_format
You can write a firebase cloud function to trigger or write a action method to trigger
import the json file in to your file,
const Users = require('./users.json'); // ES5 <br>
import Users from './users.json'); // ES6 <br>
normally anonymous user doesn't have email so it is easy to delete the record which doesn't have email id
Users.users.map(user => {
setTimeout(() => {
admin.auth().deleteUser(user.localId).then(() =>{
console.log("Successfully deleted user");
})
.catch((error) => {
console.log("Error deleting user:", error);
});
}, 20000);
});
Don't try to reduce the timeout second. It will throw below error
Error deleting user: { Error: www.googleapis.com network timeout. Please try again.
The Firebase Admin SDK can also delete multiple users at once.
Here is Node.js sample.
admin.auth().deleteUsers([uid1, uid2, uid3])
.then(deleteUsersResult => {
console.log('Successfully deleted ' + deleteUsersResult.successCount + ' users');
console.log('Failed to delete ' + deleteUsersResult.failureCount + ' users');
deleteUsersResult.errors.forEach(err => {
console.log(err.error.toJSON());
});
})
.catch(error => {
console.log('Error deleting users:', error);
});
Notice: there is a limitation as list all users.
The maximum number of users allowed to be deleted is 1000 per batch.

Resources