What's the best way to paginate and filters large set of data in Firebase? - firebase

I have a large Firestore collection with 10,000 documents.
I want to show these documents in a table by paging and filtering the results at 25 at a time.
My idea, to limit the "reads" (and therefore the costs), was to request only 25 documents at a time (using the 'limit' method), and to load the next 25 documents at the page change.
But there's a problem. In order to show the number of pages I have to know the total number of documents and I would be forced to query all the documents to find that number.
I could opt for an infinite scroll, but even in this case I would never know the total number of results that my filter has found.
Another option would be to request all documents at the beginning and then paging and filtering using the client.
so, what is the best way to show data in this type of situation by optimizing performance and costs?
Thanks!

You will find in the Firestore documentation a page dedicated to Paginating data with query cursors.
I paste here the example which "combines query cursors with the limit() method".
var first = db.collection("cities")
.orderBy("population")
.limit(25);
return first.get().then(function (documentSnapshots) {
// Get the last visible document
var lastVisible = documentSnapshots.docs[documentSnapshots.docs.length-1];
console.log("last", lastVisible);
// Construct a new query starting at this document,
// get the next 25 cities.
var next = db.collection("cities")
.orderBy("population")
.startAfter(lastVisible)
.limit(25);
});
If you opt for an infinite scroll, you can easily know if you have reached the end of the collection by looking at the value of documentSnapshots.size. If it is under 25 (the value used in the example), you know that you have reached the end of the collection.
If you want to show the total number of documents in the collection, the best is to use a distributed counter which holds the number of documents, as explained in this answer: https://stackoverflow.com/a/61250956/3371862

Firestore does not provide a way to know how many results would be returned by a query without actually executing the query and reading each document. If you need a total count, you will have to somehow track that yourself in another document. There are plenty of suggestions on Stack Overflow about counting documents in collections.
Cloud Firestore collection count
How to get a count of number of documents in a collection with Cloud Firestore
However, the paging API itself will not help you. You need to track it on your own, which is just not very easy, especially for flexible queries that could have any number of filters.

My guess is you would be using Mat-Paginator and the next button is disabled because you cannot specify the exact length? In that case or not, a simple workaround for this is to get (pageSize +1) documents each time from the Firestore sorted by a field (such as createdAt), so that after a new page is loaded, you will always have one document in the next page which will enable the "next" button on the paginator.

What worked best for me:
Create a simple query
Create a simple pagination query
Combine both (after validating each one works separately)
Simple Pagination Query
const queryHandler = query(
db.collection('YOUR-COLLECTION-NAME'),
orderBy('ORDER-FIELD-GOES-HERE'),
startAt(0),
limit(50)
)
const result = await getDocs(queryHandler)
which will return the first 50 results (ordered by your criteria)
Simple Query
const queryHandler = query(
db.collection('YOUR-COLLECTION-NAME'),
where('FIELD-NAME', 'OPERATOR', 'VALUE')
)
const result = await getDocs(queryHandler)
Note that the result object has both query field (with relevant query) and docs field (to populate actual data)
So... combining both will result with:
const queryHandler = query(
db.collection('YOUR-COLLECTION-NAME'),
where('FIELD-NAME', 'OPERATOR', 'VALUE'),
orderBy('FIELD-NAME'),
startAt(0),
limit(50)
)
const result = await getDocs(queryHandler)
Please note that the field in the where clause and in orderBy must be the same one! Also, it is worth mentioning that you may be required to create an index (for some use cases) or that this operation will fail while using equality operators and so on.
My tip: inspect the error itself where you will find a detailed description describing why the operation failed and what should be done in order to fix it (see an example output using js client in image below)

Firebase V9 functional approach. Don't forget to enable persistence so you won't get huge bills. Don't forget to use where() function if some documents have restrictions in rules. Firestore will throw error if even one document is not allowed to read by user. In case bellow documents has to have isPublic = true.
firebase.ts
function paginatedCollection(collectionPath: string, initDocumentsLimit: number, initQueryConstraint: QueryConstraint[]) {
const data = vueRef<any[]>([]) // Vue 3 Ref<T> object You can change it to even simple array.
let snapshot: QuerySnapshot<DocumentData>
let firstDoc: QueryDocumentSnapshot<DocumentData>
let unSubSnap: Unsubscribe
let docsLimit: number = initDocumentsLimit
let queryConst: QueryConstraint[] = initQueryConstraint
const onPagination = (option?: "endBefore" | "startAfter" | "startAt") => {
if (option && !snapshot) throw new Error("Your first onPagination invoked function has to have no arguments.")
let que = queryConst
option === "endBefore" ? que = [...que, limitToLast(docsLimit), endBefore(snapshot.docs[0])] : que = [...que, limit(docsLimit)]
if (option === "startAfter") que = [...que, startAfter(snapshot.docs[snapshot.docs.length - 1])]
if (option === "startAt") que = [...que, startAt(snapshot.docs[0])]
const q = query(collection(db, collectionPath), ...que)
const unSubscribtion = onSnapshot(q, snap => {
if (!snap.empty && !option) { firstDoc = snap.docs[0] }
if (option === "endBefore") {
const firstDocInSnap = JSON.stringify(snap.docs[0])
const firstSaved = JSON.stringify(firstDoc)
if (firstDocInSnap === firstSaved || snap.empty || snap.docs.length < docsLimit) {
return onPagination()
}
}
if (option === "startAfter" && snap.empty) {
onPagination("startAt")
}
if (!snap.empty) {
snapshot = snap
data.value = []
snap.forEach(docSnap => {
const doc = docSnap.data()
doc.id = docSnap.id
data.value = [...data.value, doc]
})
}
})
if (unSubSnap) unSubSnap()
unSubSnap = unSubscribtion
}
function setLimit(documentsLimit: number) {
docsLimit = documentsLimit
}
function setQueryConstraint(queryConstraint: QueryConstraint[]) {
queryConst = queryConstraint
}
function unSub() {
if (unSubSnap) unSubSnap()
}
return { data, onPagination, unSub, setLimit, setQueryConstraint }
}
export { paginatedCollection }
How to use example in Vue 3 in TypeScript
const { data, onPagination, unSub } = paginatedCollection("posts", 8, [where("isPublic", "==", true), where("category", "==", "Blog"), orderBy("createdAt", "desc")])
onMounted(() => onPagination()) // Lifecycle function
onUnmounted(() => unSub()) // Lifecycle function
function next() {
onPagination('startAfter')
window.scrollTo({ top: 0, behavior: 'smooth' })
}
function prev() {
onPagination('endBefore')
window.scrollTo({ top: 0, behavior: 'smooth' })
}
You might have problem with knowing which document is last one for example to disable button.

Related

How do I know if there are more documents left to get from a firestore collection?

I'm using flutter and firebase. I use pagination, max 5 documents per page. How do I know if there are more documents left to get from a firestore collection. I want to use this information to enable/disable a next page button presented to the user.
limit: 5 (5 documents each time)
orderBy: "date" (newest first)
startAfterDocument: latestDocument (just a variable that holds the latest document)
This is how I fetch the documents.
collection.limit(5).orderBy("date", descending: true).startAfterDocument(latestDocument).get()
I thought about checking if the number of docs received from firestore is equal to 5, then assume there are more docs to get. But this will not work if I there are a total of n * 5 docs in the collection.
I thought about getting the last document in the collection and store this and compare this to every doc in the batches I get, if there is a match then I know I've reach the end, but this means one excess read.
Or maybe I could keep on getting docs until I get an empty list and assume I've reached the end of the collection.
I still feel there are a much better solution to this.
Let me know if you need more info, this is my first question on this account.
There is no flag in the response to indicate there are more documents. The common solution is to request one more document than you need/display, and then use the presence of that last document as an indicator that there are more documents.
This is also what the database would have to do to include such a flag in its response, which is probably why this isn't an explicit option in the SDK.
You might also want to check the documentation on keeping a distributed count of the number of documents in a collection as that's another way to determine whether you need to enable the UI to load a next page.
here's a way to get a large data from firebase collection
let latestDoc = null; // this is to store the last doc from a query
//result
const dataArr = []; // this is to store the data getting from firestore
let loadMore = true; // this is to check if there's more data or no
const initialQuery = async () => {
const first = db
.collection("recipes-test")
.orderBy("title")
.startAfter(latestDoc || 0)
.limit(10);
const data = await first.get();
data.docs.forEach((doc) => {
// console.log("doc.data", doc.data());
dataArr.push(doc.data()); // pushing the data into the array
});
//! update latest doc
latestDoc = data.docs[data.docs.length - 1];
//! unattach event listeners if no more docs
if (data.empty) {
loadMore = false;
}
};
// running this through this function so we can actual await for the
//docs to get from firebase
const run = async () => {
// looping until we get all the docs
while (loadMore) {
console.log({ loadMore });
await initialQuery();
}
};

Why is it not possible to orderBy on different fields in Cloud Firestore and how can I work around it?

I have a collection in firebase cloud firestore called 'posts' and I want to show the most liked posts in the last 24h on my web app.
The post documents have a field called 'like_count' (number) and another field called 'time_posted' (timestamp).
I also want to be able to limit the results to apply pagination.
I tried to apply a filter to only get the posts posted in the last 24 hours and then ordering them by the 'like_count' and then the 'time_posted' since I want the posts with the most likes to appear first.
postsRef.where("time_posted", ">", twentyFourHoursAgo)
.orderBy("like_count", "desc")
.orderBy("time_posted", "desc")
.limit(10)
However, I quickly found out that it is not possible to filter and then sort by two different fields.
(See the Limitations part of the documentation for Order and limit data with Cloud Firestore)
It states:
Invalid: Range filter and first orderBy on different fields
I thought about sorting the results by 'like_count' in the frontend, but this won't work properly because I don't have all the documents. And getting all the documents is infeasible for a large number of daily posts.
Is there an easy work-around I am missing or how can I go about this?
When performing a query, Firestore must be able to traverse an index in a continuous fashion.
This introduction video is a little outdated (because "OR" queries are now possible using the "in" operator) but it does give a good visualization of what Firestore is doing as it runs a query.
If your query was just postsRef.orderBy("like_count", "desc").limit(10), Firestore would load up the index it has for a descending "like_count", pluck the first 10 entries and return them.
To handle your query, it would have to pluck an entry off the descending "like_count" index, compare it to your "time_posted" requirement, and either discard it or add it to a list of valid entries. Once it has all of the recent posts, it then needs to sort the results as you specified. As these steps don't make use of a continuous read of an index, it is disallowed.
The solution would be to build your own index from the recent posts and then pluck the results off of that. Because doing this on the client is ill-advised, you should instead make use of a Cloud Function to do the work for you. The following code makes use of a Callable Cloud Function.
const MS_TWENTY_FOUR_HOURS = 24 * 60 * 60 * 1000;
export getRecentTopPosts = function.https.onCall((data, context) => {
// unless otherwise stated, return only 10 entries
const limit = Number(data.limit) || 10;
const postsRef = admin.firestore().collection("posts");
// OPTIONAL CODE SEGMENT: Check Cached Index
const twentyFourHoursAgo = Date.now() - MS_TWENTY_FOUR_HOURS;
const recentPostsSnapshot = await postsRef
.where("time_posted", ">", twentyFourHoursAgo)
.get();
const orderedPosts = recentPostsSnapshot.docs
.map(postDoc => ({
snapshot: postDoc,
like_count: postDoc.get("like_count"),
time_posted: postDoc.get("time_posted")
})
.sort((p1, p2) => {
const deltaLikes = p2.like_count - p1.like_count; // descending sort based on like_count
if (deltaLikes !== 0) {
return deltaLikes;
}
return p2.time_posted - p1.time_posted; // descending sort based on time_posted
});
// OPTIONAL CODE SEGMENT: Save Cached Index
return orderedPosts
.slice(0, limit)
.map(post => ({
_id: post.snapshot.id,
...post.snapshot.data()
}));
})
If this code is expected to be called by many clients, you may wish to cache the index to save it getting constantly rebuilt by inserting the following segments into the function above.
// OPTIONAL CODE SEGMENT: Check Cached Index
if (!data.skipCache) { // allow option to bypass cache
const cachedIndexSnapshot = await admin.firestore()
.doc("_serverCache/topRecentPosts")
.get();
const oneMinuteAgo = Date.now - 60000;
// if the index was created in the past minute, reuse it
if (cachedIndexSnapshot.get("timestamp") > oneMinuteAgo) {
const recentPostMetadataArray = cachedIndexSnapshot.get("posts");
const recentPostIdArray = recentPostMetadataArray
.slice(0, limit)
.map((postMeta) => postMeta.id)
const postDocs = await fetchDocumentsWithId(postsRef, recentPostIdArray); // see https://gist.github.com/samthecodingman/aea3bc9481bbab0a7fbc72069940e527
// postDocs is not ordered, so we need to be able to find each entry by it's ID
const postDocsById = {};
for (const doc of postDocs) {
postDocsById[doc.id] = doc;
}
return recentPostIdArray
.map(id => {
// may be undefined if not found (i.e. recently deleted)
const postDoc = postDocsById[id];
if (!postDoc) {
return null; // deleted post, up to you how to handle
} else {
return {
_id: postDoc.id,
...postDoc.data()
};
}
});
}
}
// OPTIONAL CODE SEGMENT: Save Cached Index
if (!data.skipCache) { // allow option to bypass cache
await admin.firestore()
.doc("_serverCache/topRecentPosts")
.set({
timestamp: Date.now(),
posts: orderedPosts
.slice(0, 25) // cache the maximum expected amount
.map(post => ({
id: post.snapshot.id,
like_count: post.like_count,
time_posted: post.time_posted,
}))
});
}
Other improvements you could add to this function include:
A field mask - i.e. instead of return every part of the post documents, return just the title, like count, time posted and the author.
Variable post age (instead of 24 hours)
Variable minimum likes count
Filter by author

How to query an array of objects in a Firebase Cloud Function, to get a matching object and then update

I am using a scheduled task in a Firebase Cloud Function to query an array which contains a number of objects that need to be updated if a matching condition exists. My current attempt is using the 'array-contains' method to get the objects, then loop over them to find a matching condition which will then batch update the items. This is my data structure:
I need to find an object that is <= the current time, and also if the 'active' value = false.
export const liveMeetingsTrigger = functions.runWith( { memory: '1GB' }).pubsub
.schedule('every 1 minutes').onRun(async context => {
const now = admin.firestore.Timestamp.now();
const liveMeetings = await admin.firestore().collection('fl_content').where('meeting', 'array-contains', 'liveMeetingDate').get();
const batch = admin.firestore().batch();
liveMeetings.forEach(doc => {
if(doc.data().liveMeetingDate <= now && doc.data().active == false){
batch.update(doc.ref,'active',true);
}
});
return await batch.commit();
});
I have also tried using an exact object in the query instead of just using 'liveMeetingDate', but still get no results back, any help would be great - thanks.
Debugging: As the array I am trying to reach is inside of the (map) object 'liveMeetings' i have tried the dot notation (liveMeetings.meeting) with no success. Also trying a new collection with the the 'meeting' array at top level has provided no success.
Simple logging in the console (liveMeetings.size) shows that nothing is being returned on the query, so therefore the logging does not even reach the loop in the code.
As explained in this anwser the following query will not work:
const liveMeetings = await admin.firestore().collection('fl_content').where('meeting', 'array-contains', 'liveMeetingDate').get();
because the meetings array contain some objects, instead of "simple" or primitive data (e.g. string, number...).
You could query it with the exact objects, like:
const obj = {active: false, liveMeetingDate: ..., meetingId: ..., ....};
const liveMeetings = await admin.firestore().collection('fl_content').where('meeting', 'array-contains', 'obj').get();
Another approach would be to create a new collection which contains the similar documents (same Document ID) but with a meeting Array that contains only the liveMeetingDate property.
Finally, note that since your Array is within a map, you need to do
await admin.firestore().collection('fl_content').where('liveMeetings.meeting', 'array-contains', ...).get();
(PS: I don't mark this question as duplicate since you expressly ask for more help in the comments of the duplicate question/answer)

Firebase best practice for counting lists [duplicate]

You can get the child count via
firebase_node.once('value', function(snapshot) { alert('Count: ' + snapshot.numChildren()); });
But I believe this fetches the entire sub-tree of that node from the server. For huge lists, that seems RAM and latency intensive. Is there a way of getting the count (and/or a list of child names) without fetching the whole thing?
The code snippet you gave does indeed load the entire set of data and then counts it client-side, which can be very slow for large amounts of data.
Firebase doesn't currently have a way to count children without loading data, but we do plan to add it.
For now, one solution would be to maintain a counter of the number of children and update it every time you add a new child. You could use a transaction to count items, like in this code tracking upvodes:
var upvotesRef = new Firebase('https://docs-examples.firebaseio.com/android/saving-data/fireblog/posts/-JRHTHaIs-jNPLXOQivY/upvotes');
upvotesRef.transaction(function (current_value) {
return (current_value || 0) + 1;
});
For more info, see https://www.firebase.com/docs/transactions.html
UPDATE:
Firebase recently released Cloud Functions. With Cloud Functions, you don't need to create your own Server. You can simply write JavaScript functions and upload it to Firebase. Firebase will be responsible for triggering functions whenever an event occurs.
If you want to count upvotes for example, you should create a structure similar to this one:
{
"posts" : {
"-JRHTHaIs-jNPLXOQivY" : {
"upvotes_count":5,
"upvotes" : {
"userX" : true,
"userY" : true,
"userZ" : true,
...
}
}
}
}
And then write a javascript function to increase the upvotes_count when there is a new write to the upvotes node.
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp(functions.config().firebase);
exports.countlikes = functions.database.ref('/posts/$postid/upvotes').onWrite(event => {
return event.data.ref.parent.child('upvotes_count').set(event.data.numChildren());
});
You can read the Documentation to know how to Get Started with Cloud Functions.
Also, another example of counting posts is here:
https://github.com/firebase/functions-samples/blob/master/child-count/functions/index.js
Update January 2018
The firebase docs have changed so instead of event we now have change and context.
The given example throws an error complaining that event.data is undefined. This pattern seems to work better:
exports.countPrescriptions = functions.database.ref(`/prescriptions`).onWrite((change, context) => {
const data = change.after.val();
const count = Object.keys(data).length;
return change.after.ref.child('_count').set(count);
});
```
This is a little late in the game as several others have already answered nicely, but I'll share how I might implement it.
This hinges on the fact that the Firebase REST API offers a shallow=true parameter.
Assume you have a post object and each one can have a number of comments:
{
"posts": {
"$postKey": {
"comments": {
...
}
}
}
}
You obviously don't want to fetch all of the comments, just the number of comments.
Assuming you have the key for a post, you can send a GET request to
https://yourapp.firebaseio.com/posts/[the post key]/comments?shallow=true.
This will return an object of key-value pairs, where each key is the key of a comment and its value is true:
{
"comment1key": true,
"comment2key": true,
...,
"comment9999key": true
}
The size of this response is much smaller than requesting the equivalent data, and now you can calculate the number of keys in the response to find your value (e.g. commentCount = Object.keys(result).length).
This may not completely solve your problem, as you are still calculating the number of keys returned, and you can't necessarily subscribe to the value as it changes, but it does greatly reduce the size of the returned data without requiring any changes to your schema.
Save the count as you go - and use validation to enforce it. I hacked this together - for keeping a count of unique votes and counts which keeps coming up!. But this time I have tested my suggestion! (notwithstanding cut/paste errors!).
The 'trick' here is to use the node priority to as the vote count...
The data is:
vote/$issueBeingVotedOn/user/$uniqueIdOfVoter = thisVotesCount, priority=thisVotesCount
vote/$issueBeingVotedOn/count = 'user/'+$idOfLastVoter, priority=CountofLastVote
,"vote": {
".read" : true
,".write" : true
,"$issue" : {
"user" : {
"$user" : {
".validate" : "!data.exists() &&
newData.val()==data.parent().parent().child('count').getPriority()+1 &&
newData.val()==newData.GetPriority()"
user can only vote once && count must be one higher than current count && data value must be same as priority.
}
}
,"count" : {
".validate" : "data.parent().child(newData.val()).val()==newData.getPriority() &&
newData.getPriority()==data.getPriority()+1 "
}
count (last voter really) - vote must exist and its count equal newcount, && newcount (priority) can only go up by one.
}
}
Test script to add 10 votes by different users (for this example, id's faked, should user auth.uid in production). Count down by (i--) 10 to see validation fail.
<script src='https://cdn.firebase.com/v0/firebase.js'></script>
<script>
window.fb = new Firebase('https:...vote/iss1/');
window.fb.child('count').once('value', function (dss) {
votes = dss.getPriority();
for (var i=1;i<10;i++) vote(dss,i+votes);
} );
function vote(dss,count)
{
var user='user/zz' + count; // replace with auth.id or whatever
window.fb.child(user).setWithPriority(count,count);
window.fb.child('count').setWithPriority(user,count);
}
</script>
The 'risk' here is that a vote is cast, but the count not updated (haking or script failure). This is why the votes have a unique 'priority' - the script should really start by ensuring that there is no vote with priority higher than the current count, if there is it should complete that transaction before doing its own - get your clients to clean up for you :)
The count needs to be initialised with a priority before you start - forge doesn't let you do this, so a stub script is needed (before the validation is active!).
write a cloud function to and update the node count.
// below function to get the given node count.
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp(functions.config().firebase);
exports.userscount = functions.database.ref('/users/')
.onWrite(event => {
console.log('users number : ', event.data.numChildren());
return event.data.ref.parent.child('count/users').set(event.data.numChildren());
});
Refer :https://firebase.google.com/docs/functions/database-events
root--|
|-users ( this node contains all users list)
|
|-count
|-userscount :
(this node added dynamically by cloud function with the user count)

how to get firebase child count without loading data [duplicate]

You can get the child count via
firebase_node.once('value', function(snapshot) { alert('Count: ' + snapshot.numChildren()); });
But I believe this fetches the entire sub-tree of that node from the server. For huge lists, that seems RAM and latency intensive. Is there a way of getting the count (and/or a list of child names) without fetching the whole thing?
The code snippet you gave does indeed load the entire set of data and then counts it client-side, which can be very slow for large amounts of data.
Firebase doesn't currently have a way to count children without loading data, but we do plan to add it.
For now, one solution would be to maintain a counter of the number of children and update it every time you add a new child. You could use a transaction to count items, like in this code tracking upvodes:
var upvotesRef = new Firebase('https://docs-examples.firebaseio.com/android/saving-data/fireblog/posts/-JRHTHaIs-jNPLXOQivY/upvotes');
upvotesRef.transaction(function (current_value) {
return (current_value || 0) + 1;
});
For more info, see https://www.firebase.com/docs/transactions.html
UPDATE:
Firebase recently released Cloud Functions. With Cloud Functions, you don't need to create your own Server. You can simply write JavaScript functions and upload it to Firebase. Firebase will be responsible for triggering functions whenever an event occurs.
If you want to count upvotes for example, you should create a structure similar to this one:
{
"posts" : {
"-JRHTHaIs-jNPLXOQivY" : {
"upvotes_count":5,
"upvotes" : {
"userX" : true,
"userY" : true,
"userZ" : true,
...
}
}
}
}
And then write a javascript function to increase the upvotes_count when there is a new write to the upvotes node.
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp(functions.config().firebase);
exports.countlikes = functions.database.ref('/posts/$postid/upvotes').onWrite(event => {
return event.data.ref.parent.child('upvotes_count').set(event.data.numChildren());
});
You can read the Documentation to know how to Get Started with Cloud Functions.
Also, another example of counting posts is here:
https://github.com/firebase/functions-samples/blob/master/child-count/functions/index.js
Update January 2018
The firebase docs have changed so instead of event we now have change and context.
The given example throws an error complaining that event.data is undefined. This pattern seems to work better:
exports.countPrescriptions = functions.database.ref(`/prescriptions`).onWrite((change, context) => {
const data = change.after.val();
const count = Object.keys(data).length;
return change.after.ref.child('_count').set(count);
});
```
This is a little late in the game as several others have already answered nicely, but I'll share how I might implement it.
This hinges on the fact that the Firebase REST API offers a shallow=true parameter.
Assume you have a post object and each one can have a number of comments:
{
"posts": {
"$postKey": {
"comments": {
...
}
}
}
}
You obviously don't want to fetch all of the comments, just the number of comments.
Assuming you have the key for a post, you can send a GET request to
https://yourapp.firebaseio.com/posts/[the post key]/comments?shallow=true.
This will return an object of key-value pairs, where each key is the key of a comment and its value is true:
{
"comment1key": true,
"comment2key": true,
...,
"comment9999key": true
}
The size of this response is much smaller than requesting the equivalent data, and now you can calculate the number of keys in the response to find your value (e.g. commentCount = Object.keys(result).length).
This may not completely solve your problem, as you are still calculating the number of keys returned, and you can't necessarily subscribe to the value as it changes, but it does greatly reduce the size of the returned data without requiring any changes to your schema.
Save the count as you go - and use validation to enforce it. I hacked this together - for keeping a count of unique votes and counts which keeps coming up!. But this time I have tested my suggestion! (notwithstanding cut/paste errors!).
The 'trick' here is to use the node priority to as the vote count...
The data is:
vote/$issueBeingVotedOn/user/$uniqueIdOfVoter = thisVotesCount, priority=thisVotesCount
vote/$issueBeingVotedOn/count = 'user/'+$idOfLastVoter, priority=CountofLastVote
,"vote": {
".read" : true
,".write" : true
,"$issue" : {
"user" : {
"$user" : {
".validate" : "!data.exists() &&
newData.val()==data.parent().parent().child('count').getPriority()+1 &&
newData.val()==newData.GetPriority()"
user can only vote once && count must be one higher than current count && data value must be same as priority.
}
}
,"count" : {
".validate" : "data.parent().child(newData.val()).val()==newData.getPriority() &&
newData.getPriority()==data.getPriority()+1 "
}
count (last voter really) - vote must exist and its count equal newcount, && newcount (priority) can only go up by one.
}
}
Test script to add 10 votes by different users (for this example, id's faked, should user auth.uid in production). Count down by (i--) 10 to see validation fail.
<script src='https://cdn.firebase.com/v0/firebase.js'></script>
<script>
window.fb = new Firebase('https:...vote/iss1/');
window.fb.child('count').once('value', function (dss) {
votes = dss.getPriority();
for (var i=1;i<10;i++) vote(dss,i+votes);
} );
function vote(dss,count)
{
var user='user/zz' + count; // replace with auth.id or whatever
window.fb.child(user).setWithPriority(count,count);
window.fb.child('count').setWithPriority(user,count);
}
</script>
The 'risk' here is that a vote is cast, but the count not updated (haking or script failure). This is why the votes have a unique 'priority' - the script should really start by ensuring that there is no vote with priority higher than the current count, if there is it should complete that transaction before doing its own - get your clients to clean up for you :)
The count needs to be initialised with a priority before you start - forge doesn't let you do this, so a stub script is needed (before the validation is active!).
write a cloud function to and update the node count.
// below function to get the given node count.
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp(functions.config().firebase);
exports.userscount = functions.database.ref('/users/')
.onWrite(event => {
console.log('users number : ', event.data.numChildren());
return event.data.ref.parent.child('count/users').set(event.data.numChildren());
});
Refer :https://firebase.google.com/docs/functions/database-events
root--|
|-users ( this node contains all users list)
|
|-count
|-userscount :
(this node added dynamically by cloud function with the user count)

Resources