Google Cloud Datastore batch operation sequence - google-cloud-datastore

I'm wondering if batched get request to datastore will preserve records order when request finished?
I'm using standard google-cloud/datastore client library to perform batched get request, like this (from Datastore's docs).
const keys = [taskKey1, taskKey2];
datastore.get(keys)
.then((results) => {
// Tasks retrieved successfully.
const tasks = results[0];
console.log(tasks);
});
I suspected that results will be in the same order as in a keys array, but on my working example it is not.
So, am I doing something wrong, or is it Google Datastore doesn't preserves order in batch operations?

To get ordered results, you need to use a query. First, filter by key values, then order by the property(ies) you want.
See these docs for more info:
How to filter by Key
How to order results

Related

Firebase cloud functions: How to handle multiple triggers caused by another cloud function?

I am writing a web app on Firebase and have the following Firestore schema and data structure:
db.collection('users').doc{{userid}) // Each doc stores data under 'userinfo' index, which is an object(map).
db.collection('posts').doc({postid}) // Each doc contains 'userinfo', which is the data about the person who posted.
db.collection('saved').doc({userid}) // Each doc stores data under 'saved' index, which is an array of carbon copy of a document in 'posts' collection.
I am thinking about writing below cloud functions:
-Cloud function A: Listen to updates in one of the docs in the 'users' collection, and update each of the docs in 'posts' collection with the previous userinfo.
-Cloud function B: Lisen to updates in one of the docs in the 'posts' collection, and update each of the docs in 'saved' collection which contains the previous postinfo.
The complication here is, the cloud function A, once triggered, will update multiple documents in the 'posts' collection, each of which will again be a trigger. If the user has written 100 posts, then there can be 100 triggers at once.
To handle such case, which of the following is the natural next step for me? Asking because the answer would depend on how Firebase Cloud Functions handle this kind of situation, but I don't have much knowledge on that at the moment:
a) Write function B as a transaction, because Firebase Cloud Function will handle this situation, by queuing each of the triggers in some order.
b) Write function B to still listen to the update and reflect it to the 'saved' collection, but not as a transaction to avoid massive backlogs caused by database lockups.
c) Rethink the database structure and/or cloud function logic to avoid such situation from the beginning.
There might be a right or wrong answer (or is there one in this case?), but just wanted to get some guidance on direction before actually writing the codes. Any advice? Thanks a lot in advance!

Optimizing the number of reads from firestore server using caching or snapshot listener

I am rendering the following view using Firebase. So basically the search is powered by a Firebase query.
I am using the following code:
Query query = FirebaseUtils.buildQuery(
fireStore, 'customers', filters, lastDocument, documentLimit);
print("query =" + query.toString());
QuerySnapshot querySnapshot = await query.getDocuments();
print("Got reply from firestore. No of items =" + querySnapshot.documents.length.toString());
Questions:
If the user hits the same query, again and again, it still hits the server. I checked this by using doc.metadata.isFromCache and it always returns false.
Will using query snapshots help in reduce no of reads for this search query? I guess no. As the user is changing the query again and again.
Any other way to limit the number of reads?
If the user hits the same query, again and again, it still hits the server. I checked this by using doc.metadata.isFromCache and it always returns false.
If you are online, it will always return false and that's the expected behavior since the listener is always looking for changes on the server. If you want to force the retrieval of the data from the cache while you are online, then you should explicitly specify this to Firestore by adding Source.CACHE to your get() call. If you're offline, it will always return true.
Will using query snapshots help in reduce no of reads for this search query? I guess no. As the user is changing the query again and again.
No, it won't. What does a query snapshot represent? It's basically an object that contains the results of your query. However, if you perform a query, "again and again", as long as it's the same query and nothing has changed on the server, then you will not be charged with any read operations. This is happening because the second time you perform the query, the results are coming from the cache. If you perform each time a new search, you'll always be billed with a number of read operations that are equal with the number of elements that are returned by your query. Furthermore, if you create new searches and the elements that are returned are already in your cache, then you'll be billed with a read operation only for the new ones.
Any other way to limit the number of reads?
The simplest method to limit the results of a query is to use a limit() call and pass as an argument the number of elements you want your query to return:
limit(10)

Firebase implementation of order by field and count method

I have a collection of schedules with documents with fields like this
I want to get the route with this highest traffic by running a query that orders by route field and do a count of the same field. My question is how can I count after ordering by a field? This is my query right now
scheduleByRoutes() {
return this.afs.collection('schedules', ref => ref.orderBy('route', 'asc')).snapshotChanges();
}
There is no direct way to count the number of documents of each route "category" returned by your orderBy() query.
You should either:
1/ Count them from your client, iterating on the query results.
or
2/ If you know the different routes upfront, issue a query for each route and use the size() method of each QuerySnapshot. You may use Promise.all() to make these calls in parallel.
or
3/ Maintain some counters for each route in, for example, another collection. For that you would use a set of Cloud Functions that would update the counters upon Creation/Modification/Deletion.
Be aware that approaches #1 and #2 will cost a document read for each document of the collection. If your collection contains a lot of documents, you may use approach #3.

Firebase realtime db not able to filterdata with db only rules

Firebase real time database.
I am trying to limit number of items returned from a query only by changing the db rrules on firebase
Is this possible? I dont want to change the app side code
What is the rule if i have to fetch top 100 using the limittofirst.
Firebase's server-side security rules merely determine whether a certain operation is allowed. They don't filter data by themselves.
If you want to retrieve the first 100 items, put a limitToFirst(100) in your query.
If you only ever want the first 100 items to be retrieved (as in: want other read operations to be rejected), have a look at the documentation on securing queries, which contains this example:
You can also use query-based rules to limit how much data a client downloads through read operations.
For example, the following rule limits read access to only the first 1000 results of a query, as ordered by priority:
messages: {
".read": "query.orderByKey &&
query.limitToFirst <= 1000"
}
Example queries:
db.ref("messages").on("value", cb) // Would fail with PermissionDenied
db.ref("messages").limitToFirst(1000)
.on("value", cb) // Would succeed (default order by key)

Firebase firestore collection count with angularFire 2

I want to get the total number of the documents that exist in firestore.
I don't want to get the data only the total number of inside Products collection I have 200.000 items is that possible with Angular 4-5, not angular.js
Can someone expert tell me how I can achieve that ??
My code so far and is not work
get_total_messages() {
this.messages_collection = this.afs.collection<MessageEntity>('messages');
return this.messages_collection.snapshotChanges();
}
End this is how I try to get the data but is not what I want;
this.firebase_Service.get_total_messages().subscribe( data => {
console.log(data);
});
There is no API to get the count of the number of documents in a Firestore collection. This means that the only ways to get the count are:
Get all documents and count them client-side.
Store the count as a separate property and update that as you add/remove documents.
Both approaches are quite common in NoSQL databases, with the second of course being a lot more efficient as the number of documents grows.
Firebase provides a sample of using Cloud Functions to keep a counter. While this sample is written for the Firebase Realtime Database, it can easily be modified to work on Cloud Firestore too.
Firestore also provides documentation on running aggregation queries and running distributed counters. Both seem slightly more involved than the first sample I linked though.
this.firebase_Service.get_total_messages().subscribe( data=>this.totalnumber=data.length);
//now, you can get total number of messages
luckily , i've solved somehow using the code,
try this, and it works well .
this.db.collection('User').valueChanges()
.subscribe( result => {
console.log(result.length);
})

Resources