I'm creating a simple 2d-multiplayer game with Unity and I chose to use Firebase as the backend. I'm facing some issues when trying to fill up rooms using Firebase Cloud functions. How I planned this to work is following:
Player clicks "Join Room"-button
Unique device ID is send to Realtime database under "Player Searching For Room" and event listener is added to that ID
Cloud function will be triggered when the 'onWrite'-event happens. Function will then check if the room array is empty. If the room array is empty the Cloud function will then push new room to the realtime database.
Cloud function pushes the room ID under the player ID in the "Players Searching For Room"
Because player is already listening to his own ID under the "Players Searching For Room", a function will be run when room ID is pushed under his own ID. This will tell the player that the Cloud function successfully found a room for him.
Below is the Cloud function:
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp(functions.config().firebase);
// Create and Deploy Your First Cloud Functions
// https://firebase.google.com/docs/functions/write-firebase-functions
var room = [];
// ID for the room we are filling at the moment
var room_currentID;
// This functions triggers each time something is added or deleted to "Players Searching For Room"
exports.findRoom = functions.database
.ref('/Players Searching For Room/{pushId}')
.onWrite(event => {
// Check if data exists (if not, this was triggered by delete -> return)
if(!event.data.exists())
{
return;
}
// If this player already has a room, we want to return
if(event.data.val().inRoom != "none")
return;
// Store data under changed pushId (if player was added to waiting players then data is -> "size" : 4)
const data = event.data.val();
// Name of the player. We get this from the pushId of the item we pushed ("container" for data pushed).
var name = event.params.pushId;
// Size of the room the player wants to join
var size = data.size;
// With IF-statements check which room_size_n object array we want to loop through
console.log("Player called " + name + " is waiting for room that has maxium of " + size + " players")
// We can push the user to the room array since it can never be full
// (we clear the array before allowing next guy to join)
room.push(name);
// If this was the first guy
// we need to create new room
if(room.length == 1)
{
admin.database().ref('/Rooms').push({
onGoing: false // We need to set something, might aswell set something usefull
}).then(snapshot => {
// because this function is triggered by changes in firebase realtime database
// we can't return anything to the player. BUT we can inform player about the room
// he's been attached to by adding roomID to the playername in the "Players Searching For Room"
// then players device will handle removing
// Store ID of the room so that we can send it to later joiners in this room
room_currentID = snapshot.key;
data.inRoom = room_currentID;
return event.data.ref.set(data);
});
}
// If there already exists a suitable room with space on it
else if(room.length > 1)
{
// We can attach the stored roomID to the player so he knows which rooms onGoing flag to watch for.
data.inRoom = room_currentID;
// Attach roomID to the player and then check if room is full
// waiting roomID to attach to player before setting onGoing TRUE
// prevents other player to get a head start
event.data.ref.set(data).then(snapshot => {
// If roomId was attached to the player we can check the room size
if(room.length == size)
{
// ...and if the room became full we need to set onGoing to true
admin.database().ref('/Rooms/'+room_currentID).set({
onGoing: true
}).then(snapshot => {
room = [];
});
}
});
}
});
Problem is that if multiple users click the Join Game-button in a short period of time it messes up the system. Adding player ID under "Players Searching For Room" seems to work everytime, but sometimes Cloud function never attach room ID to the player ID and sometimes Cloud function creates more rooms than it should. I tested this simply by having a button that attached random ID under "Players Searching For Room" each time it was clicked. Then I rapidly clicked that button for ยด10 times. That should have attached 5 different room IDs to those 10 random IDs and also generate 5 new rooms. Instead it generated 7 rooms and added room IDs to only 8 random IDs not 10.
Problem is, i think:
Abe clicks Join Game-button at 12.00.00
Cloud function starts (execution takes 5 seconds)
Rob clicks Join Game-button at 12.00.02
Cloud function triggers again before finishing Abe's request
Eveything gets messed
Is it possible with firebase to change this so that if Rob triggers the Cloud function before Abe's request is done, Rob will be put on hold while Abe finishes. When Abe is finished than it's Rob's turn. Ugh, awfully long explanation hopefully somebody will read this :)
At Google I/O 2017, I gave a talk on building a multiplayer game using only Firebase on the backend. Cloud Functions implements pretty much all the logic of the game. It also has a simple matching feature, and you could extend that scheme to do more complicated things. You can watch the talk here, and source code for the project will be coming soon.
Related
enter image description hereI have data stored in firebase as below
The planned Start and planned finish are calculated mathematically
planned finish = planned start +duration
planned Start = planned finish of dependent sl no
for sl no 10 dependent is 9 so planned Start of 10 is equal to planned finish of 9
So my problem is if I updated planned finish of sl no 9 then how can I make 10 to automatically update it planned start and finish
The code on how data is saved is as below
if (slno != null && activity != null) {
if (action == 'create') {
// Persist a new product to Firestore
Future<Timestamp> getslno() async {
var document =
await _schedule.doc(dependent.toString()).get();
Timestamp sl = await document['plannedfinish'];
return sl;
}
var planneds = await getslno();
await _schedule.doc(slno.toString()).set({
"slno": slno,
"activity": activity,
"duration": duration,
"dependent": dependent,
"plannedstart": planneds,
"plannedfinish": planneds
.toDate()
.add(Duration(days: duration!.toInt()))
});
}
if (action == 'update') {
// Update the product
await _schedule.doc(documentSnapshot!.id).update({
"slno": slno,
"activity": activity,
"duration": duration,
"dependent": dependent
});
}
I need all the data stored to be automatically updated if at least one data is changed
Any suggestions on how to do it
Use firebase cloud functions, and utilized document onUpdate. Even if you don't know JavaScript, the functional logic is easy to implement in your equation, and the firebase admin SDK for firestore is similar to a large degree in what you are writing in Flutter.
That would be the ideal solution, otherwise, whenever someone updates a document, you need to use another function in your app to fetch the document and recalculate then update it again on the client side, which isn't very favored.
To enable cloud functions it asks for billing info, but you aren't billed until after two million api invocations. Would strongly recommend you drive into this and unleash the power.
I have an events as a parent collection that has Attendee subcollection to record all users that will attend the event like the image below. the Attendee subcollection contains users data
and then also have users as parent collection that has attendedEvents subcollection to record all events that will be visited by the user like the image below. attendedEvents` subcollection events data.
I use denormalization, so it seems the events data is duplicated in attendedEvents subcollection like this
and then I make a cron job using cloud function. this cron job task is to evaluate if an event has been passed (expired) or not. If the event has been passed, then this function should:
update the field of the event data from isActive == true to be isActive == false
read all its Attendee documents in all expired events, get all the attendeeIDs, and then delete all events data in attendedEvents subcollection of users collection.
As you can see, the second task of my cron job functions may need to read around 50.000 - 100.000 documents and then also need to delete around 50.000 - 100.000 documents as the worst case scenario (peak).
So my question is, Is it OK to perform thousand of read and delete operations in one function of Cloud Function like this ?
I am worried there is a limitation that I don't know. I am not sure, is there something that I have not been considered ? is there a better approach for this, maybe ?
Here is my cloud function code:
exports.cronDeactivatingExpiredEvents = functions.https.onRequest(async (request,response) => {
const now = new Date()
const oneMonthAgo = moment().subtract(1,"month").toDate()
try {
const expiredEventsSnapshot = await eventRef
.where("isActive","==",true)
.where("hasBeenApproved","==",true)
.where("dateTimeStart",">",oneMonthAgo)
.where("dateTimeStart","<",now)
.get()
const eventDocumentsFromFirestore = expiredEventsSnapshot.docs
const updateEventPromises = []
eventDocumentsFromFirestore.forEach(eventSnapshot => {
const event = eventSnapshot.data()
const p = admin.firestore()
.doc(`events/${event.eventID}`)
.update({isActive: false})
updateEventPromises.push(p)
})
// 1. update isActive to be false in firestore document
await Promise.all(updateEventPromises)
console.log(`Successfully deactivating ${expiredEventsSnapshot.size} expired events in Firestore`)
// getting all attendeeIDs.
// this may need to read around 50.000 documents
const eventAttendeeSnapshot = await db.collection("events").doc(eventID).collection("Attendee").get()
const attendeeDocuments = eventAttendeeSnapshot.docs
const attendeeIDs = []
attendeeDocuments.forEach( attendeeSnapshot => {
const attendee = attendeeSnapshot.data()
attendeeIDs.push(attendee.uid)
})
// 3. then delete expired event in users subcollection.
// this may need to delete 50.000 documents
const deletePromises = []
attendeeIDs.forEach( attendeeID => {
const p = db.collection("users").doc(attendeeID).collection("attendedEvents").doc(eventID).delete()
deletePromises.push(p)
})
await Promise.all(deletePromises)
console.log(`successfully delete all events data in user subcollection`)
response.status(200).send(`Successfully deactivating ${expiredEventsSnapshot.size} expired events and delete events data in attendee subcollection`)
} catch (error) {
response.status(500).send(error)
}
})
You have to pay attention to a few things here.
1) There are some limits on the side of the Cloud Function. A quota you might hit depending on how you use the data you're reading is Outbound Socket Data which is 10GB/100seconds excluding HTTP response data. In case you hit this quota you can request a quota increase by going to IAM & admin >> Quotas >> Edit Quotas and select Cloud Function API (Outgoing socket traffic for the Region you want).
However, there is also the Maximum function duration of 540 seconds. I believe what you have described should not take that long. In case it does, then if you are committing a batch delete the deletion will be done even if your function fails because of exceeding the duration.
2) On Firestore side, you have some limits too. Here you can read about some best practices when dealing with Read/Write operations and High read, write, and delete rates. Depending on the structure and the type of your data you might encounter some issues such as connection errors if you try to delete lexicographically close documents at a high rate.
Also keep in mind the more generic Firestore quotas on the number of Read/Write operations for each payment plan.
In any way, even with the best calculations there is always a room for error. So my advice would be to try a test scenario with the highest peak you are expecting. If you hit any quotas you can request a quota increase, or if you hit any hard limits you can contact Google Cloud Platform Support providing specific details on your project and use-case.
I've one firebase database instance and I would like to add a counter to a certain node.
Everytime the users run an specific action I would like to increment the node value. How to do that without getting synchronization problems? How to use google functions to do that?
Ex.:
database{
node {
counter : 0
}
}
At certain time 3 different users read the value on counter, and try to increment it. As they read at exact same time all of them read "0" and incremented to "1", but the desired value at end of execution should be "3" since it was read 3 times
==================update===================
#renaud pointed to use transactions to keep synchronization on of the saved data, but i have another scenario where i need the synchronization done on read side also:
ex.
the user read the actual value, acording to it does a different action and finishing by incrementing one...
in a sql like enviorement i would write a procedure for doing that, because doesn't matter what user will do with the info i will finish always by incrementing one
If i did understand #renaud answer right, in that scenario 4 different users reading the database at same time would get 0 as current value, then on transaction update the final stored value would be 4, but on client side each of them just read 0
You have to use a Transaction in this case, see https://firebase.google.com/docs/database/web/read-and-write#save_data_as_transactions and also https://firebase.google.com/docs/reference/js/firebase.database.Reference#transaction
A Transaction will "ensure there are no conflicts with other clients writing to the same location at the same time."
In a Cloud Function you could write your code along the following lines, for example:
....
const counterRef = admin
.database()
.ref('/node/counter');
return counterRef
.transaction(current_value => {
return (current_value || 0) + 1;
})
.then(counterValue => {
if (counterValue.committed) {
//For example update another node in the database
const updates = {};
updates['/nbrOfActionsExecuted'] = counterValue.snapshot.val();
return admin
.database()
.ref()
.update(updates);
}
})
or simply the following if you just want to update the counter (Since a transaction returns a Promise, as explained in the second link referred to above):
exports.testTransaction = functions.database.ref('/path').onWrite((change, context) => {
const counterRef = admin
.database()
.ref('/node/counter');
return counterRef
.transaction(current_value => {
return (current_value || 0) + 1;
});
});
Note that, in this second case, I have used a Realtime Database trigger as an example of trigger.
I am building a recommender system where I use Firebase to store and retrieve data about movies and user preferences.
Each movie can have several attributes, and the data looks as follows:
{
"titanic":
{"1997": 1, "english": 1, "dicaprio": 1, "romance": 1, "drama": 1 },
"inception":
{ "2010": 1, "english": 1, "dicaprio": 1, "adventure": 1, "scifi": 1}
...
}
To make the recommendations, my algorithm requires as input all the data (movies) and is matched against an user profile.
However, in production mode I need to retrieve over >10,000 movies. While the algorithm can handle this relatively fast, it takes a lot of time to load this data from Firebase.
I retrieve the data as follows:
firebase.database().ref(moviesRef).on('value', function(snapshot) {
// snapshot.val();
}, function(error){
console.log(error)
});
I am there wondering if you have any thoughts on how to speed things up? Are there any plugins or techniques known to solve this?
I am aware that denormalization could help split the data up, but the problem is really that I need ALL movies and ALL the corresponding attributes.
My suggestion would be to use Cloud Functions to handle this.
Solution 1 (Ideally)
If you can calculate suggestions every hour / day / week
You can use a Cloud Functions Cron to fire up daily / weekly and calculate recommendations per users every week / day. This way you can achieve a result more or less similar to what Spotify does with their weekly playlists / recommendations.
The main advantage of this is that your users wouldn't have to wait for all 10,000 movies to be downloaded, as this would happen in a cloud function, every Sunday night, compile a list of 25 recommendations, and save into your user's data node, which you can download when the user accesses their profile.
Your cloud functions code would look like this :
var movies, allUsers;
exports.weekly_job = functions.pubsub.topic('weekly-tick').onPublish((event) => {
getMoviesAndUsers();
});
function getMoviesAndUsers () {
firebase.database().ref(moviesRef).on('value', function(snapshot) {
movies = snapshot.val();
firebase.database().ref(allUsersRef).on('value', function(snapshot) {
allUsers = snapshot.val();
createRecommendations();
});
});
}
function createRecommendations () {
// do something magical with movies and allUsers here.
// then write the recommendations to each user's profiles kind of like
userRef.update({"userRecommendations" : {"reco1" : "Her", "reco2", "Black Mirror"}});
// etc.
}
Forgive the pseudo-code. I hope this gives an idea though.
Then on your frontend you would have to get only the userRecommendations for each user. This way you can shift the bandwidth & computing from the users device to a cloud function. And in terms of efficiency, without knowing how you calculate recommendations, I can't make any suggestions.
Solution 2
If you can't calculate suggestions every hour / day / week, and you have to do it each time user accesses their recommendations panel
Then you can trigger a cloud function every time the user visits their recommendations page. A quick cheat solution I use for this is to write a value into the user's profile like : {getRecommendations:true}, once on pageload, and then in cloud functions listen for changes in getRecommendations. As long as you have a structure like this :
userID > getRecommendations : true
And if you have proper security rules so that each user can only write to their path, this method would get you the correct userID making the request as well. So you will know which user to calculate recommendations for. A cloud function could most likely pull 10,000 records faster and save the user bandwidth, and finally would write only the recommendations to the users profile. (similar to Solution 1 above) Your setup would like this :
[Frontend Code]
//on pageload
userProfileRef.update({"getRecommendations" : true});
userRecommendationsRef.on('value', function(snapshot) { gotUserRecos(snapshot.val()); });
[Cloud Functions (Backend Code)]
exports.userRequestedRecommendations = functions.database.ref('/users/{uid}/getRecommendations').onWrite(event => {
const uid = event.params.uid;
firebase.database().ref(moviesRef).on('value', function(snapshot) {
movies = snapshot.val();
firebase.database().ref(userRefFromUID).on('value', function(snapshot) {
usersMovieTasteInformation = snapshot.val();
// do something magical with movies and user's preferences here.
// then
return userRecommendationsRef.update({"getRecommendations" : {"reco1" : "Her", "reco2", "Black Mirror"}});
});
});
});
Since your frontend will be listening for changes at userRecommendationsRef, as soon as your cloud function is done, your user will see the results. This might take a few seconds, so consider using a loading indicator.
P.S 1: I ended up using more pseudo-code than originally intended, and removed error handling etc. hoping that this generally gets the point across. If there's anything unclear, comment and I'll be happy to clarify.
P.S. 2: I'm using a very similar flow for a mini-internal-service I built for one of my clients, and it's been happily operating for longer than a month now.
Firebase NoSQL JSON structure best practice is to "Avoid nesting data", but you said, you don't want to change your data. So, for your condition, you can have REST call to any particular node (node of your each movie) of the firebase.
Solution 1) You can create some fixed number of Threads via ThreadPoolExecutors. From each worker thread, you can do HTTP (REST call request) as below. Based on your device performance and memory power, you can decide how many worker threads you want to manipulate via ThreadPoolExecutors. You can have code snippet something like below:
/* creates threads on demand */
ThreadFactory threadFactory = Executors.defaultThreadFactory();
/* Creates a thread pool that creates new threads as needed, but will reuse previously constructed threads when they are available */
ExecutorService threadPoolExecutor = Executors.newFixedThreadPool(10); /* you have 10 different worker threads */
for(int i = 0; i<100; i++) { /* you can load first 100 movies */
/* you can use your 10 different threads to read first 10 movies */
threadPoolExecutor.execute(() -> {
/* OkHttp Reqeust */
/* urlStr can be something like "https://earthquakesenotifications.firebaseio.com/movies?print=pretty" */
Request request = new Request.Builder().url(urlStr+"/i").build();
/* Note: Firebase, by default, store index for every array.
Since you are storing all your movies in movies JSON array,
it would be easier, you read first (0) from the first worker thread,
second (1) from the second worker thread and so on. */
try {
Response response = new OkHttpClient().newCall(request).execute();
/* OkHttpClient is HTTP client to request */
String str = response.body().string();
} catch (IOException e) {
e.printStackTrace();
}
return myStr;
});
}
threadPoolExecutor.shutdown();
Solution 2) Solution 1 is not based on the Listener-Observer pattern. Actually, Firebase has PUSH technology. Means, whenever something particular node changes in Firebase NoSQL JSON, the corresponding client, who has connection listener for particular node of the JSON, will get new data via onDataChange(DataSnapshot dataSnapshot) { }. For this you can create an array of DatabaseReferences like below:
Iterable<DataSnapshot> databaseReferenceList = FirebaseDatabase.getInstance().getReference().getRoot().child("movies").getChildren();
for(DataSnapshot o : databaseReferenceList) {
#Override
public void onDataChange(DataSnapshot o) {
/* show your ith movie in ListView. But even you use RecyclerView, showing each Movie in your RecyclerView's item is still show. */
/* so you can store movie in Movies ArrayList. When everything completes, then you can update RecyclerView */
}
#Override
public void onCancelled(DatabaseError databaseError) {
}
}
Although you stated your algorithm needs all the movies and all attributes, it does not mean that it processes them all at once. Any computation unit has its limits, and within your algorithm, you probably chunk the data into smaller parts that your computation unit can handle.
Having said that, if you want to speed things up, you can modify your algorithm to parallelize fetching and processing of the data/movies:
| fetch | -> |process | -> | fetch | ...
|chunk(1)| |chunk(1)| |chunk(3)|
(in parallel) | fetch | -> |process | ...
|chunk(2)| |chunk(2)|
With this approach, you can spare almost the whole processing time (but the last chunk) if processing is really faster than fetching (but you have not said how "relatively fast" your algorithm run, compared to fetching all the movies)
This "high level" approach of your problem is probably your better chance if fetching the movies is really slow although it requires more work than simply activating a hypothetic "speed up" button of a Library. Though it is a sound approach when dealing with large chunk of data.
For example, I have following database structure:
/
+ users
+ 1
+ items
+ -xxx: "hello"
+ 2
+ items
Then;
var usersRef = new Firebase("https://mydb.firebaseio.com/users");
usersRef.on("child_changed", function(snapshot) {
utils.debug(JSON.stringify(snapshot.exportVal()));
});
If a value, "world", is pushed to "/users/1/items", I may get:
{"items": {"-xxx": "hello", "-yyy": "world"}}
So, how to tell which one is changed?
I need to on("child_added") every single ref to "/users/$id/items"?
NOTE: I'm trying to write an admin process in node.js.
The child_changed event only provides information on which immediate child has changed. If a node deeper in a data structure changed, you'll know which immediate child was affected but not the full path to the changed data. This is by design.
If you want granular updates about exactly what changed, you should attach callbacks recursively to all of the elements you care about. That way when an item changes, you'll know what the item was by which callback is triggered. Firebase is actually optimized for this use case; attaching large numbers of callbacks -- even thousands -- should work fine. Behind the scenes, Firebase aggregates all of callbacks together and only synchronizes the minimum set of data needed.
So, for your example, if you want to get alerted every time a new item is added for any user, you could do the following:
var usersRef = new Firebase("https://mydb.firebaseio.com/users");
usersRef.on("child_added", function(userSnapshot) {
userSnapshot.ref().child("items").on("child_added", function(itemSnapshot)
utils.debug(itemSnapshot.val());
});
});
If you are working with a very large number of users (hundreds of thousands or millions), and synchronizing all of the data is impractical, there's another approach. Rather than have your server listen to all of the data directly, you could have it listen to a queue of changes. Then when clients add items to their item lists, they could also add an item to this queue so that the server becomes aware of it.
This is what the client code might look like:
var itemRef = new Firebase("https://mydb.firebaseio.com/users/MYID/items");
var serverEventQueue = new Firebase("https://mydb.firebaseio.com/serverEvents");
itemRef.push(newItem);
serverEventQueue.push(newItem);
You could then have the server listen for child_added on that queue and handle the events when they come in.
Andrew Lee gave a nice answer, but I think you should try to use cloud functions. something like this should work:
exports.getPath = functions.database.ref('/users/{id}/items/{itemId}')
.onWrite(event => {
// Grab the current value of what was written to the Realtime Database.
const original = event.data.val();
console.log('user id', event.params.id);
console.log('item id', event.params.itemId);
});