I just created a mobile game with flutter and would like to extend that to a multiplayer online game. So I thought about making it with firebase. I'm rather new to this domain so I hope my questions don't sound too stupid.
As the characters in my game move around fluidly I need to update their positions at least once per frame, so in case of using firebase I would need a cloud function that runs like 60 times per seconds and never ends. This function would then take all the objects in the game (which are stored in the realtime database) and update their positions according to their speed. The updated values are written back to the database to which the clients listen. They update the values in the game and render the objects at the correct positions. Whenever a player interacts with the game somehow another cloud function is called to handle that and update whatever is needed.
Is it possible to make something like this and does it make sense? It does not sound very smooth to me using cloud functions in this way but I don't have another idea of implementing it. And how would it affect the costs? Sound like a lot of function invocations and db reads/writes..
If this is totally unthinkable what could be alternatives? I wouldn't mind too much using another db and or framework to make the game itself.
Thank you very much
Communicating positions etc through a database won't be fast enough for you to do it 60 times per second. You'll have to use some form of mixed UDP and TCP solution and unfortunately that doesn't exist pre-built for Flame like it does for some game-engines, but it might not be too complicated to build one yourself depending on how complex your use-case is.
Like #Fattie suggests, it could be a good idea to try out if PubNub works for you.
Update: There is now an early stages Nakama (Open source game server) integration for Flutter here.
Firebase has utterly no connection to what you are trying to do.
You just use Unity and one of the realtime-multiplayer services - like Photon.
It is inconceivable that as a hobbyist you'd start from scratch and try to write realtime, interframe, predictive quat. networking!
In five minutes ...
https://doc.photonengine.com/en-us/pun/v2/demos-and-tutorials/pun-basics-tutorial/intro
It takes 5 minutes to download the Photon/Unity demo and be playing a realtime multiuser game with robots running around.
You then just learn how to add guns or whatever and add your own models.
Flame ...
Flame is spectacular BTW. To do a R/T MP demo with Flutter/Flame, I'd suggest just using PubNub, which would take no time.
http://pubnub.com/developers/demos/codoodler (all of the demos are terrific). It's the world's major realtime backbone, nothing is faster, they do Flutter so - that's about it.
Related
I'm making a turn based game kind of like multiplayer checkers that works with firebase realtime database, so each client needs to know when moves are made.
I'm limited by third party framework that only allows REST API requests, but doesn't allow REST API streaming because there is no way to "Set the client's Accept header to text/event-stream" or "Respect HTTP Redirects, in particular HTTP status code 307".
So, I'm thinking of reading the database with GET requests every second to see if there is new data, but I'm worried that this could be inefficient in terms of data and cause a large bill. How much worse is this solution than a REST API streaming one and is it practical?
Since in multiplayer games response time is very critical, I think you should think about how this may be inefficient in terms of user experience. But of course that will depend on how the game works.
But if you think it is ok users to have 1000ms delay, then the question is how much players will be playing the game daily, how long does each game take to finish (turn wise).
((avg. turns per game) * (avg .# of players in a single game)) * (games played per day) will be the minimum reads for only the game play part. Also you must consider if you will have to constantly check multiple documents. Probably there will be many writes also reads on the other parts of the game.
So I think overall, it is very inefficient way to solve this problem in many ways.
What is the platform you are using? Maybe someone could find a way around somehow.
Firebase provides callback listeners for requests. You can attach ChildEventListener to your request to track real time changes in your database. As long as it is connected it will be considered a single request.
Refer to this link
I'm writing a small game for Android in Unity. Basically the person have to guess whats on the photo. Now my boss wants me to add an additional function-> after successful/unsuccessful guess the player will get the panel to rate the photo (basically like or dislike), because we want to track which photos are not good/remove the photos after a couple of successful guesses.
My understanding is that if we want to add +1 to the variable in Firebase first I have to make the call and get it then we have to make a separate call with adding 1 to the value we got. I was wandering if there is a more efficient way to do it?
Thanks for any suggestions!
Instead of requesting firebase when you want to add ,you can request firebase in the beginning (onCreate like method) and save the object and then use it when you want to update it.
thanks
Well, one thing you can do is to store your data temporarily in some object, but NOT send it to Firebase right away. Instead, you can send the data to Firebase in times when the app/game is about to get paused/minimized; hence, reducing potential lags and increasing player satisfaction. OnApplicationPause(bool) is one of such functions that gets called when the game is minimized.
To do what you want, I would recommend using a Transaction instead of just doing a SetValueAsync. This lets you change values in your large shared database atomically, by first running your transaction against the local cache and later against the server data if it differs (see this question/answer).
This gets into some larger interesting bits of the Firebase Unity plugin. Reads/writes will run against your local cache, so you can do things like attach a listener to the "likes" node of a picture. As your cache syncs online and your transaction runs, this callback will be asynchronously triggered letting you keep the value up to date without worrying about syncing during app launch/shutdown/doing your own caching logic. This also means that generally, you don't have to worry too much about your online/offline state throughout your game.
After watching a fair amount of youtube videos, it seems that Google is advocating for multipath updates when changing data stored in multiple places, however, The more I've messed with cloud functions, it seems like they're and even more viable option as they can just sit in the back and listen for changes to a specific reference and push changes as needed to the other references in real time. Is there a con to going this route? Just curious as to why Google doesn't recommend them for this use case.
NEWER UPDATE: Literally as I was writing this, I received a response from Google regarding my issues. It's too late to turn our apps direction around at this point but it may be useful for someone else.
If your function doesn't return a value, then the server doesn't know how long to wait before giving up and terminating it. I'd wager a quick guess that this might be why the DB calls aren't getting invoked.
Note that since DatabaseReference.set() returns a promise, you can simply return that if you want.
Also, you may want to add a .catch() and log the output to verify the set() op isn't failing.
~firebase-support#google.com
UPDATE: My experience with cloud functions in the last month or so has been sort of a love-hate. A lot of our denormalized data relied on Cloud Functions to keep everything in sync. Unfortunately (and this was a bad idea from the start) we were dealing with transactional/money data and storing that in multiple areas was uncomfortable. When we started having issues with Cloud Functions, i.e. the execution of them on a DB listener was not 100% reliable, we knew that Firebase would not work at least for our transaction data.
Overall the concept is awesome. They work amazingly well when they trigger, but due to some inconsistencies in triggering the functions, they weren't reliable enough for our use case.
We're currently using SQL for our transactional data, and then store user data and other objects that need to be maintained real-time in Firebase. So far that's working pretty well for us.
I know this issue may have been raised multiple times but I have read on most of the questions available but did not found any that can exactly help to answer my question. As proposed by the Firebase team the fan out technique is the recommended way to ensure fast data read, but with the cost of data duplication. I know this question is subjective and depends on the application, but which is the best solution in terms of cost saving($) and data read?
Post same node in multiple child (save data read only called once,
but have redundant, so consume more Firebase storage) (see image Firebase Database - the "Fan Out" technique)
Post only one node, and other reference to the node by its key (not redundant and consume less Firbase storage, but need to read twice - get the key, and get the node for the key) (see image https://stackoverflow.com/a/38215398/1423345)
For context, I am building a non profit marketplace app, so I need to apply the best solution in terms of balancing both between cost saving ($) and fast data read.
On the other hand, read twice (bandwidth) vs bigger storage? Which one is more cost effective?
I would start by saying that ideally in Firebase you read or sync only what's necessary. So your database queries are coupled by other filters to make the query as specific as possible. If you can nail that then you will anyway build a very intelligent data structure which will be cost effective.
Now the real debate Fan - Out technique or just post reference to the nodes. As I personally prefer Fan-Out and also use it successfully so I will answer in reference to that technique only which will also give you indications of the reason that make me not wanna use keeping a reference and all.
First and foremost thing is end-user experience and performance. Which comes in the form of the Big Data Chunk Synchronization. Well in general it means that instead of downloading small chunks you aim for the biggest possible so that you reduce High Cell radio usage, High Battery Drain, High bandwidth and also keep the app updated and in sync as fast as possible.
If you aim for that kind of app performance then you clearly see that Fan-Out is the clear winner over other technique due to following reasons.
You download A Big Data Chunk stored in other node which doesn't let your cell radio stay on for long.
As you download whole info at once, your app performs better than others. Obviously by whole I don't mean that you should download full database. It's all about that smart balance which makes you download just what is required in first go.
It's not that this is the only technique which will give you faster reads and better data structure. There are other techniques like indexing, data validation and security rules which are equally important. All coupled up properly with correct data structure will give you far better performance.
In a situation where you have just a reference to other node and not actual data, then you might end up in a situation where you don't actually have anything to show to your users. Let's say your users aren't getting good connectivity so after one read which gave you just the reference, the network falls. So till the network is up again your users don't see anything and trust me that is a very bad situation for the app. Your aim as a developer should be to reduce the chances of those situations
So, I would recommend you to go for FAN - OUT technique as it is faster and cost effective when you see other factors like data filtering, indexing and security rules as well. Yes it comes with a slight price of high storage usage. But what does a less storage mean when you don't have happy users ? Still it all comes down to personal preference. But I have shared my experience and thoughts hope it helps you make right decision.
I would encourage you to got through this and have a more deeper understanding of no SQL Data modelling
Do let me know if this info helped you.
I used the get room list function for getting the rooms in the server. But once i joined the particular room ,I am not able to see the list? How can i see the list even if i connected to room?
Short Answer: I'm afraid the answer is you can't - its not supported out of the box.
Long Answer:
Its per Design
The reason for this is the scalable architecture of Photon Loadbalancing
which is the server side application and part of the sdk that is usually
used in conjuncion with PUN. Loadbalancing is also what is driving the Photon Cloud in case that is what you are refering too.
What happens is that your client initially connects to what is called the Master, where the matchmaking takes place. Your client cn either create a new room or join a room using the list of rooms or any of the other matchmaking options like JoinRandom for instance.
Once any of this happens the master will send your client the information required to connect to the GameServer hosting the game.
In case of a new client it will send your client to the server with least load.
Under load the list of games and their properties can change very quickly
- games are created
- players join and leave
- games can reach their max allowed # of players
this changes are propagated from all the GameServers back to the Master
but it is not propagated to all GameServers, meanig its not available to your client once its in a room.
Options
There is a longer thread about this topic here:
http://forum.exitgames.com/viewtopic.php?f=17&t=3093
The gist is of the discussion is:
In case you are using the selfhosted version Photon (not the Photon Cloud) you could extend Loadbalancing.
You could connect a second peer and leave it connected to the master, with the disadantage that it is counted as a second ccu, which might be a problem for your costs since its all based on ccu.
Disatvantages of Game Lists
In general the use of game lists as design component in the games tends to be overrated - it is an easy way to have matchmaking during devlopment but doesn't work well (for most of the games) when going live.
Players usually don't care and match to games based on filters (there is no need for game lists for that since photon supports matchmaking with filters), usually leading to player choosing one of the first games on the list. Under load this often means that many player try to join the same game, which usually is a problem since most games need some kind of max # of players.
BTW: if they care its often "I wan't to play with a friend", which is also supported by the photon matchmaking.
If you have mobile clients the lists of games can be a burden specially with very active games, it takes a time to load the lists and it uses bandwith to load and mantain them. Due to the latancy its also more likely for players to select games that are already full.
You might be able to by creating a second instance of PhotonNetwork and connecting it to said server where the games are hosted, and have this instance a game object that also talks to the object that has your lobby script. Not to sure if this will work or how hard it would be to implement. I'm currently working with Photon myself.