Firebase database structure for chat application - firebase

I am trying to structure a firebase database for chat system. What I wanted to achieve is after user successfully logged in, they will see a list of message they have sent to different users. Each of the message preview will show the last message. Then, user can select on the message to view the full chat details. It should work like the Facebook Messenger.
My design structure as such:
chatMessage
sender *(Assume this one is user)*
threads
threadID1
messageID1
datetime, content, receiver, status
messageID2
datetime, content, receiver, status
threadID2
messageID1
datetime, content, receiver, status
messageID2
datetime, content, receiver, status
sender *(Assume this one is admin)*
threads
threadID1
messageID1
datetime, content, receiver, status
messageID2
datetime, content, receiver, status
The design above allows me to know let's say userID1 logged in, I can retrieve all the messages he sent. However, I am not able to know if there is any reply prior to the message and therefore I am not able to retrieve the last message.
How can I actually restructure it so that I can achieve what I mentioned above? Any suggestions?
Thanks!

It sounds like you want to:
Have chat "rooms" between users
Show a list of each user's chat rooms, with the latest message for that room
If those are your requirements, I'd model precisely those in your database.
So for each chat room (a chat between a certain set of users), model the messages for that room:
chats: {
$roomId: {
$messageId: {
senderId: "..."
message: "..."
}
}
}
Now for each user, model a separate list of their chats and the latest message:
userRooms: {
$uid: {
$roomId: {
"message: "..."
}
}
}
Now whenever a user posts a message to a room, you will need to push that message to /chats/$roomId and for each user in that chat room write the message to /userRooms/$uid/$roomId (overwriting the exiting message there).
This type of data duplication is known as fanning out data, because you're spreading a single snippet of information over multiple places in the database. It is quite common in NoSQL databases, and is part of the reason they scale so well: they trade write complexity for read performance.

Related

SignalR multi-room session

SignalR multi-room session
My application is as follows
The user sees the rooms on the login screen. When he chooses one of the rooms and types his username, he logs into that room.
Rooms are formed as follows eg:
www.domain.com/room/1
www.domain.com/room/2
Then he texts with other users in the room.
My question is as follows:
When a user connected to a room opens the room link in a different tab, a new "Context.ConnectionId" value is assigned, so the room is created from the beginning. I want the connection to continue as it connects to the same room even if it's on a different tab.
In addition, a user should be able to be in both room1 and room2 at the same time, if he / she wishes.
How can I do that?
You want to look into using "Groups" in SignalR. Users can, depending on your design, open a new tab, log in and join a different room. They will get messages through both in their respective room.
public async Task JoinRoom(string roomName)
{
await Groups.Add(Context.ConnectionId, roomName);
Clients.Group(roomName).addChatMessage(Context.User.Identity.Name + " joined.");
}
Reference Documentation - https://learn.microsoft.com/en-us/aspnet/signalr/overview/guide-to-the-api/working-with-groups

How to get specified message from Azure Service Bus Topic and then delete it from Topic?

I’m writing functionality for receiving messages from Azure Service Bus Topic and delete the specified message from Topic. Before deleting that message, I need to send that message to other Topic.
static async Task ProcessMessagesAsync(Message message, CancellationToken token)
{
// Process the message.
Console.WriteLine($"Received message: WorkOrderNumber:{message.MessageId} SequenceNumber:{message.SystemProperties.SequenceNumber} Body:{Encoding.UTF8.GetString(message.Body)}");
Console.WriteLine("Enter the WorkOrder Number you want to delete:");
string WorkOrderNubmer = Console.ReadLine();
if (message.MessageId == WorkOrderNubmer)
{
//TODO:Post message into other topic(Priority) then delete from this current topic.
var status=await SendMessageToBus(message);
if (status == true)
{
await normalSubscriptionClient.CompleteAsync(message.SystemProperties.LockToken);
Console.WriteLine($"Successfully deleted your message from Topic:{NormalTopicName}-WorkOrderNumber:" + message.MessageId);
}
else
{
Console.WriteLine($"Failed to send message to PriorityTopic:{PriorityTopicName}-WorkOrderNumber:" + message.MessageId);
}
}
else
{
Console.WriteLine($"Failed to delete your message from Topic:{NormalTopicName}-WorkOrderNumber:" + WorkOrderNubmer);
// Complete the message so that it is not received again.
// This can be done only if the subscriptionClient is created in ReceiveMode.PeekLock mode (which is the default).
await normalSubscriptionClient.CompleteAsync(message.SystemProperties.LockToken);
// Note: Use the cancellationToken passed as necessary to determine if the subscriptionClient has already been closed.
// If subscriptionClient has already been closed, you can choose to not call CompleteAsync() or AbandonAsync() etc.
// to avoid unnecessary exceptions.
}
}
My issue with this approach is:
It’s not scalable; what if the message is the 50th in the collection? We’d have to iterate through 49 times and mark i.e deleted.
It’s a long-running process.
To avoid these problems, I want to get the specified message from the queue based on Index or sequence number then I can delete that from the topic.
So, can anyone suggest me how to resolve this problem?
So if I understand your questions and comments correctly you are trying to do something like this:
Incoming messages come into either a standard topic or priority
topic.
Some process checks messages in the standard topic and
"moves" them to the priority topic based on some criteria by
deleting them from the standard topic and adding them to the
priority topic.
Messages are processed as normal.
As Sean noted, step 2 simply won't work. Service Bus is a first=in-first-out-ish system where a consumer simply picks up the next available message. You can sort through a queue by pulling out all the messages and abandoning/completing them based on specific criteria, but scaling is a problem. In addition, you can think of each topic subscription as its own separate queue- removing a message form one subscription does not remove it from any of the other subscriptions.
What I would suggest instead of trying to pull out everything from the topics and then putting back the ones you want to keep, add a sorting queue in front of the two topics. If you don't need to sort the high priority messages you could put this sorting process in front of the standard priority topic only.
This is how the process would work:
Incoming messages are added to a sorting queue Note that this is a single queue, not a topic. At this point in the process we want to ensure there is only one copy of each message.
A sorting process moves messages from the sorting queue into either the standard or priority queue as is appropriate. Using something like Azure Functions you can scale this process fairly easily.
Messages are processed from the topics as normal.

Granular domain events

Initially we were using Domain events to handle communications with external systems. For instance, every time a user was updating his phone number OR his name we raise a PhoneNumberUpdated AND a NameUpdated event. These are then caught by handlers processed and sent to other systems.
public void SetName(Name name)
{
if (Name == name) return;
(...)
RaiseEvent(new NameUpdated(Id, name));
}
public void SetPhoneNumber(PhoneNumber number, PhoneNumberType type)
{
RaiseEvent(new PhoneNumberUpdated());
}
It works great as long as we do not need to "aggregate" events. For example, we got a new requirement asking us to to send one single email whenever a user updates his name and/or his phone number. With the current structure, our handlers would be notified multiples times (one time for each event raised) and this would result in multiple emails sent.
Making our events more generic don't seem to be a good solution. But then how would we aggregate several events raised within one transaction?
Thx
Seb
I believe your new requirement is a separate concern from your actual domain. Your domain generates events that describe what has happened. User notification, on the other hand, is a projection of that stream of events into email form. Just like you would keep your read model requirements separate from your domain, you should keep this separate as well.
A simple solution would be to capture the events you care about into a table and then once a day, on a schedule, and send one email per aggregate.

How to structure data in Firebase to avoid N+1 selects?

Since Firebase security rules cannot be used to filter children, what's the best way to structure data for efficient queries in a basic multi-user application? I've read through several guides, but they seem to break down when scaled past the examples given.
Say you have a basic messaging application like WhatsApp. Users can open chats with other groups of users to send private messages between themselves. Here's my initial idea of how this could be organized in Firebase (a bit similar to this example from the docs):
{
users: {
$uid: {
name: string,
chats: {
$chat_uid : true,
$chat2_uid: true
}
}
},
chats: {
$uid: {
messages: {
message1: 'first message',
message2: 'another message'
}
}
}
}
Firebase permissions could be set up to only let users read chats that are marked true in their user object (and restrict adding arbitrarily to the chats object, etc).
However this layout requires N+1 selects for several common scenarios. For example: to build the home screen, the app has to first retrieve the user's chats object, then make a get request for each thread to get its info. Same thing if a user wants to search their conversations for a specific string: the app has to run a separate request for every chat they have access to in order to see if it matches.
I'm tempted to set up a node.js server to run root-authenticated queries against the chats tree and skip the client-side firebase code altogether. But that's defeating the purpose of Firebase in the first place.
Is there a way to organize data like this using Firebase permissions and avoid the N+1 select problem?
It appears that n+1 queries do not necessarily need to be avoided and that Firebase is engineered specifically to offer good performance when doing n+1 selects, despite being counter-intuitive for developers coming from a relational database background.
An example of n+1 in the Firebase 2.4.2 documentation is followed by a reassuring message:
// List the names of all Mary's groups
var ref = new Firebase("https://docs-examples.firebaseio.com/web/org");
// fetch a list of Mary's groups
ref.child("users/mchen/groups").on('child_added', function(snapshot) {
// for each group, fetch the name and print it
String groupKey = snapshot.key();
ref.child("groups/" + groupKey + "/name").once('value', function(snapshot) {
System.out.println("Mary is a member of this group: " + snapshot.val());
});
});
Is it really okay to look up each record individually? Yes. The Firebase protocol uses web sockets, and the client libraries do a great deal of internal optimization of incoming and outgoing requests. Until we get into tens of thousands of records, this approach is perfectly reasonable. In fact, the time required to download the data (i.e. the byte count) eclipses any other concerns regarding connection overhead.

Entity Framework - Should I edit an object in a function, or after a function completes

I am coding a MVC 5 internet application, where I retrieve many Account objects that need emails sent to, then I send the emails. After the emails have been sent, I need to update a DateTime field in each Account object to store a value to show that the email has been sent.
Here is my code:
public async Task SendDailyExpirationEmails(int dayInterval)
{
IEnumerable<Account> freeTrialAccounts = GetFreeTrialAccountsForSendDailyExpirationEmails(dayInterval).ToList();
IEnumerable<Account> paidServiceAccounts = GetPaidServiceAccountsForSendDailyExpirationEmails(dayInterval).ToList();
await SendFreeTrialSubscriptionExpirationEmails(freeTrialAccounts);
await SendPaidSubscriptionExpirationEmails(paidServiceAccounts);
}
The SendEmail functions, for both the freeTrialAccounts and paidServiceAccounts, use a ForEach Loop to loop through each Account in the IEnumerable.
My question is this:
Should I update the DateTime field after both the SendEmail functions have been completed or within the SendEmail functions?
Is there a common coding practice for this situation?
Thanks in advance.
To maintain the precision of the DateTime value so that it's as correct as possible while reducing database calls, you will want to make record of it as soon as the email has sent, but wait to persist the information until your email process has completed.
I would say have a class property keep track of when each email was sent and then once all emails have been sent, make your call(s) to the database to update the sent date/time(s).
That said if you have some other job/task/application that relies on that information as soon as possible, then you will need to persist the data as soon as the email is sent. Otherwise I don't see a problem with delaying it.

Resources