SignalR architecture - central or distributed hubs - asp.net

Suppose I have 3 applications -
WebApp 1 - a NancyFX app that serves html pages. there's also a SignalR hub for messaging communications between the users of that app. (and sends messages to WebApp2 sometimes)
WebApp 2 - a NancyFX app that serves html pages. there's a SignalR hub to that receives messages from WebApp 1 and updates the users of WebApp 2.
WebApp3 - a self hosted WebAPI that doesn't have a SignalR hub, but sends messages to WebApp2 in order to update it's connected clients.
So my question - is keeping two hubs in WebApp2 and WebApp1 the way to go, or should I have a (scalable) dedicated SignalR server which hosts the hubs of WebApp2 and WebApp1 to facilitate communications?
Thanks..

Tough to say what's best for you, since we have no details about your load requirements or how authentication/authorization works in your application. However, I'll say this:
Your scenario could be viewed as similar to a more typical SignalR scale-out situation, where you have a single application deployed to a web farm behind a load-balancer. In this scenario, you use SignalR's scaleout ("backplane") feature for server-to-server communication so that outgoing messages reach clients no matter which server they happen to be connected to. Your situation is really no different, except you have three different applications in play. As long as all three of your applications are hosting the same hub class (via a shared hub assembly) and are connected to the same scaleout backplane, it ought to work fine.

Related

SignalR Core scaling/hosting

I have some questions regarding SignalR Core on the server side;
My server is written in ASP.NET Core, and it uses SignalR for sending notifications to users. The server uses Controllers with endpoints that clients interact with.
1) Can I host the entire thing in Azure App Service and add the SignalR service to it? Or would it be better to split the SignalR code out to its own server, which is called from the "main" server when needed?
2) The SignalR Service has an option for "Serverless", which according to documentation doesn't support clients calling any server RPCs when in said mode. Could I run this thing in Serverless mode as I'm only using the sockets for sending notifications to the clients. Or is it reserved for Azure functions?
3) Is there a way to get the number of connections for a user in a SignalR hub? I would like to send a push message to the user if he doesn't have any connections to the server. If not - what is the recommended way of handling this? I was thinking of adding a singleton service that keeps count, but am unsure if this would work at scale, especially with the SignalR service.
Thanks.
1) Better use the Azure SignalR.
2) Use it with the hub.
3) If you use Azure SignalR, you can just see it from the portal. In the code, whenever you use Azure SignalR or not, you can save the user Id in some var and count the connections. If you have multiple hubs and servers, you need to do more (if using redis-backplane for example).

SignalR and Web API communication

In one server, I have 2 web applications. One of them is a Web API, and the other one is SignalR. Both apps are hosted in IIS, under 2 different application pulls.
What is the best way to communicate between those 2 web applications? Is using either SignalR, or REST calls viable, for example?
You can use several way;
1) A message queue system would work. Your server is IIS, you can use MSMQ.
2) Alternate to MSMQ, you can use RabbitMQ.
3) As you mentioned, you can use HTTP calls.
4) You have already a SignalR. So you can use it for communication. Write a Hub that the servers join to hub.
Options are depends on your requirement. Backend servers, mostly, communicate with a message queue system. HTTP calls are also acceptable.
The biggest difference between HTTP and a message queue is async calls. For example, When a HTTP call trying to reach an endpoint, it waits for a response and if the server is down, you have to try again until server up. On the other hand, a message queue system uses a queue. Just fire and forget the data. Other side of the connection can get the data whenever the server is ready.
SignalR is too risky for this job.

When we should use SignalR self hosted and when we should not?

I am in a stage of using SignalR in my project and i don't understand when to use Self hosted option and when we should not use. As a example if I am willing to host my web application in server farm,
There will be separate hosting servers
Separate SignalR hubs in each IIS server
If we want to broadcast message into each client, how this is working in SignalR
The idea with SignalR running in multiple instances is that clients connected on instance A cannot get messages from clients connected to instance B.
(SignalR scaleout documentation)
However, when you scale out, clients can get routed to different
servers. A client that is connected to one server will not receive
messages sent from another server.
The solution to this is using a backplane - everytime a server recieves a message, it forwards it to all other servers. You can do this using Azure Service Bus, Redis or SQL.
The way I see, you use the self host option when you either don't want the full IIS running (because you have some lightweight operations that don't require all IIS heaviness) or you don't want a web server at all (for example you want to add real-time functionality to an already existing let's say forms application, or in any other process).
Be sure to read the documentation for self-hosting SignalR and decide whether you actually need to self host SignalR.
If you are developing a web application under IIS, I don't see any reason why you would want to self-host SignalR.
Hope this helps. Best of luck!

Using ASP.NET Web application as SignalR client

My team is in the middle of deciding the architecture of our backend system:
Webserver A is an ASP.NET MVC application with ASP.NET Web API component, hosted in Azure Website.
Windows Service B is a self-hosted OWIN server that will periodically push notifications to clients who subscribes to the notification, hosted in Azure VM.
Windows Service C is a client that subscribes to notification from B, hosted in Azure VM.
Since we are more-or-less entrenched in .NET stack, we implemented B as SignalR server with C being the SignalR client. This part seems to work well.
Now comes a point where we also want A to subscribe to B, but I realize that it means an ASP.NET Web Server is going to act as SignalR CLIENT, instead of the typical scenario where it acts as SignalR server.
I presume we can initialize the SignalR connection in Global.asax and make the process ever-running to avoid AppDomain recycle. However, I feel a bit iffy when a Web Server is made to do something other than serving web requests. This solution also make the web server not stateless since it needs to maintain the web socket connection alive.
Is there something fundamentally wrong with making an ASP.NET application a SignalR client? Is there any possible gotcha with this setup?
In Azure you cannot tell that your AppDomain will not recycle. Because of many reasons, it can restart itself to heal and then you will end up making a new connection to the SingleR server. Is that OK for you?
Also SingleR is mostly used in the Web Functionality improvement where polling and refresh on web clients is made simple. But as your requirement seems to be all a back end stuff, I would suggest you to go with any other event driven pattern. Check Azure Service Bus topic/subscription model to have different components listen to various events and act accordingly.

Where to host SignalR when long-running service via WCF is backend

I'm sure that was a confusing enough title.
I have a long running Windows service dealing with things happening in the world. This service is my canonical source of truth for the rest of my system. Now I want to slap a web interface onto this so the clients can see what is actually going on. At first this would simply be a MVC5 application with some Web API stuff. Then I plan to use SignalR 2.0 and Ember.js to make this application more interactive and "realtime".
The client communicates with the Windows Service over named pipes using WCF. A client (such as a web app) could request an instance of for example IEventService, would be given a WCF proxy client, and could read about events through this interface. Simple enough.
However, a web application basically just exists in the sense that it responds to requests from the user. The way I understand it, this is not the optimal environment for a long lived WCF client proxy to raise events in, and thus I wonder how to host my SignalR stuff. Keep in mind that a user would log in to the MVC5 site, but through the magic of SignalR, they will keep interacting with the service without necessarily making further requests to the website.
The way I see it, there are two options:
1) Host SignalR stuff as part of the web app. Find a way to keep it "long-running" while it has active clients, so that it can react to events on the WCF client proxy by passing information out to the connected web users.
2) Host SignalR stuff as part of my Windows service. This is already long-running, but I know nada about OWIN and what this would mean for my project. Also the SignalR client will have to connect to a different port than where the web app was served from, I assume.
Any advice on which is the right direction to go in? Keep in mind that in extreme cases, a web user would log in when they get to work in the morning, and only have signalr traffic going back and forth (i.e. no web requests) for a full work day, before logging out. I need them to keep up with realtime events all that time.
Any takers? :)
The benefit of self-hosting as part of your Windows service is that you can integrate the calls to clients directly with your existing code and events. If you host the SignalR server separately, you'd have another layer of communication between your service and the SignalR server.
If you've already decided on using WCF named pipes for that, then it probably won't make a difference whether you self-host or host in IIS (as long as it's on the same machine). The SignalR server itself is always "long-running" in the sense that as long as a client is connected, it will receive updates. It doesn't require manual requests from the user.
In any case, you'll probably need a web server to serve the HTML, scripts and images.
Having clients connected for a day shouldn't be a problem either way, as far as I can see.

Resources