Pub-Sub model for interacting between Web API applications - asp.net

We have an application which has set of APIs developed in .NET CORE and UI which consume these APIs developed in Angular(both are independent projects) . Both are hosted on azure app service.
WebAPI integrates with 3rd Party APIs. Some of those transactional
APIs are very time consuming, so we want to have fire and forget way
of calling those and have them notified when they end.
Once our API receives the response from 3rd party, we want that be notified to UI
So we want some messaging or Pub-Sub mechanism to achieve this
I am in consideration of SignalR and Kafka.
From the documents that I red about SignalR it seems that it can be used between API -client, so I can use this for 2nd scenario.
Can SignalR be used between 2 APIs?
Coming to Kakfa , it seems to be good one for high end data streaming which I consider as over-engineering for our requirement.
Our application will be on both Azure and on-prem so we also went through Azure service bus but there is limit on Message length and cost seem to be problem for us.
So I want to know if there are any ways I can have this communication eased between 2 WebAPIs application?

Related

Can the SuperTokens Core Api layer be scaled?

We are doing a POC with SuperTokens for authentication. We require Passwordless, Email/Password, and Social Login functionality. With that functionality required, is the SuperTokens Core Api layer scalable? If so, what is the recommended approach for scaling?
SuperTokens has two sets of APIs:
Core driver interface (CDI): The API exposed via the SuperTokens Core microservice.
Frontend driver interface (FDI): The API exposed via our backend SDK.
Your app's frontend only calls the FDI APIs and in turn, our backend SDK calls the CDI APIs. So your backend is a proxy between the frontend and the SuperTokens core service.
The scalability of the FDI APIs (which is called by your frontend) is dependent on the scalability of your API layer - which is completely controlled by you.
The scalability of the CDI APIs (exposed via the SuperTokens core service) depends on:
The number of instances of the core that have been deployed.
How "far" away is the core from your backend.
The scalability of the underlying database that the core connects to.
Each SuperTokens core instance is stateless and can be scaled up / down easily. However, all of them need to connect to the same instance of a db and therefore the limiting factor here becomes the scalability of the db itself.
Since only your backend API layer queries the SuperTokens core, it is recommended to host the core in the same region as your backend.
That being said, one instance of the core can handle several hundred requests per second comfortably. You can further improve it's performance by setting:
The max number of parallel requests to server
The max number of parallel db connections.
Finally, if we consider the different types of auth operations, session verification is by far the most common operation (as compared to signing in / out or changing a password...). By default, SuperTokens verifies a user's session in a stateless manner. This means that your backend API layer doesn't need to query the core at all for session verification.
This in turn implies that you can easily scale SuperTokens to handle millions of users with hundreds of thousands of concurrent sessions with a fairly low number of core instances.

How costly server-side Blazor is?

Server-side Blazor maintains connection to the server-side apparently, using SignalR.
SignalR is the service you need to pay for. As many simultaneous connections to SignalR are going to be used as many online users your app has at the moment of time.
Do I understand correctly, that I will need to pay for next SignalR tier once I reach certain SignalR free tier limit? And only because I use Blazor, not that I use SignalR for other purposes.
And two cost-reduction alternatives are:
use client-side Blazor WebAssembly
don't use Blazor at all
We run 3 instances of P3V3 app service to serve ~450 users (it is a LOB app, so open by most users, most of the day)
Our SignalR service cost is double the app service plan.
You can use signalr over websockets instead of the SignalR service, but for some unknown reason it is unstable and you'll see lots of signalr connection drop outs.
There's no way to save costs on SignalR - no reservations, no bulk discounts.

Azure - Sending data from IoT Hub to Web App Backend

I'm searching for a solution to get data from the Azure IoT Hub to the backend of a Web App also hosted in Azure which is written in ASP.NET 4.6.
It would be best to just receive the raw Json string as fast as possible.
I found others suggesting using Webhooks or Azure functions for a similar purpose but the delay these solutions bring aren't really acceptable.
It would be best to just connect directly to the IoT endpoint and get every message as it comes in. Can anybode please point me to the right direction?
You can simply use the EventHub .NET SDK in your web app, connect to the EventHub-compatible endpoint of the IoT Hub and directly consume the events in your app. This has minimal delay and involves no extra components.
How to guide (.NET core but same applies to .NET Framework): https://learn.microsoft.com/en-us/azure/event-hubs/event-hubs-dotnet-standard-getstarted-send#receive-events
var eventProcessorHost = new EventProcessorHost(
EventHubName,
PartitionReceiver.DefaultConsumerGroupName,
EventHubConnectionString,
StorageConnectionString,
StorageContainerName);
// Registers the Event Processor Host and starts receiving messages
await eventProcessorHost.RegisterEventProcessorAsync<SimpleEventProcessor>();
The Azure SignalR Service can help to broadcast messages to the Web App instances.
There are no direct integration between the Azure IoT Hub and Azure SignalR Service. Basically you can use two patterns for this integration such as PULL-PUSH and PUSH-PUSH.
The following screen shows these integration patterns:
Note, that the PUSH-PUSH pattern with the Azure Event Grid is suitable for solution when the subscriber (consumer) is not critical for processing events in the order.

Service Bus architecture for ASP.NET Web API

I am developing a mobile application using Telerik Platform. The services consumed by the app are ASP.NET Web API RESTful services which are hosted on Azure. I'd like to build some resilience into the app by adding a service bus and have been looking at Azure Service Bus which seems to be what I'm looking for.
It's fairly new to me and I have a few questions.
Can Azure Service Bus be used for RESTful services that return data or are they fire-and-forget only?
For simple RESTful services is Azure Service Bus the way to go or Azure Storage Queue? When would you use one vs the other?
When would I use a Queue vs Topic / Subscription?
ASB is about messaging. You use messaging for communication between parts of your system/services. RESTful services can leverage ASB by translating a request into a message to perform some work. Emphasis on converting the intent into a message, to instruct about work that needs to take place, not execute the work itself.
ASB or ASQ is your choice. This is where you need to choose between the features and capabilities each provides. There's a good MSFT comparison documentation on it.
Queues vs Topics/Subscriptions - if you need to send a message to a single destination (a command) then queue is simpler. In case a message needs to be broadcasted to multiple receivers (events), topics/subscriptions are your friends.

The best option for sending asynchronous callbacks from a Web API to a .NET client

I have the following scenario in my project :-
The client makes use of ASP.NET Web API to make HTTP service requests. The Web API sits on top of a couple of WCF services, which in-turn handle all the business logic. The client subscribes to a particular type of event with the Web API. Whenever the Web API receives notifications from the internal WCF services about the occurrence of the event, the Web API in-turn needs to notify (push events to) all the subscribed clients about the events along with their details.
I want to understand the different options which are available for
sending asynchronous callbacks from an ASP.NET Web API to the
clients.(Currently we are working on a prototype for which the
client is a C# Windows Forms application. Later we might opt for
ASP.NET MVC4 web application.).
I also want to know which option would be ideal to send asynchronous
notifications back to the client when the data that accompanies the notification is of large sizes. In our scenario, the notification data that is sent back from the service may be of large sizes (~ in the range of 5KB - 50 MB).
In our scenario which I described above, can SignalR be used for notifying the c# client from Web API, as and when the Web API receives the callback from the internal WCF services?
Note :- The Web API is currently hosted in a Windows Service and the client is a .NET Windows Forms application.
Any pointers to such code samples or directions on how this can be achieved would be extremely helpful.
Cheers
SignalR is a good fit for the scenario you're describing, so I'd suggest using it for the notifications (especially since you want to start with a WinForms application and later switch to browser clients - with SignalR, you'll be able to connect to the same server-side code).
However, I'd also suggest keeping the notication messages lightweight, so instead of sending the data to the client with them, I'd send a token the client can retrieve the data with from WebAPI (SignalR isn't really ideal for large file transfers).

Resources