Azure Service Bus not all messages received in hosted service web app - asp.net

Inside a .net web app, I set up a hosted service to receive messages from an Azure Service Bus topic. The problem is that not all messages are received, only an arbitrary amount (e.g. of 20 messages only 12 are received). The rest of them ended up in the dead letter queue. This happens when the messages are send simultaneously.
I tried the following steps to solve this:
Increased the amount of maximum concurrent calls, which helped but didn't provide a guarantee
Added a prefetch count
I also tried to send messages via the functionality in the service bus resource in Azure. 500 messages, no interval time --> didn't work (for all messages). 500 messages, 1s interval time, all messages were received.
I just don't understand why the receiver is not recieving all of the messages.
I want to build a event-driven architecture and cannot make it a gamble if all messages will be processed.
Startup.cs
...
public void ConfigureServices(IServiceCollection services)
{
services.AddSingleton<IServiceBusTopicSubscription,ServiceBusSubscription>();
services.AddHostedService<WorkerServiceBus>();
}
...
WorkerService.cs
public class WorkerServiceBus : IHostedService, IDisposable
{
private readonly ILogger<WorkerServiceBus> _logger;
private readonly IServiceBusTopicSubscription _serviceBusTopicSubscription;
public WorkerServiceBus(IServiceBusTopicSubscription serviceBusTopicSubscription,
ILogger<WorkerServiceBus> logger)
{
_serviceBusTopicSubscription = serviceBusTopicSubscription;
_logger = logger;
}
public async Task StartAsync(CancellationToken stoppingToken)
{
_logger.LogInformation("Starting the service bus queue consumer and the subscription");
await _serviceBusTopicSubscription.PrepareFiltersAndHandleMessages().ConfigureAwait(false);
}
public async Task StopAsync(CancellationToken stoppingToken)
{
_logger.LogInformation("Stopping the service bus queue consumer and the subscription");
await _serviceBusTopicSubscription.CloseSubscriptionAsync().ConfigureAwait(false);
}
public void Dispose()
{
Dispose(true);
GC.SuppressFinalize(this);
}
protected virtual async void Dispose(bool disposing)
{
if (disposing)
{
await _serviceBusTopicSubscription.DisposeAsync().ConfigureAwait(false);
}
}
}
ServiceBusSubscription.cs
public class ServiceBusSubscription : IServiceBusTopicSubscription
{
private readonly IConfiguration _configuration;
private const string TOPIC_PATH = "test";
private const string SUBSCRIPTION_NAME = "test-subscriber";
private readonly ILogger _logger;
private readonly ServiceBusClient _client;
private readonly IServiceScopeFactory _scopeFactory;
private ServiceBusProcessor _processor;
public ServiceBusBookingsSubscription(IConfiguration configuration,
ILogger<ServiceBusBookingsSubscription> logger,
IServiceScopeFactory scopeFactory)
{
_configuration = configuration;
_logger = logger;
_scopeFactory = scopeFactory;
var connectionString = _configuration.GetConnectionString("ServiceBus");
var serviceBusOptions = new ServiceBusClientOptions()
{
TransportType = ServiceBusTransportType.AmqpWebSockets
};
_client = new ServiceBusClient(connectionString, serviceBusOptions);
}
public async Task PrepareFiltersAndHandleMessages()
{
ServiceBusProcessorOptions _serviceBusProcessorOptions = new ServiceBusProcessorOptions
{
MaxConcurrentCalls = 200,
AutoCompleteMessages = false,
PrefetchCount = 1000,
};
_processor = _client.CreateProcessor(TOPIC_PATH, SUBSCRIPTION_NAME, _serviceBusProcessorOptions);
_processor.ProcessMessageAsync += ProcessMessagesAsync;
_processor.ProcessErrorAsync += ProcessErrorAsync;
await _processor.StartProcessingAsync().ConfigureAwait(false);
}
private async Task ProcessMessagesAsync(ProcessMessageEventArgs args)
{
_logger.LogInformation($"Received message from service bus");
_logger.LogInformation($"Message: {args.Message.Body}");
var payload = args.Message.Body.ToObjectFromJson<List<SchedulerBookingViewModel>>();
// Create scoped dbcontext
using var scope = _scopeFactory.CreateScope();
var dbContext = scope.ServiceProvider.GetRequiredService<dbContext>();
// Process payload
await new TestServiceBus().DoThings(payload);
await args.CompleteMessageAsync(args.Message).ConfigureAwait(false);
}
private Task ProcessErrorAsync(ProcessErrorEventArgs arg)
{
_logger.LogError(arg.Exception, "Message handler encountered an exception");
_logger.LogError($"- ErrorSource: {arg.ErrorSource}");
_logger.LogError($"- Entity Path: {arg.EntityPath}");
_logger.LogError($"- FullyQualifiedNamespace: {arg.FullyQualifiedNamespace}");
return Task.CompletedTask;
}
public async ValueTask DisposeAsync()
{
if (_processor != null)
{
await _processor.DisposeAsync().ConfigureAwait(false);
}
if (_client != null)
{
await _client.DisposeAsync().ConfigureAwait(false);
}
}
public async Task CloseSubscriptionAsync()
{
await _processor.CloseAsync().ConfigureAwait(false);
}
}

So this is how we solved the problem. It was related to the message lock duration, which is set for the Azure Resource in the portal. Previous Value: 30s. New Value: 3min.

Related

How to create Global Variable per hosted service in .Net core

How to create a global variable that can be unique per hosted service execution?
Complete Code:
https://github.com/matvi/dotnet-hosted-services
The problem:
When running hosted services is difficult to keep track of the execution without logs. In order to keep track of the execution of the hosted services I implemented logs with a unique traceId (GUID)
The problem is that the TraceLogId is being created per Task using Static Memory and when 2 task runs at the same time the first TraceLogId gets overridden by the second task.
Is there any way to avoid the traceLogId being overridden?
public static class GlobalVariables
{
public static Guid TraceLogId { get; set; }
}
public class Task1Service : ITask1Service
{
public async Task StartAsync(CancellationToken cancellationToken)
{
GlobalVariables.TraceLogId = Guid.NewGuid();
Console.WriteLine($"Task1 executing with traceLogId = {GlobalVariables.TraceLogId}");
Console.WriteLine($"Task1 will wait 5 seconds = {GlobalVariables.TraceLogId}");
await Task.Delay(5000, cancellationToken);
Console.WriteLine($"Task1 ending = {GlobalVariables.TraceLogId}");
}
}
public class Task2Service : ITask2Service
{
public async Task StartAsync(CancellationToken cancellationToken)
{
Console.WriteLine("Task2 executing");
GlobalVariables.TraceLogId = Guid.NewGuid();
Console.WriteLine($"Task2 executing with traceLogId = {GlobalVariables.TraceLogId}");
Console.WriteLine($"Task2 ending = {GlobalVariables.TraceLogId}");
}
}
When the code is executed Task1 gets a TraceLogId but when it finishes it has the traceLogId that was assigned in Task2.
using System;
using System.Threading;
using System.Threading.Tasks;
using Cronos;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
namespace HostedServicesPoc.Tasks
{
public abstract class CronJobServiceBase : IHostedService, IDisposable
{
private readonly ILogger _log;
private readonly HostedServiceTaskSettingsBase _hostedServiceTaskSettingsBase;
private System.Timers.Timer _timer;
private readonly CronExpression _expression;
private readonly TimeZoneInfo _timeZoneInfo;
protected CronJobServiceBase(IOptions<HostedServiceTaskSettingsBase> hostedServiceSettings, ILogger<CronJobServiceBase> log)
{
_log = log;
_hostedServiceTaskSettingsBase = hostedServiceSettings?.Value;
_expression = CronExpression.Parse(_hostedServiceTaskSettingsBase.CronExpressionTimer, CronFormat.Standard);
_timeZoneInfo = TimeZoneInfo.Local;
}
public virtual async Task StartAsync(CancellationToken cancellationToken)
{
_log.LogInformation($"{GetType()} is Starting");
if (_hostedServiceTaskSettingsBase.Active)
{
await ScheduleJob(cancellationToken);
}
}
public Task StopAsync(CancellationToken cancellationToken)
{
_log.LogInformation($"{GetType()} is Stopping");
return Task.CompletedTask;
}
private async Task ScheduleJob(CancellationToken cancellationToken)
{
var next = _expression.GetNextOccurrence(DateTimeOffset.Now, _timeZoneInfo);
if (next.HasValue)
{
var delay = next.Value - DateTimeOffset.Now;
if (delay.TotalMilliseconds <= 0) // prevent non-positive values from being passed into Timer
{
await ScheduleJob(cancellationToken);
}
_timer = new System.Timers.Timer(delay.TotalMilliseconds);
_timer.Elapsed += async (sender, args) =>
{
_timer.Dispose(); // reset and dispose timer
_timer = null;
if (!cancellationToken.IsCancellationRequested)
{
await ExecuteTaskAsync(cancellationToken);
}
if (!cancellationToken.IsCancellationRequested)
{
await ScheduleJob(cancellationToken); // reschedule next
}
};
_timer.Start();
}
await Task.CompletedTask;
}
protected virtual async Task ExecuteTaskAsync(CancellationToken cancellationToken)
{
await Task.Delay(5000, cancellationToken);
}
public void Dispose()
{
Dispose(true);
GC.SuppressFinalize(this);
}
protected virtual void Dispose(bool dispose)
{
try
{
if (dispose)
{
_timer?.Dispose();
}
}
finally
{
}
}
}
}
TaskServices:
public class Task1HostedService : CronJobServiceBase
{
private readonly IServiceProvider _serviceProvider;
public Task1HostedService(
IOptions<Task1HostedServiceSettings> hostedServiceSettings,
ILogger<CronJobServiceBase> log,
IServiceProvider serviceProvider) : base(hostedServiceSettings, log)
{
_serviceProvider = serviceProvider;
}
protected override async Task ExecuteTaskAsync(CancellationToken cancellationToken)
{
using var scope = _serviceProvider.CreateScope();
var task1Service = scope.ServiceProvider.GetRequiredService<ITask1Service>();
await task1Service.StartAsync(cancellationToken);
}
}
I recommend a scoped value for this; AsyncLocal<T> fits the bill.
public static class GlobalVariables
{
private static AsyncLocal<Guid> _TraceLogId = new();
public static Guid TraceLogId => _TraceLogId.Value;
public static IDisposable SetTraceLogId(Guid value)
{
var oldValue = _TraceLogId.Value;
_TraceLogId.Value = value;
return Disposable.Create(() => _TraceLogId.Value = oldValue);
}
}
public class Task1Service : ITask1Service
{
public async Task StartAsync(CancellationToken cancellationToken)
{
using var traceIdScope = GlobalVariables.SetTraceLogId(Guid.NewGuid());
Console.WriteLine($"Task1 executing with traceLogId = {GlobalVariables.TraceLogId}");
Console.WriteLine($"Task1 will wait 5 seconds = {GlobalVariables.TraceLogId}");
await Task.Delay(5000, cancellationToken);
Console.WriteLine($"Task1 ending = {GlobalVariables.TraceLogId}");
}
}
This uses Disposable from my Disposables library.

Why is my blazor app leaving so many ports open

I created a .net 6 app using server side Blazor and SignalR. The app was basically a single page with 10 different components. Each component was a client that looked something like this:
#code {
private HubConnection? hubConnection;
private ExampleViewModel data { get; set; } = new ExampleViewModel();
protected override async Task OnInitializedAsync()
{
hubConnection = new HubConnectionBuilder()
.WithUrl(NavigationManager.ToAbsoluteUri("/mainhub"))
.Build();
hubConnection.On<ExampleViewModel>("example", (Data) =>
{
data = Data;
StateHasChanged();
});
await hubConnection.StartAsync();
}
public async ValueTask DisposeAsync()
{
if (hubConnection is not null)
{
await hubConnection.DisposeAsync();
}
}
}
Each component has a "broadcaster" that runs on a timer and makes a call to the database using Mediator and Dapper. Example:
public class ExampleBroadcaster : IDataBroadcaster
{
private readonly IMediator _mediator;
private readonly ILogger<ExampleBroadcaster> _logger;
private readonly IHubContext<MainHub> _mainHub;
private readonly IMemoryCache _cache;
private const string Something = "example";
private Timer _timer;
public ExampleBroadcaster(IHubContext<MainHub> mainHub,
IMediator mediator, ILogger<ExampleBroadcaster> logger,
IMemoryCache cache)
{
_mainHub = mainHub;
_mediator = mediator;
_logger = logger;
_cache = cache;
}
public void Start()
{
_timer = new Timer(BroadcastData, null, 0, 30000);
}
private async void BroadcastData(object? state)
{
ExampleViewModel viewModel;
try
{
if (_cache.TryGetValue(Something, out ExampleViewModel data))
{
viewModel = data;
}
else
{
viewModel = _mediator.Send(new GetExampleData()).Result;
_cache.Set(Something, viewModel, TimeSpan.FromMinutes(10));
}
await _mainHub.Clients.All.SendAsync("example", viewModel);
}
catch (Exception ex)
{
_logger.LogError(ex, ex.Message);
}
}
}
The mediator handler simply uses Dapper to get data from the database:
public class GetExampleData : IRequest<ExampleViewModel>
{
}
public class GetExampleDataHandler : IRequestHandler<GetExampleData, ExampleViewModel>
{
private readonly IDbConnectionFactory _connectionFactory;
private string _storedProcedure = "some sproc name";
public GetExampleDataHandler(IDbConnectionFactory connectionFactory)
{
_connectionFactory = connectionFactory;
}
public async Task<ExampleViewModel> Handle(GetExampleData request, CancellationToken cancellationToken)
{
using (var connection = _connectionFactory.GetReadOnlyConnection())
{
return await connection.QueryFirstAsync<ExampleViewModel>(_storedProcedure, CommandType.StoredProcedure);
}
}
}
This is the main razor page that houses all the individual components:
#code {
private HubConnection? hubConnection;
protected override async Task OnInitializedAsync()
{
try
{
hubConnection = new HubConnectionBuilder()
.WithUrl(NavigationManager.ToAbsoluteUri("/mainhub"))
.Build();
await hubConnection.StartAsync();
await hubConnection.SendAsync("Init");
}
catch(Exception exception)
{
Logger.LogError(exception, exception.Message);
}
}
public async ValueTask DisposeAsync()
{
if (hubConnection is not null)
{
await hubConnection.DisposeAsync();
}
}
}
Finally, the MainHub.cs code:
public class MainHub : Hub
{
IEnumerable<IDataBroadcaster> _broadcasters;
private static bool _started;
public MainHub(IEnumerable<IDataBroadcaster> broadcasters)
{
_broadcasters = broadcasters;
}
public void Init()
{
if (!_started)
{
StartBroadcasting();
_started = true;
}
}
private void StartBroadcasting()
{
foreach (var broadcaster in _broadcasters)
{
broadcaster.Start();
}
}
}
This all worked fine locally, in our dev environment, and our test environment. In production, we found that the app was crashing after a number of hours. According to the server admins, the app is opening 100s or 1000s of ports and leaving them open until the number of allotted ports was hit, causing the app to crash.
What is the issue here? The broadcasters are registered as singletons. This app only runs on one web server.

Quartz .NET The instance of entity type 'TABLENAME' cannot be tracked because

We have built an API with .NET Core 3.1 that extracts data from an Excel and stores it via
EF Core into a MS SQL database. We use Quartz. NET so that it is handled in a background thread. For DI we use Autofac.
We use Scoped Services to be able to use the DBContext via DI (as described here https://andrewlock.net/creating-a-quartz-net-hosted-service-with-asp-net-core/).
Unfortunately, saving the data still does not work when multiple users are using the application at the same time. We get the following error message:
The instance of entity type 'TABLENAME' cannot be tracked because another instance with the same key value for {'TABLEKEY'} is already being tracked. When attaching existing entities, ensure that only one entity instance with a given key value is attached. Consider using 'DbContextOptionsBuilder.EnableSensitiveDataLogging' to see the conflicting key values.
Here our related code:
Startup.cs
// Add DbContext
services.AddDbContext<ApplicationDbContext>(options => options.UseSqlServer(Configuration.GetConnectionString("Default"), b => b.MigrationsAssembly("XY.Infrastructure")));
// Add Quartz
services.AddQuartz(q =>
{
// as of 3.3.2 this also injects scoped services (like EF DbContext) without problems
q.UseMicrosoftDependencyInjectionJobFactory();
// these are the defaults
q.UseSimpleTypeLoader();
q.UseDefaultThreadPool(tp =>
{
tp.MaxConcurrency = 24;
});
});
services.AddQuartzServer(options =>
{
// when shutting down we want jobs to complete gracefully
options.WaitForJobsToComplete = true;
});
// Add Services
services.AddHostedService<QuartzHostedService>();
services.AddSingleton<IJobFactory, SingletonJobFactory>();
services.AddSingleton<ISchedulerFactory, StdSchedulerFactory>();
services.AddTransient<ImportJob>();
services.AddScoped<IApplicationDbContext, ApplicationDbContext>();
services.AddScoped<IMyRepository, MyRepository>();
Logic.cs
// Grab the Scheduler instance from the Factory
var factory = new StdSchedulerFactory();
var scheduler = await factory.GetScheduler();
var parameters = new JobDataMap()
{
new KeyValuePair<string, object>("request", message),
new KeyValuePair<string, object>("sales", sales),
};
var jobId = $"processJob{Guid.NewGuid()}";
var groupId = $"group{Guid.NewGuid()}";
// defines the job
IJobDetail job = JobBuilder.Create<ImportJob>()
.WithIdentity(jobId, groupId)
.UsingJobData(parameters)
.Build();
// defines the trigger
ITrigger trigger = TriggerBuilder.Create()
.WithIdentity($"Trigger{Guid.NewGuid()}", groupId)
.ForJob(job)
.StartNow()
.Build();
// schedule Job
await scheduler.ScheduleJob(job, trigger);
// and start it off
await scheduler.Start();
QuartzHostedService.cs
public class QuartzHostedService : IHostedService
{
private readonly ISchedulerFactory _schedulerFactory;
private readonly IJobFactory _jobFactory;
private readonly ILogger<QuartzHostedService> _logger;
private readonly IEnumerable<JobSchedule> _jobSchedules;
public QuartzHostedService(
ISchedulerFactory schedulerFactory,
IJobFactory jobFactory,
IEnumerable<JobSchedule> jobSchedules,
ILogger<QuartzHostedService> logger)
{
_schedulerFactory = schedulerFactory;
_jobSchedules = jobSchedules;
_jobFactory = jobFactory;
_logger = logger;
}
public IScheduler Scheduler { get; set; }
public async Task StartAsync(CancellationToken cancellationToken)
{
try
{
Scheduler = await _schedulerFactory.GetScheduler(cancellationToken);
Scheduler.JobFactory = _jobFactory;
foreach (var jobSchedule in _jobSchedules)
{
var job = CreateJob(jobSchedule);
var trigger = CreateTrigger(jobSchedule);
await Scheduler.ScheduleJob(job, trigger, cancellationToken);
}
await Scheduler.Start(cancellationToken);
}
catch (Exception ex)
{
_logger.LogError(ex.Message);
}
}
public async Task StopAsync(CancellationToken cancellationToken)
{
await Scheduler?.Shutdown(cancellationToken);
}
private static IJobDetail CreateJob(JobSchedule schedule)
{
var jobType = schedule.JobType;
return JobBuilder
.Create(jobType)
.WithIdentity(jobType.FullName)
.WithDescription(jobType.Name)
.Build();
}
private static ITrigger CreateTrigger(JobSchedule schedule)
{
return TriggerBuilder
.Create()
.WithIdentity($"{schedule.JobType.FullName}.trigger")
.StartNow()
.Build();
}
}
SingletonJobFactory.cs
public class SingletonJobFactory : IJobFactory
{
private readonly IServiceProvider _serviceProvider;
public SingletonJobFactory(IServiceProvider serviceProvider)
{
_serviceProvider = serviceProvider;
}
public IJob NewJob(TriggerFiredBundle bundle, IScheduler scheduler)
{
try
{
return _serviceProvider.GetRequiredService(bundle.JobDetail.JobType) as IJob;
}
catch (Exception ex)
{
throw;
}
}
public void ReturnJob(IJob job) { }
}
Importjob.cs
[DisallowConcurrentExecution]
public class ImportJob : IJob
{
private readonly IServiceProvider _provider;
private readonly ILogger<ImportJob> _logger;
public ImportJob(IServiceProvider provider, ILogger<ImportJob> logger)
{
_provider = provider;
_logger = logger;
}
public async Task Execute(IJobExecutionContext context)
{
try
{
using (var scope = _provider.CreateScope())
{
var jobType = context.JobDetail.JobType;
var job = scope.ServiceProvider.GetRequiredService(jobType) as IJob;
var repo = _provider.GetRequiredService<MYRepository>();
var importFactSales = _provider.GetRequiredService<IImportData>();
var savedRows = 0;
var request = (MyRequest)context.JobDetail.JobDataMap.Get("request");
var sales = (IEnumerable<MyData>)context.JobDetail.JobDataMap.Get("sales");
await importFactSales.saveValidateItems(repo, request, sales, savedRows);
}
}
catch (Exception ex)
{
_logger.LogError(ex.Message);
}
}
}
I have found a solution in the meantime. As #marko-lahma described in the comment, use the built-in hosted service and don't implement your own JobFactory.
Remove the SingletonJobFactory.cs and QuartzHostedService.cs
Use the Autofac.Extras.Quartz and Quartz.Extensions.Hosting Nuget Package
Don't use CreateScope anymore, inject all needed Dependencies over the Constructor
Register QuartzAutofacFactoryModule and QuartzAutofacJobsModule in the Startup.

Cannot access a disposed object. with SignalR and Timer Manager

I wanna make my function send data as a real time (every 2 seconds or once there is change in the database table ) but the problem is there is Exception keep appread in my below code.
The exception details are:
'Cannot access a disposed object.
public class MyHub : Hub
{
private readonly IRepository<MyTable, long> _repository;
private readonly IUnitOfWorkManager _unitOfWorkManager;
public HCHub(IUnitOfWorkManager unitOfWorkManager,IRepository<MyTable, long> repository)
{
_repository = repository;
_unitOfWorkManager = unitOfWorkManager;
}
public void Get(TestDto testDto)
{
try {
using (var unitOfWork = _unitOfWorkManager.Begin())
{
var result= _repository.GetDbContext().Set<MyTable>()
.Include(x => x.list)
.ThenInclude(x => x.list2)
.ThenInclude(x => x.obj).ToList();
new TimerManager(async () =>
await Clients.All.SendAsync("listen", result) //<====== in this Line the exception occured
);
}
}
catch(Exception ex)
{
throw new UserFriendlyException(ex.InnerException.Message.ToString());
}
}
and TimerManager Code is
public class TimerManager
{
private Timer _timer;
private AutoResetEvent _autoResetEvent;
private Action _action;
public DateTime TimerStarted { get; }
public TimerManager(Action action)
{
_action = action;
_autoResetEvent = new AutoResetEvent(false);
_timer = new Timer(Execute, _autoResetEvent, 1000, 2000);
TimerStarted = DateTime.Now;
}
public void Execute(object stateInfo)
{
_action();
if ((DateTime.Now - TimerStarted).Seconds > 60)
{
_timer.Dispose();
}
}
}
So the problem is in Timer Manager or in myHub or the way that I'm simulate the realtime data by TimerManager is not acceptable ?!
Once you exit the hub method you aren't guaranteed to be able to access the Clients property. If you want to do something like that, you should inject an IHubContext<THub> into your Hubs constructor and use that instead. You can read more about IHubContext in https://learn.microsoft.com/aspnet/core/signalr/hubcontext?view=aspnetcore-3.1#get-an-instance-of-ihubcontext

How to send message to only caller client in SignalR?

Below is my SignalR Hub class code.
public class ChatHub : Hub
{
public void Send(string name, string message)
{
// Call the addNewMessageToPage method to update clients.
Clients.All.addNewMessageToPage(name, message);
}
public async void webAPIRequest()
{
HttpClient client = new HttpClient();
HttpResponseMessage response = await client.GetAsync("https://jsonplaceholder.typicode.com/posts");
//Clients.All.addWebAPIResponseToPage(response);
Clients.Caller.addWebAPIResponseToPage(response);
await Task.Delay(1000);
response = await client.GetAsync("http://www.google.com");
Clients.Caller.addWebAPIResponseToPage(response);
//Clients.All.addWebAPIResponseToPage(response);
await Task.Delay(1000);
response = await client.GetAsync("https://jsonplaceholder.typicode.com/posts?userId=1");
//Clients.All.addWebAPIResponseToPage(response);
Clients.Caller.addWebAPIResponseToPage(response);
}
}
As per my understanding ,
Clients.Caller.addWebAPIResponseToPage(response);
sends message only to caller client , whereas
Clients.All.addWebAPIResponseToPage(response);
sends the message to all the clients.
Is my understanding correct ?
If No , then what method needs to be called to send message only to caller client.
Yes your understanding is correct. Read it here
https://learn.microsoft.com/en-us/aspnet/signalr/overview/guide-to-the-api/hubs-api-guide-server#selectingclients
You can use caller, you can provide current user connection id and send message to that or I have seen a group called self in some places which keeps user logged in from various devices and send message to that.
For example if you are logged in on a desktop and on mobile as well then you will have two connection IDs but you are same user. You can add this user to a self_username_unique_group_name kind of group and then send a message to that group which will be sent to all devices where user is connected.
You can also manage connection IDs for a single user in a separate table and send message to all of those connection IDs if you want.
Too much flexibility and magic
Enjoy
I found this to work quite well where ConnectionMapping is described in https://learn.microsoft.com/en-us/aspnet/signalr/overview/guide-to-the-api/mapping-users-to-connections
public class Startup
{
public void ConfigureServices(IServiceCollection services)
{
services.AddSingleton<IHttpContextAccessor, HttpContextAccessor>();
services.AddScoped<SomeService>();
services.AddScoped<SessionService>();
services.AddScoped<ProgressHub>();
}
}
public class SomeService
{
ProgressHub _hub;
public SomeService(ProgressHub hub)
{
_hub = hub;
}
private async Task UpdateProgressT(T value)
{
_hub.Send(value);
}
}
public class ProgressHub : Hub
{
private readonly static ConnectionMapping<string> _connections = new ConnectionMapping<string>();
private readonly IHubContext<ProgressHub> _context;
private readonly SessionService _session;
public ProgressHub(IHubContext<ProgressHub> context, SessionService session)
{
_context = context;
_session = session;
}
public override Task OnConnectedAsync()
{
_connections.Add(_session.SiteId, Context.ConnectionId);
return base.OnConnectedAsync();
}
public override Task OnDisconnectedAsync(Exception exception)
{
_connections.Remove(_session.SiteId, Context.ConnectionId);
return base.OnDisconnectedAsync(exception);
}
public async Task Send(object data)
{
foreach (var connectionId in _connections.GetConnections(_session.SiteId))
{
await _context.Clients.Client(connectionId).SendAsync("Message", data);
}
}
}
public class SessionService
{
private readonly ISession _session;
public SessionService(IHttpContextAccessor accessor)
{
_session = accessor.HttpContext.Session;
if (_session == null) throw new ArgumentNullException("session");
}
public string SiteId
{
get => _session.GetString("SiteId");
set => _session.SetString("SiteId", value);
}
}

Resources