I have a requirement to a batch a number of web service calls on the receipt of a single message appearing in a (MSMQ) queue.
Is "sagas" the way to go?
The interaction with the 3rd party web service is further complicated because I need to call it once, then poll subsequently for an acknowledgement with a correlation id, returned in the reply to the initial call to the web service.
Yes, sagas could be the way to coordinate the process of initiating the call, polling until the operation has ended, and then doing something else when all the work is done.
If you don't care too much about accidentally making the web service call more than once, you can easily use Rebus' async capabilities to implement the polling - I am currently in the process of building something that basically does this:
public async Task Handle(SomeMessage message)
{
var response = await _client.Get<SomeResponse>("https://someurl") ;
var pollUrl = response.PollUrl;
var resultUrl = response.ResultUrl;
while(true)
{
var result = await _client.Get<PollResult>(pollUrl);
if (result.Status == PollStatus.Processing)
{
await Task.Delay(TimeSpan.FromSeconds(2));
continue;
}
if (result.Status == PollStatus.Done)
{
var finalResult = await _client.Get<FinalResult>(resultUrl);
return new SomeReply(finalResult);;
}
throw new Exception($"Unexpected status while polling {pollUrl}: {result.Status}")
}
}
thus taking advantage of async/await to poll the external webservice while it is processing, while consuming minimal resources in our end.
Related
When i am set stop point at line with XREAD programm doing nothing. Maybe i need to configure this XREAD command?
public async void ListenTask()
{
var readTask = Task.Run(async () =>
{
while (!Token.IsCancellationRequested)
{
var result = db.StreamRead(streamName, "$", 1);
if (result.Any())
{
var dict = ParseResult(result.Last());
var sb = new StringBuilder();
foreach (var key in dict.Keys)
{
sb.Append(dict[key]);
}
Console.WriteLine(sb.ToString());
}
await Task.Delay(1000);
}
});
}
It's an illegal operation on StackExchange.Redis
Because of its unique multiplexed architecture, StackExchange.Redis (the library you appear to be using) does not support blocking XREAD operations. This is because all the commands going over the interactive interface (basically everything non-pub/sub) uses the same connection. If you block one of those connections, everything else in your app dependent on the multiplexer will be backed up awaiting the block to complete. The StackExchange.Redis library actually goes so far as to consider the $ id an illegal id, it's only purpose after all is to block. What's most likely happening (you don't see it happen because it's being swallowed up by the synchronization context) is that var result = db.StreamRead(streamName, "$", 1); is throwing an InvalidOperationException: System.InvalidOperationException: StreamPosition.NewMessages cannot be used with StreamRead.
Work Arounds
There are 2 potential workarounds in this case, first, you can use poll with XRANGE command rather than using blocking reads.
var readTask = Task.Run(async () =>
{
var lastId = "-";
while (!token.IsCancellationRequested)
{
var result = await db.StreamRangeAsync("a-stream", lastId, "+");
if(result.Any()
{
lastId = result.Last().Id;
}
await Task.Delay(1000);
}
});
You're already effectively doing a thread sleep so this polling operation is probably good enough for what you're looking for.
If you really need to do blocking operations, you'll have to use a different library, if you try to use StackExchange.Redis (you can possibly force the issue with the Execute/ExecuteAsync commands) you can seriously negatively degrade its performance.
Articles on doing so with ServiceStack.Redis and CsRedis are available on the redis developer site (I'm the author of them)
One final thing
Probably want to make sure that when you are issuing these commands that you are being as async as possible, you're using the sync XREAD command in an Async context (mostly every command in StackExchange.Redis has a sync/async version you can use - use the async when possible).
Hello I'm developing a Server-Client application that communicate with SignalR. What I have to implement is a mechanism that will allow my server to call method on client and get a result of that call. Both applications are developed with .Net Core.
My concept is, Server invokes a method on Client providing Id of that invocation, the client executes the method and in response calls the method on the Server with method result and provided Id so the Server can match the Invocation with the result.
Usage is looking like this:
var invocationResult = await Clients
.Client(connectionId)
.GetName(id)
.AwaitInvocationResult<string>(ClientInvocationHelper._invocationResults, id);
AwaitInvocationResult - is a extension method to Task
public static Task<TResultType> AwaitInvocationResult<TResultType>(this Task invoke, ConcurrentDictionary<string, object> lookupDirectory, InvocationId id)
{
return Task.Run(() =>
{
while (!ClientInvocationHelper._invocationResults.ContainsKey(id.Value)
|| ClientInvocationHelper._invocationResults[id.Value] == null)
{
Thread.Sleep(500);
}
try
{
object data;
var stingifyData = lookupDirectory[id.Value].ToString();
//First we should check if invocation response contains exception
if (IsClientInvocationException(stingifyData, out ClientInvocationException exception))
{
throw exception;
}
if (typeof(TResultType) == typeof(string))
{
data = lookupDirectory[id.Value].ToString();
}
else
{
data = JsonConvert.DeserializeObject<TResultType>(stingifyData);
}
var result = (TResultType)data;
return Task.FromResult(result);
}
catch (Exception e)
{
Console.WriteLine(e);
throw;
}
});
}
As you can see basically I have a dictionary where key is invocation Id and value is a result of that invocation that the client can report. In a while loop I'm checking if the result is already available for server to consume, if it is, the result is converted to specific type.
This mechanism is working pretty well but I'm observing weird behaviour that I don't understand.
If I call this method with await modifier the method in Hub that is responsible to receive a result from client is never invoked.
///This method gets called by the client to return a value of specific invocation
public Task OnInvocationResult(InvocationId invocationId, object data)
{
ClientInvocationHelper._invocationResults[invocationId.Value] = data;
return Task.CompletedTask;
}
In result the while loop of AwaitInvocationResult never ends and the Hub is blocked.
Maby someone can explain this behaviour to me so I can change my approach or improve my code.
As it was mentioned in the answer by Brennan, before ASP.NET Core 5.0 SignalR connection was only able to handle one not streaming invocation of hub method at time. And since your invocation was blocked, server wasn't able to handle next invocation.
But in this case you probably can try to handle client responses in separate hub like below.
public class InvocationResultHandlerHub : Hub
{
public Task HandleResult(int invocationId, string result)
{
InvoctionHelper.SetResult(invocationId, result);
return Task.CompletedTask;
}
}
While hub method invocation is blocked, no other hub methods can be invoked by caller connection. But since client have separate connection for each hub, he will be able to invoke methods on other hubs. Probably not the best way, because client won't be able to reach first hub until response will be posted.
Other way you can try is streaming invocations. Currently SignalR doesn't await them to handle next message, so server will handle invocations and other messages between streaming calls.
You can check this behavior here in Invoke method, invocation isn't awaited when it is stream
https://github.com/dotnet/aspnetcore/blob/c8994712d8c3c982111e4f1a09061998a81d68aa/src/SignalR/server/Core/src/Internal/DefaultHubDispatcher.cs#L371
So you can try to add some dummy streaming parameter that you will not use:
public async Task TriggerRequestWithResult(string resultToSend, IAsyncEnumerable<int> stream)
{
var invocationId = InvoctionHelper.ResolveInvocationId();
await Clients.Caller.SendAsync("returnProvidedString", invocationId, resultToSend);
var result = await InvoctionHelper.ActiveWaitForInvocationResult<string>(invocationId);
Debug.WriteLine(result);
}
and on the client side you will also need to create and populate this parameter:
var stringResult = document.getElementById("syncCallString").value;
var dummySubject = new signalR.Subject();
resultsConnection.invoke("TriggerRequestWithResult", stringResult, dummySubject);
dummySubject.complete();
More details: https://learn.microsoft.com/en-us/aspnet/core/signalr/streaming?view=aspnetcore-5.0
If you can use ASP.NET Core 5, you can try to use new MaximumParallelInvocationsPerClient hub option. It will allow several invocations to execute in parallel for one connection. But if your client will call too much hub methods without providing result, connection will hang.
More details: https://learn.microsoft.com/en-us/aspnet/core/signalr/configuration?view=aspnetcore-5.0&tabs=dotnet
Actually, since returning values from client invocations isn't implemented by SignalR, maybe you can try to look into streams to return values into hubs?
This is supported in .NET 7 now https://learn.microsoft.com/en-us/aspnet/core/signalr/hubs?view=aspnetcore-7.0#client-results
By default a client can only have one hub method running at a time on the server. This means that when you wait for a result in the first hub method, the second hub method will never run since the first hub method is blocking the processing loop.
It would be better if the OnInvocationResult method ran the logic in your AwaitInvocationResult extension and the first hub method just registers the id and calls the client.
I'm running into a problem sending massive requests to a .NET Core web service. I'm using a SemaphoreSlim to limit the number of simultaneous requests. When I get a 10061 error (the web service has refused the connection), I want to dial back the number of simultaneous requests. My idea at the moment is to de-reference the SemaphoreSlim and create another:
await this.semaphoreSlim.WaitAsync().ConfigureAwait(false);
counter++;
Uri uri = new Uri($"{api}/{keyProperty}", UriKind.Relative);
string rowVersion = string.Empty;
try
{
HttpResponseMessage getResponse = await this.httpClient.GetAsync(uri).ConfigureAwait(false);
if (getResponse.IsSuccessStatusCode)
{
using (HttpContent httpContent = getResponse.Content)
{
JObject currentObject = JObject.Parse(await httpContent.ReadAsStringAsync().ConfigureAwait(false));
rowVersion = currentObject.Value<string>("rowVersion");
}
}
}
catch (HttpRequestException httpRequestException)
{
SocketException socketException = httpRequestException.InnerException as SocketException;
if (socketException != null && socketException.ErrorCode == PutHandler.ConnectionRefused)
{
this.semaphoreSlim = new SemaphoreSlim(counter * 90 / 100, counter * 90 / 100);
}
}
}
finally
{
this.semaphoreSlim.Release();
}
If I do this, what will happen to the other tasks that are waiting on the Semaphore that I just de-referenced? My guess is that nothing will happen until the object is garbage collected and disposed.
A SemaphoreSlim (just like any other object in .NET) will exist as long as there are references to it.
However, there is a bug in your code: the SemaphoreSlim being released is this.semaphoreSlim, and if this.semaphoreSlim is changed between being acquired and being released, then the code will release a different semaphore than the one that was acquired. To avoid this problem, copy this.semaphoreSlim into a local variable at the beginning of your method, and acquire and release that local variable.
More broadly, there's a difficult in the attempted solution. If you start 1000 tasks, they will all reference the old semaphore and ignore the updated this.sempahoreSlim. So you'd need a separate solution. For example, you could define a disposable "token" which is permission to call the API. Then have an asynchronous collection of these tokens (e.g., a Channel). This gives you full control over how many tokens are released at once.
I am doing a little research to understand async / await of C# better.
I found a web site that has the following code to show how much slower synchronous processing is vs async / await:
public IActionResult Index()
{
Stopwatch watch = new Stopwatch();
watch.Start();
ContentManagement service = new ContentManagement();
var content = service.GetContent();
var count = service.GetCount();
var name = service.GetName();
watch.Stop();
ViewBag.WatchMilliseconds = watch.ElapsedMilliseconds;
return View();
}
[HttpGet]
public async Task<ActionResult> IndexAsync()
{
Stopwatch watch = new Stopwatch();
watch.Start();
ContentManagement service = new ContentManagement();
var contentTask = service.GetContentAsync();
var countTask = service.GetCountAsync();
var nameTask = service.GetNameAsync();
var content = await contentTask;
var count = await countTask;
var name = await nameTask;
watch.Stop();
ViewBag.WatchMilliseconds = watch.ElapsedMilliseconds;
return View("Index");
}
public class ContentManagement
{
public string GetContent()
{
Thread.Sleep(2000);
return "content";
}
public int GetCount()
{
Thread.Sleep(5000);
return 4;
}
public string GetName()
{
Thread.Sleep(3000);
return "Matthew";
}
public async Task<string> GetContentAsync()
{
await Task.Delay(2000);
return "content";
}
public async Task<int> GetCountAsync()
{
await Task.Delay(5000);
return 4;
}
public async Task<string> GetNameAsync()
{
await Task.Delay(3000);
return "Matthew";
}
}
I understand the above code at a high level and why it performs faster.
What I don't understand is if threads are not being used, how is the processing running at the same time?
I have read in a couple of places that async / await does not create new threads to do the processing. So, what is async / await doing to allow processing to happen at the same time? The three await Task.Delay are running in parallel, correct? If it is not creating 3 threads, what is it doing?
I just want to understand what is happening at a high level.
Let me know.
Thanks in advance.
if threads are not being used, how is the processing running at the same time?
Threads let you parallelize computations on the same system. When communications or other I/O are involved, there is a different system with which your code communicates. When you initiate the task, the other system starts doing work. This happens in parallel to your system, which is free to do whatever else it needs to do until you await the task.
The three await Task.Delay are running in parallel, correct?
They are not exactly running, they are sleeping in parallel. Sleeping takes very little resources. That's why they appear to be "running" in parallel.
What I don't understand is if threads are not being used, how is the processing running at the same time?
You can think of it as an event firing when the operation is complete, as opposed to a thread being blocked until the operation is complete.
I have read in a couple of places that async / await does not create new threads to do the processing.
async and await do not; that is true. For more about how async and await work, see my intro post.
So, what is async / await doing to allow processing to happen at the same time?
One of the primary use cases of async/await is for I/O-based code. I have a long blog post that goes into the details of how asynchronous I/O does not require threads.
The three await Task.Delay are running in parallel, correct?
I prefer to use the term "concurrently", just to avoid confusion with Parallel and Parallel LINQ, both of which were created for CPU-bound parallelism and do not work as generally expected with async/await. So, I would say that both parallelism and asynchrony are forms of concurrency, and this is an example of asynchronous concurrency.
(That said, using the term "parallel" is certainly in concord with the common usage of the term).
If it is not creating 3 threads, what is it doing?
Task.Delay is not an I/O-based operation, but it is very similar to one. It uses timers underneath, so it's completely different than Thread.Sleep.
Thread.Sleep will block a thread - I believe it does go all the way to an OS Sleep call, which causes the OS to place the thread in a wait state until its sleep time is expired.
Task.Delay acts more like an I/O operation. So, it sets up a timer that fires off an event when the time expires. Timers are managed by the OS itself - as time proceeds forward (clock ticks on the CPU), the OS will notify the timer when it has completed. It's a bit more complex than that (for efficiency, .NET will coalesce managed timers), but that's the general idea.
So, the point is that there is no dedicated thread for each Task.Delay that is blocked.
I have setup a SignalR hub which has the following method:
public void SomeFunction(int SomeID)
{
try
{
Thread.Sleep(600000);
Clients.Caller.sendComplete("Complete");
}
catch (Exception ex)
{
// Exception Handling
}
finally
{
// Some Actions
}
m_Logger.Trace("*****Trying To Exit*****");
}
The issue I am having is that SignalR initiates and defaults to Server Sent Events and then hangs. Even though the function/method exits minutes later (10 minutes) the method is initiated again ( > 3 minutes) even when the sendComplete and hub.stop() methods are initiated/called on the client prior. Should the user stay on the page the initial "/send?" request stays open indefinitely. Any assistance is greatly appreciated.
To avoid blocking the method for so long, you could use a Taskand call the client method asynchronously.
public void SomeFunction(Int32 id)
{
var connectionId = this.Context.ConnectionId;
Task.Delay(600000).ContinueWith(t =>
{
var message = String.Format("The operation has completed. The ID was: {0}.", id);
var context = GlobalHost.ConnectionManager.GetHubContext<SomeHub>();
context.Clients.Client(connectionId).SendComplete(message);
});
}
Hubs are created when request arrives and destroyed after response is sent down the wire, so in the continuation task, you need to create a new context for yourself to be able to work with a client by their connection identifier, since the original hub instance will no longer be around to provide you with the Clients method.
Also note that you can leverage the nicer syntax that uses async and await keywords for describing asynchronous program flow. See examples at The ASP.NET Site's SignalR Hubs API Guide.