Tornado async endpoint does not works - asynchronous

I have two microservices:
a service
Tornado service with two endpoints: /foo and /bar
/foo
async def get(...):
x = await test()
return x
async def test():
y = call to b service, FooBar rpc
return y
/bar
def get(...):
return True
b service
gRPC service with rpc FooBar
rpc FooBar
def FooBar(...):
return requests.get("/bar")
If client hits endpoint /foo in a service:
Code hits rpc FooBar in b service
FooBar rpc can't hits /bar endpoint in a service as that service is blocked.
AFAIK, using x=await test() should prevent us of such blocking, what I have missed?

Since the rpc calls aren't async, it will block the Tornado process.
You can avoid blocking the main process by running the rpc calls in a separate thread.
First, make the test() method regular function, not a coroutine (remove the async keyword).
Example code:
async def get(...):
x = await IOLoop.current().run_in_executor(None, test)
return x
# regular function, not async
def test(...):
# make calls
return x

Related

SignalR return value from client method

Hello I'm developing a Server-Client application that communicate with SignalR. What I have to implement is a mechanism that will allow my server to call method on client and get a result of that call. Both applications are developed with .Net Core.
My concept is, Server invokes a method on Client providing Id of that invocation, the client executes the method and in response calls the method on the Server with method result and provided Id so the Server can match the Invocation with the result.
Usage is looking like this:
var invocationResult = await Clients
.Client(connectionId)
.GetName(id)
.AwaitInvocationResult<string>(ClientInvocationHelper._invocationResults, id);
AwaitInvocationResult - is a extension method to Task
public static Task<TResultType> AwaitInvocationResult<TResultType>(this Task invoke, ConcurrentDictionary<string, object> lookupDirectory, InvocationId id)
{
return Task.Run(() =>
{
while (!ClientInvocationHelper._invocationResults.ContainsKey(id.Value)
|| ClientInvocationHelper._invocationResults[id.Value] == null)
{
Thread.Sleep(500);
}
try
{
object data;
var stingifyData = lookupDirectory[id.Value].ToString();
//First we should check if invocation response contains exception
if (IsClientInvocationException(stingifyData, out ClientInvocationException exception))
{
throw exception;
}
if (typeof(TResultType) == typeof(string))
{
data = lookupDirectory[id.Value].ToString();
}
else
{
data = JsonConvert.DeserializeObject<TResultType>(stingifyData);
}
var result = (TResultType)data;
return Task.FromResult(result);
}
catch (Exception e)
{
Console.WriteLine(e);
throw;
}
});
}
As you can see basically I have a dictionary where key is invocation Id and value is a result of that invocation that the client can report. In a while loop I'm checking if the result is already available for server to consume, if it is, the result is converted to specific type.
This mechanism is working pretty well but I'm observing weird behaviour that I don't understand.
If I call this method with await modifier the method in Hub that is responsible to receive a result from client is never invoked.
///This method gets called by the client to return a value of specific invocation
public Task OnInvocationResult(InvocationId invocationId, object data)
{
ClientInvocationHelper._invocationResults[invocationId.Value] = data;
return Task.CompletedTask;
}
In result the while loop of AwaitInvocationResult never ends and the Hub is blocked.
Maby someone can explain this behaviour to me so I can change my approach or improve my code.
As it was mentioned in the answer by Brennan, before ASP.NET Core 5.0 SignalR connection was only able to handle one not streaming invocation of hub method at time. And since your invocation was blocked, server wasn't able to handle next invocation.
But in this case you probably can try to handle client responses in separate hub like below.
public class InvocationResultHandlerHub : Hub
{
public Task HandleResult(int invocationId, string result)
{
InvoctionHelper.SetResult(invocationId, result);
return Task.CompletedTask;
}
}
While hub method invocation is blocked, no other hub methods can be invoked by caller connection. But since client have separate connection for each hub, he will be able to invoke methods on other hubs. Probably not the best way, because client won't be able to reach first hub until response will be posted.
Other way you can try is streaming invocations. Currently SignalR doesn't await them to handle next message, so server will handle invocations and other messages between streaming calls.
You can check this behavior here in Invoke method, invocation isn't awaited when it is stream
https://github.com/dotnet/aspnetcore/blob/c8994712d8c3c982111e4f1a09061998a81d68aa/src/SignalR/server/Core/src/Internal/DefaultHubDispatcher.cs#L371
So you can try to add some dummy streaming parameter that you will not use:
public async Task TriggerRequestWithResult(string resultToSend, IAsyncEnumerable<int> stream)
{
var invocationId = InvoctionHelper.ResolveInvocationId();
await Clients.Caller.SendAsync("returnProvidedString", invocationId, resultToSend);
var result = await InvoctionHelper.ActiveWaitForInvocationResult<string>(invocationId);
Debug.WriteLine(result);
}
and on the client side you will also need to create and populate this parameter:
var stringResult = document.getElementById("syncCallString").value;
var dummySubject = new signalR.Subject();
resultsConnection.invoke("TriggerRequestWithResult", stringResult, dummySubject);
dummySubject.complete();
More details: https://learn.microsoft.com/en-us/aspnet/core/signalr/streaming?view=aspnetcore-5.0
If you can use ASP.NET Core 5, you can try to use new MaximumParallelInvocationsPerClient hub option. It will allow several invocations to execute in parallel for one connection. But if your client will call too much hub methods without providing result, connection will hang.
More details: https://learn.microsoft.com/en-us/aspnet/core/signalr/configuration?view=aspnetcore-5.0&tabs=dotnet
Actually, since returning values from client invocations isn't implemented by SignalR, maybe you can try to look into streams to return values into hubs?
This is supported in .NET 7 now https://learn.microsoft.com/en-us/aspnet/core/signalr/hubs?view=aspnetcore-7.0#client-results
By default a client can only have one hub method running at a time on the server. This means that when you wait for a result in the first hub method, the second hub method will never run since the first hub method is blocking the processing loop.
It would be better if the OnInvocationResult method ran the logic in your AwaitInvocationResult extension and the first hub method just registers the id and calls the client.

FastAPI Async Def calling common function

I want to wrap the upload part in a common function since it is used in multiple API routes, but how do I do it since it is using async def here.
#app.post("/api/od")
async def image_classification(files: typing.List[fastapi.UploadFile] = fastapi.File(...)):
upload_path = pathlib.Path("upload")#.joinpath(token)
upload_path.mkdir(exist_ok=True)
...
return results

Making Azure Function signalR payloads camel case

We have an HttpTriggred function according to the following code snippet:
[FunctionName("commandcompleted")]
public static Task SendMessage(
[HttpTrigger(AuthorizationLevel.Function, "post", Route = "commandcompleted/{userId}")]
object message,
string userId,
[SignalR(HubName = Negotitate.HubName)]IAsyncCollector<SignalRMessage>
signalRMessages,
ILogger log)
{
return signalRMessages.AddAsync(
new SignalRMessage
{
UserId = userId,
Target = "CommandCompleted",
Arguments = new[] { message }
});
}
The client app which is, in fact, a signalR client receives a notification upon completion an operation once the mentioned trigger is invoked.
It's observed that the payload received by the client app is always in Pascal Case. How can we augment the function's code so that it broadcasts the payload in camel case format? Please note that decorating the object's properties by [JsonProperty("camelCasePropertyName")] is not an option and we'd like to do away from it.
The application tier which prepares message object must take care of serializing it in camel case format before submitting it to the http-triggered function.

does grpc service must have exactly one input parameter and one return value

let's say i have a proto file like this. can I define service like this
rpc SayHello () returns (Response) {} //service has no input
rpc SayHello (Request1,Request2) returns (Response) {}//service has two inputs
//.proto file
syntax = "proto3";
service Greeter{
rpc SayHello (Request) returns (Response) {}
}
message Request{
string request = 1;
}
message Response{
string response = 1;
}
gRPC service methods have exactly one input message and exactly one output message. Typically, these messages are used as input and output to only one method. This is on purpose, as it allows easily adding new parameters later (to the messages) while maintaining backward compatibility.
If you don't want any input or output parameters, you can use the well-known proto google.protobuf.Empty. However, this is discouraged as it prevents you from adding parameters to the method in the future. Instead, you would be encouraged to follow the normal practice of having a message for the request, but simply with no contents:
service Greeter {
rpc SayHello (SayHelloRequest) returns (SayHelloResponse) {}
}
message SayHelloRequest {} // service has no input
Similarly, if you want two request parameters, just include both in the request message:
message SayHelloRequest { // service has two inputs
string request = 1;
string anotherRequestParam = 2;
}

Asynchronous action methods and IO completion ports

One of the reasons why it is important to use asynchronous programming when our application relies on external services, is to allow ASP.NET the use of IO completion ports, so rather than block a thread waiting for the external service to respond, ASP.NET can park the execution in an IO completion port and use the thread for attending another request, whenever the external service responds, then ASP.NET gets that execution again and resumes it. This way, no thread is block.
A example of asynchronous method would be:
[HttpPost]
public async Task<ActionResult> Open(String key)
{
Foo foo= await _externalService.GetFoo(key);
return View(foo);
}
But what does happen if we use multiple requests to external services? How does ASP.NET handles it?
[HttpPost]
public async Task<ActionResult> Open()
{
List<Task<Foo>> tasks = new List<Task<Foo>>();
foreach (var key in this.Request.Form.AllKeys)
tasks.Add(_externalService.GetFoo(key));
var foos = await Task.WhenAll(tasks);
Foo foo = null;
foreach (var f in foos)
{
if (foo == null && f != null)
foo = f;
else
foo.Merge(f);
}
return View(foo);
}
Is it still using IO completion ports? Or because the Task.WhenAll is blocking a thread?
It still uses I/O completion ports. WhenAll is asynchronous and does not block a thread.

Resources