Is there explanation for this threading code? - asynchronous

So have come across some code very similar to this. I am just wondering if someone can explain this to me.
See how it uses RX scheduler then Parallel.For and inside that a new TaskFactory.StartNew
IDisposable subscription = someObservable.ObserveOn(ThreadPoolScheduler.Instance)
.Subscribe(o =>
{
Parallel.ForEach(xxxs,
x =>
{
var theKey = x.Key;
if (!theTasks.ContainsKey(theKey) ||
theTasks.ContainsKey(theKey) && theTasks[theKey].IsCompleted)
{
theTasks[theKey] = Task.Factory.StartNew(
() =>
{
.....
}
catch (CommunicationObjectAbortedException ex)
{
....
}
catch (ObjectDisposedException ex)
{
....
}
catch (Exception e)
{
....
}
});
}
});
},
ex =>
{
....
},
() =>
{
....
});
}
I know what all these things do individually, but am not really sure what the combined threading effect here is. Can anyone hazzard a guess

Ah yes, the concurrency Turducken.
ThreadPoolScheduler schedules work on the thread pool which is distinct from the task pool. ThreadPoolScheduler was meant to be used to on platforms where a task pool was not available - prefer TaskPoolScheduler when possible.
It feels like the writer was trying to save up the task pool for only the task at hand (pardon the pun), by using the thread pool.
Parallel.ForEach blocks until the loop has been completed. So while it was running on the thread pool, when a new item is emitted, do the next ForEach on a borrowed thread from the thread pool.
As for the inner bit, the writer wants one Task to be run per unique key, if isn't already running.

Related

Using SemaphoreSlim with Parallel.ForEach

This is what I am trying to achieve. Let's say I have a process which runs every minute and performs some I/O operations. I want 5 threads to execute simultaneously and do the operations. Suppose if 2 threads took longer than a minute and when the process runs again after a minute, it should execute 3 threads simultaneously as 2 threads are already doing some operations.
So, I used the combination of SemaphoreSlim and Parallel.ForEach. Please let me know if this is the correct way of achieving this or there is any other better way.
private static SemaphoreSlim _semaphoreSlim = new SemaphoreSlim(5);
private async Task ExecuteAsync()
{
try
{
var availableThreads = _semaphoreSlim.CurrentCount;
if (availableThreads > 0)
{
var lists = await _feedSourceService.GetListAsync(availableThreads); // select #top(availableThreads) * from table
Parallel.ForEach(
lists,
new ParallelOptions
{
MaxDegreeOfParallelism = availableThreads
},
async item =>
{
await _semaphoreSlim.WaitAsync();
try
{
// I/O operations
}
finally
{
_semaphoreSlim.Release();
}
});
}
}
catch (Exception ex)
{
_logger.LogError(ex.Message, ex);
}
}
Let's say I have a process which runs every minute and performs some I/O operations... Suppose if 2 threads took longer than a minute and when the process runs again after a minute, it should execute 3 threads simultaneously as 2 threads are already doing some operations.
This kind of problem description is somewhat common, but is surprisingly difficult to code correctly. This is because you have a polling-style timer (time based) that is trying to periodically adjust a throttling mechanism. Doing this correctly is quite difficult.
So, the first thing I'd recommend is to change the problem description. Consider having the polling mechanism read all outstanding work, and then use normal throttling from there (e.g., adding then to an execution-constrained ActionBlock).
That said, if you'd prefer continuing down the more complex path, code like this would avoid the Parallel with async problem:
private static SemaphoreSlim _semaphoreSlim = new SemaphoreSlim(5);
private async Task ExecuteAsync()
{
try
{
var availableThreads = _semaphoreSlim.CurrentCount;
if (availableThreads > 0)
{
var lists = await _feedSourceService.GetListAsync(availableThreads); // select #top(availableThreads) * from table
var tasks = lists.Select(
async item =>
{
await _semaphoreSlim.WaitAsync();
try
{
// I/O operations
}
finally
{
_semaphoreSlim.Release();
}
}).ToList();
await Task.WhenAll(tasks);
}
}
catch (Exception ex)
{
_logger.LogError(ex.Message, ex);
}
}

Continue code when awaited PushModalAsync Page is closed Xamarin.Forms

I have a method like this:
public async Task BtnLoad_OnClick()
{
MediaPage galleryPage = new MediaPage();
await Application.Current.MainPage.Navigation.PushModalAsync(galleryPage);
try
{
//some logic here;
}
catch (Exception ex)
{
//
}
}
My intention was to open MediaPage() and wait until it is closed before the try{}catch{} follows.
As of now, as soon as the MediaPage opens, the try{}catch{} is executed straight away, and this is not what I intend.
How can I wait until the PushModalAsync(galleryPage) is closed?
OK.
I found the solution:
galleryPage.Disappearing += (sender2, e2) => {//after closing logic here.}

Plugin BLE (v1.3.0) delay after characteristic.ReadAsync()

I’m developing an app and I want to read some characteristics one after one.
My issue is that after a read is done I must do a delay otherwise I get an error.
Why does it need a delay ? is there a way to write correctly read tasks one after other ?
I'm using Xamarin.forms and Ble v1.3.0 plugin.
I've tried "await Task.Delay(200)" between two consecutive ReadAsync() functions and it works fine but if I remove the delay, the second ReadAsync gets exception.
private async Task ReadChr(ICharacteristic LocalCharacteristic)
{
byte[] localData = { };
if (LocalCharacteristic.CanRead)
{
try
{
return localData = await LocalCharacteristic.ReadAsync();
}
catch (Exception ex)
{
Debug.WriteLine(ex.Message);
return null;
}
}
}
if (firstCharacteristic.CanRead)
{
var ccc = await ReadChr(firstCharacteristic);
}
await Task.Delay(200);
if (secondCharacteristic.CanRead)
{
var ddd = await ReadChr(secondCharacteristic);
}
I'm searching something like polling the read process status of the characteristic. Delay after ReadAsync does not seem a good practice coding.
Any idea ?

AsDocumentQuery.HasMore Result parallelism?

I have not yet faced a situation where a query for documents has more than 1 "set". I was wondering, what would happen, if instead of
while (queryable.HasMoreResults)
{
foreach(Book b in await queryable.ExecuteNextAsync<Book>())
{
// Iterate through books
}
}
i use
ConcurrentBag<IPost> result = new ConcurrentBag<IPost>();
List<Task> tasks = new List<Task>();
var outQ = query.AsDocumentQuery<IPost>();
while (outQ.HasMoreResults)
{
var parcialResult = outQ.ExecuteNextAsync<IPost>().ContinueWith((t) =>
{
foreach (var item in t.Result)
{
result.Add(item);
}
});
tasks.Add(parcialResult);
}
return Task.WhenAll(tasks).ContinueWith((r) => { return result.AsEnumerable(); });
I'm under the impression that the second approach, being a parallel one would yield more performance, once the partial queries would be execute concurrently... but i'm afraid that while (outQ.HasMoreResults) won't become false until all async ops have finished...
Your concern regarding HasMoreResults not updating quickly enough is well-founded. You cannot dispatch an async operation and assume the object will immediately transition to the state expected of it at the end of the async operation. In this particular case you have no choice but to wait for the async operation to complete.
I assume that the library authors deliberately chose the API which does not lend itself to parallelisation - most likely because it encapsulates the kind of work which needs to be done serially.

SignalR Start()'s task is continuing even if connection is not possible

I have a C# SignalR client and I want to do some actions upon success/failure of the connection to my server. Here is my code :
this.connection.Start().ContinueWith(task =>
{
if (task.IsFaulted)
{
this.OnRaiseServerConnectionClosedEvent();
}
else
{
this.JoinGroup();
this.StopTimer();
this.OnRaiseServerConnectionOpenedEvent();
}
});
}
The else block is always executed, not caring if a server is here or not...
I have also tried with await or with Wait() but same scenario.
I understand .net tasks correctly I think but here I am stuck.
Right now my code looks like
try
{
this.connection.Start().Wait();
if (this.connection.State == ConnectionState.Connected)
{
this.JoinGroup();
this.StopTimer();
this.OnRaiseServerConnectionOpenedEvent();
}
}
catch (AggregateException)
{
this.OnRaiseServerConnectionClosedEvent();
}
catch (InvalidOperationException)
{
this.OnRaiseServerConnectionClosedEvent();
}
When no server is present, the task created by the Start() method returns without fault and with status connecting. You have to check the state of the connection if you want to follow with some actions or retry connecting.
The task you are receiving from Connection.Start is likely cancelled due to a timeout instead of being faulted. This should be an easy fix:
this.connection.Start().ContinueWith(task =>
{
if (task.IsFaulted || task.IsCanceled)
{
this.OnRaiseServerConnectionClosedEvent();
}
else
{
this.JoinGroup();
this.StopTimer();
this.OnRaiseServerConnectionOpenedEvent();
}
});
If you use Wait() instead of ContinueWith, an AggregateException containing an OperationCanceledException in its InnerExceptions collection will be thrown when the task is canceled.

Resources