Can we execute other process while MPI_Waitall is in progress - mpi

I like to execute a process while the mpi_waitall is performing blocking, is it possible?
recv(....)
send(....)
stream(..) // the process I want to perform while waitall is in progress
MPI_Waitall(...)

Related

Cancel QThread in PyQt5

I have a GUI in PyQt5, which starts a QThread that reads from a serial port. The thread does quit, when it read all the data, but I want to be able to stop it when I click on a stop button. How do I do that? Here is the basic code:
# ...
class Worker(QObject):
finished = pyqtSignal()
progress = pyqtSignal(list)
def __init__(self):
QObject.__init__(self)
self._Reader = Reader()
self._Reader.progress = self.progress
self._Reader.finished = self.finished
def run(self):
self._Reader.read()
class Ui(QtWidgets.QMainWindow):
# ...
def startClicked(self):
self.thread = QThread()
self.worker = Worker()
self.worker.moveToThread(self.thread)
self.thread.started.connect(self.worker.run)
self.worker.finished.connect(self.thread.quit)
self.worker.finished.connect(self.worker.deleteLater)
self.worker.finished.connect(self.workerFinished)
self.thread.finished.connect(self.thread.deleteLater)
self.worker.progress.connect(self.reportProgress)
self.thread.start()
def stopClicked(self):
# How do I stop the thread?
pass
when managing threads you can do, as states in the doc here: https://doc.qt.io/qt-5/qthread.html
You can stop the thread by calling exit() or quit().
https://doc.qt.io/qt-5/qthread.html#exit
exit:
Tells the thread's event loop to exit with a return code.
After calling this function, the thread leaves the event loop and returns from the call to QEventLoop::exec(). The QEventLoop::exec() function returns returnCode.
By convention, a returnCode of 0 means success, any non-zero value indicates an error.
https://doc.qt.io/qt-5/qthread.html#quit
quit:
Tells the thread's event loop to exit with return code 0 (success). Equivalent to calling QThread::exit(0).
This function does nothing if the thread does not have an event loop.
I assume that you read data in some data processing loop. If this assumption is wrong, then the following is not valid, of course.
You cannot call secondary thread's quit() directly from the main thread and expect that the secondary thread will process it immediately and quit the thread. The reason is that the thread is busy reading the data in the data processing loop. So you need to break the data processing loop in the secondary thread to make the event loop idle.
(Btw. do not confuse the data processing loop with the event loop. Data processing loop is the one which you have written yourself to read data from the port. The event loop is the loop created by Qt automatically after you called QThread::start() and which is processing events, signals and slots in the secondary thread. This event loop is blocked while your data processing loop is running.)
In order to break the data processing loop, you need to do two things:
call QThread::requestInterruption() from the main thread as response to some "Abort" button having been pressed (do not worry about thread safety, requesting interruption is thread safe/atomic)
within the loop in the secondary thread you need to periodically check QThread::isInterruptionRequested(), and if this returns true, then break the loop and emit worker's finished() signal
Once you broke from the data processing loop in the secondary thread, the event loop in the secondary thread becomes available for processing signals sent from the main thread.
I can see in your code that worker's finished() signal is connected to QThread::quit(). So emitting finished() from the secondary thread (after you broke from the data processing loop) will call thread's quit() which will be processed by the secondary thread's event loop (which is now idle) and it will quit the event loop and subsequently the thread and if you have connected everything correctly it will delete the worker and the thread. (though I have not checked this part of your code)

Golang concurrent R/W to database

I'm writing some Go software that is responsible for downloading and parsing a large number of JSON files and writing that parsed data to a sqlite database. My current design has 10 go routines simultaneously downloading/parsing these JSONs and communicating them to another go routine whose sole job is to listen on a specific channel and write the channel contents to the DB.
The system does some additional read operations after all writing should have been completed, which leads to an issue where queries return incorrect results because not all of the data has been written to the table. Because the JSON data I'm pulling is dynamic, I have no easy way to know when all the data has been written.
I've considered two possibilities for solving this, though I'm not super happy with either solution:
Listen on the channel and wait for it to be empty. This should work in principle, however, it does not ensure that the data has been written, all it ensures is it's been received on the channel.
Synchronize access to the DB. This again should work in principle, however, I would still need to order the query operation to be after all the write operations.
Are there any other design decisions I should consider to rectify this issue? For reference the libraries I'm using to pull this data are go-colly and the go-sqlite3. Appreciate all the help!
You can use a sync.WaitGroup
e.g.
package main
import "sync"
func main() {
// Some sort of job queue for your workers to process. This job queue should be closed by the process
// that populates it with items. Once the job channel is closed, any for loops ranging over the channel
// will read items until there are no more items, and then break.
jobChan := make(chan JobInfo)
// Populate the job queue here...
// ...
close(jobChan)
// We now have a full queue of jobs that can't accept new jobs because the channel is closed.
// Number of concurrent workers.
workerCount := 10
// Initialize the WaitGroup.
wg := sync.WaitGroup{}
wg.Add(workerCount)
// Create the worker goroutines.
for i := 0; i < workerCount; i++ {
go func() {
// When the jobChan is closed, and no more jobs are available on the queue, the for loop
// will exit, causing wg.Done() to be called, and the anonymous function to exit.
for job := range jobChan {
// Process job.
}
wg.Done()
}()
}
// Wait for all workers to call wg.Done()
wg.Wait()
// Whatever you want to do after all queue items have been processed goes here.
// ...
}

How to wait for all completed AsyncOperations on app close

I'm sending messages using an asynchronous write operation, but when the app is closed I need to write 2 messages to a device, but only 1 message gets successfully written.
Each write operation is chained to a message queue, so the messages write operations are sent sequentially after each completed write operation while the message queue is full. So basically the following code might get to the first completed callback, but it doesn't reach the 2nd before the app closes. I tried adding a Windows Sleep call in between and after Async Operations, but this didn't work. I also tested waiting for the completion callback using a while loop to see if the 2nd opertation ever completes, which it never does.
ComPtr<IAsyncOperation<GattCommunicationStatus>> writeOp;
GattWriteOption option = GattWriteOption_WriteWithoutResponse;
hr = customCharacteristic->WriteValueWithOptionAsync(buffer.Get(), option, &writeOp);
hr = writeOp->put_Completed(Callback<IAsyncOperationCompletedHandler<GattCommunicationStatus>>
(this, &DataGloveBluetooth::OnCharacteristicWriteComplete).Get());
This is only an issue on app close as it seems 2nd operation never gets to the callback. Also, the messages are too large to put together, so I need to be able to send more than 1 message.
Is there a proper way I can wait? This is some pseudo-code to help explain the ordering:
WriteMessage(LED_RESET); // adds to message queue, then calls aync op,
WriteMessage(CLOSE); // adds to message queue, async op is called once first message is sent, complete callback is never reached
Sleep(5000) // whatever sleep amount never helps the 2nd message finish

Sleep for x seconds before running next operation

I have been trying in various ways to make my program sleep for 10 seconds before running the next line of code.
this.SetContentView (Resource_Layout.Main)
let timer = new System.Timers.Timer(10000.0)
async{do timer.Start()}
this.SetContentView (Resource_Layout.next)
I can't get any solution to work.
If you want to use async rather than the more direct way (of creating a timer and setting the content view in the event handler of the timer), then you need something like this:
this.SetContentView (Resource_Layout.Main)
async{
do! Async.Sleep(10000.0)
this.SetContentView (Resource_Layout.next) }
|> Async.StartImmediate
The key points:
Using do! Async.Sleep you can block the execution of asynchronous computation
By moving the SetContentView call inside the async, it will happen after the sleep
Using Async.StartImmediate, you start the workflow - and the sleeping ensures that the rest of the computation runs in the same threading context (meaning that it will run on the UI thread and the code will be able to access UI elements).

Asynchronous Execution of UPDATEs without wait

There is a code in DataUserManual about Asynchronous Execution:
Statement stmt = (ses << "SELECT (age) FROM Person", into(age), async); // asynchronous statement
Statement::Result result = stmt.execute(); // executes asynchronously
stmt.execute(); // throws InvalidAccessException
It's about that we should always call wait on result before execute a next request.
What about a case when I don't expect any answer on requests from DB? Is is safe to asynchronously call a chain of UPDATEs without waiting for result?
The reason for the exception is because the statement is still executing. If you want to issue updates in parallel, you need separate Statement objects.

Resources