okay, so I have a controller method which need to make a bunch of soap call to an external service, each one quite heavy. I am trying to do these one in parralel to save some time, but unless I build the async calls from GlobalScope, the deferred are resolved in sequence. Let me show you.
executing the following code
#ResponseBody
#GetMapping(path = ["/buildSoapCall"])
fun searchStations(): String = runBlocking {
var travels: List<Travel> = service.getTravels().take(500)
val deferred = travels
.map {
async() {
print("START")
val result = service.executeSoapCall(it)
print("END")
result
}
}
println("Finished deferred")
val callResults = deferred.awaitAll()
println("Finished Awaiting")
""
}
get me the following console message :
Finished deferred
START-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-END.....
the - is printed by executeSoapCall
As you can see, the deferred are called in sequence.
But if I use GlobalScope, like this :
#ResponseBody
#GetMapping(path = ["/buildSoapCall"])
fun searchStations(): String = runBlocking {
var travels: List<Travel> = service.getTravels().take(500)
val deferred = travels
.map {
GlobalScope.async() {
print("START")
val result = service.executeSoapCall(it)
print("END")
result
}
}
println("Finished deferred")
val callResults = deferred.awaitAll()
println("Finished Awaiting")
""
}
I get the following console message :
Finished Treating
STARTSTARTSTARTSTARTSTARTSTARTSTARTSTARTSTARTSTARTSTARTFinished deferred
START-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART--ENDENDSTARTSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-ENDSTART-END...START-END-END-END-END-END-END-END-END-END-END-END-ENDFinished Awaiting
showing that the Deferred are all starting in parallel. In addition, the treatment time is quite shorter.
I don't really understand why I have this behaviour.
Your call to service.executeSoapCall blocks the thread runBlocking coroutine is running on. You need to start async coroutine on a different thread everytime to get a concurrent behavior. You can achieve that by using a threadpool, e.g., Dispatchers.IO:
...
async(Dispatchers.IO) {
print("START")
val result = service.executeSoapCall(it)
print("END")
result
}
...
or creating a new thread on every call:
...
async(newSingleThreadContext("MyThread")) {
print("START")
val result = service.executeSoapCall(it)
print("END")
result
}
...
GlobalScope works because it uses a ThreadPool by default but you should avoid using it. You can read this article by Roman Elizarov about that topic.
Related
firebase method is working on worker thread automatically. but I have used coroutine and callbackflow to implement firebase listener code synchronously or get return from the listener.
below is my code that I explained
coroutine await with firebase for one shot
override suspend fun checkNickName(nickName: String): Results<Int> {
lateinit var result : Results<Int>
fireStore.collection("database")
.document("user")
.get()
.addOnCompleteListener { document ->
if (document.isSuccessful) {
val list = document.result.data?.get("nickNameList") as List<String>
if (list.contains(nickName))
result = Results.Exist(1)
else
result = Results.No(0)
//document.getResult().get("nickNameList")
}
else {
}
}.await()
return result
}
callbackflow with firebase listener
override fun getOwnUser(): Flow<UserEntity> = callbackFlow{
val document = fireStore.collection("database/user/userList/")
.document("test!!!!!")
val subscription = document.addSnapshotListener { snapshot,_ ->
if (snapshot!!.exists()) {
val ownUser = snapshot.toObject<UserEntity>()
if (ownUser != null) {
trySend(ownUser)
}
}
}
awaitClose { subscription.remove() }
}
so I really wonder these way is good or bad practice and its reason
Do not combine addOnCompleteListener with coroutines await(). There is no guarantee that the listener gets called before or after await(), so it is possible the code in the listener won't be called until after the whole suspend function returns. Also, one of the major reasons to use coroutines in the first place is to avoid using callbacks. So your first function should look like:
override suspend fun checkNickName(nickName: String): Results<Int> {
try {
val userList = fireStore.collection("database")
.document("user")
.get()
.await()
.get("nickNameList") as List<String>
return if (userList.contains(nickName)) Results.Exist(1) else Results.No(0)
} catch (e: Exception) {
// return a failure result here
}
}
Your use of callbackFlow looks fine, except you should add a buffer() call to the flow you're returning so you can specify how to handle backpressure. However, it's possible you will want to handle that downstream instead.
override fun getOwnUser(): Flow<UserEntity> = callbackFlow {
//...
}.buffer(/* Customize backpressure behavior here */)
Let's say I have list of repos. I want to iterate through all of them. As each repo returns with result, I wanted to pass it on.
val repos = listOf(repo1, repo2, repo3)
val deferredItems = mutableListOf<Deferred<List<result>>>()
repos.forEach { repo ->
deferredItems.add(async { getResult(repo) })
}
val results = mutableListOf<Any>()
deferredItems.forEach { deferredItem ->
results.add(deferredItem.await())
}
println("results :: $results")
In the above case, It waits for each repo to return result. It fills the results in sequence, result of repo1 followed by result of repo2. If repo1 takes more time than repo2 to return result, we will be waiting for repo1's result even though we have result for repo2.
Is there any way to pass the result of repo2 as soon as we have the result?
The Flow API supports this almost directly:
repos.asFlow()
.flatMapMerge { flow { emit(getResult(it)) } }
.collect { println(it) }
flatMapMerge first collects all the Flows that come out of the lambda you pass to it and then concurrently collects those and sends them into the downstream as soon as any of them completes.
That's what channels are for:
val repos = listOf("repo1", "repo2", "repo3")
val results = Channel<Result>()
repos.forEach { repo ->
launch {
val res = getResult(repo)
results.send(res)
}
}
for (r in results) {
println(r)
}
This example is incomplete, as I don't close the channel, so the resulting code will be forever suspended. Make sure that in your real code you close the channel once all results are received:
val count = AtomicInteger()
for (r in results) {
println(r)
if (count.incrementAndGet() == repos.size) {
results.close()
}
}
you should use Channels.
suspend fun loadReposConcurrent() = coroutineScope {
val repos = listOf(repo1, repo2, repo3)
val channel = Channel<List<YourResultType>>()
for (repo in repos) {
launch {
val result = getResult(repo)
channel.send(result)
}
}
var allResults = emptyList<YourResultType>()
repeat(repos.size) {
val result = channel.receive()
allResults = allResults + result
println("results :: $result")
//updateUi(allResults)
}
}
in the code above in for (repo in repos) {...} loop all the requests calculated in seprate coroutines with launch and as soon as their result is ready will send to channel.
in repeat(repos.size) {...} the channel.receive() waits for new values from all coroutines and consumes them.
I am trying to understand which is the best way to have an asynchronous job fired at a scheduled rate in Kotlin, while the application is normally running it's normal tasks. Let's say I have a simple application that only prints out "..." every second, but every 5 seconds I want another job / thread / coroutine (which ever suits best) to print "you have a message!". For the async job I have a class NotificationProducer and it looks like this.
class NotificationProducer {
fun produce() {
println("You have a message!")
}
}
Then, my main method looks like this.
while (true) {
println("...")
sleep(1000)
}
Should I use GlobalScope.async, Timer().schedule(...) or some Quartz job to achieve what I want? Any advice is highly appreciated. The point is that notification must come from another class (e.g. NotificationProducer)
If I correctly understand the issue, using Kotlin Coroutines you can implement it as the following:
class Presenter : CoroutineScope { // implement CoroutineScope to create local scope
private var job: Job = Job()
override val coroutineContext: CoroutineContext
get() = Dispatchers.Default + job
// this method will help to stop execution of a coroutine.
// Call it to cancel coroutine and to break the while loop defined in the coroutine below
fun cancel() {
job.cancel()
}
fun schedule() = launch { // launching the coroutine
var seconds = 1
val producer = NotificationProducer()
while (true) {
println("...")
delay(1000)
if (seconds++ == 5) {
producer.produce()
seconds = 1
}
}
}
}
Then you can use an instance of the Presenter class to launch the coroutine and stop it:
val presenter = Presenter()
presenter.schedule() // calling `schedule()` function launches the coroutine
//...
presenter.cancel() // cancel the coroutine when you need
For simple scheduling requirements, you can consider using coroutines:
class NotificationProducerScheduler(val service: NotificationProducer, val interval: Long, val initialDelay: Long?) :
CoroutineScope {
private val job = Job()
private val singleThreadExecutor = Executors.newSingleThreadExecutor()
override val coroutineContext: CoroutineContext
get() = job + singleThreadExecutor.asCoroutineDispatcher()
fun stop() {
job.cancel()
singleThreadExecutor.shutdown()
}
fun start() = launch {
initialDelay?.let {
delay(it)
}
while (isActive) {
service.produce()
delay(interval)
}
println("coroutine done")
}
}
Otherwise, the Java concurrency API is pretty solid too:
class NotificationProducerSchedulerJavaScheduler(
val service: NotificationProducer,
val interval: Long,
val initialDelay: Long = 0
) {
private val scheduler = Executors.newScheduledThreadPool(1)
private val task = Runnable { service.produce() }
fun stop() {
scheduler.shutdown()
}
fun start() {
scheduler.scheduleWithFixedDelay(task, initialDelay, interval, TimeUnit.MILLISECONDS)
}
}
This function will run a task in the background while proceeding with a "main" task that controls the lifecycle of the background job. Below is an example of usage.
/**
* Runs a task in the background in IO while the op proceeds.
* The job is canceled when op returns.
* This is useful for updating caches and the like.
*/
suspend fun withBackgroundTask(task: suspend () -> Unit, op: suspend () -> Unit) {
val job = CoroutineScope(Dispatchers.IO).launch { task() }
try {
op()
} finally {
job.cancel()
}
}
/**
* Updates the cache in a background task while op runs.
*/
suspend fun withCache(cache: Cache<*>, op: suspend () -> Unit) {
suspend fun cacheUpdate() {
cache.fetchInternal()
while (true) {
delay(cache.cycle)
cache.fetchInternal()
}
}
withBackgroundTask(::cacheUpdate, op)
}
I need to asynchronously fetch cats, dogs and mice and then do some post-processing. Here is something what I am doing:
Promise<List<Cat>> fetchCats = task {}
Promise<List<Mouse>> fetchMice = task { }
Promise<List<Dog>> fetchDogs = task {}
List promiseList = [fetchCats, fetchMice, fetchDogs]
List results = Promises.waitAll(promiseList)
The problem I am facing is, order of items in list results is not fixed, i.e. in one execution results can be [cats, dogs, mice] and in other execution, results can be [dogs, mice, cats].
Which means to access cats I need to explicitly check type of element of results, and similarly for dogs, and mice which makes my code look bad.
Upon going through documentation here, I found PromiseMap API can help me as it provides a pretty way of accessing results through key-value pairs. Here is what it offers:
import grails.async.*
def map = new PromiseMap()
map['one'] = { 2 * 2 }
map['two'] = { 4 * 4 }
map['three'] = { 8 * 8 }
map.onComplete { Map results ->
assert [one:4,two:16,three:64] == results
}
Though PromiseMap has an onComplete method, but it does not make current thread wait for all the promises to finish.
Using PromiseMap, how can I block the current thread till all the promises get finished?
If you are only concern about current thread to wait until PromiseMap complete, can use Thread : join()
import grails.async.*
def map = new PromiseMap()
map['one'] = { println "task one" }
map['two'] = { println "task two" }
map['three'] = { println "task three" }
Thread t = new Thread() {
public void run() {
println("pausing the current thread, let promiseMap complete first")
map.onComplete { Map results ->
println("Promisemap processing : " + results)
}
}
}
t.start()
t.join()
println("\n CurrentThread : I can won the race if you just comment t.join() line in code")
Use .get()
From the PromiseMap source:
/**
* Synchronously return the populated map with all values obtained from promises used
* inside the populated map
*
* #return A map where the values are obtained from the promises
*/
Map<K, V> get() throws Throwable {
I'm writing a C++ class ScriptProcess, meant to be used in QML, that acts as an interface to a child process. Said child process loads a script, then executes functions on demand. When you call a function, the result (be it a value or an exception) is returned asynchronously through signals.
import QtQuick 2.5
import QtTest 1.0
import dungeon 1.0
TestCase {
id: test
name: "ScriptProcessTest"
property ScriptProcess session: null
signal exceptionWhenLoadingCode(string type, int line, string message)
SignalSpy {
id: exceptionSpy
target: test
signalName: "exceptionWhenLoadingCode"
}
Component {
id: process
ScriptProcess {
id: script
onExceptionWhenLoadingCode: test.exceptionWhenLoadingCode(type, line, message)
}
}
function startScript(scriptUrl) {
var request = new XMLHttpRequest()
request.open("GET", "data/%1.js".arg(scriptUrl), false)
request.send(null)
return process.createObject(test, {code: request.responseText})
}
function cleanup() {
if (session) {
session.stop()
}
delete session
session = null
}
function test_syntaxErrorInGlobalContextIsReported() {
var count = exceptionSpy.count
session = startScript("syntax-error-in-global-context")
compare(exceptionSpy.count, count + 1)
}
function test_errorThrownInGlobalContextIsReported() {
var count = exceptionSpy.count
session = startScript("error-in-global-context")
compare(exceptionSpy.count, count + 1)
}
}
In a nutshell, I do the following:
For each test, open the auxillary process and load a script from a file. This is done by instantiating a Component, with the script given via the ScriptProcess.code property.
Run the test.
When the test finishes, kill the process and delete the object that manages it.
My problem is that the SignalSpy called exceptionSpy is not being triggered; exceptionSpy.count is always zero, and I have no idea why. Why is this the case? Am I misusing SignalSpy or Component?
XMLHttpRequest is asynchronous, so you should probably change startScript() to something like:
function startScript(scriptUrl) {
var object = process.createObject(test)
var request = new XMLHttpRequest()
request.onreadystatechange = function() {
if (request.readyState === XMLHttpRequest.DONE)
object.code = request.responseText
}
request.open("GET", "data/%1.js".arg(scriptUrl), false)
request.send(null)
return object
}
And since the signal is not going to be emitted right away, you'll have to wait for it instead of comparing the count immediately after creating the object:
function test_foo() {
var count = exceptionSpy.count
session = startScript("...")
tryCompare(exceptionSpy, "count", count + 1)
}