What's the best practice for calling a constructor of a class in scala 2.10 (M4+) ?
Answering my own question:
Calling the constructor is different than invoking a method. Here's the right way to do it in scala 2.10
import reflect.runtime.universe._
import reflect.runtime.currentMirror
val typ = typeOf[Range]
val constructor = typ.members.find(_.kind == "constructor").get.asMethodSymbol
currentMirror reflectClass typ.typeSymbol.asClassSymbol reflectConstructor constructor apply (1,10,1)
As expected, the result is:
res7: Any = Range(1, 2, 3, 4, 5, 6, 7, 8, 9)
Related
I would like to use a function that takes an enumerable and a function and does the same thing as Python's itertools.accumulate. For example,
iex> accumulate([1,3,7], &Kernel.+)
[1, 4, 11]
For an explanation, it's equal to [1, 1+2, 1+4+7]. Does such a function exist in Elixir's standard library?
Enum.scan/2 does the same thing.
Enum.scan([1, 3, 7], &+/2)
#⇒ [1, 4, 11]
The closest implementation I can find is that of aiostream's chunks. Which allows the generation of "chunks of size n from an asynchronous sequence. The chunks are lists, and the last chunk might contain less than n elements".
I have implemented something similar but the key difference is that it prioritises fulfilling one batch at a time as opposed to multiple batches at once.
import asyncio
import aiostream
from collections import deque
class IterableAsyncQueue:
def __init__(self):
self.queue = asyncio.Queue()
async def put(self, value):
await self.queue.put(value)
def __aiter__(self):
return self
async def __anext__(self):
return await self.queue.get()
class Batch:
def __init__(self, n):
self.batch_size = n
def __call__(self, iterable, *args):
self.iterable = iterable
self.calls = deque()
self.pending = set()
self.pending_return = asyncio.Queue()
self.initialised = False
return self
def __iter__(self):
iterable = iter(self.iterable)
return iter(lambda: tuple(itertools.islice(iterable, self.batch_size)), ())
def __aiter__(self):
return self
async def __anext__(self):
self.pending |= {asyncio.create_task(self.iterable.__anext__()) for _ in range(self.batch_size)}
if self.initialised:
future = asyncio.get_running_loop().create_future()
self.calls.append(future)
await future
else:
self.initialised = True
batch = []
while len(batch) < self.batch_size:
done, _ = await asyncio.wait(self.pending, return_when=asyncio.FIRST_COMPLETED)
done = list(done)[0]
batch.append(await done)
self.pending.discard(done)
next_call = self.calls.popleft()
next_call.set_result(None)
return batch
async def consumer(n, a):
start = time.time()
async for x in a:
print(n, x, time.time() - start)
async def producer(q):
for x in range(50):
await asyncio.sleep(0.5)
await q.put(x)
q = IterableAsyncQueue()
# a = Batch(5)(q)
a = aiostream.stream.chunks(q, 5)
loop = asyncio.get_event_loop()
loop.create_task(producer(q))
loop.create_task(consumer(1, a))
loop.create_task(consumer(2, a))
loop.run_forever()
The output using aiostream.stream.chunks:
1 [0, 2, 4, 6, 8] 4.542179107666016
2 [1, 3, 5, 7, 9] 5.04422402381897
1 [10, 12, 14, 16, 18] 9.575451850891113
2 [11, 13, 15, 17, 19] 10.077155828475952
The output using my implementation of priority batch:
1 [0, 1, 2, 3, 4] 2.519313097000122
2 [5, 6, 7, 8, 9] 5.031418323516846
1 [10, 11, 12, 13, 14] 7.543889045715332
2 [15, 16, 17, 18, 19] 10.052537202835083
It seems to me that the priority batch is fundamentally more useful as it yields results sooner than chunks allowing the calling code to await another batch. This means that if there are m consumers each awaiting a batch of size n then there are always between m×n and (m-1)×n results being waited upon. With the chunks implementation the number of results being waited upon varies between m and m×n.
What I would like to know is why haven't I been able to find an implementation of this before and is this the best way of implementing this solution?
If I have tasks 1, 2, 3, 4, and 5,
and I want to have 1 -> 2, then 2->3, 2->4, 2->5, what is the best set up for doing this?
Would 3.set_upstream(2), 4.set_upstream(2), 5.set_upstream(2) sufficient?
Yes it's enough. Don't forget also 2.set_upstream(1)
I have to give list of values into in clause of SQL query but while retrieving the values [ ] also come along with data which is not readable by query language.
For example I have list as:
def val = new ArrayList<Integer>(Arrays.asList(1,2,3,4,5,6,7,8))
while doing println(val) output come as [1, 2, 3, 4, 5, 6, 7, 8] but in the query it is needed as: 1, 2, 3, 4, 5, 6, 7, 8
In java this one works as System.out.println(val.toString().replaceAll("[\\[\\]]", "")) but not in groovy. Can't we use collection to remove like this?
Instead of:
def val = new ArrayList(Arrays.asList(1,2,3,4,5,6,7,8))
use:
def val = new ArrayList(Arrays.asList(1,2,3,4,5,6,7,8)).join(', ')
or simply:
def val = [1,2,3,4,5,6,7,8].join(', ')
Try using g-strings and the minus operator:
println "${val}" - '[' - ']'
What is the most concise way of converting a java.util.List into a normal
JavaFX sequence (in JavaFX)?
e.g.
def myList = java.util.Arrays.asList(1, 2, 3);
def mySequence = ... // a sequence containing [1, 2, 3]
This is the most concise way I could find - there may be a more direct method though
def myList = java.util.Arrays.asList(1, 2, 3);
def mySequence = for (i in myList) i;
println("mySequence={mySequence}");