Is there a zipWithLatestList function for zio streams? - zio

Monix streams have combineLatestList method for which given the list of streams you can convert it to stream of List which consists of latest values?
Zio streams have zipWithLatest but it works only for 2 streams i wonder how can we make it to work for List of streams?
Thanks!

Related

How to split serialized byte stream (e.g avro, protobuf) in network?

Recently, I want to write an application using netty.The format of message I want to send is serialized object stream using Avro or Protobuf.
There exists one question for me, that is while one side receive the byte streams from the other side, how could I split the byte stream,or how could I know if such stream terminated,and ready for the next serialized objects?
I get some tips that is using special characters between different object byte stream,but doesn't avro or protobuf will generate such characters while serialize objects?
I think you most likely want to "prefix" each serialised object with the number of bytes it is serialized too. This will allow you to ensure you only read the correct number of bytes per Object and so do the right thing when de-serialize it.
Netty itself contains for example the LengthFieldBasedFrameDecoder which will allow you to "slice" out the bytes for an object. And the LengthFieldPrepender which allows you to prefix each of them when do the encoding.

Stream processors with multiple inputs and outputs as arrows

Reading John Hughes's Generalising monads to arrows, I understand that arrows can be used to represent and combine stream processors with a single input and a single output. It is also possible to represent multiple inputs and outputs using pairs, or using ArrowChoice.
However, using a pair means the input is a stream of pairs, which isn't enough to express processing streams that arrive at difference rates. ArrowChoice is able express that, but it "multiplexes" the two streams in a single one.
I'm looking for a way to combine streams with multiple inputs and multiple outputs, while still being able to distinguish between the case where the streams are multiplexed and the case of separate streams.
Is that possible?
Maybe you could use the These type (from here) which is defined as :
data These a b = This a | That b | These a b
This way you could express that you are receiving one stream, or the other, or both.

Muxing non-synchronised streams to Haali

I have 2 input streams of data that are being passed to a Haali Muxer (mp4 format).
Currently I stream these to Haali directly in a DirectShow graph without a clock. I wondered if I should be trying to write these to the muxer synchronised, or whether it happily accepts a stream of audio data that stops before the video data stream stops. (I have issues with the output file not playing audio after seeking, and I'm not sure why this could occur)
I can't find much in the way of documentation for muxing with the Haali muxer, does anyone know the best place to look for info on this filter?
To have the streams multiplexed into single MP4 file you need single instance of multiplexer (Haali, GDCL, commercial, wrapper over mp4v2 library, over Media Foundation sink etc) with two (or more) input pins on it connected to respective sources, which in turn are going to be written as tracks.
Filter graph clock does not matter. Clock is for presentation, and file writers accept incoming data and write it as soon as possible anyway. It is more accurate to remove the clock, as you seem to already be doing, but having standard clock is not going to be different.
Data is synchronized using time stamps on individual media samples, parts of media streams. Multiplexer builds internal queues for every stream and then consumes data from the streams to build single file, in a sort of way that original stream data is interleaved. If one stream supplies too much data, that is, if data is available too early while another stream supplies data slowly, multiplexer blocks further data reception on this particular stream by not returning from respective processing call (IPin::Receive) expecting that during this wait the slow stream provides additional input. Eventually, what multiplexer looks at when matching data from different streams is data time stamps.
To obtain synchronized data in resulting MP4 file you, thus, need to make sure the payload data is properly time stamped. Multiplexer will take care of the rest.
This also includes that the time stamps should be monotonously increasing within a stream, and key frames/splice points are respectively indicated. Otherwise some multiplexers might issue a failure immediately, other would produce the output file but it might have playback issues (esp. seeking).

Streams and procedures with variable number of arguments in Scheme

I'm trying to create a stream version of map that takes a variable number of streams as argument. The problem I have is that I want it to handle streams of various size, and that it will terminate when one of them is empty. If i was dealing with lists instead of streams I would just do like this:
if (member? '() args)
'()
But since this materializes the whole stream each time, I guess it defeats the purpose? I can't seem to think of any other way to check if one of the streams are empty than to to it like this.
The main difference between stream version and ordinary is that you need to use stream-cons, etc. You can still have a list of streams instead of a list of lists so your little check can be written:
(if (memq stream-null args)
stream-null
(stream-cons <??> <??>))
It does not materialize the stream since args is a list of streams. Thus every stream gets checked for being the empty stream and that's a very simple eq? test (hench I changed it to memq)

How to render a byte array from socket/application using DirectShow?

I have an application. I will have a situation, wherein I will receive a big array of encoded bytes. I have to decode them and render it. For decoding, I am using a custom decoder class. After the decode, how can I construct a DirectShow graph which will receive input data from the decoder? Please give some direction/samples on this.
Have a look at the PushSource sample in the DirectShow SDK. This sample shows you how to create a source filter that can be rendered. It is all about setting the output media type of your filter correctly so that the rest of the graph can be rendered. The sample also shows you how to feed media samples to the rest of the media pipeline. In your case what do you decode to? The PushSource sample outputs RGB24 IIRC.
Also, it sounds like you're decoding in the same filter as your receiving the bytes in? Typically in DirectShow you would write a source filter that is able to receive bytes from the network and outputs samples in the encoded format. You would then connect this filter to a custom decoder filter, that then outputs either RGB24 or some raw media format that is understood by DirectShow. Similarly for audio, you could output say, PCM.
Edit:
I have used the same approach (CSource, CSourceStream). That is correct, the DoBufferProcessingLoop calls FillBuffer. My general approach has been to use the producer-consumer pattern. The networking-reading thread populates the queue with samples and in my overridden DoBufferProcessingLoop I check whether the queue has any data, calling FillBuffer if there is data. You can of course try other methods such as waiting on events (frame availibility). To see the approach I used you can download the source code of an example RTSP source filter at http://sourceforge.net/projects/videoprocessing/ and see if that suits you. Best thing I would say is to just try stuff and learn as you go along.

Resources