video stream store in buffer (locally) in as3 - apache-flex

I am recording video using Red 5 and action-script, but some time red 5 flex connection close due to some network problem. I Want to store live video stream in buffer (locally, not more than 3 minute), if my red 5 connection close/fail/lost. It is possible to store video locally?

Related

Is it possible for the receiver to decrement an HTTP/2 stream flow control window?

Section 6.9 of RFC 7540 describes the mechanism for HTTP/2 flow control. There is a flow control window for each connection, and another flow control window for all streams on that connection. It provides a way for the receiver to set the initial flow control window for a stream:
Both endpoints can adjust the initial window size for new streams by including a value for SETTINGS_INITIAL_WINDOW_SIZE in the SETTINGS frame that forms part of the connection preface.
And a way for the receiver to increase the connection and stream flow control windows:
The payload of a WINDOW_UPDATE frame is one reserved bit plus an unsigned 31-bit integer indicating the number of octets that the sender can transmit in addition to the existing flow-control window. The legal range for the increment to the flow-control window is 1 to 2^31-1 (2,147,483,647) octets.
[...]
A sender that receives a WINDOW_UPDATE frame updates the corresponding window by the amount specified in the frame.
And a way for the receiver to increment or decrement the flow control windows for all streams (but not the connection) at once:
When the value of SETTINGS_INITIAL_WINDOW_SIZE changes, a receiver MUST adjust the size of all stream flow-control windows that it maintains by the difference between the new value and the old value.
But as far as I can tell, there is no way for the receiver to decrement a single stream's flow control window without changing the initial window size.
Is that correct? If so, why not? This seems like a reasonable thing to want to do if you are multiplexing many long-lived streams over a single connection. You may have some BDP-controlled memory budget for the overall connection, carved up across the streams, and are tuning the proportion that each stream gets according to its recent bandwidth demand. If one of them temporarily goes idle you'd like to be able to reset its window to be small so that it doesn't strand the memory budget, without affecting the other streams, and without making it impossible to receive new streams.
(Of course I understand that there is a race, and the sender may have sent data before receiving the decrement. But the window is already allowed to go negative due to the SETTINGS_INITIAL_WINDOW_SIZE mechanism above, so it seems like it would be reasonable to allow for a negative window here too.)
Is it really not possible to do this without depending on forward progress from the sender to eat up the stranded bytes in the flow control window?
Here's more detail on why I'm interested in the question, because I'm conscious of the XY problem.
I'm thinking about how to solve an RPC flow control issue. I have a server with a limited memory budget, and incoming streams with different priorities for how much of that memory they should be allowed to consume. I want to implement something like weighted max-min fairness across them, adjusting their flow control windows so that they sum to no more than my memory budget, but when we're not memory constrained we get maximum throughput.
For efficiency reasons, it would be desirable to multiplex streams of different priorities on a single connection. But then as demands change, or as other connections show up, we need to be able to adjust stream flow control windows downward so they still sum to no more than the budget. When stream B shows up or receives a higher priority but stream A is sitting on a bunch of flow control budget, we need to reduce A's window and increase B's.
Even without the multiplexing, the same problem applies at the connection level: as far as I can tell, there is no way to adjust the connection flow control window downward without changing the initial window size. Of course it will be adjusted downward as the client sends data, but I don't want to need to depend on forward progress from the client for this, since that may take arbitrarily long.
It's possible there is a better way to achieve this!
A server that has N streams, of which some idle and some actively downloading data to the client, will typically re-allocate the connection window to active streams.
For example, say you are watching a movie and downloading a big file from the same server at the same time.
Connection window is 100, and each stream has a window of 100 too (obviously in case of many streams the sum of all stream windows will be capped by the connection window, but if there is only one stream it can be at max).
Now, when you watch and download each stream gets 50.
If you pause the movie, and the server knows about that (i.e. it does not exhaust the movie stream window), then the server now has to serve only one stream, with a connection window of 100 and a single stream (the download one) that also has window of 100, therefore reallocating the whole window to the active stream.
You only get into problems if the client doesn't tell the server that the movie has been paused.
In this case, the server will continue to send movie data until the movie stream window is exhausted (or quasi exhausted), and the client does not acknowledges that data because it's paused.
At that point, the server notices that data is not acknowledged by one stream and stops sending data to it, but of course part of the connection window is taken, reducing the window of the active download stream.
From the server point of view, it has a perfectly good connection where one stream (the download one) works wonderfully at max speed, but another stream hiccups and exhausts its window and causes the other stream to slow down (possibly to a halt), even if it's the same connection!
Obviously it cannot be a connection/communication issue, because one stream (the download one) works perfectly fine at max speed.
Therefore it is an application issue.
The HTTP/2 implementation on the server does not know that one of the streams is a movie that can be paused -- it's the application that must communicate this to the server and keep the connection window as large as possible.
Introducing a new HTTP/2 frame to "pause" downloads (or changing the semantic of the existing frames to accommodate a "pause" command) would have complicated the protocol quite substantially, for a feature that is 100% application driven -- it is the application that must trigger the send of the "pause" command but at that point it can send its own "pause" message to the server without complicating the HTTP/2 specification.
It is an interesting case where HTTP/1.1 and HTTP/2 behave very differently and require different code to work in a similar way.
With HTTP/1.1 you would have one connection for the movie and one for the download, they would be independent, and the client application would not need to communicate to the server that the movie was paused -- it could just stop reading from the movie connection until it became TCP congested without affecting the download connection -- assuming that the server is non-blocking to avoid scalability issues.

nginx-rtmp video stream seeking functionality

I am trying to make a video streaming service by using nginx-rtmp for reading stream and sending a dash/hls streaming media
Used this link for live streaming media
I used obs for sending the stream to nginx
But it is a live stream and user cant seek the stream
so, is there a way so that users cana actually seek it
Or is there a better way to stream video using dash/hls

How do I set up a live audio streaming http server?

I was hoping to build an application that streams audio (mp3, ogg, etc.) from my microphone to a web browser.
I think I can use the html5 audio tag to read/play the stream from my server.
The area I'm really stuck on is how to setup the streaming http endpoint. What technologies will I need, and how should my server be structured to get the live audio from my mic and accessible from my server?
For example, for streaming mp3, do I constantly respond with mp3 frames as they are recorded?
Thanks for any help!
First off, let's split this problem up into a few parts. You have the audio capture (recording), the encoding/codec, the server, and the receiving clients.
Capture -> Codec -> Server -> Several Clients
For audio capture, you will need to use the Web Audio API along with getUserMedia. This will allow you to get 32-bit floating point PCM samples from the recording device. This data stream takes up a ton of bandwidth... a few megabit for a stereo stream. This stream is not directly playable in an HTML5 audio tag, and while you could play it on the receiving end with the Web Audio API, it takes up too much bandwidth to be useful. You need to use a codec to get the bandwidth usage down.
The codecs you want to look at include MP3, AAC (and its variants such as HE-AAC), and Opus. Not all browsers support all codecs. MP3 is the most widely compatible but AAC provides better quality for a given bitrate. Opus is a free and open codec but still doesn't have the greatest client adoption. In any case, there isn't yet a codec that you can run in-browser with any real stability. (Although it's being worked on! There are a lot of test projects made with Emscripten.) I solved this problem by reducing the bit depth of my samples to 16-bit signed integers and sending this PCM stream to a server to do the codec, over a binary websocket.
This encoding server took the PCM stream and ran it through a codec server-side. Here you can use whatever you'd like, such as a licensed codec binary or a tool like FFmpeg which encapsulates multiple codecs.
Next, this server streamed the data to a real streaming media server like Icecast. SHOUTcast and Icecast servers take the encoded stream and relay it to many clients over an HTTP-like connection. (Icecast is HTTP compliant whereas SHOUTcast is close but not quite there which can cause compatibility issues.)
Once you have your streaming server set up, it's as simple as referencing the stream URL in your <audio> tag.
Hopefully that gets you started. Depending on your needs, you might also look into WebRTC which does all of this for you but doesn't give you options for quality and also doesn't scale beyond a few users.

Synchronize sound stream with video stream

I'm working on a video/audio conference project and i have the next problem:
I record sound with DirectSound and send through the network(multicast) everytime the audio buffer is full(aprox. 200 milliseconds) of pcm raw format.
Using the DirectX.Capture project(Code project) i'm sending images through the network(multicast).
You have any ideas how to synchronize these two streams?On Lan i have no problem with synchronization,but i think on the internet will be some problems because of the differences of net speed between peers,routing,etc.
Thank you!

Scheduled Media Streaming

I have a video that needs to be delivered through streaming, but all viewers need to be synchronized at the same time regardless of when they started the video. If the video starts streaming at 7:00 and someone visits the page at 7:05, they should see the footage at 7:05 and onwards.
Does Red5 or Flash Media Server or any other streaming server have a feature to handle this? or is this something that needs to be handled by the player?
regardless of how you load an active stream in Flash, it will start at the beginning of the file stream. For real-time streams that is the moment the user joins the stream since the file stream starts at that moment.

Resources