http streaming - http

is http streaming possible without using any streaming servers?

Of course. You can output and flush, it gets to client before you end the script, thus it's streaming.

For live streaming, only segmented, like Apple HLS, other variants of segmented HLS (like OSMF) are not widely supported at the moment.
IIS from microsoft can also do smooth streaming (and Apple HLS as well).
Apple HLS can be supported on any web server when you pre-segment stream to chunks and just upload to web server path.
For VoD streaming, there is lot's of modules for all web servers.

Yes, although libraries have varying level of support. What needs to be used is "http chunking", such that lib does not try to buffer the whole request/response in memory (to computed the content length header) and instead indicate content comes in chunks.

Yes,not only its possible but has been implemented by various media server companies, only reason they still make usage of servers because of commercial purpose. Basically the content you want to stream should be divided into chunks/packets and then client machine can request those chunks via simple HTTP Get Requests.

Well if you have WebSockets available you can actually get quite low-latency streaming for low-fps scenarios by sending video frames as jpegs.
You can also send audio separately and play it using WebAudio on your browser. I imagine it could work for scenarios where you do not require perfect audio-video sync.
Another approach is to stream MPEG chunks through WebSockets, decode them in JS using jsmpeg and render to a canvas. You can find more here (video-only):
http://phoboslab.org/log/2013/09/html5-live-video-streaming-via-websockets

yes, the answer to your problem with http streaming is MPEG-DASH tech

Related

Consume RTMP ans distribute via WebSocket

I have a Linux PC which streams video (with audio) from a webcam to an RTMP server (nginx). The nginx RTMP server then converts the video into HLS and that HLS stream is shown on the browsers. Everything works good. The only problem is the delay due to the HLS protocol (10-20 seconds depending on the HLS playlist size).
I am looking for an alternative to HLS which can run on most of the major browsers. I can not use WebRTC due to the lack of audio, I can not use flash due to lack of support is mobile browsers. So my question is, is there any way to consume the RTMP stream, then distribute it via WebSocket and play on modern WebSocket supported browsers without any additional plugin? I am using ffmpeg to publish the RTMP stream from the Linux PC. If required, the source stream can easily be changed to other live streaming protocol like RTSP. So if there's some other solution which can solve this problem without RTMP, I can go for that too.
Thanks in advance.
Yes this is possible, but there's an even simpler solution. Just stream the data over HTTP.
WebSockets are only needed for bi-directional communication. You're just sending the video to the client.

HLS Nginx and Streaming Media

This is a process question more then anything.
I've been reading up on HTTP streaming (HLS) over the past week.
My goal is to be able to deliver content from my NGINX web server using HLS.
I have looked at Clappr using HLS.js. as a player however I'm just unclear what I need to do to deliver the content. Do I need a streaming media sever? just a web server?
I think I can use ffmpeg to make the HLS streams.
Eventually I'm hoping to be able to record incoming streams for processing later. Right now I just want to be able to put out HLS streams.
Any advice or infographic or something to put this in perspective would be appreciated.
Do I need a streaming media sever? just a web server?
One of the main purposes of HLS is that you can serve the data with any HTTP server. Content is effectively files done in chunks. No special streaming server is needed. Nginx is fine.

How do I set up a live audio streaming http server?

I was hoping to build an application that streams audio (mp3, ogg, etc.) from my microphone to a web browser.
I think I can use the html5 audio tag to read/play the stream from my server.
The area I'm really stuck on is how to setup the streaming http endpoint. What technologies will I need, and how should my server be structured to get the live audio from my mic and accessible from my server?
For example, for streaming mp3, do I constantly respond with mp3 frames as they are recorded?
Thanks for any help!
First off, let's split this problem up into a few parts. You have the audio capture (recording), the encoding/codec, the server, and the receiving clients.
Capture -> Codec -> Server -> Several Clients
For audio capture, you will need to use the Web Audio API along with getUserMedia. This will allow you to get 32-bit floating point PCM samples from the recording device. This data stream takes up a ton of bandwidth... a few megabit for a stereo stream. This stream is not directly playable in an HTML5 audio tag, and while you could play it on the receiving end with the Web Audio API, it takes up too much bandwidth to be useful. You need to use a codec to get the bandwidth usage down.
The codecs you want to look at include MP3, AAC (and its variants such as HE-AAC), and Opus. Not all browsers support all codecs. MP3 is the most widely compatible but AAC provides better quality for a given bitrate. Opus is a free and open codec but still doesn't have the greatest client adoption. In any case, there isn't yet a codec that you can run in-browser with any real stability. (Although it's being worked on! There are a lot of test projects made with Emscripten.) I solved this problem by reducing the bit depth of my samples to 16-bit signed integers and sending this PCM stream to a server to do the codec, over a binary websocket.
This encoding server took the PCM stream and ran it through a codec server-side. Here you can use whatever you'd like, such as a licensed codec binary or a tool like FFmpeg which encapsulates multiple codecs.
Next, this server streamed the data to a real streaming media server like Icecast. SHOUTcast and Icecast servers take the encoded stream and relay it to many clients over an HTTP-like connection. (Icecast is HTTP compliant whereas SHOUTcast is close but not quite there which can cause compatibility issues.)
Once you have your streaming server set up, it's as simple as referencing the stream URL in your <audio> tag.
Hopefully that gets you started. Depending on your needs, you might also look into WebRTC which does all of this for you but doesn't give you options for quality and also doesn't scale beyond a few users.

Does YouTube stream Videos via TCP?

I just sniffed some traffic using wireshark and noticed, that the YouTube traffic relies on TCP. I thought, they were using UDP? But it seems like as if they would use HTTP octet streams. Is YouTube really using TCP for streams or am i missing something?
Because they need everything TCP provides (slow start, transmit pacing, exponential backoff, receive windows, reordering, duplicate rejection, and so on) they would either have to use TCP or try to do all those things themselves. There's no way they could do that better than each operating system's optimized TCP implementation.
Obviously, Google is currently experimenting with own Protocol Implementations, like QUIC (Quick UDP Internet Connection), as one can see when examining the HTTP Response
HTTP/1.1 200 OK
...
Content-Type: video/mp4
Alternate-Protocol: 80:quic
...
However, currently, they seem to rely on TCP, just like David mentioned before.
From http://www.crazyengineers.com/threads/youtube-use-tcp-or-udp.38419/:
...of course youtube page uses http [which is over TCP]. The real thing does not happens
via http page but the flash object that is embedded in that page. The
flash object which appear on youtube is video flash player. The video
flash player acts as iframe(technically incorrect term) for contents
that would be called for streaming via flash object. For storing media
contents a media sever have been installed by youtube whose contents
get called when you press play button.
For streaming media to flash player Real Time Streaming
Protocol(RTSP) is used. The play button on flash player acts as RTSP
invoker for media being called and media is streamed via UDP packets.
In fact you don't need to migrate anywhere from page because the
embedded object calls for video not the http page but as the object is
embedded on http page once you close it, object also get closed.

Understanding REST: REST as a high volume transport?

I'm designing a system that will need to move multi-GB backup images over TCP, and I'm looking at REST as an alternative to ONC RPC.
For example, I might have
POST http://site/backups/image1
where image1 is an 50GB file whose data is contained in the HTTP body.
My question: is this within the scope of what REST is meant for? Is it inappropriate to move massive files over HTTP? My preliminary testing shows that the performance isn't too bad, and I like the clean, debuggable protocol, as opposed to a custom ONC RPC server. But is this overloading the role of a webserver?
Thanks,
-Steve
HTTP has about the same overheads as FTP.
An HTTP server if often asked to do more work than an FTP server. But otherwise, using HTTP to send a large file is about the same as using FTP.
The only consideration is making sure your web server and web application framework are configured to do this kind of thing without needlessly expanding the entire 50Gb file inside Apache.
Steve,
HTTP has a look-before-you-leap 'feature' that allows the client to ask the server whether it will accept the data submission before it actually sends out the data. I'd look into using this to avoid transferring GBs of data only to find out that the server is currently not willing to handle them. Look at the HTTP Expect header and 100 Continue status codes.
Also, you can use FTP within a RESTful approach, IOW, think along the lines of
<backup-store href="ftp://example.org/site/backup/images/"/>
and make your clients understand the ftp URI scheme.
Finally, the T in HTTP means transfer and not transport - an important distinction to make because the former is an application semantic (HTTP is an application protocol) and the latter is a not.
HTH,
Jan
REST has nothing to do with how large your data is or which method you use to transport it.

Resources