HLS Nginx and Streaming Media - nginx

This is a process question more then anything.
I've been reading up on HTTP streaming (HLS) over the past week.
My goal is to be able to deliver content from my NGINX web server using HLS.
I have looked at Clappr using HLS.js. as a player however I'm just unclear what I need to do to deliver the content. Do I need a streaming media sever? just a web server?
I think I can use ffmpeg to make the HLS streams.
Eventually I'm hoping to be able to record incoming streams for processing later. Right now I just want to be able to put out HLS streams.
Any advice or infographic or something to put this in perspective would be appreciated.

Do I need a streaming media sever? just a web server?
One of the main purposes of HLS is that you can serve the data with any HTTP server. Content is effectively files done in chunks. No special streaming server is needed. Nginx is fine.

Related

Consume RTMP ans distribute via WebSocket

I have a Linux PC which streams video (with audio) from a webcam to an RTMP server (nginx). The nginx RTMP server then converts the video into HLS and that HLS stream is shown on the browsers. Everything works good. The only problem is the delay due to the HLS protocol (10-20 seconds depending on the HLS playlist size).
I am looking for an alternative to HLS which can run on most of the major browsers. I can not use WebRTC due to the lack of audio, I can not use flash due to lack of support is mobile browsers. So my question is, is there any way to consume the RTMP stream, then distribute it via WebSocket and play on modern WebSocket supported browsers without any additional plugin? I am using ffmpeg to publish the RTMP stream from the Linux PC. If required, the source stream can easily be changed to other live streaming protocol like RTSP. So if there's some other solution which can solve this problem without RTMP, I can go for that too.
Thanks in advance.
Yes this is possible, but there's an even simpler solution. Just stream the data over HTTP.
WebSockets are only needed for bi-directional communication. You're just sending the video to the client.

How do I set up a live audio streaming http server?

I was hoping to build an application that streams audio (mp3, ogg, etc.) from my microphone to a web browser.
I think I can use the html5 audio tag to read/play the stream from my server.
The area I'm really stuck on is how to setup the streaming http endpoint. What technologies will I need, and how should my server be structured to get the live audio from my mic and accessible from my server?
For example, for streaming mp3, do I constantly respond with mp3 frames as they are recorded?
Thanks for any help!
First off, let's split this problem up into a few parts. You have the audio capture (recording), the encoding/codec, the server, and the receiving clients.
Capture -> Codec -> Server -> Several Clients
For audio capture, you will need to use the Web Audio API along with getUserMedia. This will allow you to get 32-bit floating point PCM samples from the recording device. This data stream takes up a ton of bandwidth... a few megabit for a stereo stream. This stream is not directly playable in an HTML5 audio tag, and while you could play it on the receiving end with the Web Audio API, it takes up too much bandwidth to be useful. You need to use a codec to get the bandwidth usage down.
The codecs you want to look at include MP3, AAC (and its variants such as HE-AAC), and Opus. Not all browsers support all codecs. MP3 is the most widely compatible but AAC provides better quality for a given bitrate. Opus is a free and open codec but still doesn't have the greatest client adoption. In any case, there isn't yet a codec that you can run in-browser with any real stability. (Although it's being worked on! There are a lot of test projects made with Emscripten.) I solved this problem by reducing the bit depth of my samples to 16-bit signed integers and sending this PCM stream to a server to do the codec, over a binary websocket.
This encoding server took the PCM stream and ran it through a codec server-side. Here you can use whatever you'd like, such as a licensed codec binary or a tool like FFmpeg which encapsulates multiple codecs.
Next, this server streamed the data to a real streaming media server like Icecast. SHOUTcast and Icecast servers take the encoded stream and relay it to many clients over an HTTP-like connection. (Icecast is HTTP compliant whereas SHOUTcast is close but not quite there which can cause compatibility issues.)
Once you have your streaming server set up, it's as simple as referencing the stream URL in your <audio> tag.
Hopefully that gets you started. Depending on your needs, you might also look into WebRTC which does all of this for you but doesn't give you options for quality and also doesn't scale beyond a few users.

Proxy a rtmp stream

How could I proxy a rtmp stream?
I have two raspberry pi streaming live video from raspicams on my LAN. Each raspberry pi sends the video to ffmpeg which wraps in flv and sends to crtmpserver.
A third server using nginx, has a static html page with two instances of jwplayer, each pointing to one raspberry pi.
The setup is just like this one.
The web server uses authentication and I'd like streams not to be public too.
I'm thinking of trying nginx-rtmp-module, but I am not sure if it would help me. Also, it seems dormant and has many open issues.
I'm open to suggestions, thanks in advance!
You can use MonaServer with this client (copy it into the www/ directory of MonaServer) which listen on the udp port 6666 and wait for an flv file to publish it with the name "file".
Then you should already be able to play your stream with jwplayer (with the address rtmp:///file) or with any other player. MonaServer support the HTTP protocol so you can host your html page without nginx if you want.
Now if you want to filter the subscription to "file" you need to write a client:onSubscribe function in your main.lua script, just like this :
function onConnection(client)
INFO("Connection from ",client.address)
function client:onSubscribe(listener)
INFO("Subscribing to ", listener.publication.name, "...")
if not client.right then
error("no rights to play it")
end
end
end
(Here you need to change the "not client.right" and implement your authentication function for your purpose)
Going further you could use another flash video client that support RTMFP in order to handle a large number of clients. Contact me (jammetthomas AT gmail.com) for more informations.

Understanding REST: REST as a high volume transport?

I'm designing a system that will need to move multi-GB backup images over TCP, and I'm looking at REST as an alternative to ONC RPC.
For example, I might have
POST http://site/backups/image1
where image1 is an 50GB file whose data is contained in the HTTP body.
My question: is this within the scope of what REST is meant for? Is it inappropriate to move massive files over HTTP? My preliminary testing shows that the performance isn't too bad, and I like the clean, debuggable protocol, as opposed to a custom ONC RPC server. But is this overloading the role of a webserver?
Thanks,
-Steve
HTTP has about the same overheads as FTP.
An HTTP server if often asked to do more work than an FTP server. But otherwise, using HTTP to send a large file is about the same as using FTP.
The only consideration is making sure your web server and web application framework are configured to do this kind of thing without needlessly expanding the entire 50Gb file inside Apache.
Steve,
HTTP has a look-before-you-leap 'feature' that allows the client to ask the server whether it will accept the data submission before it actually sends out the data. I'd look into using this to avoid transferring GBs of data only to find out that the server is currently not willing to handle them. Look at the HTTP Expect header and 100 Continue status codes.
Also, you can use FTP within a RESTful approach, IOW, think along the lines of
<backup-store href="ftp://example.org/site/backup/images/"/>
and make your clients understand the ftp URI scheme.
Finally, the T in HTTP means transfer and not transport - an important distinction to make because the former is an application semantic (HTTP is an application protocol) and the latter is a not.
HTH,
Jan
REST has nothing to do with how large your data is or which method you use to transport it.

http streaming

is http streaming possible without using any streaming servers?
Of course. You can output and flush, it gets to client before you end the script, thus it's streaming.
For live streaming, only segmented, like Apple HLS, other variants of segmented HLS (like OSMF) are not widely supported at the moment.
IIS from microsoft can also do smooth streaming (and Apple HLS as well).
Apple HLS can be supported on any web server when you pre-segment stream to chunks and just upload to web server path.
For VoD streaming, there is lot's of modules for all web servers.
Yes, although libraries have varying level of support. What needs to be used is "http chunking", such that lib does not try to buffer the whole request/response in memory (to computed the content length header) and instead indicate content comes in chunks.
Yes,not only its possible but has been implemented by various media server companies, only reason they still make usage of servers because of commercial purpose. Basically the content you want to stream should be divided into chunks/packets and then client machine can request those chunks via simple HTTP Get Requests.
Well if you have WebSockets available you can actually get quite low-latency streaming for low-fps scenarios by sending video frames as jpegs.
You can also send audio separately and play it using WebAudio on your browser. I imagine it could work for scenarios where you do not require perfect audio-video sync.
Another approach is to stream MPEG chunks through WebSockets, decode them in JS using jsmpeg and render to a canvas. You can find more here (video-only):
http://phoboslab.org/log/2013/09/html5-live-video-streaming-via-websockets
yes, the answer to your problem with http streaming is MPEG-DASH tech

Resources