Streaming from a webcam through Nginx to videojs without flash - nginx

Has anyone managed to use ffmpeg to stream from a webcam, and then serve this using nginx to a page running videojs. I can't believe this isn't possible without using flash.
I'd like a pure html5 solution without any flash, and I've tried using rtmp.
I can pick up the rtmp stream using vlc, I've got the page with videojs working, but I can't work out how to link it all up. I'm doing this from a rpi3, so have been using the hardware encoder,
ffmpeg -f video4linux2 -i /dev/video0 -c:v h264_omx -c:a copy -b:v 1500k rtmp://localhost/hls/movie
Here was the rtmp nginx setup, I'd compiled it as a module into nginx(module on git hub);
rtmp {
server {
listen 1935;
ping 30s;
notify_method get;
application hls {
live on;
# sample HLS
hls on;
hls_path /tmp/hls;
}
}
}
Thanks

Ok, tried 3 methods, using rasbian stretch on a Pi+ over wireless, with a logitech C270 webcam for use as a baby monitor;
Rtmp
ffserver
motion
Rmtp worked slowly but uses flash, so was unacceptable
I couldn't get ffserver to work at all and it's being deprecated ffserver deprecation notice.
Motion worked fine, good resolution and frame rate, as well as low latency.
Just adding this to try and stop other people trying other solutions before hitting one that worked for me anyway.

Related

How to convert cctv footage to h265 for streaming and recording in nginx-rtmp module?

I am using the nginx-rtmp-module here to pull an rtsp stream from a camera on local network and convert it to rtmp. This rtmp stream is then converted to hls and made available for livestreaming. It is also being recorded in 5 min segments. (these things can all be seen in the nginx.conf below).
I want to change the video codec to h265 to save storage space, since each 5 min video is ~230mb, and using opencv and python, I was able to get <100mb per 5 min video using h265, so I know there is a lot of space for storage saving.
How can I change the codec of the stream to h265?
I have tried installing libx265-dev and setting -vcodec libx265, however this tells me flv is an invalid container and I'm getting nowhere with finding a valid container for streaming + recording.
My nginx.conf:
rtmp {
server {
listen 1935; # Listen on standard RTMP port
application relay {
live on;
hls on;
hls_path /tmp/hls;
hls_fragment 15s;
exec_static /usr/bin/ffmpeg -i rtsp://test:test#192.168.100.10:8554/fhd -vcodec copy -f flv rtmp://localhost:1935/relay/fhd;
# record block
record video;
record_path /tmp/hls;
record_unique on;
record_interval 5m;
}
}
}
The RTMP protocol does not have support for the h.265 codec. There is no standard way to do this.

nginx push rtmp stream to ffmpeg

On my Raspberry Pi with camera module, I try to setup a web-based streaming platform. I want to preview the stream in my browser and use CGI scripts to start/stop broadcasting to youtube (,...).
This is how I did the streaming setup so far:
Nginx puts up an RTMP application webcam. This is where I'll send the camera and audio stream usig ffmpeg. It publishes the stream as HLS for the web preview. It's also pushing the stream to another application source. That's where I want to (occasionally) hook up another ffmpeg process for broadcasting to youtube (,...) RTMP end points.
I initiate the stream using ffmpeg like this:
ffmpeg -loglevel debug -f v4l2 -framerate 15 -video_size 1280x720 -input_format h264 -i /dev/video0 -f alsa -i hw:2 -codec:v copy -g 15 -codec:a aac -b:a 128k -ar 44100 -strict experimental -f flv "rtmp://localhost:1935/webcam/hhart"
So far everything works fine. I can preview the HLS stream using a video.js viewer on my website (also served by nginx).
Now I want to start another ffmpeg process for broadcasting to my youtube channel, hooked up to the source application like this:
ffmpeg -loglevel debug -f flv -listen 1 -i rtmp://localhost:1935/source/hhart -c copy 'rtmp://a.rtmp.youtube.com/live2/<KEY>'
(in the final setup, launching and killing this process will be done via CGI scripts)
This is what ffmpeg returns:
Opening an input file: rtmp://localhost:1935/source/hhart.
[flv # 0x2032480] Opening 'rtmp://localhost:1935/source/hhart' for reading
[rtmp # 0x2032a10] No default whitelist set
[tcp # 0x20330f0] No default whitelist set
and then... nothing happens. There's no stream coming in at Youtube studio, but there are no error messages either.
Some other tests I did:
from the webcam application, push directly to the Youtube RTMP => that works! (but it's not what I want, because I want the HLS stream to be online always, but the broadcasting only when I'm going live.)
from VLC display the stream at rtmp://localhost:1935/source/hhart => similar to ffmpeg, there's no error message, the progress bar keeps loading.
So I have the impression that there is something going on, but there's no actual data transmitted.
RTMP section in nginx.conf:
rtmp {
server {
listen 1935;
chunk_size 4000;
application webcam {
live on;
hls on;
hls_path /Services/Webcam/HLSStream;
hls_fragment 3;
hls_playlist_length 60;
#deny play all;
push rtmp://localhost:1935/source/;
#push rtmp://a.rtmp.youtube.com/live2/<KEY>;
}
application source {
live on;
record off;
}
}
}
Of course, I may be totally on the wrong track, so any suggestions how I can realize my requirements in a better way, are welcome!
OK, I recompiled nginx with --with-debug and that got me to a solution.
Rather than pushing the stream to another application, I have to push the stream to an RTMP address on another port, and there the second ffmpeg process can pick it up. And it seems to be better to use 127.0.0.1 instead of localhost.
Like this:
rtmp {
server {
listen 1935;
chunk_size 4000;
application webcam {
live on;
hls on;
hls_path /Services/Webcam/HLSStream;
hls_fragment 3;
hls_playlist_length 60;
#deny play all;
push rtmp://127.0.0.1:1936/test/; # ADDED
record off;
}
}
}
Launching the broadcast to youtube:
ffmpeg -loglevel debug -f flv -listen 1 -i rtmp://127.0.0.1:1936/test/ -c copy -f flv 'rtmp://a.rtmp.youtube.com/live2/<KEY>'
Now my HLS stream is always online, and I can control broadcasting to youtube by launching/killing the second ffmpeg process.

How to make Nginx HLS application pull RTMP on request

This is my situation: I want to use Nginx to serve streams over HTTP using HLS. The thing is, I want the HLS streams to be generated/started when I get a request for them.
I can successfully achieve this behavior with RTMP, as I have the following config file
application myApp {
live on;
exec_pull myScript.sh $app $name;
exec_kill_signal term;
}
In my myScript.sh file I have something like this:
ffmpeg -i sourceLink.m3u8 -codec copy -bsf:a aac_adtstoasc -f flv rtmp://localhost/$APP/$NAME;
So that when I request rtmp://myAddress/myApp/someLink it works perfectly.
However, I want to stream HLS instead of RTMP. I have seen other solutions that do this by having an application that starts the RTMP module and pushes it to the HLS application. However, what I want is that when I recieve a request for an HLS stream THEN the RTMP is started (which starts the FFmpeg), and as soon as the requests for the stream are over the RTMP stops receiving requests and thus stops the FFmpeg.
Here is my config file with what I have tried so far:
application myHLSApp {
live on;
pull rtmp://localhost:1935/myApp/someLink name=test static;
# Turn on HLS
hls on;
hls_path /mnt/hls/;
hls_fragment 3;
hls_playlist_length 60;
# disable consuming the stream from nginx as rtmp
deny play all;
}
My main objective is to start the FFmpeg in myScript.sh ONLY when it is needed to then stream that incoming file as HLS from my machine. Any ideas?
Thanks a lot for the help!

Nginx RTMP with Flask

I have followed along the documentation/tutorial on how to set up the config file for RTMP streaming from here: https://www.nginx.com/blog/video-streaming-for-remote-learning-with-nginx/ and it is pretty straight forward. However, I am not sure how I can have my backend built on Flask to redirect the stream to some HLS/DASH video player that is embedded in an HTML template that is sent in response to a client that requested for a specific HTTP endpoint. The tutorial shows how to view locally in a VLC media player but not how to embed it in an HTML file that gets sent to the client. How would I go about doing this? For reference, I am hosting my website on Heroku that is set up with its Nginx buildpack from here, https://github.com/heroku/heroku-buildpack-nginx, and I am not sure if I need to have Heroku install additional dependencies to set up an RTMP server and listen for a stream.
Use the HLS protocol (HTTP Live Streaming).
Nginx knows how to render HTTP perfectly. So, you just need to create and update the playlist and fragments of the HLS stream, as well as monitor the removal of old fragments. To do this, there is a nginx-rtmp-hls module. It is located in the hls directory, but it is not collected by default since requires the libavformat library included in the ffmpeg package. To build nginx with HLS support, you need to add this module explicitly during configuration:
./configure --add-module=/path/to/nginx-rtmp-module --add-module=/path/to/nginx-rtmp-module/hls
To generate HLS, just specify the following directives:
application myapp {
live on;
hls on;
hls_path /tmp/hls;
hls_fragment 5s;
}
And finally, in the http {} section, configure the return of everything related to HLS:
location /hls {
root /tmp;
}
To show stream in browser create html page with such content (example):
<video width="600" height="300" controls="1" autoplay="1" src="http://example.com/hls/mystream.m3u8"></video>
Update 1:
You attached link on Nginx setup tutorial, so i'm referencing on their "Compiling NGINX with the RTMP Module" step with changes related to HLS module:
$ cd /path/to/build/dir
$ git clone https://github.com/arut/nginx-rtmp-module.git
$ git clone https://github.com/nginx/nginx.git
$ cd nginx
$ ./auto/configure --add-module=../nginx-rtmp-module --add-module=../nginx-rtmp-module/hls
$ make
$ sudo make install

is that possible to serv hls and dash mpeg both at a time

actually i'm a kind a newbie to the Nginx RTMP server. I had setup my nginx.conf file to accept both hls and dash-mpeg. but the now problem is at a once I can able to serve either of the hls or dash-mpeg. so now my question is that possible to serve both hls and dash-mpeg at the same time for two different videos? I'm streaming video from OBS Studio.
here are my MPEG and hls code in nginx.conf file
rtmp {
server {
listen 1935; # Listen on standard RTMP port
chunk_size 4000;
application show {
live on;
# Turn on HLS
hls on;
hls_path /nginx/hls/;
hls_fragment 3;
hls_playlist_length 60;
deny play all;
}
application dash {
live on;
dash on;
dash_path /nginx/dash;
}
}
}
thank you advance.
yes, that is possible you can serve how many you need. it having 2 options(as i tried,)
1. you can use the OBS studio for streaming. or
2. You can use the FFmpeg command in the terminal to achieve it.
I suggest you use two terminals or 2 windows of OBS studio to make it possible. but be sure that your network is as strong to support the streaming 2 videos at once. try it and let us to know. and I guess you need to add some more configuration in the dash config. for more details refer to this page.peer5.com, if it is working give an upvote too;-)

Resources