RTMP Nginx exec_push with alaw and h264 - nginx

I have 2 RTMP servers, one NGINX with the RTMP module, and a second one that can only consume RTMP with H264 and ALAW, which will receive video from the NGINX one.
I successfully pushed my camera image to the second server using Gstreamer:
gst-launch-1.0 v4l2src device="/dev/video0" ! videoconvert ! video/x-raw,format=I420 ! x264enc speed-preset=ultrafast tune=zerolatency key-int-max=20 ! flvmux name=flvmux ! rtmpsink location=rtmp://<second_server_ip>:1935/publish/foobar audiotestsrc ! alawenc ! flvmux.
Now, I need to send a video from the NGINX server to the second server, but it needs to have H264 video encoding and ALAW audio encoding.
I tried one more intermediate step, where I streamed to the NGINX server and ran
gst-launch-1.0 rtmpsrc location=rtmp://<nginx_ip>:1935/live/100 ! videoconvert ! video/x-raw,format=I420 ! x264enc speed-preset=ultrafast tune=zerolatency key-int-max=20 ! flvmux name=flvmux ! rtmpsink location=rtmp://<second_server_ip>:1935/publish/foobar audiotestsrc ! alawenc | flvmux.
but I got
Setting pipeline to PAUSED ...
pipeline is PREROLLING ...
ERROR: from element /GstPipeline:pipeline0/GstRTMPSrc:rtmpsrc0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3072): gst_base_src_loop (): /GstPipeline:pipeline0/GstRTMPSrc:rtmpsrc0:
streaming stopped, reason not-negotiated (-4)
ERROR: pipeline doesn't want to preroll.
What am I missing in the GStreamer command? And how can I add it to the NGINX application to automatically push the streams received? I'm ok with doing it with FFmpeg if I need to.
Thank you
EDIT:
My first server will be fed with a stream from a mobile app. This server (NGINX) is configured like this:
application live {
live on;
interleave on;
}
This server should push to a second server (written in Go), which is going to convert RTMP to WebRTC so I can play the stream in a browser. As I previously stated, I have successfully streamed RTMP to the second server with GStreamer, but I want the first server to automatically push to the second, and it's that push that I been able todo yet.
EDIT 2:
I have used VLC and the encoding is H264, so no problem there. But I do need ALAW instead of MPEG AAC audio, how can I change on the NGINX server when I push it to the second server?

Related

Error while trying to send logs with rsyslog without local storage

I'm trying to send logs into datadog using rsyslog. Ideally, I'm trying to do this without having the logs stored on the server hosting rsyslog. I've run into an error in my config that I haven't been able to find out much about. The error occurs on startup of rsyslog.
omfwd: could not get addrinfo for hostname '(null)':'(null)': Name or service not known [v8.2001.0 try https://www.rsyslog.com/e/2007 ]
Here's the portion I've added into the default rsyslog.config
module(load="imudp")
input(type="imudp" port="514" ruleset="datadog")
ruleset(name="datadog"){
action(
type="omfwd"
action.resumeRetryCount="-1"
queue.type="linkedList"
queue.saveOnShutdown="on"
queue.maxDiskSpace="1g"
queue.fileName="fwdRule1"
)
$template DatadogFormat,"00000000000000000 <%pri%>%protocol-version% %timestamp:::date-rfc3339% %HOSTNAME% %app-name% - - - %msg%\n "
$DefaultNetstreamDriverCAFile /etc/ssl/certs/ca-certificates.crt
$ActionSendStreamDriver gtls
$ActionSendStreamDriverMode 1
$ActionSendStreamDriverAuthMode x509/name
$ActionSendStreamDriverPermittedPeer *.logs.datadoghq.com
*.* ##intake.logs.datadoghq.com:10516;DatadogFormat
}
First things first.
The module imudp enables log reception over udp.
The module omfwd enables log forwarding over (tcp, udp, ...)
So most probably - or atleast as far as i can tell - with rsyslog you just want to log messages locally and then send them to datadog.
I don't know anything about the $ActionSendStreamDriver tags, so I can't help you there. But what is jumping out is, that in your action you haven't defined where the logs should be sent to.
ruleset(name="datadog"){
action(
type="omfwd"
target="10.100.1.1"
port="514"
protocol="udp"
...
)
...
}

nginx push rtmp stream to ffmpeg

On my Raspberry Pi with camera module, I try to setup a web-based streaming platform. I want to preview the stream in my browser and use CGI scripts to start/stop broadcasting to youtube (,...).
This is how I did the streaming setup so far:
Nginx puts up an RTMP application webcam. This is where I'll send the camera and audio stream usig ffmpeg. It publishes the stream as HLS for the web preview. It's also pushing the stream to another application source. That's where I want to (occasionally) hook up another ffmpeg process for broadcasting to youtube (,...) RTMP end points.
I initiate the stream using ffmpeg like this:
ffmpeg -loglevel debug -f v4l2 -framerate 15 -video_size 1280x720 -input_format h264 -i /dev/video0 -f alsa -i hw:2 -codec:v copy -g 15 -codec:a aac -b:a 128k -ar 44100 -strict experimental -f flv "rtmp://localhost:1935/webcam/hhart"
So far everything works fine. I can preview the HLS stream using a video.js viewer on my website (also served by nginx).
Now I want to start another ffmpeg process for broadcasting to my youtube channel, hooked up to the source application like this:
ffmpeg -loglevel debug -f flv -listen 1 -i rtmp://localhost:1935/source/hhart -c copy 'rtmp://a.rtmp.youtube.com/live2/<KEY>'
(in the final setup, launching and killing this process will be done via CGI scripts)
This is what ffmpeg returns:
Opening an input file: rtmp://localhost:1935/source/hhart.
[flv # 0x2032480] Opening 'rtmp://localhost:1935/source/hhart' for reading
[rtmp # 0x2032a10] No default whitelist set
[tcp # 0x20330f0] No default whitelist set
and then... nothing happens. There's no stream coming in at Youtube studio, but there are no error messages either.
Some other tests I did:
from the webcam application, push directly to the Youtube RTMP => that works! (but it's not what I want, because I want the HLS stream to be online always, but the broadcasting only when I'm going live.)
from VLC display the stream at rtmp://localhost:1935/source/hhart => similar to ffmpeg, there's no error message, the progress bar keeps loading.
So I have the impression that there is something going on, but there's no actual data transmitted.
RTMP section in nginx.conf:
rtmp {
server {
listen 1935;
chunk_size 4000;
application webcam {
live on;
hls on;
hls_path /Services/Webcam/HLSStream;
hls_fragment 3;
hls_playlist_length 60;
#deny play all;
push rtmp://localhost:1935/source/;
#push rtmp://a.rtmp.youtube.com/live2/<KEY>;
}
application source {
live on;
record off;
}
}
}
Of course, I may be totally on the wrong track, so any suggestions how I can realize my requirements in a better way, are welcome!
OK, I recompiled nginx with --with-debug and that got me to a solution.
Rather than pushing the stream to another application, I have to push the stream to an RTMP address on another port, and there the second ffmpeg process can pick it up. And it seems to be better to use 127.0.0.1 instead of localhost.
Like this:
rtmp {
server {
listen 1935;
chunk_size 4000;
application webcam {
live on;
hls on;
hls_path /Services/Webcam/HLSStream;
hls_fragment 3;
hls_playlist_length 60;
#deny play all;
push rtmp://127.0.0.1:1936/test/; # ADDED
record off;
}
}
}
Launching the broadcast to youtube:
ffmpeg -loglevel debug -f flv -listen 1 -i rtmp://127.0.0.1:1936/test/ -c copy -f flv 'rtmp://a.rtmp.youtube.com/live2/<KEY>'
Now my HLS stream is always online, and I can control broadcasting to youtube by launching/killing the second ffmpeg process.

How to send RTP stream to Janus from NGINX RTMP module?

This is my first post here, even though this platform has already helped me a lot.
So, i'm trying to create a stream and display it in a browser. I have already configured NGINX with the rtmp module and my stream works very well with HLS (between 5 and 10 seconds of latency).
Now I would like to set up a low-latency stream and that's why I have installed the janus-gateway webRTC server that allows to take in input an RTP stream and provide in output a webRTC stream.
Here's the schema I'd like to follow :
OBS -> RTMP -> Nginx-rtmp-module -> ffmpeg -> RTP -> Janus -> webRTC -> Browser
But I have a problem with this part : "nginx-rtmp-module -> ffmpeg -> janus"
In fact, my janus's server is running and demos streaming works very well in localhost, but when i try to provide an RTP stream, Janus don't detect the stream in the demos (it shows "No remote video available").
Anyone can help me, please ?
Ressources :
My janus.plugin.streaming.jcfg configuration :
rtp-sample: {
type = "rtp"
id = 1
description = "Opus/VP8 live stream coming from external source"
metadata = "You can use this metadata section to put any info you want!"
audio = true
video = true
audioport = 5002
audiopt = 111
audiortpmap = "opus/48000/2"
videoport = 5004
videopt = 100
videortpmap = "VP8/90000"
secret = "adminpwd"
}
My nginx.conf application :
application test {
deny play all;
live on;
on_publish http://localhost/test/backend/sec/live_auth.php;
exec ffmpeg -i rtmp://localhost/test/$name -an -c:v copy -flags global_header -bsf dump_extra -f rtp rtp://localhost:5004;
}
If you need something more for help me, don't hesitate ! Thank you in advance, and sorry for my bad english :)
I finally solved this problem with the following command :
sudo ffmpeg -y -i "rtmp://127.0.0.1/app/stream" -c:v libx264 -profile:v main -s 1920x1080 -an -preset ultrafast -tune zerolatency -g 50 -f rtp rtp://127.0.0.1:5004
Unfortunately, when I use -c:v copy, it doesn't work. It only works when encoding with libx264 which adds latency and I got between 3 and 4 seconds of latency.
However, when I installed Janus, my goal was to do better than with HLS, protocol with which I reach 2.5 seconds of latency.
So Janus did not meet my need. Moreover I was warned that it was not a stream server. After some research I came across Github on the Oven Media Engine project, a stream server that offers a latency of less than 1s. The documentation is complete on the dedicated site and a player (Oven Media Player) adapted to this server is available under MIT license. The server is under GPLv2 license.
Here is the current schema of my architecture :
OBS -> Nginx (which allows to allow streaming with on_publish, because OME doesn't allow it yet. The stream is then pushed to the OME server) -> OME -> Transcoding in different bitrate and resolution (optional) -> OME -> Edge OME (optional) -> player.
If you have any questions, don't hesitate, the support is very friendly !
Hope it helps

GStreamer/iMX6: streaming h264 encoded video over serial port between iMX6 and PC

Recently, I have started to work on a project that aims at live video streaming applications based on imx6 processors. A quick description of what I have done so far and what I am trying to do:
Setup: imx6 board(Boundary devices Sabre Lite) acting as the video server(using GStreamer imx plugins), and a PC running Ubuntu on it that receives data from imx6 and streams the video using GStreamer features.
On the imx6 processor, I run the 'testvideosrc' and have successfully streamed it using the RTP over UDP using eternet interface between the imx6 and the PC.
Accessing the serial port using the device files in Linux, next I tried out dumping the video data from imx6 board to a serial port and reading this serial port on the PC. For this, the baud rate of both devices was configured to 115200 baud. The encoding 'bitrate' is configured to 5Kbps. Here are the commands:
IMX6:
#gst-launch-1.0 -v videotestsrc pattern=18 ! video/x- raw,width=100,height=50 ! imxvpuenc_h264 bitrate=5 ! h264parse ! filesink location=/dev/ttyUSB1
PC:
#CAPS=video/x-h264
#gst-launch-1.0 -v filesrc location=/dev/ttyUSB1 ! $CAPS ! h264parse ! avdec_h264 ! autovideosink sync=true
There are no errors observed on the imx6 board.
However, I see the following errors at the PC side:
# GST_DEBUG=3 gst-launch-1.0 -v filesrc location=/dev/ttyUSB1 ! $CAPS ! h264parse ! avdec_h264 ! autovideosink sync=true
Setting pipeline to PAUSED …
0:00:00.066439392 15475 0x556d8a01d160 WARN basesrc gstbasesrc.c:3583:gst_base_src_start_complete: pad not activated yet
Pipeline is PREROLLING …
0:00:21.730466251 15475 0x556d8a000940 WARN capsfilter
gstcapsfilter.c:455:gst_capsfilter_prepare_buf: error: Filter caps do not completely specify the output format
0:00:21.730523691 15475 0x556d8a000940 WARN capsfilter gstcapsfilter.c:455:gst_capsfilter_prepare_buf: error: Output caps are unfixed: video/x-h264, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
0:00:21.730676173 15475 0x556d8a000940 WARN basetransform gstbasetransform.c:2159:default_generate_output: could not get buffer from pool: error
0:00:21.730742223 15475 0x556d8a000940 WARN basesrc gstbasesrc.c:3055:gst_base_src_loop: error: Internal data stream error.
0:00:21.730775478 15475 0x556d8a000940 WARN basesrc gstbasesrc.c:3055:gst_base_src_loop: error: streaming stopped, reason error (-5)
ERROR: from element /GstPipeline:pipeline0/GstCapsFilter:capsfilter0: Filter caps do not completely specify the output format
Additional debug info:
gstcapsfilter.c(455): gst_capsfilter_prepare_buf (): /GstPipeline:pipeline0/GstCapsFilter:capsfilter0:
Output caps are unfixed: video/x-h264, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
ERROR: pipeline doesn’t want to preroll.
Setting pipeline to NULL …
Freeing pipeline …
Since, the encoding rate is 5Kbps (bitrate=5, as specified in the above command), I suppose it is possible to send this amount of data via the serial port. I realize that currently, the caps negotiation fails, however, I am unsure of how to proceed with this.
On the PC side, reading the serial port with 'cat /dev/ttyUSB1' succeeds with limited data. The data is unreadable(as expected), however it is not a continuous stream.
Does anyone have an idea, on how to solve this. I also think, I am misinterpreting the usage of Linux device files, when I am attempting to read over the serial data file using GStreamer.
My later test, would be to use an actual camera(MIPI) and try streaming it over serial port. Does it seem feasible or is a completely crazy idea to do?
With the following commands, I could get this working on serial with a baud of 19200. However, the latency is very high in the range of 5-6 seconds. With a baud of 1M, it works with less noticeable latency < 1s.
imx6:
gst-launch-1.0 -v videotestsrc pattern=18 ! video/x-raw,width=100,height=50
! imxvpuenc_h264 bitrate=5 ! h264parse ! filesink location=/dev/ttyUSB0
blocksize=1024 max-bitrate=19000 sync=false
PC:
gst-launch-1.0 -v filesrc location=/dev/ttyUSB1 blocksize=1024 ! $CAPS !
h264parse ! avdec_h264 lowres=2 skip-frame=0 ! autovideosink sync=false

Streaming from a webcam through Nginx to videojs without flash

Has anyone managed to use ffmpeg to stream from a webcam, and then serve this using nginx to a page running videojs. I can't believe this isn't possible without using flash.
I'd like a pure html5 solution without any flash, and I've tried using rtmp.
I can pick up the rtmp stream using vlc, I've got the page with videojs working, but I can't work out how to link it all up. I'm doing this from a rpi3, so have been using the hardware encoder,
ffmpeg -f video4linux2 -i /dev/video0 -c:v h264_omx -c:a copy -b:v 1500k rtmp://localhost/hls/movie
Here was the rtmp nginx setup, I'd compiled it as a module into nginx(module on git hub);
rtmp {
server {
listen 1935;
ping 30s;
notify_method get;
application hls {
live on;
# sample HLS
hls on;
hls_path /tmp/hls;
}
}
}
Thanks
Ok, tried 3 methods, using rasbian stretch on a Pi+ over wireless, with a logitech C270 webcam for use as a baby monitor;
Rtmp
ffserver
motion
Rmtp worked slowly but uses flash, so was unacceptable
I couldn't get ffserver to work at all and it's being deprecated ffserver deprecation notice.
Motion worked fine, good resolution and frame rate, as well as low latency.
Just adding this to try and stop other people trying other solutions before hitting one that worked for me anyway.

Resources