Hls stream -> Nginx (free) -> Rtsp stream - nginx

Is that even possible? I would like to stream a 4K Stereo video and although VLC seems to work i want a more stable option like NGINX.

Related

how to set nginx with rtmp (mp3) audio?

I have ubuntu 18.04 and I was install nginx with rtmp module and when I send live audio and video - audio not work. codec from video is h 264 and audio mp3. how to make audio to be work?
Quick answer: nginx-rtmp does not support mp3 over HLS, but there are some workarounds.
But because you didn't describe your use scenario with more detail, let me discuss it in general scenarios.
Publish MP3 Live Stream
You can use FFmpeg to publish live stream, or you should get an encoder which supports MP3 audio codec. For example:
ffmpeg -re -i doc/source.flv -vcodec copy -acodec libmp3lame \
-f flv rtmp://localhost/live/livestream
Not: OBS only support AAC audio codec, as I know.
You can use nginx-rtmp or SRS as media server, then play the RTMP stream by:
FFPLAY: ffplay rtmp://localhost/live/livestream
VLC: rtmp://localhost/live/livestream
You might want to covert RTMP to other protocols for other players such as H5 or mobile phone, let me describe it in next section.
Play MP3 Live Stream
Viewers literally never use ffplay or VLC as player, but use H5 or mobile phone, so you must covert MP3 over RTMP to other protocols, which might not be supported by some media server.
HLS is the most common used protocol for player, please note that nginx-rtmp does not support MP3 over HLS, see #181. After some research, I found that nginx-rtmp force to use AAC in HLS stream, so you must fix it by merging this patch.
Another workaround is to use SRS, which supports MP3 over HLS and other protocols. For detail usage, please see #296. For example:
MP3 over HLS: Supported. Recommend to enable hls_acodec mp3; to make the first segment written without change PMT of HLS.
MP3 over HTTP-TS: Supported. You can use mpegts.js H5 player to play the live stream, which has low latency than HLS.
Covert MP3 to WebRTC(Opus): Supported. Use WebRTC to play the live stream, is another solution.
However, you can covert MP3 codec to AAC by FFmpeg, because AAC is widely used now and almost all servers and players support AAC, however it consumes more CPU (about 2% per stream) for audio transcoding:
ffmpeg -i rtmp://localhost/live/livestream -vcodec copy -acodec aac \
-f flv rtmp://localhost/live/livestream-aac
PS: Note that you can also do pure audio live stream, by deliver by HTTP-MP3. And if need to DVR the live stream, both FLV and MP4 are ok.

How to convert udp mpeg ts stream to http stream?

I'm trying to stream a live video on a website and I already have a udp mpeg-ts. I cannot show this stream on html so I wanna convert this stream to http on server then send it to clients. how can I do that using ffmpeg?
any other solution accepted too.
thank you
The key question is - do you need a low latency live streaming or 15-30 seconds latency is OK for you. If you don't need low latency, use ffmpeg to ingest your udp mpeg-ts and output HLS.
For low latency live streaming to web browsers, you will have to install a media server software, such as Wowza / Unreal Media Server / Red5 or similar.
The media server will ingest your udp mpeg-ts and will convert it to WebRTC streams playable by web browsers.

What would be the best strategy to take a RTP stream and send it to an RTMP server?

I'm receiving a RTP/UDP from a hardware encoder, I have tried ffmpeg, so it takes this input and outputs the stream as FLV (it's being sent to NGINX, nginx-rtmp-module). However I'm not able to play the stream smoothly once it's received by nginx, some frames are broken or lost, etc.
I think that my CPU is too slow for this format change (FLV) and/or ffmpeg is missing a lot of RTP packets. Any ideas?

HTTP Live Streaming encoders

I'm developing an iPhone app which helps you stream videos from IP camera into your iPhone. Here I need to use HTTP Live streaming. I wanted to know about any encoders required to convert ip camera output to MPEG-2 transport stream. Thanks.
Might sound obvious but you could push the camera stream to some media server like Wowza, and transcode it to HLS there easily. HLS is "native" for iOS so your iPhone won't be having any troubles with playback.

Playing RTP stream in flex

I am trying to play the RTP playload in flex but no success. Can some enlighten me how to achieve this without using RTMP Server as middle ware.
You can't do that without using an RTMP server. The NetConnection class you find in Flex can send and receive RTMP streams, and those streams can have the same payload you find in RTP packets. Although, to unpack RTP packets and create RTMP packets you need an RTMP server like Wowza Media Server, or something alike.
There are several open source media servers you can use:
Red5
Wowza
RTMPD
Any of these would suit your purpose. Flex makes the client side pretty trivial too.

Resources