ffmpeg or vlc playlist to rtmp stream? - nginx

So, I've read all the articles here and unfortunately I can't seem to find the answers I'm looking for. I've gotten close, but the certain magic strings allude me.
I'm running hls live streaming (nginx) on ubuntu 17.10 server. In short, I can get the server running one video at a time fine with ffmpeg (with subtitles) using the following:
ffmpeg -re -i "1.mkv" -vcodec libx264 -vprofile baseline -g 30 -b:v 1000k -s 852x480 -acodec aac -strict -2 -b:a 192k -ac 2 -vf subtitles=1.srt -f flv rtmp://localhost:1935/show/stream
Though, I cannot find a solution to run a playlist using this method. It seems impossible, and when I try vlc via sout (internally, or externally) I reveive either buffer problems, or the aac experimental codec error:
[aac # 0xb162e900] The encoder 'aac' is experimental but experimental codecs are not enabled, add '-strict -2' if you want to use it.
Example string that spits that error:
vlc "1.mkv" --sout '#transcode{soverlay,vb=1000,vcodec=h264,width=853,height=480,acodec=mp4a,ab=128,channels=2,samplerate=44100}:std{access=rtmp,mux=ffmpeg{mux=flv},dst=rtmp://localhost:1935/show/stream}'
Every other audio codec doesn't work with flv. I'm at a loss, I've tried almost every combination I could think of and digout just to get to this point. The best functioning out of them has been ffmpeg: it doesn't buffer video at all, plays smoothly, but just can't play a playlist. Whereas vlc can play a playlist but buffers, and has no sound (internally). I've tried aenc=ffmpeg{strict=-2}, batch pipes, etc, etc. I need help. Nothing works. Is there any solution? All I want is to run a playlist of 25 videos, all different variations, on a loop to the m3u8 for embedding.

A friend of mine mentioned he used bash scripts to have a seamless playlist like viewing feature. Hopefully that points you in the direction you need. I can try digging them up if you want to work together on this, coz I too am interested in finding out more about it.

Related

Combine two input audio/video in Nginx RTMP

I'm trying to do a Web TV for my Radio but i'm stucked into this problem.
I'm trying to put a video loop (mp4) and trying to add into that loop the audio source of my radio who stream in m3u8 via Shoutcast.
Is possible to do this? I try to search everything on internet without any particular result.
Use -stream_loop:
ffmpeg -re -stream_loop -1 -i input.mp4 -i rtmp:// -map 0:v -map 1:a output
-re will play input.mp4 at realtime for streaming instead of as fast as possible.
Make sure you use FFmpeg 4.0 or newer or it will not work.

DirectShow: How to capture audio and video

I am looking for a way to capture my desktop. I came across something called direct Show but I cannot seem to get the syntax right on ffmpeg.
What can I do to capture the audio and video ?
I tried the syntax given in direct show site but not sure about it.
I just got mine to work and below i've given two examples of how you can do it and play it.
First one is
ffmpeg -f dshow -i video="screen-capture-recorder":audio="virtual-audio-capturer" -vcodec h264_nvenc -f mpegts udp://10.1.0.0:1234
This will stream it in the same network in the udp link
play it by typing ffplay udp://#10.1.0.0:1234.
You can change the udp link to what you want. Try different variation so it work. or even type this into VLC, which will also make it work.
2ND is
ffmpeg -f dshow -i video="screen-capture-recorder":audio="virtual-audio-capturer" -vcodec h264_nvenc output.mp4
You will get a mp4 file with the recording. Just press ctrl + c to stop the recording. Or if you know how long to record for add -t *seconds*. Replace seconds with actual number of seconds you want to record for. just add the -t before the output file name.

DirectShow stream using ffmpeg point to point streaming through TCP protocol

I had set up a point-to-point stream using ffmpeg via UDP protocol and the stream worked, but there was screen tearing etc. I already tried raising the buffer size, but it did not help. This is a work network, so the UDP protocol won't work.
here is the full command:
ffmpeg -f dshow -i video="UScreenCapture" -r 30 -vcodec mpeg4 -q 12 -f mpegts udp://192.168.1.220:1234?pkt_size=188?buffer_size=65535
I've tried to make this work with TCP with no success
Here's what i've got now:
ffmpeg -f dshow -i video="UScreenCapture" -f mpegts tcp://192.168.1.194:5555
this returns an error:
real-time buffer [UScreenCapture] [Video input] too full or near too
full <323% of size: 3041280 [rtbufsize parameter]>! frame dropped!
This last message repeated xxxx times (it went up to around 1400 and I just turned it off).
I've tried to implement the -rtbufsize paremeter and raising the buffsize up to 800000000, didn't help.
I would appreciate any suggestions on how to solve this.

How to convert a time series data into a live streaming video

There is a event going on..it goes for 2 hours and at each second it creates new series data. How can I take this data(x=[],y=[]),convert into a graph and then convert it into a video live stream and then see it?
Is this even possible?
I have never done this but it seems possible.
First, you create a PNG with the graph using http://www.gnuplot.info.
Gnuplot is a portable command-line driven graphing utility for Linux, OS/2, MS Windows, OSX, VMS, and many other platforms. [...] It was originally created to allow scientists and students to visualize mathematical functions and data interactively, but has grown to support many non-interactive uses such as web scripting.
You can create a script that routinely updates the PNG with new data.
There is also a node.js module to interface with gnuplot.
https://www.npmjs.com/package/plotframes
Then, you could use FFmpeg to create an HLS stream looping an image over and over again.
ffmpeg -loop 1 -r 30000/1001 -i graph_960x540.png -an -s 960x540 -r 30000/1001 -c:v libx264 -crf 10 -maxrate 900k -b:v 900k -profile:v baseline -bufsize 1800k -pix_fmt yuv420p -hls_time 2 -hls_list_size 0 -hls_segment_filename 'png2hls/file%03d.ts' png2hls/index.m3u8
However, is a video stream required to display a graph? Why not display the image and use JavaScript to update the image as new graphs are produced?
Cheers.

How to stream with ffmpeg via http protocol

I'm currently doing a stream that is supposed to display correctly within Flowplayer.
First I send it to another PC via RTP. Here, I also checked with VLC that the codec etc. arrive correctly, which they do.
Now I want to expose this stream to Flowplayer as a file, so it can be displayed, via something I used in VLC:
http://localhost:8080/test.mp4
for example.
The full line I got is: ffmpeg -i input -f mp4 http://localhost:8080/test.mp4
However, no matter how I try to do this, I only get an input/output error. Is this only possible with something like ffserver or another?
What I think is this doesn't work because ffmpeg can't act as a server; on VLC it works since it can. (Though VLC ruins the codecs I set and it can't be read afterwards for some reason)
A (sort of) workaround I can use is saving the RTP stream to a file, and then letting flowplayer load it. This, however, only works once the file is not accessed anymore; I get a codec error otherwise.
To have FFmpeg act as an HTTP server, you need to pass the -listen 1 option. Additionally, -f mp4 will result in a non-fragmented MP4, which is not suitable for streaming. You can get a fragmented MP4 with -movflags frag_keyframe+empty_moov. A full working command line is:
ffmpeg -i input -listen 1 -f mp4 -movflags frag_keyframe+empty_moov http://localhost:8080
Other options you may find helpful are -re to limit the streaming speed to the input framerate, -stream_loop -1 to loop the input, and -c copy to avoid reencoding.
you need this command line
ffmpeg -f v4l2 -s 320x240 -r 25 -i /dev/video0 -f alsa -ac 1 -i hw:0 http://localhost:8090/feed1.ffm
make sure that your feed name ends with ".ffm" and if it's not the case, then add "-f ffm" before your feed URL, to manually specify the output format (because ffmpeg won't be able to figure it out automatically any more), like this "-f ffm http://localhost:8090/blah.bleh".

Resources