There is a event going on..it goes for 2 hours and at each second it creates new series data. How can I take this data(x=[],y=[]),convert into a graph and then convert it into a video live stream and then see it?
Is this even possible?
I have never done this but it seems possible.
First, you create a PNG with the graph using http://www.gnuplot.info.
Gnuplot is a portable command-line driven graphing utility for Linux, OS/2, MS Windows, OSX, VMS, and many other platforms. [...] It was originally created to allow scientists and students to visualize mathematical functions and data interactively, but has grown to support many non-interactive uses such as web scripting.
You can create a script that routinely updates the PNG with new data.
There is also a node.js module to interface with gnuplot.
https://www.npmjs.com/package/plotframes
Then, you could use FFmpeg to create an HLS stream looping an image over and over again.
ffmpeg -loop 1 -r 30000/1001 -i graph_960x540.png -an -s 960x540 -r 30000/1001 -c:v libx264 -crf 10 -maxrate 900k -b:v 900k -profile:v baseline -bufsize 1800k -pix_fmt yuv420p -hls_time 2 -hls_list_size 0 -hls_segment_filename 'png2hls/file%03d.ts' png2hls/index.m3u8
However, is a video stream required to display a graph? Why not display the image and use JavaScript to update the image as new graphs are produced?
Cheers.
Related
I'm trying to do a Web TV for my Radio but i'm stucked into this problem.
I'm trying to put a video loop (mp4) and trying to add into that loop the audio source of my radio who stream in m3u8 via Shoutcast.
Is possible to do this? I try to search everything on internet without any particular result.
Use -stream_loop:
ffmpeg -re -stream_loop -1 -i input.mp4 -i rtmp:// -map 0:v -map 1:a output
-re will play input.mp4 at realtime for streaming instead of as fast as possible.
Make sure you use FFmpeg 4.0 or newer or it will not work.
So, I've read all the articles here and unfortunately I can't seem to find the answers I'm looking for. I've gotten close, but the certain magic strings allude me.
I'm running hls live streaming (nginx) on ubuntu 17.10 server. In short, I can get the server running one video at a time fine with ffmpeg (with subtitles) using the following:
ffmpeg -re -i "1.mkv" -vcodec libx264 -vprofile baseline -g 30 -b:v 1000k -s 852x480 -acodec aac -strict -2 -b:a 192k -ac 2 -vf subtitles=1.srt -f flv rtmp://localhost:1935/show/stream
Though, I cannot find a solution to run a playlist using this method. It seems impossible, and when I try vlc via sout (internally, or externally) I reveive either buffer problems, or the aac experimental codec error:
[aac # 0xb162e900] The encoder 'aac' is experimental but experimental codecs are not enabled, add '-strict -2' if you want to use it.
Example string that spits that error:
vlc "1.mkv" --sout '#transcode{soverlay,vb=1000,vcodec=h264,width=853,height=480,acodec=mp4a,ab=128,channels=2,samplerate=44100}:std{access=rtmp,mux=ffmpeg{mux=flv},dst=rtmp://localhost:1935/show/stream}'
Every other audio codec doesn't work with flv. I'm at a loss, I've tried almost every combination I could think of and digout just to get to this point. The best functioning out of them has been ffmpeg: it doesn't buffer video at all, plays smoothly, but just can't play a playlist. Whereas vlc can play a playlist but buffers, and has no sound (internally). I've tried aenc=ffmpeg{strict=-2}, batch pipes, etc, etc. I need help. Nothing works. Is there any solution? All I want is to run a playlist of 25 videos, all different variations, on a loop to the m3u8 for embedding.
A friend of mine mentioned he used bash scripts to have a seamless playlist like viewing feature. Hopefully that points you in the direction you need. I can try digging them up if you want to work together on this, coz I too am interested in finding out more about it.
FFMPEG taking too much time to encrypting a video file.
I am running following command "ffmpeg -i "Sample.mp4" -hls_time 10 -hls_list_size 0 -hls_key_info_file enc.keyinfo "sample.m3u8"".
To encrypt 708MB video file it takes around 22 minutes, Which is too much time so
Is there any way to decress the time without changing the video quality?
I am trying to create a hls stream out of an .mp4 file. So far I can create a manifest + .ts files, but I don't have a playlist.m3u8 to deside which manfest I should give the users based on their bandwith. How do I do that?
Here is my current command which creates HLS streams (no playlist):
ffmpeg -i test.mp4 -codec copy -vbsf h264_mp4toannexb -map 0 -f segment -segment_list out.m3u8 -segment_time 10 out%03d.ts
What this creates is out.m3u8:
#EXTM3U
#EXT-X-TARGETDURATION:10
#EXT-X-MEDIA-SEQUENCE:0
#EXTINF:10,
out.ts
#EXTINF:10,
out.ts
What I want to create:
#EXTM3U
#EXT-X-STREAM-INF:PROGRAM-ID=1, BANDWIDTH=860000
low.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1, BANDWIDTH=512000
medium.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1, BANDWIDTH=160000
high.m3u8
To do adaptive streaming with HLS first of all you need to encode your video at the bitrates you want to support. Take a look at Apple's encoding recommendations for some examples.
Once you've done that, you then need to segment each video and generate a playlist for it. The final step is to create a master playlist where you add the URLs of the variant playlists and information about each stream, such as the bandwidth, resolution, and so on - this is the playlist you will use as the video source for the player.
For example, let's assume that your source video was shot in 1080p and you want to generate a 360p variant with a video bitrate of 1200k. You could something like that with the following ffmpeg command:
ffmpeg -i 1080p.mov -c:v libx264 -vprofile baseline -vlevel 3.1 -s 640x360 -b:v 1200k -strict -2 -c:a aac -ar 44100 -ac 2 -b:a 96k 360p.mov
Note that the (source) video you generate the variants from needs to be high quality - you can't encode a 1080p video from a 720p one (without upscaling).
Next, run the command similar to the one in your question to generate the playlist and the segments for this video:
ffmpeg -i 360p.mov -codec copy -vbsf h264_mp4toannexb -map 0 -f segment -segment_time 10 -segment_format mpegts -segment_list 360p/playlist.m3u8 -segment_list_type m3u8 360p/fileSequence%d.ts
Now create a master playlist and add the (relative) URL of the playlist you just created. So something like this:
#EXTM3U
#EXT-X-STREAM-INF:BANDWIDTH=1228800,CODECS="mp4a.40.2,avc1.4d401e",RESOLUTION=640x360
360p/playlist.m3u8
(The bandwidth attribute should also take into account the bitrate of the audio, which I haven't done here.)
Repeat the process for the other variants.
The player will use the information about the available streams in the playlist, and the available bandwidth at the time, to determine which stream is the most appropriate to play.
I'm currently doing a stream that is supposed to display correctly within Flowplayer.
First I send it to another PC via RTP. Here, I also checked with VLC that the codec etc. arrive correctly, which they do.
Now I want to expose this stream to Flowplayer as a file, so it can be displayed, via something I used in VLC:
http://localhost:8080/test.mp4
for example.
The full line I got is: ffmpeg -i input -f mp4 http://localhost:8080/test.mp4
However, no matter how I try to do this, I only get an input/output error. Is this only possible with something like ffserver or another?
What I think is this doesn't work because ffmpeg can't act as a server; on VLC it works since it can. (Though VLC ruins the codecs I set and it can't be read afterwards for some reason)
A (sort of) workaround I can use is saving the RTP stream to a file, and then letting flowplayer load it. This, however, only works once the file is not accessed anymore; I get a codec error otherwise.
To have FFmpeg act as an HTTP server, you need to pass the -listen 1 option. Additionally, -f mp4 will result in a non-fragmented MP4, which is not suitable for streaming. You can get a fragmented MP4 with -movflags frag_keyframe+empty_moov. A full working command line is:
ffmpeg -i input -listen 1 -f mp4 -movflags frag_keyframe+empty_moov http://localhost:8080
Other options you may find helpful are -re to limit the streaming speed to the input framerate, -stream_loop -1 to loop the input, and -c copy to avoid reencoding.
you need this command line
ffmpeg -f v4l2 -s 320x240 -r 25 -i /dev/video0 -f alsa -ac 1 -i hw:0 http://localhost:8090/feed1.ffm
make sure that your feed name ends with ".ffm" and if it's not the case, then add "-f ffm" before your feed URL, to manually specify the output format (because ffmpeg won't be able to figure it out automatically any more), like this "-f ffm http://localhost:8090/blah.bleh".