We use ffmpeg in one of our applications to slice videos. While it's working fine for slicing PAL videos, it is not working for QT videos... Here's the command we use:
ffmpeg.exe -i "input.mp4" ss startTime -c copy -to stopTime -y "output.mp4"
Throws an error - "[mp4 # 0515c240] Could not find tag for codec pcm_s16le in stream #1, codec not currently supported in container"
The input videos are created by a solid state digital video recording system which records videos from following Input channels:
6 channels of videos input (DVI-4No.s, PAL-2No.s)
2 channels of audio input (Left & Right)
2 channels of MIL STD 1553B bus data
2 channels of RS422 data
What could be the issue & how can it be resolved?
FFmpeg does not support PCM audio in MP4 container. Output to MOV instead.
Run
ffmpeg -i in.mp4 -ss startTime -to stopTime -c copy -avoid_negative_ts make_zero out.mov
Related
I use nginx and ffmpeg to restream video from my provider. Previously I use ffmpeg with arguments where I reencoding video and reencoding audio, because my server is to slow I resigned from reencoding.
So now, I use that command :
ffmpeg -re -i http://link.somelink.com:6565/21d12d1/17233 -map 0 -c copy -bsf:a aac_adtstoasc -f flv -flvflags no_duration_filesize rtmp://test_ip/canal/stream
This works only when my provider streaming with aac audio codec, but sometimes my provider change audio codec to ac3. And then this doesn't work. I try something like this :
ffmpeg -thread_queue_size 32768 -re -i http://link.somelink.com:6565/21d12d1/17233 -c:v copy -c:a aac -f flv -flvflags no_duration_filesize rtmp://test_ip/canal/stream
And it all looks like it's all right in console with ffmpeg, but my restreaming video doesn't work. Ngnix throws 304 exception sometime.
Any suggestions?
Please help,
It's very important for me...
Ac3 is not in supported codecs list. You should encode your stream accordingly.
RTMP supports only a limited number of codecs. The most popular RTMP video codecs are H264, Sorenson-H263 (aka flv) and audio codecs AAC, MP3, Nellymoser, Speex. If your video is encoded with these codecs (the most common pair is H264/AAC) then you do not need any conversion. Otherwise you need to convert video to one of supported codecs.
https://github.com/arut/nginx-rtmp-module/wiki/Getting-started-with-nginx-rtmp
So, I've read all the articles here and unfortunately I can't seem to find the answers I'm looking for. I've gotten close, but the certain magic strings allude me.
I'm running hls live streaming (nginx) on ubuntu 17.10 server. In short, I can get the server running one video at a time fine with ffmpeg (with subtitles) using the following:
ffmpeg -re -i "1.mkv" -vcodec libx264 -vprofile baseline -g 30 -b:v 1000k -s 852x480 -acodec aac -strict -2 -b:a 192k -ac 2 -vf subtitles=1.srt -f flv rtmp://localhost:1935/show/stream
Though, I cannot find a solution to run a playlist using this method. It seems impossible, and when I try vlc via sout (internally, or externally) I reveive either buffer problems, or the aac experimental codec error:
[aac # 0xb162e900] The encoder 'aac' is experimental but experimental codecs are not enabled, add '-strict -2' if you want to use it.
Example string that spits that error:
vlc "1.mkv" --sout '#transcode{soverlay,vb=1000,vcodec=h264,width=853,height=480,acodec=mp4a,ab=128,channels=2,samplerate=44100}:std{access=rtmp,mux=ffmpeg{mux=flv},dst=rtmp://localhost:1935/show/stream}'
Every other audio codec doesn't work with flv. I'm at a loss, I've tried almost every combination I could think of and digout just to get to this point. The best functioning out of them has been ffmpeg: it doesn't buffer video at all, plays smoothly, but just can't play a playlist. Whereas vlc can play a playlist but buffers, and has no sound (internally). I've tried aenc=ffmpeg{strict=-2}, batch pipes, etc, etc. I need help. Nothing works. Is there any solution? All I want is to run a playlist of 25 videos, all different variations, on a loop to the m3u8 for embedding.
A friend of mine mentioned he used bash scripts to have a seamless playlist like viewing feature. Hopefully that points you in the direction you need. I can try digging them up if you want to work together on this, coz I too am interested in finding out more about it.
There is a event going on..it goes for 2 hours and at each second it creates new series data. How can I take this data(x=[],y=[]),convert into a graph and then convert it into a video live stream and then see it?
Is this even possible?
I have never done this but it seems possible.
First, you create a PNG with the graph using http://www.gnuplot.info.
Gnuplot is a portable command-line driven graphing utility for Linux, OS/2, MS Windows, OSX, VMS, and many other platforms. [...] It was originally created to allow scientists and students to visualize mathematical functions and data interactively, but has grown to support many non-interactive uses such as web scripting.
You can create a script that routinely updates the PNG with new data.
There is also a node.js module to interface with gnuplot.
https://www.npmjs.com/package/plotframes
Then, you could use FFmpeg to create an HLS stream looping an image over and over again.
ffmpeg -loop 1 -r 30000/1001 -i graph_960x540.png -an -s 960x540 -r 30000/1001 -c:v libx264 -crf 10 -maxrate 900k -b:v 900k -profile:v baseline -bufsize 1800k -pix_fmt yuv420p -hls_time 2 -hls_list_size 0 -hls_segment_filename 'png2hls/file%03d.ts' png2hls/index.m3u8
However, is a video stream required to display a graph? Why not display the image and use JavaScript to update the image as new graphs are produced?
Cheers.
I'm trying to encode video files, that users upload on my server.
I interpretate file as stream, incoming on my server by http protocol and use ffmpeg for realtime file encoding, while upload procedure executes.
When source file have .avi format, I have successful encoding result, but on .mp4 format appears error:
---------------------
[buffer # 0000000000308380] Unable to parse option value "-1" as pixel format
Last message repeated 1 times
[buffer # 0000000000308380] Error setting option pix_fmt to value -1.
---------------------
I think this might be because .mp4 contains "moov atom" data in the end of file.
I think so because when I processing file by "-movflags faststart" command before encoding, I also have successful result.
That is the command i using now:
ffmpeg -i http://myhost.com/app/video/video2.mp4 -f mp4 -vcodec libx264 -b:v 800K -acodec libvo_aacenc -b:a 128K -ar 44100 -ac 2 -y c:/watch-and-get/video/video5.mp4
Can I resolve this problem and encode multiple video formats as a stream without any excess steps?
you are running an old version of ffmpeg. this problem was fixed.
-pix_fmt is pixel format and its value should be an integer.(ffmpeg somehow takes this value as -1, i am not sure why. hence u get that error. but updating would solve this problem)
extra info : run ffmpeg -pix_fmts to see the all the available pixel formats.
download the latest version.
i would recommend installing the latest version from a binary as it is much simpler. i have answered about the same here
I'm currently doing a stream that is supposed to display correctly within Flowplayer.
First I send it to another PC via RTP. Here, I also checked with VLC that the codec etc. arrive correctly, which they do.
Now I want to expose this stream to Flowplayer as a file, so it can be displayed, via something I used in VLC:
http://localhost:8080/test.mp4
for example.
The full line I got is: ffmpeg -i input -f mp4 http://localhost:8080/test.mp4
However, no matter how I try to do this, I only get an input/output error. Is this only possible with something like ffserver or another?
What I think is this doesn't work because ffmpeg can't act as a server; on VLC it works since it can. (Though VLC ruins the codecs I set and it can't be read afterwards for some reason)
A (sort of) workaround I can use is saving the RTP stream to a file, and then letting flowplayer load it. This, however, only works once the file is not accessed anymore; I get a codec error otherwise.
To have FFmpeg act as an HTTP server, you need to pass the -listen 1 option. Additionally, -f mp4 will result in a non-fragmented MP4, which is not suitable for streaming. You can get a fragmented MP4 with -movflags frag_keyframe+empty_moov. A full working command line is:
ffmpeg -i input -listen 1 -f mp4 -movflags frag_keyframe+empty_moov http://localhost:8080
Other options you may find helpful are -re to limit the streaming speed to the input framerate, -stream_loop -1 to loop the input, and -c copy to avoid reencoding.
you need this command line
ffmpeg -f v4l2 -s 320x240 -r 25 -i /dev/video0 -f alsa -ac 1 -i hw:0 http://localhost:8090/feed1.ffm
make sure that your feed name ends with ".ffm" and if it's not the case, then add "-f ffm" before your feed URL, to manually specify the output format (because ffmpeg won't be able to figure it out automatically any more), like this "-f ffm http://localhost:8090/blah.bleh".