LinkedIn large video file upload - linkedin

I try to upload large video file by using multipart method but when I try to complete the upload I always get Multipart upload metadata deserialization failure error
{'message': 'com.linkedin.vector.utils.logic.LogicLayerInvalidException: Multipart upload metadata deserialization failure', 'status': 400}
Anyone have any idea to fix it?
I'm using python requests library.

Compress the video before uploading, e.g. ffmpeg -i input.mp4 -vcodec libx265 -crf 28 output.mp4
You can use a python ffmpeg wrapper to do this programmatically. For example, with ffmpeg-python:
import ffmpeg
ffmpeg.input('input.mp4').output('output.mp4', vcodec='libx265', crf='28').run()

Related

Streaming video in a folder with ffmpeg

I have configured RTMP to work with Nginx on ubuntu server following the guide from https://www.youtube.com/watch?v=Js1OlvRNsdI
I have tested the setup and everything is working perfectly to the end but then I have a folder containing movies and I want them to be streamed or played independently via a web player, for example, JWPlayer and I have failed to get that implementation online. Is there anyone with an idea of how to go about it?
Do you get a folder contains a set of movies, and you want to stream them as a stream, or each file as a stream?
Whatever you could do this by FFmpeg:
for file in $(ls movies/*.mp4); do
ffmpeg -re -i $file -c copy -f flv rtmp://server/app/stream
done
If want to stream each file as a stream, try to start multiple ffmpeg to do this.

RTMP proxy to crop original video and send it to another RTMP server

I need to crop the video from an RTMP stream and send it to another RTMP server which always change. My understanding is that I should use nginx-proxy and ffmpeg, can anybody help me on how to set it up?
I suppose that i need to send the stream to an endpoint like /stream/:stream-key/:next-server-ip process the stream with ffmpeg and then send it to the :next-server-ip, what language should I use in the backend for this?
There are 2 strategies for processing such task:
"Pull"
You have some published rtmp stream and use ffmpeg to pull it, convert and send result to another server:
ffmpeg -i rtmp://source-server/stream -filter:v "crop=out_w:out_h:x:y" -vcodec h264 -acodec copy -f flv rtmp://next-server/stream
"Push"
RTMP stream is pushed to your server which processes it and sends result to another server. For such task you can use nginx-rtmp module for nginx and setup ffmpeg command using exec_push directive:
application src {
live on;
exec_push ffmpeg -i rtmp://localhost/src/$name -filter:v "crop=out_w:out_h:x:y" -vcodec h264 -acodec copy -f flv rtmp://next-server/stream 2>>/var/log/ffmpeg-$name.log;
}
When someone start to stream to rtmp://your-server/src/stream_name this ffmpeg command will be executed and the processing will begin
For additional information about video cropping and related ffmpeg parameters see https://video.stackexchange.com/a/4571

FFmpeg -> JSMpeg Websocket Closes Repeatedly

I'm trying to create a fairly simple streaming server/site. Here's the current flow:
OBS streams to an RTMP URL
Nginx accepts the RTMP stream and uses exec-push to have FFmpeg pick up the stream and transcode it
FFmpeg transcodes the stream and outputs it to a JSMpeg application, which displays the stream on a webpage.
When I have my exec_push statement as follows, everything seems to work perfectly, except the browser says Possible garbage data. Skipping. on every frame it receives:
exec_push /usr/bin/ffmpeg -re -i rtmp://127.0.0.1:1935/$app/$name -f mpeg1video http://localhost:8080/supersecret;
This behavior is understandable, because JSMpeg must receive MPEG-TS data, not MPEG1 data. It sees the MPEG1 frames and thinks they're garbage.
So through some online research, I found this:
exec_push /usr/bin/ffmpeg -re -i rtmp://127.0.0.1:1935/$app/$name -c:v copy -c:a copy -f mpegts http://localhost:8080/supersecret;
Supposedly, this is supposed to transcode my RTMP stream into an MPEG-TS format, which should be compatible with JSMpeg.
However, with the second version of the command, my FFmpeg -> JSMpeg stream keeps connecting and disconnecting, connecting and disconnecting, and so on. This behavior is observed in terminal:
Stream Connected: ::1:40208
close
Stream Connected: ::1:40212
close
Stream Connected: ::1:40216
close
Stream Connected: ::1:40220
close
Stream Connected: ::1:40224
close
...
What would cause this? I am pretty certain the issue is in my exec_push command. OBS is perfectly content, which tells me that the stream is making it to the server, and if I do a push, I can do a test push to Ustream just fine, which tells me that Nginx is at least processing the stream with some reasonable degree of success.
Disclaimer: I have no idea what I'm talking about. Everything I know about FFmpeg and JSMpeg/Node is from snippets of code that I found online.
Answer credit goes to #Mulvya.
In the second exec_push command, the -c:v copy -c:a copy should not be there. By using that, there isn't any transcoding going on-- it's just a stream passthrough.
Removing the -c:v copy -c:a copy from the command and restarting Nginx yields a successful stream.

Encoding video stream by http protocol using ffmpeg library

I'm trying to encode video files, that users upload on my server.
I interpretate file as stream, incoming on my server by http protocol and use ffmpeg for realtime file encoding, while upload procedure executes.
When source file have .avi format, I have successful encoding result, but on .mp4 format appears error:
---------------------
[buffer # 0000000000308380] Unable to parse option value "-1" as pixel format
Last message repeated 1 times
[buffer # 0000000000308380] Error setting option pix_fmt to value -1.
---------------------
I think this might be because .mp4 contains "moov atom" data in the end of file.
I think so because when I processing file by "-movflags faststart" command before encoding, I also have successful result.
That is the command i using now:
ffmpeg -i http://myhost.com/app/video/video2.mp4 -f mp4 -vcodec libx264 -b:v 800K -acodec libvo_aacenc -b:a 128K -ar 44100 -ac 2 -y c:/watch-and-get/video/video5.mp4
Can I resolve this problem and encode multiple video formats as a stream without any excess steps?
you are running an old version of ffmpeg. this problem was fixed.
-pix_fmt is pixel format and its value should be an integer.(ffmpeg somehow takes this value as -1, i am not sure why. hence u get that error. but updating would solve this problem)
extra info : run ffmpeg -pix_fmts to see the all the available pixel formats.
download the latest version.
i would recommend installing the latest version from a binary as it is much simpler. i have answered about the same here

How to stream with ffmpeg via http protocol

I'm currently doing a stream that is supposed to display correctly within Flowplayer.
First I send it to another PC via RTP. Here, I also checked with VLC that the codec etc. arrive correctly, which they do.
Now I want to expose this stream to Flowplayer as a file, so it can be displayed, via something I used in VLC:
http://localhost:8080/test.mp4
for example.
The full line I got is: ffmpeg -i input -f mp4 http://localhost:8080/test.mp4
However, no matter how I try to do this, I only get an input/output error. Is this only possible with something like ffserver or another?
What I think is this doesn't work because ffmpeg can't act as a server; on VLC it works since it can. (Though VLC ruins the codecs I set and it can't be read afterwards for some reason)
A (sort of) workaround I can use is saving the RTP stream to a file, and then letting flowplayer load it. This, however, only works once the file is not accessed anymore; I get a codec error otherwise.
To have FFmpeg act as an HTTP server, you need to pass the -listen 1 option. Additionally, -f mp4 will result in a non-fragmented MP4, which is not suitable for streaming. You can get a fragmented MP4 with -movflags frag_keyframe+empty_moov. A full working command line is:
ffmpeg -i input -listen 1 -f mp4 -movflags frag_keyframe+empty_moov http://localhost:8080
Other options you may find helpful are -re to limit the streaming speed to the input framerate, -stream_loop -1 to loop the input, and -c copy to avoid reencoding.
you need this command line
ffmpeg -f v4l2 -s 320x240 -r 25 -i /dev/video0 -f alsa -ac 1 -i hw:0 http://localhost:8090/feed1.ffm
make sure that your feed name ends with ".ffm" and if it's not the case, then add "-f ffm" before your feed URL, to manually specify the output format (because ffmpeg won't be able to figure it out automatically any more), like this "-f ffm http://localhost:8090/blah.bleh".

Resources