how to send QImages via gstreamer over udp - qt

I am using non supported v4L camera and i need to stream video to remote Pc using gstreamer.
I have successfully stream it on host computer using Qt and QImages .
earlier I have asked question here
about how to feed external frames into gstreamer .
I read blog here and tried to implement it using gstreamer 1.0 but somehow its not working as expected.
So I thought of streaming qimages via gstreamer in same network but on different workstation. I am wondering if someone can give me starting point of how to send Qimages using gstreamer 1.0. I am not asking for code but just a direction.
I am very new to this multi-media stuff so I will be grateful if you explain it in layman's language.
Thanks in advance

First you need to determine what protocol and encoding type you want to use to transmit it over UDP. I'd recommend h264 over RTP for most cases.
GStreamer offers a wide collection of plugins to do this, including a command line utility called gst-launch. Here are some basic send/receive commands:
gst-launch-1.0 videotestsrc pattern=ball ! video/x-raw,width=1280,height=720 ! videoconvert ! x264enc ! h264parse ! rtph264pay ! udpsink host=localhost port=7777
gst-launch-1.0 udpsrc port=7777 ! rtpbin ! rtph264depay ! decodebin ! videoconvert ! autovideosink
When you write applications that use a GStreamer pipeline, the simplest way to get frames to that pipeline is with an appsrc. So you might have something like this:
const char* pipeline = "appsrc name=mysrc caps=video/x-raw,width=1280,height=720,format=RGB ! videoconvert ! x264enc ! h264parse ! rtph264pay ! udpsink host=localhost port=7777";
GError* error(NULL);
GstElement* bin = gst_parse_bin_from_description(tail_pipe_s.c_str(), true, &error);
if(bin == NULL || error != NULL) {
...
}
GstElement* appsrc = gst_bin_get_by_name(GST_BIN(bin), "mysrc");
GstAppSrcCallbacks* appsrc_callbacks = (GstAppSrcCallbacks*)malloc(sizeof(GstAppSrcCallbacks));
memset(appsrc_callbacks, 0, sizeof(GstAppSrcCallbacks));
appsrc_callbacks->need_data = need_buffers;
appsrc_callbacks->enough_data = enough_buffers;
gst_app_src_set_callbacks(GST_APP_SRC(appsrc), appsrc_callbacks, (gpointer)your_data, free);
gst_object_unref(appsrc);
Then in a background thread you make calls to gst_app_src_push_buffer, which would involve reading the raw data from your QImage and converting it to a GstBuffer.
OR
Maybe QT has some easier way that I'm not aware of. :)

Related

Display multiple h264 videostreams at fixed positions with gstreamer, Qt and weston

I am trying to display multiple h264 videostreams from IP cameras on our embedded system (i.MX8 QXP) along with a Qt (5.14.2) application. Ideally we would like to position each video at a certain x & y with a certain width & height. We run weston with desktop-shell.
I have tried different approaches:
From the Qt application, make a process call to gst-launch-1.0
For example to display one stream:
gst-launch-1.0 udpsrc port=50004 caps="application/x-rtp,media=(string)video, clock-rate=(int)90000, encodingname=(string)H264, payload=(int)96" ! queue ! rtpjitterbuffer latency=500 mode=slave do-lost=false ! rtph264depay ! h264parse ! v4l2h264dec ! imxvideoconvert_g2d ! waylandsink sync=false window-width=640 window-height=480
This gives acceptable latency, appears ontop of the application, but the position (x,y) is random.
I understand that the positioning is determined by the compositor (in this case desktop-shell) but is there any way around this? (looked up ivi-shell, but seems unsupported by gstreamer)
gst-launch with imxcompositor_g2d to combine multiple streams, for example:
gst-launch-1.0 -v imxcompositor_g2d name=comp \
sink_0::xpos=0 sink_0::ypos=0 sink_0::width=640 sink_0::height=480 \
sink_1::xpos=0 sink_1::ypos=480 sink_1::width=640 sink_1::height=480 ! \
waylandsink \
videotestsrc ! comp.sink_0 \
videotestsrc ! comp.sink_1
The problem with this approach is that xpos=0 seems to start at ~width/2, and I cannot move the image further to the left. Another problem is that the "width area" that is not covered by the video is filled with black color, instead of the showing the Qt application.
Let Qt application handle video stream
I use qml types MediaPlayer and VideoOutput to create a custom gsp-pipeline and display the video.
The problem with this approach is the added latency. By using the pipeline described in approach 1, and replacing waylandsink with qtvideosink, the latency is increased by more than ~2s.
MediaPlayer {
id:mediaPlayer2
source: "gst-pipeline: udpsrc port=50004 caps=application/x-rtp,media=video,clock-rate=90000,encodingname=H264,payload=96 ! queue ! rtpjitterbuffer latency=500 mode=slave do-lost=false ! rtph264depay ! h264parse ! v4l2h264dec ! imxvideoconvert_g2d ! qtvideosink"
autoPlay: true
autoLoad: true
}
VideoOutput {
width:640
height:400
x: 100
y: 100
source: mediaPlayer2
z: 1
}

Using more bandwidth when sending video in 480p than in 720p with Gstreamer

I'm building an application where I used gstreamer to do the transmission of a video. My pipeline is really simple : I get the video from my application, convert it, encode in h264, build RTP packets, and send it through UDP. It works perfectly fine.
However, during testing I've remarked something strange: I use more bandwidth (i look at the bandwidth used with iptraf) when the video is sent in the 640 * 480 px than in 1280 * 720 px. As the video is higher in quality in the second case, I would suppose that it will use more bandwidth. Any idea why this happens? Thanks!
I just put here pipeline I use for you to test if you want :
sender :
gst-launch-1.0 v4l2src ! videoconvert ! x264enc tune=zerolatency noise-reduction=10000 speed-preset=superfast ! rtph264pay config-interval=0 pt=96 ! udpsink host=127.0.0.1 port=5000'
receiver :
gst-launch-1.0 -v udpsrc port=5000 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! avdec_h264 lowres=2 ! videoconvert ! xvimagesink
bandwidth use in 640 * 480 px : around 2000 kb/s
bandwidth use in 1280 * 720 px : around 1100 kb/s

Qt Overlay over GStreamer

I've a question considering the current setup:
Yocto Linux on iMX6
Neither a window-, nor a display-manager
A fully functional Qt Application, tested on Debian 9
The application consists of 2 main elements:
A GStreamer part, with a imxg2dvideosink
A semi-transparent Qt Overlay, which should be displayed over the stream
The question:
How can I accomplish to display the overlay over the stream, while having both parts on fullscreen (filling the whole screen)? Possible solutions:
/dev/fb1 as an overlay to /dev/fb0 (How to split a single application to two fb's ?)
Use a display-manager ?
Use a window-manager ?
linuxfb instead of eglfs ?
My current (not working) solution:
Using -platform eglfs
The application will first start GStreamer, and afterwards show the overlay
I've found the solution myself. Shared below:
1) Run Qt Application on /dev/fb1:
export QT_QPA_EGLFS_FB=/dev/fb1 (Specify /dev/fb1 as eglfs framebuffer)
echo 0 > /sys/class/graphics/fb1/blank (Unblank framebuffer)
fbset -fb /dev/fb1 --geometry <your geometry here> (Set framebuffer geometry)
./YourApplication -platform eglfs (Run application)
Use a Color Key if you want full opacity while having fully transparent parts of your overlay.
2) Run GStreamer on /dev/fb0:
gst-launch-1.0 videotestsrc ! imxg2dvideosink framebuffer=/dev/fb0
This is the solution for eglfs. Other possibilities are linuxfb.

Gstreamer automation in STB (Set Top Box)

mates,I have a set top box which I am communicating through serial port.This box has Gstreamer media frame-work(linux platform and C language).
I am trying to automate Gstreamer i.e gst-launch,gst-inspect....there are also other frame work like Qt which I want to automate.
Following are my attempts toward this problems :
Attempt 1:
Tried using Pyserial and was successful toward working of it,but by using Pyserial I was able to access my port and communicate to my board,but I found no way to automate things.
import serial
import time
port = "COM1"
baud = 115200
ser = serial.Serial(port, baud,xonxoff=False, rtscts=False, dsrdtr=False,timeout=None)
ser.flushInput()
ser.flushOutput()
if ser.isOpen():
print(ser.name + ' is open...')
while True :
cmd = input("Enter command or 'exit':")
if cmd == 'exit':
ser.close()
exit()
else:
ser.write(cmd.encode() + b'\r\n' )
bytesToRead = ser.inWaiting()
out=ser.read(bytesToRead)
print(out.decode(),sep='')
Attempt 2 :
To have a communicator install in my board which can communicate to my box.
If this is correct ,I have no Idea how to proceed with this.
Any help toward STB automation will be greatly appreciated.

Gstreamer problems network streaming with gst-launch

quite new to Gstreamer, but I'm trying to implement a network stream using the gst-launch command. So far I've managed to get the pipeline working with the videotestsrc but when I try to put a filesrc in it's place I have trouble. the following is what I've tried
Taking a .mov/.mkv file and streaming
gst-launch -ve gstrtpbin name=rtpbin filesrc location=/home/user/Gstreamer_projects
/test_videos/bbb_short_1080p.mkv ! matroskademux ! h264parse ! rtph264pay !
rtpbin.send_rtp_sink_0 rtpbin.send_rtp_src_0 ! queue ! udpsink host=192.168.1.21 port=5000
rtpbin.send_rtcp_src_0 ! udpsink host=192.168.1.21 port=5001 sync=false async=false udpsrc
port=5005 ! rtpbin.recv_rtcp_sink_0
the output on the terminal is
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtcp_src_0: caps = application/x-rtcp
/GstPipeline:pipeline0/GstRtpBin:rtpbin
/GstRtpSession:rtpsession0.GstPad:send_rtcp_src: caps = application/x-rtcp
/GstPipeline:pipeline0/GstUDPSink:udpsink1.GstPad:sink: caps = application/x-rtcp
/GstPipeline:pipeline0
/GstRtpBin:rtpbin.GstGhostPad:send_rtcp_src_0.GstProxyPad:proxypad3: caps = application/x-rtcp
ERROR: from element /GstPipeline:pipeline0/GstMatroskaDemux:matroskademux0: GStreamer encountered a general stream error.
Additional debug info:
matroska-demux.c(4492): gst_matroska_demux_loop (): /GstPipeline:pipeline0
/GstMatroskaDemux:matroskademux0:
stream stopped, reason not-linked
Execution ended after 1096585 ns.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
/GstPipeline:pipeline0/GstUDPSink:udpsink1.GstPad:sink: caps = NULL
/GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtcp_src_0: caps = NULL
/GstPipeline:pipeline0/GstRtpBin:rtpbin
/GstRtpSession:rtpsession0.GstPad:send_rtcp_src: caps = NULL
/GstPipeline:pipeline0/GstMatroskaDemux:matroskademux0.GstPad:audio_00: caps = NULL
/GstPipeline:pipeline0/GstMatroskaDemux:matroskademux0.GstPad:video_00: caps = NULL
Setting pipeline to NULL ...
Freeing pipeline ...
Converted the above file into YUV and then streamed this. This works but very slowly at 2-3fps.
If anyone knows how to fix the pipeline to demux the file properly or increase the performance of the x264enc in Gstreamer I would be very grateful!
You may try to add a queue element before h264parse element. Also, specify the video src pad of matroskademux in the pipeline.

Resources