Can anyone help me generate a GStreamer pipeline to create an encrypted hls stream? I have been able to do the following which works well but I would like to add encryption to the final output.
gst-launch-1.0 -v souphttpsrc location=http://192.168.1.20/1.ts ! tsdemux program-number=1 name=tsmux tsmux.video_0_0044 ! queue ! muxer. tsmux.audio_0_0045 ! queue ! aacparse ! muxer. mpegtsmux name=muxer ! hlssink location="test/hlssink.%05d.ts" playlist-location="test/playlist.m3u8" max-files=6 target-duration=6
Related
Hi I am trying to open a video file using opencv with gstreamer support in python. The idea is to grab frames from the file and to simultaneously pass it to my python3 application for processing while also encoding it into h264 and sending it to a udpsink. Each of these streams work when being run independently but I run into errors when trying to run it together. This pipeline works if I pull from a web camera instead of a filesrc.
The code that I used to open the cv2.VideoCapture is below. I am running this on a TX2 with Jetpack 4.3 and a recompiled Opencv 4.1.1
video_stream = cv2.VideoCapture("filesrc location=video.mp4 ! \
qtdemux name=demux demux.video_0 ! h264parse ! omxh264dec ! tee name=t \
t. ! queue leaky=downstream ! nvvidconv flip-method=0 ! video/x-raw, format=(string)BGRx ! videoconvert ! video/x-raw, format=(string)BGR ! appsink \
t. ! queue leaky=downstream ! nvvidconv ! video/x-raw(memory:NVMM), width=(int)320, height=(int)240 ! omxh264enc ! video/x-h264, streamformat=byte-stream ! h264parse ! rtph264pay config-interval=1 pt=96 ! udpsink host=192.168.1.1 port=1234")
The error I get is as follows
[ WARN:0] global /usr/local/src/opencv-4.1.1/modules/videoio/src/cap_gstreamer.cpp (1757) handleMessage OpenCV | GStreamer warning: Embedded video playback halted; module demux reported: Internal data stream error.
[ WARN:0] global /usr/local/src/opencv-4.1.1/modules/videoio/src/cap_gstreamer.cpp (886) open OpenCV | GStreamer warning: unable to start pipeline
[ WARN:0] global /usr/local/src/opencv-4.1.1/modules/videoio/src/cap_gstreamer.cpp (480) isPipelinePlaying OpenCV | GStreamer warning: GStreamer: pipeline have not been created
Any suggestions on how I should proceed? Thanks!
I figured it out. I need to add another nvvidconv before the tee. Not sure exactly why, but it allows the entire pipeline to flow correctly.
video_stream = cv2.VideoCapture("filesrc location=video.mp4 ! \
qtdemux name=demux demux.video_0 ! h264parse ! omxh264dec ! nvvidconv ! tee name=t \
t. ! queue leaky=downstream ! nvvidconv flip-method=0 ! video/x-raw, format=(string)BGRx ! videoconvert ! video/x-raw, format=(string)BGR ! appsink \
t. ! queue leaky=downstream ! nvvidconv ! video/x-raw(memory:NVMM), width=(int)320, height=(int)240 ! omxh264enc ! video/x-h264, streamformat=byte-stream ! h264parse ! rtph264pay config-interval=1 pt=96 ! udpsink host=192.168.1.1 port=1234")
I'm testing some satellital modem with a USB-to-serial (RS232) converter. I have already tested the "connection", and it works. I'm using minicom and am able to capture data sent from one terminal (running one bash script that echoes random numbers) to another.
To make this modem send things, I must send it AT commands to it. What is the best way to do it? Should I just echo the AT command from my bash script? Or is there any better way?
#!/bin/bash
while true;
do
number=$RANDOM
echo $number >/dev/ttyUSB0
sleep 4
done
Thank you for your time!
Since you're talking with a modem, the general way to talk with the modem is to use ModemManager, an application that handles the communication with the modem for you.
If you are unable to use ModemManager or you must use bash for some reason, note that commands must end with a \r\n to be used by the modem. The best way that I have found to do that is to use /bin/echo as follows:
/bin/echo -n -e "AT\r\n" > /dev/ttyUSB0
This ensures that echo will convert the escape characters to the carriage return and newline appropriately.
I got a working answer from for this from here:
https://unix.stackexchange.com/questions/97242/how-to-send-at-commands-to-a-modem-in-linux
First using socat.
echo "AT" | socat - /dev/ttyUSB2,crnl
worked great.
Then I used atinout
echo AT | atinout - /dev/ttyACM0 -
Ultimately I chose socat, because atinout isn't in any repository.
I am working on a script to upgrade firmware on a network switch. There are two commands that need to be ran but they have to wait for the other one to finish. I have used the after command but I am just taking a guess on the timing. Is there a way that when the switch comes back with 'TFTP to Flash Done.' it moves on to the next command?
Here is my command so far
CLI enable
CLI $deviceLogin
CLI $deviceEnablePwd
CLI copy tftp flash $tftpIP kxz10105.bin bootrom
CLI !
CLI !
CLI !
CLI !
CLI !
CLI !
after 120000
CLI copy tftp flash $tftpIP ICX64S08030h.bin primary
CLI !
CLI !
CLI !
CLI !
CLI !
CLI !
CLI !
CLI !
CLI !
CLI !
CLI !
CLI !
after 180000
CLI show clock
CLI reload after 00:00:01
I'm attempting to stream an h264 encoded video using gstreamer and tcp. The command is:
gst-launch-1.0 videotestsrc is-live=true ! videoconvert ! videoscale ! video/x-raw,width=800,height=600 ! x264enc key-int-max=12 ! rtph264pay config-interval=1 pt=96 ! gdppay ! tcpserversink port=5000
gop size is set to 12, and configuration sent every second. I can't receive this stream using vlc (neither on the same machine nor on other machine). The command on vlc is:
vlc rtp://localhost:5000
but nothing showed. Anyone can help ?
regards
wrap the stream up in some container like mpegts
gst-launch-1.0 -v videotestsrc ! x264enc key-int-max=12 byte-stream=true ! mpegtsmux ! tcpserversink port=8888 host=localhost
now in vlc using tcp://localhost:8888
I want to take a h264 video, decode it, re-encode it in mjpeg and stream it over tcp.
For this, I use a raspivid video caputre which give a h264 output video piped with Gstreamer which decode, re-encode and transmit using tcp:
raspivid -n -t 0 -b 7000000 -fps 25 -o - | \
gst-launch-1.0 fdsrc ! video/x-h264,framerate=25/1,stream-format=byte-stream ! decodebin ! videorate ! video/x-raw,framerate=10/1 ! \
videoconvert ! jpegenc ! tcpserversink host=192.168.24.5 port=5000 &
To receive I use:
gst-launch-1.0 tcpclientsrc host=192.168.24.5 port=5000 ! jpegdec ! autovideosink
On my tcp server my CPU work at 90% and I have no error. We could think it's ok, but ...
On my tcp client I have this error:
ERROR: from element /GstPipeline:pipeline0/GstTCPClientSrc:tcpclientsrc0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2865): gst_base_src_loop (): /GstPipeline:pipeline0/GstTCPClientSrc:tcpclientsrc0:
streaming task paused, reason not-negotiated (-4)
ERROR: pipeline doesn't want to preroll.
Did you have any ideas why my pipeline is broken ?
Tried adding a videoconvert before the videosink yet?
Also, you should specify the caps for the tcp source as the second pipeline needs to know at least the framerate. I'd do something like:
gst-launch-1.0 tcpclientsrc host=192.168.24.5 port=5000 ! image/jpeg, framerate=25/1 ! jpegparse ! jpegdec ! queue ! videoconvert ! autovideosink
If that still doesn't work, a GST_DEBUG=6 log from the receiving pipeline should help pinpointing the issue.