I want to write an app that streams audio and video over the network. As a simple start, I wanted to try out the audio part.
I want to stream vorbis-encoded audio from my microphone over the network. I started with the following pipeline for the receiver:
gst-launch-0.10 tcpserversrc host=localhost port=3000 !
oggdemux !
vorbisdec !
audioconvert !
audio/x-raw-int, endianness="(int)1234", signed="(boolean)true", width="(int)16", depth="(int)16", rate="(int)22000", channels="(int)1" !
alsasink
And for the sender:
gst-launch-0.10 autoaudiosrc !
audio/x-raw-int,rate=22000,channels=1,width=16 !
audioconvert !
vorbisenc !
oggmux !
tcpclientsink host=localhost port=3000
This kind of works, but the audio is snatchy.
Can someone give me a hint how I can vorbis-encode and stream audio from my microphone smoothly over the network?
EDIT: I used audiotestsrc and made a recording of the output: http://db.tt/oDuQ2O41
I have tried those commands below (sligthly modified from those in the original post)
and could they solved the "snatchy" sound problem for me
sender:
gst-launch-0.10 autoaudiosrc ! \
audio/x-raw-int, endianness="(int)1234", signed="(boolean)true", width="(int)16", depth="(int)16", rate="(int)22000", channels="(int)1" ! \
audioconvert ! \
vorbisenc ! \
oggmux max-delay=50 max-page-delay=50 ! \
tcpclientsink host=localhost port=3000
receiver:
gst-launch-0.10 tcpserversrc host=localhost port=3000 ! \
oggdemux ! \
vorbisdec ! \
audioconvert ! \
audio/x-raw-int, endianness="(int)1234", signed="(boolean)true", width="(int)16", depth="(int)16", rate="(int)22000", channels="(int)1" ! \
pulsesink
change your sender pipeline to -
gst-launch-0.10 autoaudiosrc ! audio/x-raw-int,rate=22000,channels=1,width=16 ! audioconvert ! vorbisenc ! identity silent=true sync=true ! oggmux ! tcpclientsink host=localhost port=3000
This will control the data generation rate to pipeline clock.
Let me know if this works.
Related
GStreamer: Multiple webcam sources, Picture in Picture to mux on a Jetson Nano, then to be used as a pipeline with belabox >> belabox.net
I'm currently trying to pull two different usb webcams into a pipeline and create a Picture-in-Picture composite, but I keep getting an error like this:
GStreamer Error: gstreamer error from v4l2src1
This is the current pipeline I'm working on that doesn't work;
v4l2src device=/dev/video0 ! image/jpeg,width=1920,height=1080,framerate=60/1 ! nvvidconv ! queue ! comp.sink_0 !
v4l2src device=/dev/video1 ! image/jpeg,width=800,height=448,framerate=60/1 ! videobox left=-4 right=-4 top=-4 bottom=-4 ! nvvidconv ! queue ! comp.sink_1 !
nvcompositor name=comp sink_0::width=1920 sink_0::height=1080 sink_1::width=640 sink_1::height=360 sink_1::xpos=1266 sink_1::ypos=706 sink_1::zorder=2 ! 'video/x-raw(memory:NVMM),format=RGBA,pixel-aspect-ratio=1/1' ! nvvidconv ! 'video/x-raw(memory:NVMM),format=NV12' !
queue ! identity name=v_delay signal-handoffs=TRUE !
nvv4l2decoder mjpeg=1 enable-max-performance=true ! nvvidconv !
textoverlay text='' valignment=top halignment=right font-desc="Monospace, 5" name=overlay ! queue !
nvvidconv interpolation-method=5 !
nvv4l2h265enc control-rate=1 qp-range="28,50:0,38:0,50" iframeinterval=60 preset-level=4 maxperf-enable=true EnableTwopassCBR=true insert-sps-pps=true name=venc_bps !
h265parse config-interval=-1 ! queue max-size-time=10000000000 max-size-buffers=1000 max-size-bytes=41943040 ! mux.
alsasrc device=hw:2 ! identity name=a_delay signal-handoffs=TRUE ! volume volume=1.0 !
audioconvert ! opusenc bitrate=320000 ! opusparse ! queue max-size-time=10000000000 max-size-buffers=1000 ! mux.
mpegtsmux name=mux !
appsink name=appsink
Please note; I have two separate pipelines that can use each webcam, so I know the cameras work and can be used, I just need them to be in the same pipeline have a picture-in-picture composite and work in the belabox environment.
For reference this is a working usb webcam pipeline that works with belabox:
v4l2src device=/dev/video0 ! image/jpeg,width=1920,height=1080,framerate=60/1 !
identity name=v_delay signal-handoffs=TRUE !
nvv4l2decoder mjpeg=1 enable-max-performance=true ! nvvidconv !
textoverlay text='' valignment=top halignment=right font-desc="Monospace, 5" name=overlay ! queue !
nvvidconv interpolation-method=5 !
nvv4l2h265enc control-rate=1 qp-range="28,50:0,38:0,50" iframeinterval=60 preset-level=4 maxperf-enable=true EnableTwopassCBR=true insert-sps-pps=true name=venc_bps !
h265parse config-interval=-1 ! queue max-size-time=10000000000 max-size-buffers=1000 max-size-bytes=41943040 ! mux.
alsasrc device=hw:2 ! identity name=a_delay signal-handoffs=TRUE ! volume volume=1.0 !
audioconvert ! voaacenc bitrate=128000 ! aacparse ! queue max-size-time=10000000000 max-size-buffers=1000 ! mux.
mpegtsmux name=mux !
appsink name=appsink
Any Ideas?
I have used the following deepstream pipeline in MediaPlayer object of QML. It works fine and it puts the video ouput in newly created window.
Rectangle
{
MediaPlayer
{
id: cameraVideo
autoPlay: true
source: "gst-pipeline: uridecodebin3
uri=\"rtspt://192.168.1.118:8080/h264_ulaw.sdp\" ! queue !
nvstreammux0.sink_0 nvstreammux name=nvstreammux0 batch-size=1 batched-push-timeout=40000
width=800 height=600 live-source=TRUE ! queue ! nvvideoconvert
! queue ! nvinfer config-file-path=\"/opt/nvidia/deepstream
/deepstream/samples/configs/deepstream-app/config_infer_primary_nano.txt\" !
queue ! nvmultistreamtiler ! queue ! nvtracker tracker-width=240 tracker-height=200
ll-lib-file=/opt/nvidia/deepstream/deepstream/lib/libnvds_mot_iou.so
ll-config-file=/opt/nvidia/deepstream/deepstream/samples/configs/deepstream-app/iou_config.txt ! queue !
nvdsosd process-mode=HW_MODE ! queue ! nvegltransform ! nveglglessink"
}
VideoOutput
{
anchors.fill: parent
source: cameraVideo
}
}
But I want to put video output in specified QML Rectangle which is parent of MediaPlayer object. It seems that I must use qtvideosink instead of nveglglessink. But which plugins should I use to transform the video stream for qtvideosink.
By the way the app is running on NVidia Jetson Nano.
We are using datastore java client :
compile group: 'com.google.cloud', name: 'google-cloud-datastore', version: '0.11.2-beta'
But we are getting the following error while executing any query :
WARN [2017-04-11 19:43:33,556] [ ] org.eclipse.jetty.util.component.AbstractLifeCycle: FAILED org.eclipse.jetty.server.Server#bfe47a8: java.lang.NoSuchMethodError: com.google.protobuf.AbstractMessage.newBuilderForType(Lcom/google/protobuf/AbstractMessage$BuilderParent;)Lcom/google/protobuf/Message$Builder;
! java.lang.NoSuchMethodError: com.google.protobuf.AbstractMessage.newBuilderForType(Lcom/google/protobuf/AbstractMessage$BuilderParent;)Lcom/google/protobuf/Message$Builder;
! at com.google.protobuf.SingleFieldBuilderV3.getBuilder(SingleFieldBuilderV3.java:142)
! at com.google.protobuf.RepeatedFieldBuilderV3.addBuilder(RepeatedFieldBuilderV3.java:413)
! at com.google.datastore.v1.Query$Builder.addKindBuilder(Query.java:1713)
! at com.google.cloud.datastore.StructuredQuery.toPb(StructuredQuery.java:945)
! at com.google.cloud.datastore.StructuredQuery.populatePb(StructuredQuery.java:924)
! at com.google.cloud.datastore.QueryResultsImpl.sendRequest(QueryResultsImpl.java:72)
! at com.google.cloud.datastore.QueryResultsImpl.<init>(QueryResultsImpl.java:57)
! at com.google.cloud.datastore.DatastoreImpl.run(DatastoreImpl.java:84)
! at com.google.cloud.datastore.DatastoreImpl.run(DatastoreImpl.java:75)
! at com.nis.userService.core.managed.DeltaContactConsumer.start(DeltaContactConsumer.java:33)
Sample can be found at : https://gist.github.com/Saurabh111191/2afb12dc13e71c361291c1bf3cc0b4f8
We have tried using other client libraries also like objectify , but getting the same error.
Can someone help us out here?
I'm trying to customise the deploy scripts to allow me to deploy each of my four API proxies from the command line. It looks very similar to the one provided in the samples on Github:
#!/bin/bash
if [[ $# -eq 0 ]] ; then
echo 'Must provide proxy name.'
exit 0
fi
dirname=$1
proxyname="teamname-"$dirname
source ./setup/setenv.sh
echo "Enter your password for user $username in the Apigee Enterprise organization $org, followed by [ENTER]:"
read -s password
echo Deploying $proxyname to $env on $url using $username and $org
./tools/deploy.py -n $proxyname -u $username:$password -o $org -h $url -e $env -p / -d ./$dirname
echo "If 'State: deployed', then your API Proxy is ready to be invoked."
echo "Run '$ sh invoke.sh'"
echo "If you get errors, make sure you have set the proper account settings in /setup/setenv.sh"
However when I run it, I get the following response:
Deploying teamname-gameassets to int on https://api.enterprise.apigee.com using my-email-address and org-name
Writing ./gameassets/teamname-gameassets.xml to ./teamname-gameassets.xml
Writing ./gameassets/policies/Add-CORS.xml to policies/Add-CORS.xml
Writing ./gameassets/proxies/default.xml to proxies/default.xml
Writing ./gameassets/targets/development.xml to targets/development.xml
Writing ./gameassets/targets/production.xml to targets/production.xml
Import failed to /v1/organizations/org-name/apis?action=import&name=teamname-gameassets with status 500:
{
"code" : "messaging.config.beans.ImportFailed",
"message" : "Failed to import the bundle : java.lang.NullPointerException",
"contexts" : [ ],
"cause" : {
"contexts" : [ ]
}
}
How should I go about debugging when I receive errors during the deploy process? Is there some sort of console I can view once logged in to Apigee?
I'm not sure how your proxy ended up this way, but it looks like the top-level directory is named "gameassets." It should be named "apiproxy". If you rename this directory you should see a successful deployment.
Also, before you customize too much, please try out "apigeetool," which is a more flexible command-line tool for deploying proxies:
https://github.com/apigee/api-platform-tools
I am working on C computational function for Scilab xcos block. When trying to compile and link the code to Scilab using ilib_for_link('DO13','do13.c',[],"c"), I get:
Generate a loader file
Generate a Makefile
Running the Makefile
Compilation of do13.c
Building shared library (be patient)
!------------- Compile file do13.c -------------- !
! !
! IF NOT EXIST Release mkdir Release !
! !
! cl -D__MSC__ -DFORDLL -D_WIN64 -c -DSTRICT -D_CRT_SECURE_NO_!
! DEPRECATE -D__MAKEFILEVC__ -nologo -I"C:/PROGRA~1/SCILAB!
! ~1.3/libs/MALLOC/includes" -I"C:/PROGRA~1/SCILAB~1.3/mo!
! dules/core/includes" -I"C:/PROGRA~1/SCILAB~1.3/modules/!
! api_scilab/includes" -I"C:/PROGRA~1/SCILAB~1.3/modules/!
! call_scilab/includes" -I"C:/PROGRA~1/SCILAB~1.3/modules!
! /output_stream/includes" -I"C:/PROGRA~1/SCILAB~1.3/modu!
! les/jvm/includes" -I"C:/PROGRA~1/SCILAB~1.3/modules/loc!
! alization/includes" -I"C:/PROGRA~1/SCILAB~1.3/modules/d!
! ynamic_link/includes" -I"C:/PROGRA~1/SCILAB~1.3/modules!
! /mexlib/includes" -I"C:/PROGRA~1/SCILAB~1.3/modules/tim!
! e/includes" -I"C:/PROGRA~1/SCILAB~1.3/modules/windows_t!
! ools/includes" -I"C:/PROGRA~1/SCILAB~1.3/libs/f2c" -I"!
! C:/PROGRA~1/SCILAB~1.3/libs/hashtable" -I"C:/PROGRA~1/S!
! CILAB~1.3/libs/intl" -W3 -Gd -Z7 -O2 -MT /Fo"Release/" /!
! Fd"Release/" -DFORDLL do13.c !
! !
!do13.c !
! !
!
do13.c(1) : fatal error C1083: Cannot open include file: 'sci!
! cos/scicos_block4.h': No such file or directory !
! !
!NMAKE : fatal error U1077: '"C:\Program Files (x86)\Microsoft!
! Visual Studio 10.0\VC\BIN\amd64\cl.EXE"' : return code !
! '0x2' !
! !
!Stop. !
!--error 10000
ilib_compile: Error while executing Makelib.mak.
at line 76 of function ilib_compile called by :
at line 90 of function ilib_for_link called by :
I am not expert in programming. What could these messages mean?
Just include scicos_block4.h instead of scicos/scicos_block4.h will fix the issue.