I would like to get playback rate of a playing gstreamer pipeline.
I can get duration of media with:
gst_element_query_duration(pipelinePlayMsg, GST_FORMAT_TIME, &duration);
But there is no such method for playback speed of media as:
gst_element_query_rate(pipelinePlayMsg, GST_FORMAT_TIME, &duration);
How to get playback rate (speed)?
kind regards, Thomas
Related
I'm looking to develop an ultrasonic time-of-flight (ToF) sensor and have been looking at high speed timing circuits. Many versions use the TDC7200, but ST offers the STM32F334 which has a high-resolution timer with apparent 217ps resolution.
What I'm wondering whether this timer can actually be used to measure time with 217ps between each count value (assuming it's run at maximum clock rate)?
Does anyone have experience using this microcontroller's high-resolution timer like this?
According to the reference manual it should be possible:
High-Resolution Timer (HRTIM) : For control and monitoring purposes, the timer has also timing measure capabilities and links to built-in ADC and DAC converters. Last, it features light-load management mode and
is able to handle various fault schemes for safe shut-down purposes. --page 626
The timing unit has the capability to capture the counter value, triggered by internal and
external events. The purpose is to:
• measure events arrival timings or occurrence intervals --page 641
I am currently working on a video to video application, which uses the VLCJ api 2.2.0 for the media streaming.
What I want to do is calculate the dropped frames of the remote video stream. Specifically, i have set an upper limit of maximum FPS, and thus the calculation should be :
lostFPS = maximumFPS - currentFPS.
I saw in the javadoc of vlcj that the currentFPS is provided by the getFPS function, but it always returns 0 for some reason even if the video is streamed normally (both for local and remote).
Does anyone know if there are alternative ways of calculating this loss or am i missing something?
Best regards,
giannis
libVLC provides statistics about the currently playing media, vlcj exposes it:
libvlc_media_stats_t stats = mediaPlayer.getMediaStatistics();
int droppedFrames = stats.i_lost_pictures;
I am using DirectShow in my application to capture video from webcams. I have issues while using cameras to preview and capture 1080P videos. Eg: HD Pro Webcam C910 camera of Logitech.
1080P video preview was very jerky and no HD clarity was observed. I could see that the enumerated device name was "USB Video Device"
Today we installed Logitech webcam software on these XP machines . In that application, we could see the 1080P video without any jerking. Also we recorded 1080P video in the Logitech application and saw them in high quality.
But when I test my application,
I can see that the enumerated device name has been changed to "Logitech Pro Webcam C910" instead of the "USB Video Device" as in the previous case.
The CPU eaten up by my application is 20%, but the process "SYSTEM" eats up 60%+ and the overall CPU revolves around 100%
Even though the video quality has been greatly improved, the jerks are still there, may be due to the 100% CPU.
When I closed my application, the high CPU utlizaton by "System" process goes away.
Regarding my application - It uses ICaptureGraphBuilder2::RenderStream to create Preview and Capture streams.
In Capture Stream, I connect Camera filter to NULL renderer with sample grabber as the intermediate filter.
In preview stream, I have
g_pBuild->RenderStream(&PIN_CATEGORY_PREVIEW,&MEDIATYPE_Video,cam,NULL,NULL);
Preview is displayed on a windows as specified using IVideoWindow interface. I use the following
g_vidWin->put_Owner((OAHWND)(HWND)hWnd);
g_vidWin->put_WindowStyle(WS_CHILD | WS_CLIPSIBLINGS);
g_vidWin->put_MessageDrain((OAHWND)hWnd);
I tried setting Frame rate to different values ( AvgTimePerFrame = 500000 ( 20 fps ) and 666667(15 fps) etc.
But all the trials, still give the same result. Clarity has become more, but some jerks still remain and CPU is almost 100% due to 60+ % utlilization by "System". When I close my video application, usage by "System" goes back to 1-2 %.
Any help on this is most welcome.
Thanks in advance,
Use IAMStreamConfig.SetFormat() to select the frame rate, dimensions, color space, and compression of the output streams (Capture and Preview) from a capture device.
Aside: The comment above of "It doesn't change the source filter's own rate" is completely wrong. The whole purpose of this interface is to define the output format and framerate of the captured video.
Use IAMStreamConfig.GetStreamCaps() to determine what frames rates, dimensions, color spaces, and compression formats are available. Most cameras provide a number different formats.
It sounds like the fundamental issue you're facing is that USB bandwidth (at least prior to USB3) can't sustain 30fps 1080P without compression. I'm most familiar with the Microsoft LifeCam Studio family of USB cameras, and these devices perform hardware compression to send the video over the wire, and then eat up a substantial fraction of your CPU on the receiving end converting the compressed video from Motion JPEG into a YUV format. Presumably the Logitech cameras work in a similar fashion.
The framerate that cameras produce is influenced by the additional workload of performing auto-focus, auto-color correction, and auto-exposure in software. Try disabling all these features on your camera if possible. In the era of Skype, camera software and hardware has become less attentive to maintaining a high framerate in favor of better image quality.
The DirectShow timing model for capture continues to work even if the camera can't produce frames at the requested rate as long as the camera indicates that frames are missing. It does this using "dropped frame" count field which rides along with each captured frame. The sum of the dropped frames plus the "real" frames must equal the requested frame rate set via IAMStreamConfig.SetFormat().
Using the LifeCam Studio on an I7 I have captured at 30fps 720p with preview, compressed to H.264 and written an .mp4 file to disk using around 30% of the CPU, but only if all the auto-focus/color/exposure settings on the camera are disabled.
I am able to extract samples out of a video using the readSample method. Now how can I play the data present in those samples? Or how to play IMFSample ?
Sample IMFSample is a block of data, such as video frame or a chunk of audio sequence. This is a tiny piece of data to be played alone. The API addresses more sophisticated playback scenarios, such as where playback is a session where one or more streams are streamed in sync.
Be sure to check Getting Started with MFPlay on MSDN to see how playback is set up with Media Foundation.
FMLE = Flash live media encoder 3.0
i have posted this question on Adobe Forum, but not sure if they have people on that forum with programming experience.
I am a developer writing a video capture and audio capture device. The devices already work in other encoders. The devices are written in directshow. I am integrating with FMLE and encountered this problem.
The audio device doesnt have a usable volume bar in FMLE. The FMLE error is "The selected audio device "censored (company secret)" doesn't allow setting volume intensity. Disabling the volume slider control."
my audio device implements these interfaces along with the standard directshow filter interfaces
IBasicAudio
IAMAudioInputMixer
I put tracepoints in queryinterface and found FMLE query's for (my comments in comment string)
{IID_IUnknown}
{IID_IPersistPropertyBag}
{IID_IBaseFilter}
{IID_IAMOpenProgress}
{IID_IAMDeviceRemoval}
{IID_IMediaFilter}
{IID_IAMBufferNegotiation}
{IID_IAMStreamConfig}
{IID_IPin}
{IID_IReferenceClock}
{IID_IMediaSeeking}
{IID_IMediaPosition}
{IID_IVideoWindow} // WTF ?? query video window ?
{IID_IBasicAudio}
{2DD74950-A890-11D1-ABE8-00A0C905F375} // i think this is async stream,
What am i missing ? FMLE doesnt use IAMAudioInputMixer ?
Anyone know the exact interface which FMLE uses for Volume intensity ? . .I assumed it was IBasicAudio, but it doesnt seem to call any methods in there.
Answer provided by Ram Gupta of adobe forum.
"FMLE does not query for CLSID_AudioInputMixerProperties interface.
FMLE enumerates all the pin of audio source filter(using EnumPins) and then it extracts each pin info using QueryPinInfo Function.
FMLE searches for the audio filter Pin whose direction is PINDIR_INPUT(using QueryPinInfo) and then it queries for IAMAudioInputMixer interface to set the volume level.
Could you pls chk if the following functions are properly implemented
-->get_enable: it should set its parameter value to true.
-->put_MixLevel
-->QueryPinInfo:"
This solution did work. My problem was that because i never declared an input pin (since i dont have any directshow related input).