I wrote an app for video capture. That app uses the following graph:
As you see after Smart Tee it has two banches. The first, "Capture", I use for stream handling, the second, "Preview", for show of video on the app's window.
Sometimes an user minimizes that window and using Preview branch is not need. For that case I would like to stop a stream only on this branch.
I can do it by stop and rebuild of all graph without Preview. But I would like don't to stop/rebuild the graph. Perhaps, does somebody knows the other method to do it? Any ideas.
It's not possible to just stop a part of a graph. For such a scenario you need multiple graphs (source, grabber, preview) and the GMFBridge.
By the way, why do you need the tee? Can't you connect the video-renderer to the grabber?
Related
I have code that generates thumbnails from JPEGs. It pulls an image from S3 and then generates the thumbs.
One in about every 3000 files ends up looking like this. It happens in batches. The high res looks like this and they're all resized down to low res. It does not fail on resize. I can go to my S3 bucket and see that the original file is indeed intact.
I had this code written in Ruby and ported it all over to clojure hoping it would just fix my issue but it's still happening.
What would result in a JPEG that looks like this?
I'm using standard image copying code like so
(with-open [in (clojure.java.io/input-stream uri)
out (clojure.java.io/output-stream file)]
(clojure.java.io/copy in out))
Would there be any way to detect the transfer didn't go well in clojure? Imagemagick? Any other command line tool?
My guess is it is one of 2 possible issues (you know your code, so you can probably rule one out quickly):
You are running out of memory. If the whole batch of processing is happening at once, the first few are probably not being released until the whole process is completed.
You are running out of time. You may be reaching your maximum execution time for the script.
Implementing some logging as the batches are processed could tell you when the issue happens and what the overall state is at that moment.
I'm using Directshow SampleGrabber in callback mode to capture video frame from source file and do some processing. Also I would like to maintain the current playback rate of video and need to support both random, forward and backward seeking. For this I'm also doing some local buffering in a different thread.
I'm running graph with syn source set to NULL, so as to get maximum speed. However when I pause the graph after fixed amount of buffering. The SampleGrabber callback is getting called spuriously even when graph is paused. This is affecting my frame indexing and tracking. I want to resume the graph exactly from the same position at which it was paused. However if I run the graph with default clock it works fine but then my playback get affected. I want buffering thread to finish as soon as possible.
How can I make sure that callback is not called when graph is paused? Any thoughts or suggestion would be of great help.
Thanks in advance
Pradeep
Paused graph typically has all the same streaming internally (active state) with the exception that renderers are blocking streaming, esp. as soon as enough data is received for a preview banner. Since you removed clock from the graph, your renderer is likely to not block execution because it does not hold any clock to pause against. In your case this is the problem coming out of your intent to reuse the same graph for quick parsing through file and playback. Separate graph design looks having more chances to do better.
I have a DirectShow webcam application. I make use of Sample Grabber to get the buffer callbacks and IVideoWindow to control the display co-ordinates for the Preview. I have Preview and Capture Streams which I run as below.
g_pBuild->RenderStream(&PIN_CATEGORY_CAPTURE, &MEDIATYPE_Video,cam,g_pGrabberF,pNullRenderer2); g_pBuild->RenderStream(&PIN_CATEGORY_PREVIEW, &MEDIATYPE_Video,cam,NULL,NULL);
On certain On board cameras, IMediaControl::Run followed by IMediaControl::Stop followed by IMediaCOntrol::Run doesn't switch on the camera.
Extenal USB cameras work properly here. How can I diagnose more on this? Any pointers, please help.
Maybe its specific to a certain hardware issue in the unit.
Do a quick test by adding sleep of 1 sec between calls.
If it does help than you need to find a way to know when to unit state in idle or not.
There are two important parts of the question which you did not provide:
Filter graph topologies
HRESULTs of the method calls
A problem you might be having is that one of the filters in the topology does not handle well state transitions and fails somewhere between states. Supposedly your second Run meets it still trying to complete Stop. You might get a HRESULT there which indicates the issue (better for you) or the filter fails silently.
The filter graph's is the unlikely source of the bug itself. Chances are high that it does everything flawlessly, however since internally it distributes the calls between filters, one of the filter is letting you down.
I was using a callback mechanism to grab the webcam frames in my media application. It worked, but was slow due to certain additional buffer functions that were performed within the callback itself.
Now I am trying the other way to get frames. That is, call a method and grab the frame (instead of callback). I used a sample in CodeProject which makes use of IVMRWindowlessControl9::GetCurrentImage.
I encountered the following issues.
In a Microsoft webcam, the Preview didn't render (only black screen) on Windows 7. But the same camera rendered Preview on XP.
Here my doubt is, will the VMR specific functionalities be dependent on camera drivers on different platforms? Otherwise, how could this difference happen?
Wherever the sample application worked, I observed that the biBitCount member of the resulting BITMAPINFOHEADER structure is 32.
Is this a value set by application or a driver setting for VMR operations? How is this configured?
Finally, which is the best method to grab the webcam frames? A callback approach? Or a Direct approach?
Thanks in advance,
IVMRWindowlessControl9::GetCurrentImage is intended for occasional snapshots, not for regular image grabbing.
Quote from MSDN:
This method can be called at any time, no matter what state the filter
is in, whether running, stopped or paused. However, frequent calls to
this method will degrade video playback performance.
This methods reads back from video memory which is slow in first place. This methods does conversion (that is, slow again) to RGB color space because this format is most suitable for for non-streaming apps and gives less compatibility issues.
All in all, you can use it for periodic image grabbing, however this is not what you are supposed to do. To capture at streaming rate you need you use a filter in the pipeline, or Sample Grabber with callback.
I'm using a simple DirectShow graph to convert some videos to WMV format, which is working fine. I'm now trying to use a filter based on the Synth Filter sample to supply a silent audio track to the videos and I'm running into some problems.
Essentially, I don't know how to stop the graph when this filter (the synth filter) is connected. I guess because it just provides samples forever until somebody tells it to stop, the usual approach of calling IMediaEvent::WaitForCompletion on the filter graph doesn't work (the graph never stops). What I want it to do of course is stop as soon as the video source filter is finished.
I've tried tracking the position of the graph with IMediaSeeking::GetPositions and then manually stopping the graph when this exceeds the duration of the source file, but the accuracy of the stop time with this approach isn't great.
Can anyone think of a better way to do this? Do I need to have another filter that monitors the output from the video source and also has a pointer to the audio source so it can stop it as soon as the video source delivers EndOfStream? Is there no way to accomplish this from purely application-side code?
I've done something not too different myself in the past. I added support for IMediaSeeking to the silence generator filter, and then you need to make sure that you set start and stop times for the conversion (even if it's just 0 and duration), so that the silence generator can generate the right amount of audio and then send EOS.
G