Negotiating an allocator between Directshow filters fails - directshow

I'm developing a custom Directshow source filter to provide decompressed video data to a rendering filter. I've used the PushSource sample provided by the Directshow SDK as a basis for my filter. I'm attempting to connect this to a VideoMixingRenderer9 filter.
When creating the graph I'm calling ConnectDirect():
HRESULT hr = mp_graph_builder->ConnectDirect(OutPin, InPin, &mediaType);
but during this call, calling SetProperties on the downstream filters allocator (in DecideBufferSize()), fails with D3DERR_INVALIDCALL (0x8876086c):
ALLOCATOR_PROPERTIES actual;
memset(&actual,0,sizeof(actual));
hr = pAlloc->SetProperties(pRequest, &actual);
If I let it try to use my allocator (the one provided by CBaseOutputPin) when setting the allocator on the downstream filter, this fails with E_FAIL (in CBaseOutputPin::DecideAllocator)
hr = pPin->NotifyAllocator(*ppAlloc, FALSE);
Any help would be much appreciated!
Thanks.
EDIT:
This is the media type provided by GetMediaType
VIDEOINFOHEADER *pvi = (VIDEOINFOHEADER*)pMediaType->AllocFormatBuffer(sizeof(VIDEOINFOHEADER));
if (pvi == 0)
return(E_OUTOFMEMORY);
ZeroMemory(pvi, pMediaType->cbFormat);
pvi->AvgTimePerFrame = m_rtFrameLength;
pMediaType->formattype = FORMAT_VideoInfo;
pMediaType->majortype = MEDIATYPE_Video;
pMediaType->subtype = MEDIASUBTYPE_RGB24;
pMediaType->bTemporalCompression = FALSE;
pMediaType->bFixedSizeSamples = TRUE;
pMediaType->formattype = FORMAT_VideoInfo;
pvi->bmiHeader.biSize = sizeof(BITMAPINFOHEADER);
pvi->bmiHeader.biWidth = (640 / 128 + 1) * 128;
pvi->bmiHeader.biHeight = -480; // negative so top down..
pvi->bmiHeader.biPlanes = 1;
pvi->bmiHeader.biBitCount = 24;
pvi->bmiHeader.biCompression = NULL; // ok if rgb else use MAKEFOURCC(...)
pvi->bmiHeader.biSizeImage = GetBitmapSize(&pvi->bmiHeader);
pvi->bmiHeader.biClrImportant = 0;
pvi->bmiHeader.biClrUsed = 0; //Use max colour depth
pvi->bmiHeader.biXPelsPerMeter = 0;
pvi->bmiHeader.biYPelsPerMeter = 0;
SetRectEmpty(&(pvi->rcSource));
SetRectEmpty(&(pvi->rcTarget));
pvi->rcSource.bottom = 480;
pvi->rcSource.right = 640;
pvi->rcTarget.bottom = 480;
pvi->rcTarget.right = 640;
pMediaType->SetType(&MEDIATYPE_Video);
pMediaType->SetFormatType(&FORMAT_VideoInfo);
pMediaType->SetTemporalCompression(FALSE);
const GUID SubTypeGUID = GetBitmapSubtype(&pvi->bmiHeader);
pMediaType->SetSubtype(&SubTypeGUID);
pMediaType->SetSampleSize(pvi->bmiHeader.biSizeImage);
and DecideBufferSize where pAlloc->SetProperties is called
HRESULT CPushPinBitmap::DecideBufferSize(IMemAllocator *pAlloc, ALLOCATOR_PROPERTIES *pRequest) {
HRESULT hr;
CAutoLock cAutoLock(CBasePin::m_pLock);
CheckPointer(pAlloc, E_POINTER);
CheckPointer(pRequest, E_POINTER);
if (pRequest->cBuffers == 0) {
pRequest->cBuffers = 2;
}
pRequest->cbBuffer = 480 * ( (640 / 128 + 1) * 128 ) * 3;
ALLOCATOR_PROPERTIES actual;
memset(&actual,0,sizeof(actual));
hr = pAlloc->SetProperties(pRequest, &actual);
if (FAILED(hr)) {
return hr;
}
if (actual.cbBuffer < pRequest->cbBuffer) {
return E_FAIL;
}
return S_OK;
}
The constants are only temporary!

There is no way you can use your own allocator with VMR/EVR filters. They just insist on their own, which in turn is backed on DirectDraw/Direct3D surfaces.
To connect directly to VMR/EVR filters you need a different strategy. The allocator is always theirs. You need to support extended strides. See Handling Format Changes from the Video Renderer.

Related

STM32F4 HAL ADC DMA Transfer Error

I'm using an STM32F405OG for a project and one of the necessary functions is monitoring 3 analog channels at ~1 Hz. My desired implementation is starting an ADC DMA read of all 3 channels in Scan mode and retrieving the results at a later time after the DMA complete interrupt has occurred.
I'm using ADC1 and have tried both DMA channels 0 and 4, both with the same result: HAL_ADC_ErrorCallback() is invoked after the first call to HAL_ADC_Start_DMA(). At this point, the ADC handle is in an error state (HAL_ADC_STATE_ERROR_DMA) with the error code 0x04 (HAL_ADC_ERROR_DMA). Checking the linked DMA handle yields a DMA error code of HAL_DMA_ERROR_NO_XFER, meaning "Abort requested with no Xfer ongoing."
I'm totally lost as to what's causing this - my code should be consistent with examples and the "how to use this module" comments at the top of stm32f4xx_hal_adc.c. I've attached my code below.
ADC_HandleTypeDef ADC_hADC =
{
.Instance = ADC1,
.Init =
{
.ClockPrescaler = ADC_CLOCK_SYNC_PCLK_DIV8,
.Resolution = ADC_RESOLUTION_12B,
.EOCSelection = ADC_EOC_SEQ_CONV, // EOC at end of sequence of channel conversions
.ScanConvMode = ENABLE,
.ContinuousConvMode = DISABLE,
.DiscontinuousConvMode = DISABLE,
.NbrOfDiscConversion = 0U,
.ExternalTrigConvEdge = ADC_EXTERNALTRIGCONVEDGE_NONE,
.ExternalTrigConv = ADC_SOFTWARE_START,
.DataAlign = ADC_DATAALIGN_RIGHT,
.NbrOfConversion = _NUM_ADC_CONV,
.DMAContinuousRequests = DISABLE
}
};
DMA_HandleTypeDef _hDmaAdc =
{
.Instance = DMA2_Stream0,
.Init =
{
.Channel = DMA_CHANNEL_0,
.Direction = DMA_PERIPH_TO_MEMORY,
.PeriphInc = DMA_PINC_DISABLE,
.MemInc = DMA_MINC_ENABLE,
.PeriphDataAlignment = DMA_PDATAALIGN_WORD,
.MemDataAlignment = DMA_MDATAALIGN_WORD,
.Mode = DMA_NORMAL,
.Priority = DMA_PRIORITY_HIGH,
.FIFOMode = DMA_FIFOMODE_DISABLE,
.FIFOThreshold = DMA_FIFO_THRESHOLD_FULL,
.MemBurst = DMA_MBURST_SINGLE,
.PeriphBurst = DMA_PBURST_SINGLE
}
};
void HAL_ADC_MspInit(ADC_HandleTypeDef *h)
{
if (!h)
{
return;
}
else if (h->Instance == ADC1)
{
__HAL_RCC_ADC1_CLK_ENABLE();
__HAL_RCC_DMA2_CLK_ENABLE();
HAL_DMA_Init(&_hDmaAdc);
__HAL_LINKDMA(h, DMA_Handle, _hDmaAdc);
HAL_NVIC_SetPriority(ADC_IRQn, IT_PRIO_ADC, 0);
HAL_NVIC_SetPriority(DMA2_Stream0_IRQn, IT_PRIO_ADC, 0);
HAL_NVIC_EnableIRQ(ADC_IRQn);
HAL_NVIC_EnableIRQ(DMA2_Stream0_IRQn);
}
}
uint32_t _meas[3];
ADC_ChannelConfTypeDef _chanCfg[3] =
{
// VIN_MON
{
.Channel = ADC_CHANNEL_1,
},
// VDD_MON
{
.Channel = ADC_CHANNEL_8,
},
// VDD2_MON
{
.Channel = ADC_CHANNEL_2,
}
};
Bool ADC_Init(void)
{
ADC_DeInit();
memset(_meas, 0, sizeof(_meas));
Bool status = (HAL_ADC_Init(&ADC_hADC) == HAL_OK);
if (status)
{
// Configure each ADC channel
for (uint32_t i = 0U; i < NELEM(_chanCfg); i++)
{
_chanCfg[i].Rank = (i + 1U);
_chanCfg[i].SamplingTime = ADC_SAMPLETIME_480CYCLES;
_chanCfg[i].Offset = 0U;
if (HAL_ADC_ConfigChannel(&ADC_hADC, &_chanCfg[i]) != HAL_OK)
{
status = FALSE;
break;
}
}
_state = ADC_STATE_READY;
}
if (!status)
{
ADC_DeInit();
}
return status;
}
Bool ADC_StartRead(void)
{
Bool status = TRUE;
status = (HAL_ADC_Start_DMA(&ADC_hADC, &_meas[0], 3) == HAL_OK);
return status;
}
After slowing down the conversions via the ClockPrescaler init structure field, increasing the number of ADC cycles, and calling HAL_ADC_Stop_DMA() (per file header comment in stm32f4xx_hal_adc.c), everything is working.
Note that calling HAL_ADC_Stop_DMA() in the DMA Transfer Complete ISR caused the aforementioned error conditions as well, so calls to that function will have to be made sometime after the DMAXferCplt ISR is invoked.

GetMonitorCapabilities returns false

I want to adjust the brightness of my monitor on Windows.
I follow this page:
How to use GetMonitorCapabilities and GetMonitorBrightness functions
but, the GetMonitorCapabilities function returns false.
I printed using qDebug, hPhysicalMonitor is NULL. Is this point wrong?
(The GetPhysicalMonitorsFromHMONITOR function returns true)
How can I fix this code?
HMONITOR hMonitor = NULL;
DWORD cPhysicalMonitors = 1;
LPPHYSICAL_MONITOR pPhysicalMonitors;
HWND hWnd = GetDesktopWindow();
// Get the monitor handle.
hMonitor = MonitorFromWindow(hWnd, MONITOR_DEFAULTTOPRIMARY);
// Get the number of physical monitors.
BOOL bSuccess = GetNumberOfPhysicalMonitorsFromHMONITOR(hMonitor, &cPhysicalMonitors);
if (bSuccess)
{
// Allocate the array of PHYSICAL_MONITOR structures.
pPhysicalMonitors = (LPPHYSICAL_MONITOR)malloc(cPhysicalMonitors* sizeof(PHYSICAL_MONITOR));
if (pPhysicalMonitors != NULL)
{
// Get the array.
bSuccess = GetPhysicalMonitorsFromHMONITOR( hMonitor, cPhysicalMonitors, pPhysicalMonitors);
if (bSuccess == FALSE)
{
qDebug("GetPhysicalMonitorsFromHMONITOR");
}
// Get physical monitor handle.
HANDLE hPhysicalMonitor = pPhysicalMonitors[0].hPhysicalMonitor;
DWORD pdwMonitorCapabilities = 0;
DWORD pdwSupportedColorTemperatures = 0;
bSuccess = GetMonitorCapabilities(hPhysicalMonitor, &pdwMonitorCapabilities, &pdwSupportedColorTemperatures);
if (bSuccess == FALSE)
{
qDebug("GetMonitorCapabilities");
}
DWORD pdwMinimumBrightness = 0;
DWORD pdwCurrentBrightness = 0;
DWORD pdwMaximumBrightness = 0;
bSuccess = GetMonitorBrightness(hPhysicalMonitor, &pdwMinimumBrightness, &pdwCurrentBrightness, &pdwMaximumBrightness);
if (bSuccess == FALSE)
{
qDebug("GetMonitorBrightness");
}
qDebug(QByteArray::number(bSuccess));
qDebug(QByteArray::number((WORD)pdwMinimumBrightness));
qDebug(QByteArray::number((WORD)pdwCurrentBrightness));
qDebug(QByteArray::number((WORD)pdwMaximumBrightness));
// Close the monitor handles.
bSuccess = DestroyPhysicalMonitors(cPhysicalMonitors, pPhysicalMonitors);
// Free the array.
free(pPhysicalMonitors);
}
}
If you want to adjust the brightness using WmiMonitorBrightnessMethods use the code below.
public static void SetWmiMonitorBrightness(byte targetBrightness)
{
ManagementScope scope = new ManagementScope("root\\WMI");
SelectQuery query = new SelectQuery("WmiMonitorBrightnessMethods");
using (ManagementObjectSearcher searcher = new ManagementObjectSearcher(scope, query))
{
using (ManagementObjectCollection objectCollection = searcher.Get())
{
foreach (ManagementObject mObj in objectCollection)
{
mObj.InvokeMethod("WmiSetBrightness",
new Object[] { UInt32.MaxValue, targetBrightness });
break;
}
}
}
}
Doing GetMonitorBrightness on the specific monitor I have causes ERROR_GRAPHICS_I2C_ERROR_TRANSMITTING_DATA.
So I used WmiSetBrightness instead. It works fine.
If you got a tip related to ERROR_GRAPHICS_I2C_ERROR_TRANSMITTING_DATA, please share it with me.

Media Foundation Frames in Byte Format Run-Time

I use media foundation to capture alive webcam video, is it possible to get the frames captured in a byte streams format in Run-Time and to write them as a stream of bits in a text file after each time cycle ?
I am not sure that I can have the stream in byte format(without container) neither I can do that on run time?
It's not completely clear what you're asking. If you want to capture the raw frames from webcam and save them to a file then the answer is yes that can be done. The Media Foundation SDK MFCaptureToFile sample does exactly that although because it uses a SinkWriter you will have to specify a container file type such as mp4 when creating it.
If you really do want to get the raw frames one by one then you need to dispense with the SinkWriter (or write a custom one). Below is a code snippet that shows getting samples from an IMFSourceReader and converting them into a byte array (and a few other things). You could write the byte array to a text file although unless you do something like put a bitmap header on it it won't be very useful. The IMFSourceReader, IMFMediaTypes all need to be set up correctly prior to being able to call ReadSample but hopefully it gives you a rough idea of where to look further.
HRESULT MFVideoSampler::GetSample(/* out */ array<Byte> ^% buffer)
{
if (_videoReader == NULL) {
return -1;
}
else {
IMFSample *videoSample = NULL;
DWORD streamIndex, flags;
LONGLONG llVideoTimeStamp;
// Initial read results in a null pSample??
CHECK_HR(_videoReader->ReadSample(
//MF_SOURCE_READER_ANY_STREAM, // Stream index.
MF_SOURCE_READER_FIRST_VIDEO_STREAM,
0, // Flags.
&streamIndex, // Receives the actual stream index.
&flags, // Receives status flags.
&llVideoTimeStamp, // Receives the time stamp.
&videoSample // Receives the sample or NULL.
), L"Error reading video sample.");
if (flags & MF_SOURCE_READERF_ENDOFSTREAM)
{
wprintf(L"\tEnd of stream\n");
}
if (flags & MF_SOURCE_READERF_NEWSTREAM)
{
wprintf(L"\tNew stream\n");
}
if (flags & MF_SOURCE_READERF_NATIVEMEDIATYPECHANGED)
{
wprintf(L"\tNative type changed\n");
}
if (flags & MF_SOURCE_READERF_CURRENTMEDIATYPECHANGED)
{
wprintf(L"\tCurrent type changed\n");
IMFMediaType *videoType = NULL;
CHECK_HR(_videoReader->GetCurrentMediaType(
(DWORD)MF_SOURCE_READER_FIRST_VIDEO_STREAM,
&videoType), L"Error retrieving current media type from first video stream.");
Console::WriteLine(GetMediaTypeDescription(videoType));
// Get the frame dimensions and stride
UINT32 nWidth, nHeight;
MFGetAttributeSize(videoType, MF_MT_FRAME_SIZE, &nWidth, &nHeight);
_width = nWidth;
_height = nHeight;
//LONG lFrameStride;
//videoType->GetUINT32(MF_MT_DEFAULT_STRIDE, (UINT32*)&lFrameStride);
videoType->Release();
}
if (flags & MF_SOURCE_READERF_STREAMTICK)
{
wprintf(L"\tStream tick\n");
}
if (!videoSample)
{
printf("Failed to get video sample from MF.\n");
}
else
{
DWORD nCurrBufferCount = 0;
CHECK_HR(videoSample->GetBufferCount(&nCurrBufferCount), L"Failed to get the buffer count from the video sample.\n");
IMFMediaBuffer * pMediaBuffer;
CHECK_HR(videoSample->ConvertToContiguousBuffer(&pMediaBuffer), L"Failed to extract the video sample into a raw buffer.\n");
DWORD nCurrLen = 0;
CHECK_HR(pMediaBuffer->GetCurrentLength(&nCurrLen), L"Failed to get the length of the raw buffer holding the video sample.\n");
byte *imgBuff;
DWORD buffCurrLen = 0;
DWORD buffMaxLen = 0;
pMediaBuffer->Lock(&imgBuff, &buffMaxLen, &buffCurrLen);
if (Stride != -1 && Stride < 0) {
// Bitmap needs to be flipped.
int bmpSize = buffCurrLen; // ToDo: Don't assume RGB/BGR 24.
int absStride = Stride * -1;
byte *flipBuf = new byte[bmpSize];
for (int row = 0; row < _height; row++) {
for (int col = 0; col < absStride; col += 3) {
flipBuf[row * absStride + col] = imgBuff[((_height - row - 1) * absStride) + col];
flipBuf[row * absStride + col + 1] = imgBuff[((_height - row - 1) * absStride) + col + 1];
flipBuf[row * absStride + col + 2] = imgBuff[((_height - row - 1) * absStride) + col + 2];
}
}
buffer = gcnew array<Byte>(buffCurrLen);
Marshal::Copy((IntPtr)flipBuf, buffer, 0, buffCurrLen);
delete flipBuf;
}
else {
buffer = gcnew array<Byte>(buffCurrLen);
Marshal::Copy((IntPtr)imgBuff, buffer, 0, buffCurrLen);
}
pMediaBuffer->Unlock();
pMediaBuffer->Release();
videoSample->Release();
return S_OK;
}
}
}

Changing resolution using media foundation under win 7

I am writing an app on top media foundation under win 7, I use IMFMediaSource to query the cameras interfaces to get frames and other properties. its weird but I cant find a way to change resolution. it seems that if I used IMFCaptureSource i could use SetCurrentDeviceMediaType to change resolution but its only supported in Windows 8. so we cant change resolution under win 7 using media foundation?? is there a way to use direct show with IMFMediaSource to change resolution??
if so, can anyone help with some code sample?
thanks!
ok, so I found out eventually. Iam using IMFSourceReader to get samples from the MFMediaSource, so after configurating the SourceReader you can iterate the native media types that the camera supports like this :
HRESULT nativeTypeErrorCode = S_OK;
DWORD count = 0;
UINT32 streamIndex = 0;
UINT32 requiredWidth = 1600;
UINT32 requiredheight = 900;
while (nativeTypeErrorCode == S_OK)
{
IMFMediaType * nativeType = NULL;
nativeTypeErrorCode = m_pReader->GetNativeMediaType(streamIndex, count, &nativeType);
if(nativeTypeErrorCode != S_OK) continue;
// get the media type
GUID nativeGuid = {0};
hr = nativeType->GetGUID(MF_MT_SUBTYPE, &nativeGuid);
if (FAILED(hr)) return hr;
UINT32 width, height;
hr = ::MFGetAttributeSize(nativeType, MF_MT_FRAME_SIZE, &width, &height);
if (FAILED(hr)) return hr;
if(nativeGuid == <my type guid> && width == requiredWidth && height == requiredheight)
{
// found native config, set it
hr = m_pReader->SetCurrentMediaType(streamIndex, NULL, nativeType);
if (FAILED(hr)) return hr;
break;
}
SafeRelease(&nativeType);
count++;
}
this means that Iam not creating a new media type with the resolution I require, I get the native media type with the configuration I need, and I set it on the SourceReader.
hope it will help the future media foundation traveler... :)
You can query directshow interface from IMediaSource which can change resolution.
for ex:for Camera control properties I do like this.
IAMCameraControl* m_pCameraControl = NULL;
HRESULT hr = S_OK;
hr = pMediaSource->QueryInterface(IID_PPV_ARGS(&m_pCameraControl));
if (m_pCameraControl == NULL)
{
return E_FAIL;
}
In the same way in your case I am not sure about the interface but I guess it will be in following way.
IAMStreamConfig * m_pStreamConfig = NULL;
HRESULT hr = S_OK;
hr = pMediaSource->QueryInterface(IID_PPV_ARGS(&m_pStreamConfig ));
if (m_pCameraControl == NULL)
{
return E_FAIL;
}

c++ builder: convert video to png-snapshots with directshow

Thanks to your help I was able to search for the right words to use directshow a bit better.
I found a tutorial how to use the SampleGrabber-object here:
http://msdn.microsoft.com/en-us/library/windows/desktop/dd407288%28v=vs.85%29.aspx
I could implement it and modified it a bit so it isn't just saving the first frame, but every Frame to a PNG. For that I use corona.
However, I just guessed something around and don't quite know which buffers are containing my data and in which form.
So, I have basically 3 questions:
Am I using SavePNG right? the resulting Images are upside-down!
Can I replace the BaseFilter for the video with one that is connected to a camera?
Contains pBuffer my Imagedata so I can get rgb-byte-informations by simply type pBuffer[123]?
I'm using embarcadero's C++-Builder (XE2 16).
Here is the code I found at the website, a bit modified (error-handling removed for better view. after each hr=... there is a Failed-check):
void __fastcall TForm1::btn_kameraClick(TObject *Sender)
{
HRESULT hr = S_OK;
IGraphBuilder *pGraph = NULL;
IMediaControl *pControl = NULL;
IMediaEventEx *pEvent = NULL;
IBaseFilter *pGrabberF = NULL;
ISampleGrabber *pGrabber = NULL;
IBaseFilter *pSourceF = NULL;
IEnumPins *pEnum = NULL;
IPin *pPin = NULL;
IBaseFilter *pNullF = NULL;
BYTE *pBuffer = NULL;
hr = CoCreateInstance(CLSID_FilterGraph, NULL,CLSCTX_INPROC_SERVER,IID_PPV_ARGS(&pGraph));
hr = pGraph->QueryInterface(IID_PPV_ARGS(&pControl));
hr = pGraph->QueryInterface(IID_PPV_ARGS(&pEvent));
hr = CoCreateInstance(CLSID_SampleGrabber, NULL, CLSCTX_INPROC_SERVER,IID_PPV_ARGS(&pGrabberF));
hr = pGraph->AddFilter(pGrabberF, L"Sample Grabber");
hr = pGrabberF->QueryInterface(IID_PPV_ARGS(&pGrabber));
AM_MEDIA_TYPE mt;
ZeroMemory(&mt, sizeof(mt));
mt.majortype = MEDIATYPE_Video;
mt.subtype = MEDIASUBTYPE_RGB24;
hr = pGrabber->SetMediaType(&mt);
hr = pGraph->AddSourceFilter(L"C:/Users/Julian/Desktop/homogenität/1,1x_2,7y.mpg", L"Source", &pSourceF);
hr = pSourceF->EnumPins(&pEnum);
while (S_OK == pEnum->Next(1, &pPin, NULL))
{
hr = ConnectFilters(pGraph, pPin, pGrabberF);
SafeRelease(&pPin);
if (SUCCEEDED(hr))break;
}
hr = CoCreateInstance(CLSID_NullRenderer, NULL, CLSCTX_INPROC_SERVER,IID_PPV_ARGS(&pNullF));
hr = pGraph->AddFilter(pNullF, L"Null Filter");
hr = ConnectFilters(pGraph, pGrabberF, pNullF);
hr = pGrabber->SetOneShot(TRUE);
hr = pGrabber->SetBufferSamples(TRUE);
long evCode=0;
long cbBuffer=0;
hr = pControl->Run();
hr = pEvent->WaitForCompletion(INFINITE, &evCode);
hr = pGrabber->GetCurrentBuffer(&cbBuffer, NULL);
pBuffer = (BYTE*)CoTaskMemAlloc(cbBuffer);
hr = pGrabber->GetConnectedMediaType(&mt);
CComQIPtr< IMediaSeeking, &IID_IMediaSeeking > pSeeking( pGraph );
// for(int i=0;i<10;i++){
bool hui=true;int i=0;
while(hui){
REFERENCE_TIME Start = i * UNITS;
hr = pSeeking->SetPositions( &Start, AM_SEEKING_AbsolutePositioning,NULL, AM_SEEKING_NoPositioning );
// Sleep(10);
hr = pEvent->WaitForCompletion(INFINITE,&evCode);
if(hr!=0)hui=false;
hr = pGrabber->GetCurrentBuffer(&cbBuffer, (long*)pBuffer);
if ((mt.formattype == FORMAT_VideoInfo) &&(mt.cbFormat >= sizeof(VIDEOINFOHEADER)) &&(mt.pbFormat != NULL))
{
VIDEOINFOHEADER *pVih = (VIDEOINFOHEADER*)mt.pbFormat;
// hr = WriteBitmap(("hui"+(String)i+".bmp").c_str(), &pVih->bmiHeader, mt.cbFormat - SIZE_PREHEADER, pBuffer, cbBuffer);
hr = SavePNG(i,pBuffer, pVih->bmiHeader.biWidth, pVih->bmiHeader.biHeight);
}
else hr = VFW_E_INVALIDMEDIATYPE;
i++;
}
FreeMediaType(mt);
done:
CoTaskMemFree(pBuffer);
SafeRelease(&pPin);
SafeRelease(&pEnum);
SafeRelease(&pNullF);
SafeRelease(&pSourceF);
SafeRelease(&pGrabber);
SafeRelease(&pGrabberF);
SafeRelease(&pControl);
SafeRelease(&pEvent);
SafeRelease(&pGraph);
}
bool SavePNG(int i, Byte* m_pImageData,long m_Width,long m_Height)
{
// Make sure there is image data
if (!m_pImageData)
return false;
stringstream FilePath;
FilePath << "hui"<< i<<".png";
// Create a corona image
corona::Image* pImage = corona::CreateImage(m_Width, m_Height, corona::PF_R8G8B8, m_pImageData);
// Make sure the image was created
if (!pImage)
return false;
// Save the image to a PNG file
corona::SaveImage(FilePath.str().c_str(), corona::FF_PNG, pImage);
// Delete the corona image
delete pImage;
// Nothing went wrong
return true;
}
I hope I have done nothing horribly wrong... I really tried to research everything^^
Does somebody knows about my 3 questions above?
I you found something really wrong here, I would also appreciate for you to tell me, so I can fix and improve.
Regards,
Julian
Now when you have a video frame in 24-bit RGB format, all you need is to compress to PNG. You have choices here:
libpng
GDI+
WIC
Possibly, C++ bulder has native classes to cover PNG as well.
P.S. DirectShow API you are using is not DirectX, it is a part of Windows core SDK.

Resources