GDCL Mpeg-4 Multiplexor Problem - directshow

I just create a simple graph
SourceFilter(*.mp4 file format) ---> GDCL MPEG 4 Mux Filter ---> File writer Filter
It works fine. But when the source is in h264 file format
SourceFilter( *.h264 file format) ---> GDCL MPEG 4 Mux Filter---> File writer Filter
It record a file but the recorded file does not play in VLC Player, QuickTime, BS Player, WM Player.
What i am doing wrong? Any ideas to record h264 video source? Do i need H264 Mux?
Best Wishes
PS: i JUST want to record video by the way...Why i need a mux?

There are two H.264 formats used by DirectShow filters. One is Byte Stream Format, in which each NALU is preceded by a start code 00 00 01. The other is the format used within MP4 files, in which each start code is preceded by a length (the media type or the MP4 file metadata specifies how many bytes are used in the length field). The problem is that some FOURCCs are used for both formats.
The MP4 mux sample accepts either BSF or length-preceded data, depending on the subtype give. It does not attempt to work out which it is. Most likely, when you are feeding the H.264 elementary stream, you are giving the mux a FOURCC or media type that the mux thinks means length-prepended, when you are giving BSF data. Check in TypeHandler::CanSupport.
If you just want to save H.264 video to a file, you can use a Dump filter to just write the bits to a file. If you are saving BSF, this is a valid H.264 elementary stream file. If you want support for the majority of players, or if you want seeking support, then you will want to write the elementary stream into a container with an index, such as MP4. In this case, you need a mux, not for the multiplexing, but for the indexing and metadata creation.
G

Related

How to process RTP data for Microsoft directshow MPEG1 decoder

Starting from videoprocessing project, I'm trying to build a directshow filter that connects to a RTSP server becoming a source filter for the Windows MPEG1 decoder (I can not use other formats or decoders having WinCE as OS target).
My filter declares MediaType
MEDIATYPE_Video type
FORMAT_MPEGVideo subtype
MEDIASUBTYPE_MPEG1Payload formatType
Currently, when I connect my rtspSource filter with the CLSID_CMpegVideoCodec decoder, I am rendering a black video.
However, if I replace the windows decoder with CLSID_LAV_VideoDecoderFilter provided by the LAVFilters project, the video is correctly rendered.
After reading "How to process raw UDP packets so that they can be decoded by a decoder filter in a directshow source filter", dealing with the same issue for H264 and MPEG-4, I also read the RFC2250 and then I have depacketized the data but the result is the same.
Currently I'm sending to decoder packets starting with Video Stream Start Code
000001 00 (Picture)
or integral packets starting with
000001 B3 (Sequence Header)
and which contain within them also startCode
000001 B2 (User Data)
000001 B8 (Group Of Picture)
000001 00 (Picture)
000001 01 (Slice)
Still referring to the previous link, which deals with H264 and MPEG-4 cases, speak about "Process data for decoder" but I am not clear exactly what is expected by the CLSID_CMpegVideoCodec filter, after agreeing the format type MEDIASUBTYPE_MPEG1Payload.
However, adding at the beginning of each sample the three bytes 000001 or the 4 bytes 00000100, the video is rendered with images updated approximately every 2 seconds and losing the intermediate images.
I performed the tests both by setting the IMediaSample with
SetTime(NULL, NULL)
that setting
SetTime(start, start+1)
with:
start = (rtp_timestamp - rtp_timestamp_first_packet) + 300ms
following the answer to "Writing custom DirectShow RTSP/RTP Source push filter - timestamping data coming from live sources"
but the results do not change.
Any suggestions would be greatly appreciated.
Thanks in advance.

GDCL Mpeg-4 Multiplexor - Filter's can't agree on connection

I am attempting to publish some mp4 files with the GDCL Mpeg-4 Multiplexor but it's not accepting the input from my camera (QuickCam
Orbit/Sphere AF).
I see that it has set the the sub type is set to MEDIASUBTYPE_NULL.
I can't seem to figure out a set of filters that will adapt to
successfully link the pins. What do I need to do to successfully
adapt from my Capture pin to the multiplexor?
GDCL Mpeg-4 Multiplexor multiplexes compressed data and your camera captures raw (uncompressed) video. You need to insert a compressor in between in order to deliver MPEG-4 compatible video into the multiplexer. That is, MPEG-4 Part 2 or MPEG-4 Part 10 AKA H.264 video compressor. The multiplexer filter itslef does not do data compression/encoding.

Encoding videos for use with Adobe Live Streaming

I have an original video coded at 20Mbps, 1920x1080, 30fps and want to convert it down to be 640x480 30fps at a range of (3 different) bitrates for use by Adobe Live Streaming.
Should I use ffmpeg to resize and encode at the 3 bitrates then use f4fpackager to create the f4m f4f and f4x files or just use ffmpeg to reduce the resolution and then f4fpackager to encode the relevant bitrates?
I've had several tries so far, but when encoded the videos seem to play at a much larger bitrate than they've been encoded at. For example, if I set up the OSMF to play from my webserver, I'd be expecting my best encoded video to play at 1,500kbps but it's way above that.
Has anyone had any experience of encoding for use like this?
I'm using the following options to f4fpackager
--bitrate=1428 --segment-duration 30 --fragment-duration 2
f4fpackager doesn't do any encoding, it does 2 things:
- fragment the mp4 files (mp4 -> f4f)
- generate a Manifest (f4m) file referencing all you fragmented files (f4f)
So the process is:
- transcode your source file in all the size/bitrate that you want to provide (eg: 1920x01080#4Mbps, 1280x720#2Mbps, etc)
- use f4fpackager to convert the mp4 to f4f (this is the fragmentation step)
- use f4fpackager to generate the Manifest.f4m referencing the files that you generated in the previous step
the --bitrate option of f4fpackager should match the value that you use with ffmpeg, this parameter is used to generate the manifest file with the correct bitrate value of each quality

problem with asf writer

Im trying to encode raw data(both video frame and audio sample) into .asf file, using asf writer filter in directshow.
my filter graph structure:
raw_send_filter -> asf writer filter
raw_send_filter implements CBaseFilter and CBaseOutputPin. It plays a role as source filter which get raw data, then deliver them to ASF writer filter. The process follows these steps:
Get deliver buffer (return into "sample") , using the function CBaseOutputPin::GetDeliveryBuffer
sample->GetPointer(&buffer);
Set time stamp (with frame rate = 30 fps)
deliver sample
The problem is after encode some raw data, I can not deliver any more.
I can encode .avi file with this way, using Avi mux filter. Can u tell me why I can not deliver samples after encoding some?
Thanks.
Possibly the ASF multiplexer is waiting for more data. Check if you send audio and video in the same rate.

get flv length before uploading to server

I'm using the FileReference class to upload flvs to a server.
Is it possible to check the flv length not size before allowing an upload?
Are you targeting Flash Player 10 alone or lower versions too? Because lower versions of Flash player (9 etc) do not allow the uploading SWF to read the contents of file (other than creationDate, creator (The Macintosh creator type of the file), modificationDate, name, size in bytes and type), so there is no way you are going to be able to do this on those players.
If you are targeting solely FP10 users, you can load the FLV into a ByteArray in your SWF and
Play it using an FLV player and read the duration property from the player. But I couldn't find an FLV player that takes a ByteArray as input - and after reading this thread in SO, it seems like that is not possible at all.
Parse the FLV file, and read the duration property from its metadata. The FLV file specification is open, but this isn't going to be easy.
Update to the comment:
Excerpts from the FLV file spec:
onMetaData
An FLV file can contain metadata with an “onMetaData” marker. Various stream properties
are available to a running ActionScript program via the NetStream.onMetaData property.
The available properties differ depending on the software used.
Common properties include:
duration: a DOUBLE indicating the total duration of the file in seconds
width: a DOUBLE indicating the width of the video in pixels
height: a DOUBLE indicating the height of the video in pixels
videodatarate: a DOUBLE indicating the video bit rate in kilobits per second
framerate: a DOUBLE indicating the number of frames per second
videocodecid: a DOUBLE indicating the video codec ID used in the file (see “Video
tags” on page 8 for available CodecID values)
audiosamplerate: a DOUBLE indicating the frequency at which the audio stream is
replayed
audiosamplesize: a DOUBLE indicating the resolution of a single audio sample
stereo: a BOOL indicating whether the data is stereo
audiocodecid: a DOUBLE indicating the audio codec ID used in the file (see “Audio
tags” on page 6 for available SoundFormat values)
filesize: a DOUBLE indicating the total size of the file in bytes
FLV file can contain metadata - it doesn't say it will contain metadata. It also says that available properties can vary based on the software used to create FLV. So I guess there is no guarantee (as per specs) that the duration property will be present. That said, duration is one of the basic properties of FLV and it would be safe to assume that any reasonable software would include it.
You can use Netstream.appendBytes to feed FileReference.data (after a call to browse, before a call to upload) to a NetStream used for playing a video. From there, the duration can be taken from the metadata, as described elsewhere on this thread. Note that at least Flash Player 10 is required for this approach.

Resources