flex video player doesn't play until the video is fully loaded - apache-flex

the player uses a VideoDisplay
and i set the source like videoDisplay.source = "sourceStringURL"
and the vid doesn't play until it's fully loaded

If you are trying to play recorded MP4 then it will not play until you record it in Progressive format, and if you cant specify progressive format then you have to use this program QTIndexSwapper that will help you.
When you directly record MP4 format, the file size is unknown for the recorder so when you stop recording the recorded moves the MOOV atom which is beginning point of MP4 at the end of the file. This atom needs to be placed at the beginning for flash player to detect its length and where to start. QTIndexSwapper does it for you. You will also get lots of c based program to do that.

Related

C# AForge VideoFileWriter WriteVideoFrame error

I want to make a app that record video using webcam,
I made the logic like get each frame as bitmap and store it to file using
AForge VideoFileWriter WriteVideoFrame function,
When I open then file using VideoFileWriter Open function,
writer.Open(path, VideoWidth, VideoHeight, frameRate, VideoCodec.H264, bitRate);
It is hard to determine the bitRate, when the bitrate is wrong, the whole program die without any error.
I think the bitrate is related to video frame width, height, framerate, bitcount as well as codec,
But I not sure the specific formular to calculate.
I want to compress the video using h264 codec.
Can anyone help me to find out the solution?
Thank you very much.

change recording file programmatically in directshow

I made a console application, using directshow, that record from a live source (now a webcam, then a tv capture card), add current date and time in overlay and then save audio and video as .asf.
Now I want that the output file is going to change every 60 minutes without stopping the graph. I must not loose any seconds of the live stream.
The graph is something like this one:
http://imageshack.us/photo/my-images/543/graphp.jpg/
I took a look at the GMFBridge but I have some compiling problem with their examples.
I am wondering if there is a way to split what exist from the overlay filter and audio source, connect them to another asf writer (paused) and then switch them every 60 minutes.
The paused asf filter's file name must change (pp.asf, pp2.asf, pp4.asf ...). Something like this:
http://imageshack.us/photo/my-images/546/graph1f.jpg/
with pp1 paused. I found some people in internet that say that the asf writer deletes the current file if the graph does not go in stop mode.
Well, I have the product (http://www.videophill.com) that does exactly what you described (its used for broadcast compliance recording purposes) - and I found that only way to do that is this:
create a dshow graph that will be used only to capture the audio and video
then, at the end of the graph, insert samplegrabber filters, both for audio and video
then, use IWMWritter to create and save wmv file, using samples fetched from samplegrabber filters
when time comes, close one IWMWritter and create another one.
That way, you won't lose single frame when switching the output files.
Of course, there is also question of queue-ing and storing the samples (when switching the writters) and properly re-aligning the audio/video timestamps, but from my research, that's the only 'normal' way to do it, and I used in practice.
The solution is in writing a custom DShow filter with two input pins in your case. One for audio stream and the other for video stream. Inside that filter (doesn't have to be inside from the architecture point of view, because you can also use callbacks for example and do the job somewhere else) you should create asf files. While switching files, A/V data would be stored in cache (e.g. big enough circular buffer). You can also watch and modify A/V sync in that filter. For writing ASF files I would recommend Windows Media Format SDK.You can also add output pins if you like to pass A/V data further if necessary for preview, parallel streaming etc...
GMFBridge is a viable, but complicated solution, a more direct approach I have implemented in the past is querying your ASF Writer for the IWMWriterAdvanced2 interface and setting a custom sink. Within that interface you have methods to remove and add sinks to your ASF writer. The sink automatically connected will write to the file that you speficifed. One way to write whereever you want to is
1.) remove all default sinks:
pWriterAdv->RemoveSink(NULL);
2.) register a custom sink:
pWriterAdv->AddSink((IWMWriterSink*)&streamSink);
The custom sink can be a class that implements IWMWriterSink, which requires implementing callback methods that are called i.e. when the ASF header is written (OnHeader(/* [in] */ INSSBuffer *pHeader);) and when a data packet is written (OnDataUnit(/* [in] */ INSSBuffer *pDataUnit);) - in your implementation you can then write them wherever you want, for example offer additional methods on this class where you can specify the file name you want to write to.
Note that this solution does not quite get you were you want to if you need to write out the header information in each of the 60 minute files - after the initial header you will only get ASF packet data. A workaround for that could be to re-write the intial header before any packet data of each file, however this will produce an unindexed (non-seekable) ASF file.

get flv length before uploading to server

I'm using the FileReference class to upload flvs to a server.
Is it possible to check the flv length not size before allowing an upload?
Are you targeting Flash Player 10 alone or lower versions too? Because lower versions of Flash player (9 etc) do not allow the uploading SWF to read the contents of file (other than creationDate, creator (The Macintosh creator type of the file), modificationDate, name, size in bytes and type), so there is no way you are going to be able to do this on those players.
If you are targeting solely FP10 users, you can load the FLV into a ByteArray in your SWF and
Play it using an FLV player and read the duration property from the player. But I couldn't find an FLV player that takes a ByteArray as input - and after reading this thread in SO, it seems like that is not possible at all.
Parse the FLV file, and read the duration property from its metadata. The FLV file specification is open, but this isn't going to be easy.
Update to the comment:
Excerpts from the FLV file spec:
onMetaData
An FLV file can contain metadata with an “onMetaData” marker. Various stream properties
are available to a running ActionScript program via the NetStream.onMetaData property.
The available properties differ depending on the software used.
Common properties include:
duration: a DOUBLE indicating the total duration of the file in seconds
width: a DOUBLE indicating the width of the video in pixels
height: a DOUBLE indicating the height of the video in pixels
videodatarate: a DOUBLE indicating the video bit rate in kilobits per second
framerate: a DOUBLE indicating the number of frames per second
videocodecid: a DOUBLE indicating the video codec ID used in the file (see “Video
tags” on page 8 for available CodecID values)
audiosamplerate: a DOUBLE indicating the frequency at which the audio stream is
replayed
audiosamplesize: a DOUBLE indicating the resolution of a single audio sample
stereo: a BOOL indicating whether the data is stereo
audiocodecid: a DOUBLE indicating the audio codec ID used in the file (see “Audio
tags” on page 6 for available SoundFormat values)
filesize: a DOUBLE indicating the total size of the file in bytes
FLV file can contain metadata - it doesn't say it will contain metadata. It also says that available properties can vary based on the software used to create FLV. So I guess there is no guarantee (as per specs) that the duration property will be present. That said, duration is one of the basic properties of FLV and it would be safe to assume that any reasonable software would include it.
You can use Netstream.appendBytes to feed FileReference.data (after a call to browse, before a call to upload) to a NetStream used for playing a video. From there, the duration can be taken from the metadata, as described elsewhere on this thread. Note that at least Flash Player 10 is required for this approach.

Get mp3 total track time using either javascript or ASP.NET

I am using the below jQuery plugin for playing mp3
www.happyworm.com/jquery/jplayer
However, there is a bug in Flash that the total play (track) time won't show up correctly UNTIL AFTER the whole mp3 is completed downloaded.
I wonder if there is a way to work around this to get the correct total time using either javascript / another flash / even backend library in ASP.NET. Any suggestion helps. Thanks
You sure that's a bug? Looking at the header definition for the MP3 format I don't see any values for the length of the file. Generally applications that play MP3s would have to calculate the time, and that may not be doable until the entire file is downloaded. So the behavior you're seeing from Flash might be expected.
Theoretically if it's a fixed bitrate file (as opposed to VBR) then knowing the bitrate (gotten from the header) and the total size of the file should be enough to calculate it. However, the server would have to report the size of the file in the response headers (and that's not guaranteed to be accurate).
My guess is you'd need some service on the server that could calculate the length and report that to you in a separate request.

Generate flv, mpg or some other movie format from an ActionScript movie clip

I am working on a Flex application/game where a lot of UIComponents are moved around on a canvas.
I would like to "record" an flv movie of the movement on the canvas. Is there anyway this can be accomplished ?
I essentially want my users to be able to record small flv videos of their games to be uploaded on youtube.
Any ideas or suggestions about how to do this ?
There is SimpleFlvWriter (for AIR). You may modify it to get a non-AIR version. But memory management will be an issue since BitmapData will take up a lot of memory... It may be possible for a few seconds flv but definite not for several minutes.
Usually we stream things to a Flash server (eg. Flash Media Server, Red5) and let the server create the flv. But you need to find a way to convert the screen captures to NetStream. Or you may find other server side technology that can create flv from sequence of BitmapData. But in anyway it will consume a lot of bandwidth.
An alternative I can think of, is to save all the game commands(in XML, or other text format) and send it to the server. And you write a program in server-side to generate the flv from only the game commands. But it will be the most difficult solution to be implemented.

Resources