How to play flv formatted byteArray in OSMF? - apache-flex

I am developing an web application in flex which have a feature of recording the runtime by having a snapshot of each frames then encoding it into a ByteArray for video playback.
I am currently using NetStream.appendBytes() for playing the ByteArray FLV. It is working, but I just found out about OSMF and thinking bout integrating it in my application.
It is it possible to play the flv byteArray in OSMF? An example on how can it be done would be totally great. thanks!

I am now able to play flv bytearrays in OSMF. Beforehand, I've already been able to play byteArray by creating a new class that extends netStream and overriding its play method to use appendbytes instead. So what I did was to make OSMF use it. I did this creating these classes:
1. ByteStreamElement - media element
2. ByteStreamLoader - extends LoaderBase
3. ByteStreamLoadTrait - extends LoadTrait
overriding netstremas seek/play method:
//manually dispatch seek event since we override seek()
dispatchEvent(new NetStatusEvent(NetStatusEvent.NET_STATUS,false,false, {code:"NetStream.Play.Seek", level:"status"}));
//look for byte position based on _seekTime value
flvStream = _sfw.getFlvStream(false);
_seekTime = parameters[1] * 1000; //netstream time in milliseconds
_flvParser.parse(flvStream, false, flvTagSeeker);
flvStream.position = _flvParserProcessed;
//append flvtag from the new byte position to end of flv byteArray
var tmp:ByteArray = new ByteArray();
flvStream.readBytes(tmp, 0, flvStream.bytesAvailable);
_flvParserProcessed = 0;
this.appendBytesAction(NetStreamAppendBytesAction.RESET_SEEK);
appendBytes(tmp)
And using it like this:
mediaPlayerSprite = new MediaPlayerSprite();
addChild(mediaPlayerSprite);
mediaPlayerSprite.media = new ByteStreamElement();
Im really not sure though if this is the best way to do it. Not sure if it is best that i created new classes or I should have written some sort of plugin for OSMF to use to play bytearrays. And another thing is that, what I really need is it to contiually appendbytes in the player in case needed. That's why Im still not using this and for the mean time Ill stick with my custom made "ByteStream player" until I figure this out.

Related

Flex/AS code for webcam

Can any one point me to a small piece of flex/AS code to test webcam and Microphone.
Thanks
Adobe supplies examples in the documentation. read flash.media.Camera and flash.media.Microphone. the examples you are looking for are under the getCamera() and getMicrophone() public methods of the Camera and Microphone classes respectively.
The smallest amount of code you can use to create a web cam video feed is this:
var cam:Camera = Camera.getCamera();
cam.setMode(640,480,30);
var video:Video = new Video(640, 480);
video.attachCamera(cam);
addChild(video);
Glancing over the docs will give you some additional properties and arguments that you can use to suit your needs.
Update
Here is a link to a working version of this code http://wonderfl.net/c/oGEY

VLC-Player in adobe Air App does not stop/quit

Well i have an adobe air , downloaded from below link.. it is wonderful app..
http://www.adobe.com/devnet/air/flex/articles/air_screenrecording.html
and this works fine. It captures my screen , record audio but it just does not stop or quit as vlc-player.exe continues to run in the task manager.
i tried lots of vlc- commands but it just does not stop once it starts capturing screen video.
I need help on it..
I know this is a old thread, but just in case someone wants to know...
You can't use rc-fake-tty because Windows doesn't support terminal. For Windows, tell VLC to run with only one instance, then send it the quit command as a separate NativeProcess call.
So, in the linked article, change the stopRecording() method to this:
public function stopRecording():void{
var startupInfo:NativeProcessStartupInfo = new NativeProcessStartupInfo();
startupInfo.executable = vlcFile;
var processArgs:Vector.<String> = new Vector.<String>();
processArgs.push("-I");
processArgs.push("rc"); //Remote control
processArgs.push("--one-instance");
processArgs.push("vlc://quit");
startupInfo.arguments = processArgs;
var killSwitch:NativeProcess = new NativeProcess();
killSwitch.start(startupInfo);
}
And make sure to add this:
processArgs.push("--one-instance");
To your initial screen record startupInfo in startRecording() method.
I quit using vlc for the same reason and started to write my recording application using .Net 4, but i am having less performance using c# now.
Edit:
VLC for windows does not support fake rc control so setting rc-fake-tty is useless. As the very last try, i wanna control is via socket. If you got it working this way, please make me informed.

Wowza + Flex not reproducing whole audio

I'm writing a flex app, which must record an audio and then playback. It records just fine, I can hear the flv on the server, but when it comes to the playback it cuts the end a little bit, and each time I ask to reproduce again it cuts a little bit more. What can it be? I guess it's something related to buffer management, but I don't know exactly. Any thoughts?
EDIT: Here's the code I'm using to playback. It is called from a mediator:
var streamPlayClient:Object = new Object();
this.stream.client = streamPlayClient;
streamPlayClient.onPlayStatus = function(infoObject:Object):void {
if (infoObject.code == "NetStream.Buffer.Flush") {
stopPlayback();
}
}
this.stream.play("flv:" + this.streamName);
As it turns out, I have to handle the NetStream.Buffer.Empty event, instead of the NetStream.Play.Complete or the NetStream.Buffer.Flush.

Flex slow first Http request

When i use loader.load(request); for the first time, my flex freeze for 10 secondes before posting the data (i can see the web server result in real time).
However if redo a similar POST with other data but same request.url, it's instantaneous.
// Multi form encoded data
variables = new URLVariables();
variables.user = "aaa";
variables.boardjpg = new URLFileVariable(data.boardBytes, "foo.jpg");
request = new URLRequestBuilder(variables).build();
request.url = "http://localhost:8000/upload/";
loader.load(request);
How can i see what is taking so long ?
Thanks !
Ok, this is an old question, anyway I find it searching for other things so quick adding this
URLFileVariables nor URLRequestBuilder are core classes in AS3, so I guess you're using some custom library to build your request. I don't know which library you use, but it seems that the purpose is to serialize some binary data to build a POST. Serializing usually takes some times the first time (lookup initialization and the like) and goes faster next, a well known example is Remoting in his different flavours

Streaming webcam video in Flash using MP4 encoding

One of the features of the Flash app I'm working on is to be able to stream a webcam to others. We're just using the built-in webcam support in Flash and sending it through FMS.
We've had some people ask for higher quality video, but we're already using the highest quality setting we can in Flash (setting quality to 100%).
My understanding is that in the newer flash players they added support for MPEG-4 encoding for the videos. I created a simple test Flex app to try and compare the video quality of the MP4 vs FLV encodings. However, I can't seem to get MP4 to work at all.
According to the Flex documentation the only thing I need to do to use MP4 instead of FLV is prepend "mp4:" to the name of the stream when calling publish:
Specify the stream name as a string
with the prefix mp4: with or without
the filename extension. The prefix
indicates to the server that the file
contains H.264-encoded video and
AAC-encoded audio within the MPEG-4
Part 14 container format.
When I try this nothing happens. I don't get any events raised on the client side, no exceptions thrown, and my logging on the server side doesn't show any streams starting.
Here's the relevant code:
// These are all defined and created within the class.
private var nc:NetConnection;
private var sharing:Boolean;
private var pubStream:NetStream;
private var format:String;
private var streamName:String;
private var camera:Camera;
// called when the user clicks the start button
private function startSharing():void {
if (!nc.connected) {
return;
}
if (sharing) { return; }
if(pubStream == null) {
pubStream = new NetStream(nc);
pubStream.attachCamera(camera);
}
startPublish();
sharing = true;
}
private function startPublish():void {
var name:String;
if (this.format == "mp4") {
name = "mp4:" + streamName;
} else {
name = streamName;
}
//pubStream.publish(name, "live");
pubStream.publish(name, "record");
}
Would be helpful to know the version of FMS you are running?
It seems like you need at least FMS 3.0.2.
Are you sure this applies to live streams and not only for recording? this 1 2 links suggest that while the player can decode sorenson, vp6 and h264, it can only encode in sorenson.
I'm in a similar situation, so I would like to have this clarified.
edit: what actually makes me doubt is that the documentation says flv and mp4, which arent codecs but containers, live streaming doesnt use containers, the encoded frames travel directly inside rtmp packets
Flash Player doesn't encode using H.264, but Flash Media Server can record any codec in the F4V container. Flash Media Live Encoder can encode using H.264.
So basically you can't send h264 from web flash player (yet?)...

Resources