Can any one point me to a small piece of flex/AS code to test webcam and Microphone.
Thanks
Adobe supplies examples in the documentation. read flash.media.Camera and flash.media.Microphone. the examples you are looking for are under the getCamera() and getMicrophone() public methods of the Camera and Microphone classes respectively.
The smallest amount of code you can use to create a web cam video feed is this:
var cam:Camera = Camera.getCamera();
cam.setMode(640,480,30);
var video:Video = new Video(640, 480);
video.attachCamera(cam);
addChild(video);
Glancing over the docs will give you some additional properties and arguments that you can use to suit your needs.
Update
Here is a link to a working version of this code http://wonderfl.net/c/oGEY
Related
I am working on Mono for Android App and want to show a route map between two points. I implemented their Xamarin.Android Map API to get the current location.
I couldn't get how to plot a route from current location to destination. Is there any library or framework?
Xamarin.Android Map API is based on Google Maps Android v1 API which is deprecated as of December 3rd, 2012. If you really, really want to implement Google Maps I would recommend looking at the Xamarin Maps and Location Demo v2 which uses Google Maps Android v2 API. However drawing a route is still not an easy task - you can use the Google Direction API to obtain a polyline and use it to draw a shape on the map as described here (BTW notice the limitations of Direction API free usage).
I suggest a different approach. If your ok with using Open Street Maps I would definitely go for OSM Droid library along with OSM Bonus Pack which does all the magic for you. Take a look at Xamarin OSM Droid Binding sample. To include the Bonus Pack you can add its JAR to the Osm Droid Binding solution and fix two visibility issues by adding to Metadata.xml:
<attr path="/api/package[#name='org.osmdroid.bonuspack.routing']/class[#name='RoadLink']" name="visibility">public</attr>
<attr path="/api/package[#name='org.osmdroid.bonuspack.overlays']/class[#name='MapEventsOverlay']/method[#name='draw']" name="visibility">public</attr>
Having this set up there is a very nice Road Manager that will plot a route for us. Below is a sample code snippet:
public class MainActivity : Activity
{
private IMapController _mapController;
private MapView _mapView;
protected override void OnCreate(Bundle bundle)
{
base.OnCreate(bundle);
SetContentView(Resource.Layout.Main);
_mapView = FindViewById<MapView>(Resource.Id.mapview);
_mapView.SetTileSource(TileSourceFactory.DefaultTileSource);
_mapView.SetMultiTouchControls(true);
_mapController = _mapView.Controller;
RoadManager roadManager = new MapQuestRoadManager();
JavaList<GeoPoint> waypoints = new JavaList<GeoPoint>();
waypoints.Add(new GeoPoint(51.776625, 19.454834)); //start point
waypoints.Add(new GeoPoint(51.770839, 19.464962)); //end point
Road road = roadManager.GetRoad(waypoints);
PathOverlay roadOverlay = RoadManager.BuildRoadOverlay(road, _mapView.Context);
_mapView.Overlays.Add(roadOverlay);
_mapView.Invalidate();
}
}
More tutorials on this topic are located here.
new user to c4iOS framework. Working my way thru the tutorials/examples - wondering how one goes about playing back audio (as opposed to video, which is covered in the example texts).
thanks in advance for answering my less-than-advance 'n00b' question
-jf
Audio samples are fairly similar to movie objects, albeit they don't have an option like shouldAutoplay that will get them running as soon as the application loads.
The easiest way to construct a sample is like this:
#implementation C4WorkSpace {
C4Sample *audioSample;
}
-(void)setup {
audioSample = [C4Sample sampleNamed:#"C4Loop.aif"];
}
Which builds the audio sample object as a variable that you can then reference in other methods. For instance, if you want to play a sound clip when you first touch the screen you would do the following:
-(void)touchesBegan {
[audioSample play];
}
To toggle the playback for each touch, you would do something like:
-(void)touchesBegan {
if(audioSample.isPlaying) {
[audioSample stop];
} else {
[audioSample play];
}
}
A working copy of a C4 app that toggles playback can be found HERE.
There are also a lot of properties for audio samples that let you control things like playback rate, volume, panning and so on.
An example of changing the volume is like this:
audioSample.volume = 0.5; //0 = mute, 1 = full volume
An example of skipping to a specific time in a sample would be like:
audioSample.currentTime = 1.0f; //this will put the "playhead" to 1.0 second
You can have a look at the C4Sample documentation to see more properties and other aspects of the class. The documentation is also available via the Xcode organizer.
I am developing an web application in flex which have a feature of recording the runtime by having a snapshot of each frames then encoding it into a ByteArray for video playback.
I am currently using NetStream.appendBytes() for playing the ByteArray FLV. It is working, but I just found out about OSMF and thinking bout integrating it in my application.
It is it possible to play the flv byteArray in OSMF? An example on how can it be done would be totally great. thanks!
I am now able to play flv bytearrays in OSMF. Beforehand, I've already been able to play byteArray by creating a new class that extends netStream and overriding its play method to use appendbytes instead. So what I did was to make OSMF use it. I did this creating these classes:
1. ByteStreamElement - media element
2. ByteStreamLoader - extends LoaderBase
3. ByteStreamLoadTrait - extends LoadTrait
overriding netstremas seek/play method:
//manually dispatch seek event since we override seek()
dispatchEvent(new NetStatusEvent(NetStatusEvent.NET_STATUS,false,false, {code:"NetStream.Play.Seek", level:"status"}));
//look for byte position based on _seekTime value
flvStream = _sfw.getFlvStream(false);
_seekTime = parameters[1] * 1000; //netstream time in milliseconds
_flvParser.parse(flvStream, false, flvTagSeeker);
flvStream.position = _flvParserProcessed;
//append flvtag from the new byte position to end of flv byteArray
var tmp:ByteArray = new ByteArray();
flvStream.readBytes(tmp, 0, flvStream.bytesAvailable);
_flvParserProcessed = 0;
this.appendBytesAction(NetStreamAppendBytesAction.RESET_SEEK);
appendBytes(tmp)
And using it like this:
mediaPlayerSprite = new MediaPlayerSprite();
addChild(mediaPlayerSprite);
mediaPlayerSprite.media = new ByteStreamElement();
Im really not sure though if this is the best way to do it. Not sure if it is best that i created new classes or I should have written some sort of plugin for OSMF to use to play bytearrays. And another thing is that, what I really need is it to contiually appendbytes in the player in case needed. That's why Im still not using this and for the mean time Ill stick with my custom made "ByteStream player" until I figure this out.
Well i have an adobe air , downloaded from below link.. it is wonderful app..
http://www.adobe.com/devnet/air/flex/articles/air_screenrecording.html
and this works fine. It captures my screen , record audio but it just does not stop or quit as vlc-player.exe continues to run in the task manager.
i tried lots of vlc- commands but it just does not stop once it starts capturing screen video.
I need help on it..
I know this is a old thread, but just in case someone wants to know...
You can't use rc-fake-tty because Windows doesn't support terminal. For Windows, tell VLC to run with only one instance, then send it the quit command as a separate NativeProcess call.
So, in the linked article, change the stopRecording() method to this:
public function stopRecording():void{
var startupInfo:NativeProcessStartupInfo = new NativeProcessStartupInfo();
startupInfo.executable = vlcFile;
var processArgs:Vector.<String> = new Vector.<String>();
processArgs.push("-I");
processArgs.push("rc"); //Remote control
processArgs.push("--one-instance");
processArgs.push("vlc://quit");
startupInfo.arguments = processArgs;
var killSwitch:NativeProcess = new NativeProcess();
killSwitch.start(startupInfo);
}
And make sure to add this:
processArgs.push("--one-instance");
To your initial screen record startupInfo in startRecording() method.
I quit using vlc for the same reason and started to write my recording application using .Net 4, but i am having less performance using c# now.
Edit:
VLC for windows does not support fake rc control so setting rc-fake-tty is useless. As the very last try, i wanna control is via socket. If you got it working this way, please make me informed.
One of the features of the Flash app I'm working on is to be able to stream a webcam to others. We're just using the built-in webcam support in Flash and sending it through FMS.
We've had some people ask for higher quality video, but we're already using the highest quality setting we can in Flash (setting quality to 100%).
My understanding is that in the newer flash players they added support for MPEG-4 encoding for the videos. I created a simple test Flex app to try and compare the video quality of the MP4 vs FLV encodings. However, I can't seem to get MP4 to work at all.
According to the Flex documentation the only thing I need to do to use MP4 instead of FLV is prepend "mp4:" to the name of the stream when calling publish:
Specify the stream name as a string
with the prefix mp4: with or without
the filename extension. The prefix
indicates to the server that the file
contains H.264-encoded video and
AAC-encoded audio within the MPEG-4
Part 14 container format.
When I try this nothing happens. I don't get any events raised on the client side, no exceptions thrown, and my logging on the server side doesn't show any streams starting.
Here's the relevant code:
// These are all defined and created within the class.
private var nc:NetConnection;
private var sharing:Boolean;
private var pubStream:NetStream;
private var format:String;
private var streamName:String;
private var camera:Camera;
// called when the user clicks the start button
private function startSharing():void {
if (!nc.connected) {
return;
}
if (sharing) { return; }
if(pubStream == null) {
pubStream = new NetStream(nc);
pubStream.attachCamera(camera);
}
startPublish();
sharing = true;
}
private function startPublish():void {
var name:String;
if (this.format == "mp4") {
name = "mp4:" + streamName;
} else {
name = streamName;
}
//pubStream.publish(name, "live");
pubStream.publish(name, "record");
}
Would be helpful to know the version of FMS you are running?
It seems like you need at least FMS 3.0.2.
Are you sure this applies to live streams and not only for recording? this 1 2 links suggest that while the player can decode sorenson, vp6 and h264, it can only encode in sorenson.
I'm in a similar situation, so I would like to have this clarified.
edit: what actually makes me doubt is that the documentation says flv and mp4, which arent codecs but containers, live streaming doesnt use containers, the encoded frames travel directly inside rtmp packets
Flash Player doesn't encode using H.264, but Flash Media Server can record any codec in the F4V container. Flash Media Live Encoder can encode using H.264.
So basically you can't send h264 from web flash player (yet?)...