I am trying to play from local storage an encrypted video using ExoPlayer.
The command used to encrypt the video using FFMPEG is as follows:
-i /storage/emulated/0/Download/20210125_193031.mp4 -vcodec copy -acodec copy -c:v libx264 -encryption_scheme cenc-aes-ctr -encryption_key b42ca3172ee4e69bf51848a59db9cd13 -encryption_kid 09e367028f33436ca5dd60ffe6671e70 /storage/emulated/0/Download/out_enc.mp4
Here it is the sourcecode of my player:
public class PlayerActivity extends AppCompatActivity {
private SimpleExoPlayer player;
private DefaultDrmSessionManager drmSessionManager;
#Override
protected void onCreate(final Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_player);
// Build the media item.
PlayerView playerView = findViewById(R.id.video_view);
player = new SimpleExoPlayer.Builder(this).build();
playerView.setPlayer(player);
//player.prepare();
//FFMPEG command: -i /storage/emulated/0/Download/20210125_193031.mp4 -vf scale=-1:720 -c:v libx264 -encryption_scheme cenc-aes-ctr -encryption_key b42ca3172ee4e69bf51848a59db9cd13 -encryption_kid 09e367028f33436ca5dd60ffe6671e70 /storage/emulated/0/Download/out_enc.mp4
//base 64 keys generated from: https://www.base64encode.org/
//playVideo("/storage/emulated/0/Download/out_enc.mp4", "MDllMzY3MDI4ZjMzNDM2Y2E1ZGQ2MGZmZTY2NzFlNzA=", "YjQyY2EzMTcyZWU0ZTY5YmY1MTg0OGE1OWRiOWNkMTM=");
playVideo("/storage/emulated/0/Download/out_enc.mp4", "CeNnAo8zQ2yl3WD/5mcecA", "tCyjFy7k5pv1GEilnbnNEw");
}
private void playVideo(String url, String keyID, String keyValue) {
try {
drmSessionManager = buildDrmSessionManager(Util.getDrmUuid(C.CLEARKEY_UUID.toString()), true, keyID, keyValue
);
} catch (Exception e) {
e.printStackTrace();
}
player.setMediaSource(buildDashMediaSource(Uri.parse(url)));
player.prepare();
player.setPlayWhenReady(true);
}
private MediaSource buildDashMediaSource(Uri uri) {
DefaultDataSourceFactory dashChunkSourceFactory = new DefaultDataSourceFactory(this, "agent");
return new ProgressiveMediaSource.Factory(dashChunkSourceFactory)
.setDrmSessionManager(drmSessionManager)
.createMediaSource(uri);
}
private DefaultDrmSessionManager buildDrmSessionManager(UUID uuid, Boolean multiSession, String id, String value) {
/* String base64Id = Base64.encodeToString(id.getBytes(), Base64.DEFAULT);
String base64Value = Base64.encodeToString(value.getBytes(), Base64.DEFAULT);*/
String keyString = "{\"keys\":[{\"kty\":\"oct\",\"k\":\""+value+"\",\"kid\":\""+id+"\"}],\"type\":\"temporary\"}";;
LocalMediaDrmCallback drmCallback = new LocalMediaDrmCallback(keyString.getBytes());
FrameworkMediaDrm mediaDrm = null;
try {
mediaDrm = FrameworkMediaDrm.newInstance(uuid);
} catch (UnsupportedDrmException e) {
e.printStackTrace();
}
return new DefaultDrmSessionManager(uuid, mediaDrm, drmCallback, null, multiSession);
}
#Override
protected void onDestroy() {
player.release();
super.onDestroy();
}
Here it is the link for the encrypted video.
The main issue: the video is playing but is not decrypted. What am I missing?
Looking at the logcat output there does not seem to be any DRM, AES or clearkey errors.
Looking then at the video file itself, it appears to report some issues:
However, checking against other sample files encrypted using he same ffmpeg approach you used, they show similar issues so this appear to be typical output from ffprobe for a file encrypted in this way.
Looking then at the video file structure itself with an MP4 parser to see the individual atoms, or the header blocks, there does not appear to be a PSSH box.
A PSSH box is a header area that contains the data about the encryption for an ISOBMFF mp4 file - this is actually an optional field in the CENC spec so your video is valid even without this.
The obvious question then is, how would a player know the video is encoded? The answer, according to the CENC spec is:
Detection
For a stream determined to be in the ISO Base Media File Format [ISOBMFF], this ISO Common Encryption ('cenc') Protection Scheme may be detected as follows.
Protection scheme signaling conforms with [ISOBMFF]. When protection has been applied, the stream type will be transformed to 'encv' for video or 'enca' for audio, with a Protection Scheme Information Box ('sinf') added to the sample entry in the Sample Description Box ('stsd'). The Protection Scheme Information Box ('sinf') will contain a Scheme Type Box ('schm') with a scheme_type field set to a value of 'cenc'
Looking at your video using a MP4 analyser (see below) shows that it does indeed have the stream type shown as 'encv' in the stud box (output from inspect tool below):
Testing playback with ffplay itself using the same encryption key, shows that the video does actually play successfully:
ffplay out_enc.mp4 -decryption_key b42ca3172ee4e69bf51848a59db9cd13
However, a regular player will not be able to play it unless you provide the decryption key. It would be reasonable to to expect the player to flag an error in this case, but that does not appear to be happening with some common players I checked including VLC, so it is quite possible ExoPlayer on Android is not flagging this also.
Looking at ExoPlayer specifically, as noted by Duna in the comments below and outlined in this GIT thread https://github.com/google/ExoPlayer/issues/8532#issuecomment-771811707 , ExoPlayer does not currently (Feb 2021) read the PSSH box for MP4, only for fragmented MP4. From that thread:
After looking deeper into this, I found that ExoPlayer's Mp4Extractor actually doesn't read pssh boxes. Currently we only read this info in fragmented MP4 files (using FragmentedMp4Extractor). This means even when playing the file with the pssh box, the drmInitData still ends up null - meaning playback fails. I didn't realise this limitation when you initially filed the issue, otherwise I would have flagged it earlier.
However, looking at the Mp4Extractor code it does check for 'encv' and it does also check for the default key_id, both of which are present in the video file produced when inspected. Again, it would be reasonable for ExoPlayer to flag an error if it either does not find these in a format is can understand, or if it finds them but is not supplied a corresponding key to play the file with.
So how could the video be encrypted and played reliably?
You could use ffplay on android although I suspect this would not be too straightforward, based on past experience using ffmpeg on Android.
There are some easier (appearing) examples leveraging ExoPlayer also that it would be worth looking at - e.g:
https://stackoverflow.com/a/62678769/334402
You could also look at leveraging DASH. Most media that is streamed to mobile devices these days use a streaming protocol like DASH or HLS - these formats will nearly always have the encryption data included in the 'manifest' or 'index' file and this will definitely be recognised by ExoPlayer. There are online tutorials and free tools to allow you package videos into DASH, including adding encryption. The ExoPlayer team provide information on downloading and playing back such these streams (link correct at time of writing):
https://exoplayer.dev/downloading-media.html
If you want to examine the mp4 files yourself in more detail there are various free tools like:
https://gpac.github.io/mp4box.js/test/filereader.html (highlighted by Duna in comments and looks very good)
https://www.bento4.com/documentation/mp4dump/
https://inspect.eyevinn.technology
Related
I have a trouble when recording audio file using asioout. When I playback file, the sound volume is too low. I hardly hear the sound. I cant raise output anymore than the current setting as the sound from loudspeaker will be much higher. Is there a way to make sound louder when recording file?
I use interface audio (presonus studio 18|24) with microphone as input device.
This is my code when record file.
public void OnAudioAvailable(object sender, AsioAudioAvailableEventArgs e)
{
if (Samples == null)
Samples = new float[e.SamplesPerBuffer * e.InputBuffers.Length];
e.GetAsInterleavedSamples(Samples);
if (Writer != null)
Writer.WriteSamples(Samples, 0, Samples.Length);
}
I've found a solution!. From this answer, I decide to use FFMpeg to increase volume of recorded file. Audio Volume Manipulation
I'm working on an app that has video streaming functionality. I'm using firebase database and firebase storage. I'm trying to find some documentation on how firebase storage handles video files, but can't really find much.
There's mentioning in the docs that firebase storage works with other google app services to allow for CDN and video streaming, but all searches seem to lead to a dead end. Any advice?
I think there are several types of video streaming, which could change our answer here:
Live streaming (subscribers are watching as an event happens)
Youtube style (post a video and end users watch at their convenience)
Having built a live streaming Periscope style app using Firebase Storage and the Firebase Realtime Database, I pretty strongly recommend against it--we uploaded three second chunks and synced them via the Realtime Database. While it worked (surprisingly well), there was ~5 second latency over very good internet, and it also wasn't the most efficient solution (after all, you're uploading and storing that video, plus there wasn't any transcoding). I recommend using some WebRTC style, built for video transport, and using the Realtime Database for signaling along side the stream.
On the other side, it's definitely possible to build mobile YT on Firebase features. The trick here is going to be transcoding the video (using something like Zencoder or Bitmovin, more here: https://cloud.google.com/solutions/media/) to chop up your uploaded video into smaller chunks of different resolutions (and different formats, iOS requires HLS for streaming, for instance). You client can store chunk information in the Realtime Database (chunk name, resolutions available, number of chunks), and can download said chunks from Storage as the video progresses.
If you want to steam a video from Firebase Storage, this is the best way I found. This will depend on the size of your video file. I'm only requesting 10-30mb files so this solution works good for me. Just treat the Firebase Url as a regular url:
String str = "fire_base_video_URL";
Uri uri = Uri.parse(str);
videoViewLandscape.setVideoURI(uri);
progressBarLandScape.setVisibility(View.VISIBLE);
videoViewLandscape.requestFocus();
videoViewLandscape.start();
If you want to loop the video:
videoViewLandscape.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mp) {
mp.setLooping(true);
}
});
And if you want to show a progress bar before the video starts do this:
videoViewLandscape.setOnInfoListener(new MediaPlayer.OnInfoListener() {
#Override
public boolean onInfo(MediaPlayer mp, int what, int extra) {
if (what == MediaPlayer.MEDIA_INFO_BUFFERING_END) {
progressBarLandScape.setVisibility(View.GONE);
return true;
}
else if(what == MediaPlayer.MEDIA_INFO_BUFFERING_START){
progressBarLandScape.setVisibility(View.VISIBLE);
return true;
}
return false;
}
});
This is not the best way of doing things but it works for me for now until I can find a good video streaming service.
2020: Yes, firebase storage video streaming is easy and possible.
All other questions suggest that you use a protocol like HLS. However, this is only necessary if you develop an app for the Apple AppStore that serves videos that are longer than 10 minutes.
In all other cases, you can simply encode your videos in mp4 and upload them to firebase. Your clients can then stream the mp4 without a problem. Just make sure that your moov atom is at the beginning of your mp4 file. This allows to start playing the video immediately, even if it is not fully loaded.
Users can also skip ahead or go back thanks to variable bit requests which are supported by firebase storage.
To test it, just upload a video to your firebase storage and open it in your browser.
You can host HLS videos on Firebase Cloud Storage. It works pretty well for me.
The trick is to modify the playlist .m3u8 files to contain the storage folder prefix, and the ?alt=media suffix for each file entry in the playlist:
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:3
#EXT-X-MEDIA-SEQUENCE:0
#EXTINF:2.760000,
<folder_name>%2F1_fileSequence_0.ts?alt=media
#EXT-X-ENDLIST
You also don't really have to use server-side transcoding, you can have the client who uploads the video do it, and save considerable costs.
I've written a full tutorial with source code here: https://itnext.io/how-to-make-a-serverless-flutter-video-sharing-app-with-firebase-storage-including-hls-and-411e4fff68fa
this is my exact implementation for it to start a video playing from storage on firebase as soon as the view is open, and then have the view disappear and then addded a button to click after to replay the video.
I have a demo link with the key so you can see it works. any questions hit me up.
you will just have to create a IBAction if you want the button after the video disappears.
// BackMuscles.swift
// Messenger
//
// Created by Zach Smith on 8/12/21.
// Copyright © 2021 spaceMuleFitness. All rights reserved.
//
import UIKit
import AVKit
import AVFoundation
class BackMuscles: UIViewController {
#IBOutlet weak var playv: UIButton!
let avPlayerViewController = AVPlayerViewController()
var avPlayer:AVPlayer?
override func viewDidLoad() {
super.viewDidLoad()
self.view.addBackground()
let movieUrl:NSURL? = NSURL(string: "https://firebasestorage.googleapis.com/v0/b/messenger-test-d225b.appspot.com/o/test%2FTestVideo.mov?alt=media&token=bd4ccba3-b446-43bc-809e-b1152aa3c2ff")
if let url = movieUrl {
self.avPlayer = AVPlayer(url: url as URL)
self.avPlayerViewController.player = self.avPlayer
}
NotificationCenter.default.addObserver(self, selector: #selector(playerDidFinishPlaying), name: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: avPlayerViewController.player?.currentItem)
self.present(self.avPlayerViewController, animated: true) { () -> Void in
self.avPlayerViewController.player?.play() // Do any additional setup after loading the view.
}
}
#objc func playerDidFinishPlaying(note: NSNotification) {
self.avPlayerViewController.dismiss(animated: true)
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
}
#IBAction func playV(sender: UIButton) {
let amovieUrl:NSURL? = NSURL(string: "https://firebasestorage.googleapis.com/v0/b/messenger-test-d225b.appspot.com/o/test%2FTestVideo.mov?alt=media&token=bd4ccba3-b446-43bc-809e-b1152aa3c2ff")
if let aurl = amovieUrl {
self.avPlayer = AVPlayer(url: aurl as URL)
self.avPlayerViewController.player = self.avPlayer
self.present(self.avPlayerViewController, animated: true) { () -> Void in
self.avPlayerViewController.player?.play()
}
}
}
}
If you want to create a YT like app, you can first compress the video, I recommend using this library to manage video compression, i recommend the one in this link. I've manage to compress a video of 118 mg to 6 mg in under 42 seconds. It also has a great demo app, just follow the example.
After you get the compressed file upload the file to Storage, in you client app you will play the video url using a player like Exo Player.
The video below is pretty good it uses exoplayer to stream instead of mediaplayer or videoViewLanscape
https://www.youtube.com/watch?v=s_D5C5e2Uu0
try{
BandwidthMeter bandwidthMeter = new DefaultBandwidthMeter();
TrackSelector trackSelector = new DefaultTrackSelector(new AdaptiveTrackSelection.Factory(bandwidthMeter));
simpleExoPlayer = ExoPlayerFactory.newSimpleInstance(this, trackSelector);
String vid = "https://www.youtube.com/watch?v=s_D5C5e2Uu0";
Uri uri= Uri.parse(vid);
DefaultHttpDataSourceFactory dataSourceFactory = new DefaultHttpDataSourceFactory("exoplayer_video");
ExtractorsFactory extractorsFactory = new DefaultExtractorsFactory();
MediaSource mediaSource = new ExtractorMediaSource(uri,dataSourceFactory,
extractorsFactory, null, null);
videoView.setPlayer(simpleExoPlayer);
simpleExoPlayer.prepare(mediaSource);
simpleExoPlayer.setPlayWhenReady(true);
}
catch (Exception e){
}
The below is the implements in the buid gradle app file that you will need.
implementation 'com.google.android.exoplayer:exoplayer:r2.4.0'
implementation 'com.google.android.exoplayer:exoplayer-core:r2.4.0'
implementation 'com.google.android.exoplayer:exoplayer-dash:r2.4.0'
implementation 'com.google.android.exoplayer:exoplayer-hls:r2.4.0'
implementation 'com.google.android.exoplayer:exoplayer-smoothstreaming:r2.4.0'
implementation 'com.google.android.exoplayer:exoplayer-ui:r2.4.0'
To use firestorage to play videos, all you need is the full url to the video. You then pass this url into a video view or exoplayer. No full download is needed. The videoview will stream the content YT style
see first of all you need to understand firebase rules okay
like make it true for development mode
then you have to create a storage reference for storing videos in your app
okay
then you have to select video you want to upload okay
then use Uploadtask for creating a task
and upload video file in storage
now for retrieving video use exoplayer library in android
you can visit here for more
I have a need to create a copy of a Google Doc with a specific ID - not the "friendly" name like MyDocument, but the name that makes it unique in the GoogleSphere - the one like 1x_tfTiA9-b5UwAf3k2fg6y6hyZSYQIvhSNn-saaDs4c.
Here's the scenario why I would like to do this:
I have a newsletter which is in the form of a Google Doc. The newsletter is published on a website by embedding the document in a web page inside an <iframe> element. Also published in the same way is a "large print" version of the newsletter that is the same, apart from the fact that the default font size is 24pt, rather than 11pt.
I am trying to automate the production of the large print version, but in such a way that the unique ID of the large print document doesn't change, so that the embedded <iframe> for it still works.
I have experimented in the past with Google Apps Scripts routines for creating a deep copy of a document but the deep copy functions don't play nicely with images and tables, so I could never get a complete copy. If I could implement a "Save As" function, where the operand was an existing unique ID, I think this would do what I want.
Anyone know how I might do this?
I delved into this, attempting to set the id of the "large print" version of the file in a variety of ways:
via copy(): var copiedFile = Drive.Files.copy(lpFile, spFile.id, options);
which yields the error:
Generated IDs are not currently supported for copy requests
via insert(): var newFile = Drive.Files.insert(lpFile, doc.getBlob(), options);
which yields the error:
Generated IDs are not supported for Google Docs formats
via update(): Drive.Files.update(lpFile, lpFile.id, doc.getBlob(), options);
This method successfully updates the "large print" file from the small print file. This particular line, however, uses the Document#getBlob() method, which has issues with formatting and rich content from the Document. In particular, as you mention, images and tables in are not preserved (among other things, like changes to the font, etc.). Compare pre with post
It seems that - if the appropriate method of exporting formatted byte content from the document can be found - the update() method has the most promise. Note that the update() method in the Apps Script client library requires a Blob input (i.e. doc.getBlob().getBytes() will not work), so the fundamental limitation may be the (lack of) support for rich format information in the produced Blob data. With this in mind, I tried a couple methods for obtaining "formatted" Blob data from the "small print" file:
via Document#getAs(mimetype): Drive.Files.export(lpFile, lpFile.id, doc.getAs(<type>), options);
which fails for seemingly sensible types with the errors:
MimeType.GOOGLE_DOCS: We're sorry, a server error occurred. Please wait a bit and try again.
MimeType.MICROSOFT_WORD: Converting from application/vnd.google-apps.document to application/vnd.openxmlformats-officedocument.wordprocessingml.document is not supported.
These errors do make sense, since the internal Google Docs MimeType is not exportable (you can't "download as" this filetype since the data is kept however Google wants to keep it), and the documentation for Document#getAs(mimeType) indicates that only PDF export is supported by the Document Service. Indeed, attempting to coerce the Blob from doc.getBlob() with getAs(mimeType) fails, with the error:
Converting from application/pdf to application/vnd.openxmlformats-officedocument.wordprocessingml.document is not supported.
using DriveApp to get the Blob, rather than the Document Service:
Drive.Files.update(lpFile, lpFile.id, DriveApp.getFileById(smallPrintId).getBlob(), options);
This has the same issues as doc.getBlob(), and likely uses the same internal methods.
using DriveApp#getAs has the same errors as Document#getAs
Considering the limitation of the native Apps Script implementations, I then used the advanced service to obtain the Blob data. This is a bit trickier, since the File resource returned is not actually the file, but metadata about the file. Obtaining the Blob with the REST API requires exporting the file to a desired MimeType. We know from above that the PDF-formatted Blob fails to be properly imported, since that is the format used by the above attempts. We also know that the Google Docs format is not exportable, so the only one left is MS Word's .docx.
var blob = getBlobViaURL_(smallPrintId, MimeType.MICROSOFT_WORD);
Drive.Files.update(lpFile, lpFile.id, blob, options);
where getBlobViaURL_ implements the workaround from this SO question for the (still-broken) Drive.Files.export() Apps Script method.
This method successfully updates the existing "large print" file with the exact content from the "small print" file - at least for my test document. Given that it involves downloading content instead of using the internal, already-present data available to the export methods, it will likely fail for larger files.
Testing Script:
function copyContentFromAtoB() {
var smallPrintId = "some id";
var largePrintId = "some other id";
// You must first enable the Drive "Advanced Service" before this will work.
// Get the file metadata of the to-be-updated file.
var lpFile = Drive.Files.get(largePrintId);
// View available options on the relevant Drive REST API pages.
var options = {
updateViewedDate: false,
};
// Ideally this would use Drive.Files.export, but there is a bug in the Apps Script
// client library's implementation: https://issuetracker.google.com/issues/36765129
var blob = getBlobViaURL_(smallPrintId, MimeType.MICROSOFT_WORD);
// Replace the contents of the large print version with that of the small print version.
Drive.Files.update(lpFile, lpFile.id, blob, options);
}
// Below function derived from https://stackoverflow.com/a/42925916/9337071
function getBlobViaURL_(id, mimeType) {
var url = "https://www.googleapis.com/drive/v2/files/"+id+"/export?mimeType="+ mimeType;
var resp = UrlFetchApp.fetch(url, {
headers: { Authorization: 'Bearer ' + ScriptApp.getOAuthToken()}
});
return resp.getBlob();
}
I am having the hardest time figuring this problem out. I have a Silverlight 4 application that loads audio and video files from URLs. The URLs are the same domain as the application is hosted on and it works great for video.
The URLs are actually asp.net mvc controllers that are responsible for reading the file from a shared location on and the server and serving back a filestream. The URLs look something like this:
http://localhost:31479/CourseMedia?path=\omnisandbox1\ILMSShare2\Demo-Fire+Behavior\media\Disclaim.wma&encrypted=False&id=00000000-0000-0000-0000-000000000000
If I put the URL directly into the browser the file loads and plays in windows media player just fine, and if I use a separate test silverlight project to load the url it also works, but for the life of me I can not get it to work properly in my main project.
This is the routine I use to actually do the source setting:
protected void SetPlayerURL(MediaElement player, string url)
{
if (player != null && url.Length > 0)
{
player.ClearValue(MediaElement.SourceProperty);
player.Source = new Uri(this.Packet.GetMediaUrl(url, false, Guid.Empty));
}
}
and the GetMediaURL function simply builds the URL format seen above:
public string GetMediaUrl(
string path,
bool encrypted,
Guid key)
{
StringBuilder builder = new StringBuilder();
builder.AppendFormat("http://{0}/CourseMedia?path={1}&encrypted={2}&id={3}",
this.Host,
System.Windows.Browser.HttpUtility.UrlEncode(path),
encrypted,
key);
return builder.ToString();
}
The request to the controller is never made for the media when it is audio. Seems odd to me as this exact code works fine for video. The MediaElement state never leaves "Closed" and the CurrentStateChanged,, MediaOpened, and MediaFailed events are never triggered.
I am at a loss!
Try setting ScrubbingEnabled of the MediaElement to false, there were some problems with Framework version 3.5 and audio and the workaround was setting that to false. Might be worth trying.
Also try capturing BufferingStarted, BufferingEnded, MediaEnded along with your MediaFailed and MediaOpened events. I'm curious if it is a buffering issue.
One of the features of the Flash app I'm working on is to be able to stream a webcam to others. We're just using the built-in webcam support in Flash and sending it through FMS.
We've had some people ask for higher quality video, but we're already using the highest quality setting we can in Flash (setting quality to 100%).
My understanding is that in the newer flash players they added support for MPEG-4 encoding for the videos. I created a simple test Flex app to try and compare the video quality of the MP4 vs FLV encodings. However, I can't seem to get MP4 to work at all.
According to the Flex documentation the only thing I need to do to use MP4 instead of FLV is prepend "mp4:" to the name of the stream when calling publish:
Specify the stream name as a string
with the prefix mp4: with or without
the filename extension. The prefix
indicates to the server that the file
contains H.264-encoded video and
AAC-encoded audio within the MPEG-4
Part 14 container format.
When I try this nothing happens. I don't get any events raised on the client side, no exceptions thrown, and my logging on the server side doesn't show any streams starting.
Here's the relevant code:
// These are all defined and created within the class.
private var nc:NetConnection;
private var sharing:Boolean;
private var pubStream:NetStream;
private var format:String;
private var streamName:String;
private var camera:Camera;
// called when the user clicks the start button
private function startSharing():void {
if (!nc.connected) {
return;
}
if (sharing) { return; }
if(pubStream == null) {
pubStream = new NetStream(nc);
pubStream.attachCamera(camera);
}
startPublish();
sharing = true;
}
private function startPublish():void {
var name:String;
if (this.format == "mp4") {
name = "mp4:" + streamName;
} else {
name = streamName;
}
//pubStream.publish(name, "live");
pubStream.publish(name, "record");
}
Would be helpful to know the version of FMS you are running?
It seems like you need at least FMS 3.0.2.
Are you sure this applies to live streams and not only for recording? this 1 2 links suggest that while the player can decode sorenson, vp6 and h264, it can only encode in sorenson.
I'm in a similar situation, so I would like to have this clarified.
edit: what actually makes me doubt is that the documentation says flv and mp4, which arent codecs but containers, live streaming doesnt use containers, the encoded frames travel directly inside rtmp packets
Flash Player doesn't encode using H.264, but Flash Media Server can record any codec in the F4V container. Flash Media Live Encoder can encode using H.264.
So basically you can't send h264 from web flash player (yet?)...