One of the features of the Flash app I'm working on is to be able to stream a webcam to others. We're just using the built-in webcam support in Flash and sending it through FMS.
We've had some people ask for higher quality video, but we're already using the highest quality setting we can in Flash (setting quality to 100%).
My understanding is that in the newer flash players they added support for MPEG-4 encoding for the videos. I created a simple test Flex app to try and compare the video quality of the MP4 vs FLV encodings. However, I can't seem to get MP4 to work at all.
According to the Flex documentation the only thing I need to do to use MP4 instead of FLV is prepend "mp4:" to the name of the stream when calling publish:
Specify the stream name as a string
with the prefix mp4: with or without
the filename extension. The prefix
indicates to the server that the file
contains H.264-encoded video and
AAC-encoded audio within the MPEG-4
Part 14 container format.
When I try this nothing happens. I don't get any events raised on the client side, no exceptions thrown, and my logging on the server side doesn't show any streams starting.
Here's the relevant code:
// These are all defined and created within the class.
private var nc:NetConnection;
private var sharing:Boolean;
private var pubStream:NetStream;
private var format:String;
private var streamName:String;
private var camera:Camera;
// called when the user clicks the start button
private function startSharing():void {
if (!nc.connected) {
return;
}
if (sharing) { return; }
if(pubStream == null) {
pubStream = new NetStream(nc);
pubStream.attachCamera(camera);
}
startPublish();
sharing = true;
}
private function startPublish():void {
var name:String;
if (this.format == "mp4") {
name = "mp4:" + streamName;
} else {
name = streamName;
}
//pubStream.publish(name, "live");
pubStream.publish(name, "record");
}
Would be helpful to know the version of FMS you are running?
It seems like you need at least FMS 3.0.2.
Are you sure this applies to live streams and not only for recording? this 1 2 links suggest that while the player can decode sorenson, vp6 and h264, it can only encode in sorenson.
I'm in a similar situation, so I would like to have this clarified.
edit: what actually makes me doubt is that the documentation says flv and mp4, which arent codecs but containers, live streaming doesnt use containers, the encoded frames travel directly inside rtmp packets
Flash Player doesn't encode using H.264, but Flash Media Server can record any codec in the F4V container. Flash Media Live Encoder can encode using H.264.
So basically you can't send h264 from web flash player (yet?)...
Related
I am trying to play from local storage an encrypted video using ExoPlayer.
The command used to encrypt the video using FFMPEG is as follows:
-i /storage/emulated/0/Download/20210125_193031.mp4 -vcodec copy -acodec copy -c:v libx264 -encryption_scheme cenc-aes-ctr -encryption_key b42ca3172ee4e69bf51848a59db9cd13 -encryption_kid 09e367028f33436ca5dd60ffe6671e70 /storage/emulated/0/Download/out_enc.mp4
Here it is the sourcecode of my player:
public class PlayerActivity extends AppCompatActivity {
private SimpleExoPlayer player;
private DefaultDrmSessionManager drmSessionManager;
#Override
protected void onCreate(final Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_player);
// Build the media item.
PlayerView playerView = findViewById(R.id.video_view);
player = new SimpleExoPlayer.Builder(this).build();
playerView.setPlayer(player);
//player.prepare();
//FFMPEG command: -i /storage/emulated/0/Download/20210125_193031.mp4 -vf scale=-1:720 -c:v libx264 -encryption_scheme cenc-aes-ctr -encryption_key b42ca3172ee4e69bf51848a59db9cd13 -encryption_kid 09e367028f33436ca5dd60ffe6671e70 /storage/emulated/0/Download/out_enc.mp4
//base 64 keys generated from: https://www.base64encode.org/
//playVideo("/storage/emulated/0/Download/out_enc.mp4", "MDllMzY3MDI4ZjMzNDM2Y2E1ZGQ2MGZmZTY2NzFlNzA=", "YjQyY2EzMTcyZWU0ZTY5YmY1MTg0OGE1OWRiOWNkMTM=");
playVideo("/storage/emulated/0/Download/out_enc.mp4", "CeNnAo8zQ2yl3WD/5mcecA", "tCyjFy7k5pv1GEilnbnNEw");
}
private void playVideo(String url, String keyID, String keyValue) {
try {
drmSessionManager = buildDrmSessionManager(Util.getDrmUuid(C.CLEARKEY_UUID.toString()), true, keyID, keyValue
);
} catch (Exception e) {
e.printStackTrace();
}
player.setMediaSource(buildDashMediaSource(Uri.parse(url)));
player.prepare();
player.setPlayWhenReady(true);
}
private MediaSource buildDashMediaSource(Uri uri) {
DefaultDataSourceFactory dashChunkSourceFactory = new DefaultDataSourceFactory(this, "agent");
return new ProgressiveMediaSource.Factory(dashChunkSourceFactory)
.setDrmSessionManager(drmSessionManager)
.createMediaSource(uri);
}
private DefaultDrmSessionManager buildDrmSessionManager(UUID uuid, Boolean multiSession, String id, String value) {
/* String base64Id = Base64.encodeToString(id.getBytes(), Base64.DEFAULT);
String base64Value = Base64.encodeToString(value.getBytes(), Base64.DEFAULT);*/
String keyString = "{\"keys\":[{\"kty\":\"oct\",\"k\":\""+value+"\",\"kid\":\""+id+"\"}],\"type\":\"temporary\"}";;
LocalMediaDrmCallback drmCallback = new LocalMediaDrmCallback(keyString.getBytes());
FrameworkMediaDrm mediaDrm = null;
try {
mediaDrm = FrameworkMediaDrm.newInstance(uuid);
} catch (UnsupportedDrmException e) {
e.printStackTrace();
}
return new DefaultDrmSessionManager(uuid, mediaDrm, drmCallback, null, multiSession);
}
#Override
protected void onDestroy() {
player.release();
super.onDestroy();
}
Here it is the link for the encrypted video.
The main issue: the video is playing but is not decrypted. What am I missing?
Looking at the logcat output there does not seem to be any DRM, AES or clearkey errors.
Looking then at the video file itself, it appears to report some issues:
However, checking against other sample files encrypted using he same ffmpeg approach you used, they show similar issues so this appear to be typical output from ffprobe for a file encrypted in this way.
Looking then at the video file structure itself with an MP4 parser to see the individual atoms, or the header blocks, there does not appear to be a PSSH box.
A PSSH box is a header area that contains the data about the encryption for an ISOBMFF mp4 file - this is actually an optional field in the CENC spec so your video is valid even without this.
The obvious question then is, how would a player know the video is encoded? The answer, according to the CENC spec is:
Detection
For a stream determined to be in the ISO Base Media File Format [ISOBMFF], this ISO Common Encryption ('cenc') Protection Scheme may be detected as follows.
Protection scheme signaling conforms with [ISOBMFF]. When protection has been applied, the stream type will be transformed to 'encv' for video or 'enca' for audio, with a Protection Scheme Information Box ('sinf') added to the sample entry in the Sample Description Box ('stsd'). The Protection Scheme Information Box ('sinf') will contain a Scheme Type Box ('schm') with a scheme_type field set to a value of 'cenc'
Looking at your video using a MP4 analyser (see below) shows that it does indeed have the stream type shown as 'encv' in the stud box (output from inspect tool below):
Testing playback with ffplay itself using the same encryption key, shows that the video does actually play successfully:
ffplay out_enc.mp4 -decryption_key b42ca3172ee4e69bf51848a59db9cd13
However, a regular player will not be able to play it unless you provide the decryption key. It would be reasonable to to expect the player to flag an error in this case, but that does not appear to be happening with some common players I checked including VLC, so it is quite possible ExoPlayer on Android is not flagging this also.
Looking at ExoPlayer specifically, as noted by Duna in the comments below and outlined in this GIT thread https://github.com/google/ExoPlayer/issues/8532#issuecomment-771811707 , ExoPlayer does not currently (Feb 2021) read the PSSH box for MP4, only for fragmented MP4. From that thread:
After looking deeper into this, I found that ExoPlayer's Mp4Extractor actually doesn't read pssh boxes. Currently we only read this info in fragmented MP4 files (using FragmentedMp4Extractor). This means even when playing the file with the pssh box, the drmInitData still ends up null - meaning playback fails. I didn't realise this limitation when you initially filed the issue, otherwise I would have flagged it earlier.
However, looking at the Mp4Extractor code it does check for 'encv' and it does also check for the default key_id, both of which are present in the video file produced when inspected. Again, it would be reasonable for ExoPlayer to flag an error if it either does not find these in a format is can understand, or if it finds them but is not supplied a corresponding key to play the file with.
So how could the video be encrypted and played reliably?
You could use ffplay on android although I suspect this would not be too straightforward, based on past experience using ffmpeg on Android.
There are some easier (appearing) examples leveraging ExoPlayer also that it would be worth looking at - e.g:
https://stackoverflow.com/a/62678769/334402
You could also look at leveraging DASH. Most media that is streamed to mobile devices these days use a streaming protocol like DASH or HLS - these formats will nearly always have the encryption data included in the 'manifest' or 'index' file and this will definitely be recognised by ExoPlayer. There are online tutorials and free tools to allow you package videos into DASH, including adding encryption. The ExoPlayer team provide information on downloading and playing back such these streams (link correct at time of writing):
https://exoplayer.dev/downloading-media.html
If you want to examine the mp4 files yourself in more detail there are various free tools like:
https://gpac.github.io/mp4box.js/test/filereader.html (highlighted by Duna in comments and looks very good)
https://www.bento4.com/documentation/mp4dump/
https://inspect.eyevinn.technology
I'm working on an app that has video streaming functionality. I'm using firebase database and firebase storage. I'm trying to find some documentation on how firebase storage handles video files, but can't really find much.
There's mentioning in the docs that firebase storage works with other google app services to allow for CDN and video streaming, but all searches seem to lead to a dead end. Any advice?
I think there are several types of video streaming, which could change our answer here:
Live streaming (subscribers are watching as an event happens)
Youtube style (post a video and end users watch at their convenience)
Having built a live streaming Periscope style app using Firebase Storage and the Firebase Realtime Database, I pretty strongly recommend against it--we uploaded three second chunks and synced them via the Realtime Database. While it worked (surprisingly well), there was ~5 second latency over very good internet, and it also wasn't the most efficient solution (after all, you're uploading and storing that video, plus there wasn't any transcoding). I recommend using some WebRTC style, built for video transport, and using the Realtime Database for signaling along side the stream.
On the other side, it's definitely possible to build mobile YT on Firebase features. The trick here is going to be transcoding the video (using something like Zencoder or Bitmovin, more here: https://cloud.google.com/solutions/media/) to chop up your uploaded video into smaller chunks of different resolutions (and different formats, iOS requires HLS for streaming, for instance). You client can store chunk information in the Realtime Database (chunk name, resolutions available, number of chunks), and can download said chunks from Storage as the video progresses.
If you want to steam a video from Firebase Storage, this is the best way I found. This will depend on the size of your video file. I'm only requesting 10-30mb files so this solution works good for me. Just treat the Firebase Url as a regular url:
String str = "fire_base_video_URL";
Uri uri = Uri.parse(str);
videoViewLandscape.setVideoURI(uri);
progressBarLandScape.setVisibility(View.VISIBLE);
videoViewLandscape.requestFocus();
videoViewLandscape.start();
If you want to loop the video:
videoViewLandscape.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mp) {
mp.setLooping(true);
}
});
And if you want to show a progress bar before the video starts do this:
videoViewLandscape.setOnInfoListener(new MediaPlayer.OnInfoListener() {
#Override
public boolean onInfo(MediaPlayer mp, int what, int extra) {
if (what == MediaPlayer.MEDIA_INFO_BUFFERING_END) {
progressBarLandScape.setVisibility(View.GONE);
return true;
}
else if(what == MediaPlayer.MEDIA_INFO_BUFFERING_START){
progressBarLandScape.setVisibility(View.VISIBLE);
return true;
}
return false;
}
});
This is not the best way of doing things but it works for me for now until I can find a good video streaming service.
2020: Yes, firebase storage video streaming is easy and possible.
All other questions suggest that you use a protocol like HLS. However, this is only necessary if you develop an app for the Apple AppStore that serves videos that are longer than 10 minutes.
In all other cases, you can simply encode your videos in mp4 and upload them to firebase. Your clients can then stream the mp4 without a problem. Just make sure that your moov atom is at the beginning of your mp4 file. This allows to start playing the video immediately, even if it is not fully loaded.
Users can also skip ahead or go back thanks to variable bit requests which are supported by firebase storage.
To test it, just upload a video to your firebase storage and open it in your browser.
You can host HLS videos on Firebase Cloud Storage. It works pretty well for me.
The trick is to modify the playlist .m3u8 files to contain the storage folder prefix, and the ?alt=media suffix for each file entry in the playlist:
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:3
#EXT-X-MEDIA-SEQUENCE:0
#EXTINF:2.760000,
<folder_name>%2F1_fileSequence_0.ts?alt=media
#EXT-X-ENDLIST
You also don't really have to use server-side transcoding, you can have the client who uploads the video do it, and save considerable costs.
I've written a full tutorial with source code here: https://itnext.io/how-to-make-a-serverless-flutter-video-sharing-app-with-firebase-storage-including-hls-and-411e4fff68fa
this is my exact implementation for it to start a video playing from storage on firebase as soon as the view is open, and then have the view disappear and then addded a button to click after to replay the video.
I have a demo link with the key so you can see it works. any questions hit me up.
you will just have to create a IBAction if you want the button after the video disappears.
// BackMuscles.swift
// Messenger
//
// Created by Zach Smith on 8/12/21.
// Copyright © 2021 spaceMuleFitness. All rights reserved.
//
import UIKit
import AVKit
import AVFoundation
class BackMuscles: UIViewController {
#IBOutlet weak var playv: UIButton!
let avPlayerViewController = AVPlayerViewController()
var avPlayer:AVPlayer?
override func viewDidLoad() {
super.viewDidLoad()
self.view.addBackground()
let movieUrl:NSURL? = NSURL(string: "https://firebasestorage.googleapis.com/v0/b/messenger-test-d225b.appspot.com/o/test%2FTestVideo.mov?alt=media&token=bd4ccba3-b446-43bc-809e-b1152aa3c2ff")
if let url = movieUrl {
self.avPlayer = AVPlayer(url: url as URL)
self.avPlayerViewController.player = self.avPlayer
}
NotificationCenter.default.addObserver(self, selector: #selector(playerDidFinishPlaying), name: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: avPlayerViewController.player?.currentItem)
self.present(self.avPlayerViewController, animated: true) { () -> Void in
self.avPlayerViewController.player?.play() // Do any additional setup after loading the view.
}
}
#objc func playerDidFinishPlaying(note: NSNotification) {
self.avPlayerViewController.dismiss(animated: true)
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
}
#IBAction func playV(sender: UIButton) {
let amovieUrl:NSURL? = NSURL(string: "https://firebasestorage.googleapis.com/v0/b/messenger-test-d225b.appspot.com/o/test%2FTestVideo.mov?alt=media&token=bd4ccba3-b446-43bc-809e-b1152aa3c2ff")
if let aurl = amovieUrl {
self.avPlayer = AVPlayer(url: aurl as URL)
self.avPlayerViewController.player = self.avPlayer
self.present(self.avPlayerViewController, animated: true) { () -> Void in
self.avPlayerViewController.player?.play()
}
}
}
}
If you want to create a YT like app, you can first compress the video, I recommend using this library to manage video compression, i recommend the one in this link. I've manage to compress a video of 118 mg to 6 mg in under 42 seconds. It also has a great demo app, just follow the example.
After you get the compressed file upload the file to Storage, in you client app you will play the video url using a player like Exo Player.
The video below is pretty good it uses exoplayer to stream instead of mediaplayer or videoViewLanscape
https://www.youtube.com/watch?v=s_D5C5e2Uu0
try{
BandwidthMeter bandwidthMeter = new DefaultBandwidthMeter();
TrackSelector trackSelector = new DefaultTrackSelector(new AdaptiveTrackSelection.Factory(bandwidthMeter));
simpleExoPlayer = ExoPlayerFactory.newSimpleInstance(this, trackSelector);
String vid = "https://www.youtube.com/watch?v=s_D5C5e2Uu0";
Uri uri= Uri.parse(vid);
DefaultHttpDataSourceFactory dataSourceFactory = new DefaultHttpDataSourceFactory("exoplayer_video");
ExtractorsFactory extractorsFactory = new DefaultExtractorsFactory();
MediaSource mediaSource = new ExtractorMediaSource(uri,dataSourceFactory,
extractorsFactory, null, null);
videoView.setPlayer(simpleExoPlayer);
simpleExoPlayer.prepare(mediaSource);
simpleExoPlayer.setPlayWhenReady(true);
}
catch (Exception e){
}
The below is the implements in the buid gradle app file that you will need.
implementation 'com.google.android.exoplayer:exoplayer:r2.4.0'
implementation 'com.google.android.exoplayer:exoplayer-core:r2.4.0'
implementation 'com.google.android.exoplayer:exoplayer-dash:r2.4.0'
implementation 'com.google.android.exoplayer:exoplayer-hls:r2.4.0'
implementation 'com.google.android.exoplayer:exoplayer-smoothstreaming:r2.4.0'
implementation 'com.google.android.exoplayer:exoplayer-ui:r2.4.0'
To use firestorage to play videos, all you need is the full url to the video. You then pass this url into a video view or exoplayer. No full download is needed. The videoview will stream the content YT style
see first of all you need to understand firebase rules okay
like make it true for development mode
then you have to create a storage reference for storing videos in your app
okay
then you have to select video you want to upload okay
then use Uploadtask for creating a task
and upload video file in storage
now for retrieving video use exoplayer library in android
you can visit here for more
I am using progressive streaming with VideoDisplay, the HTTP URL provided gets buffered completely even if I have configured it to start playing the video when the buffering reaches 20%, the trace message shows that the playing started(using mozilla / Flashbug+Firebug), but it doesnot show the video till the buffercounter reaches 100%
How can I get the video stream to play at the 20% of stream.
Code Segment where the check takes place
var loadedPct:uint = Math.round(100 * (event.bytesLoaded / event.bytesTotal));
trace('waiting...');
mainVideoCanvas.addChild(LoadingImage);
VidLoadingLabel2.text = loadedPct.toString();
mainVideoCanvas.addChild(VidLoadingLabel2);
if (loadedPct >= 20)
{
trace(event.bytesLoaded);
trace(loadedPct);
player.load();
player.play();
trace(player.state);
trace('Playing');
}
if (loadedPct == 100)
{
trace('Ready to Complete');
trace(player.state);
mainVideoCanvas.removeChild(VidLoadingLabel2);
mainVideoCanvas.removeChild(LoadingImage);
mainVideoCanvas.addChild(player);
player.addEventListener(VideoEvent.COMPLETE, completePlay);
}
Thanks and regards
deadbrain
The web server needs specific support for the variant of HTTP that Flash speaks when it tries to stream a movie. Adobe isn't using bog-standard HTTP for this. If the web server doesn't have this support, you get the behavior you see: complete download before playback begins.
With H.264 and Apache, you can add the support you need for this with CodeShop's mod_h264_streaming plugin.
I am developing an web application in flex which have a feature of recording the runtime by having a snapshot of each frames then encoding it into a ByteArray for video playback.
I am currently using NetStream.appendBytes() for playing the ByteArray FLV. It is working, but I just found out about OSMF and thinking bout integrating it in my application.
It is it possible to play the flv byteArray in OSMF? An example on how can it be done would be totally great. thanks!
I am now able to play flv bytearrays in OSMF. Beforehand, I've already been able to play byteArray by creating a new class that extends netStream and overriding its play method to use appendbytes instead. So what I did was to make OSMF use it. I did this creating these classes:
1. ByteStreamElement - media element
2. ByteStreamLoader - extends LoaderBase
3. ByteStreamLoadTrait - extends LoadTrait
overriding netstremas seek/play method:
//manually dispatch seek event since we override seek()
dispatchEvent(new NetStatusEvent(NetStatusEvent.NET_STATUS,false,false, {code:"NetStream.Play.Seek", level:"status"}));
//look for byte position based on _seekTime value
flvStream = _sfw.getFlvStream(false);
_seekTime = parameters[1] * 1000; //netstream time in milliseconds
_flvParser.parse(flvStream, false, flvTagSeeker);
flvStream.position = _flvParserProcessed;
//append flvtag from the new byte position to end of flv byteArray
var tmp:ByteArray = new ByteArray();
flvStream.readBytes(tmp, 0, flvStream.bytesAvailable);
_flvParserProcessed = 0;
this.appendBytesAction(NetStreamAppendBytesAction.RESET_SEEK);
appendBytes(tmp)
And using it like this:
mediaPlayerSprite = new MediaPlayerSprite();
addChild(mediaPlayerSprite);
mediaPlayerSprite.media = new ByteStreamElement();
Im really not sure though if this is the best way to do it. Not sure if it is best that i created new classes or I should have written some sort of plugin for OSMF to use to play bytearrays. And another thing is that, what I really need is it to contiually appendbytes in the player in case needed. That's why Im still not using this and for the mean time Ill stick with my custom made "ByteStream player" until I figure this out.
I am having the hardest time figuring this problem out. I have a Silverlight 4 application that loads audio and video files from URLs. The URLs are the same domain as the application is hosted on and it works great for video.
The URLs are actually asp.net mvc controllers that are responsible for reading the file from a shared location on and the server and serving back a filestream. The URLs look something like this:
http://localhost:31479/CourseMedia?path=\omnisandbox1\ILMSShare2\Demo-Fire+Behavior\media\Disclaim.wma&encrypted=False&id=00000000-0000-0000-0000-000000000000
If I put the URL directly into the browser the file loads and plays in windows media player just fine, and if I use a separate test silverlight project to load the url it also works, but for the life of me I can not get it to work properly in my main project.
This is the routine I use to actually do the source setting:
protected void SetPlayerURL(MediaElement player, string url)
{
if (player != null && url.Length > 0)
{
player.ClearValue(MediaElement.SourceProperty);
player.Source = new Uri(this.Packet.GetMediaUrl(url, false, Guid.Empty));
}
}
and the GetMediaURL function simply builds the URL format seen above:
public string GetMediaUrl(
string path,
bool encrypted,
Guid key)
{
StringBuilder builder = new StringBuilder();
builder.AppendFormat("http://{0}/CourseMedia?path={1}&encrypted={2}&id={3}",
this.Host,
System.Windows.Browser.HttpUtility.UrlEncode(path),
encrypted,
key);
return builder.ToString();
}
The request to the controller is never made for the media when it is audio. Seems odd to me as this exact code works fine for video. The MediaElement state never leaves "Closed" and the CurrentStateChanged,, MediaOpened, and MediaFailed events are never triggered.
I am at a loss!
Try setting ScrubbingEnabled of the MediaElement to false, there were some problems with Framework version 3.5 and audio and the workaround was setting that to false. Might be worth trying.
Also try capturing BufferingStarted, BufferingEnded, MediaEnded along with your MediaFailed and MediaOpened events. I'm curious if it is a buffering issue.