Flex: Adobe Flash Builder with MXML: HTML Progressive Streaming doesnt work! - apache-flex

I am using progressive streaming with VideoDisplay, the HTTP URL provided gets buffered completely even if I have configured it to start playing the video when the buffering reaches 20%, the trace message shows that the playing started(using mozilla / Flashbug+Firebug), but it doesnot show the video till the buffercounter reaches 100%
How can I get the video stream to play at the 20% of stream.
Code Segment where the check takes place
var loadedPct:uint = Math.round(100 * (event.bytesLoaded / event.bytesTotal));
trace('waiting...');
mainVideoCanvas.addChild(LoadingImage);
VidLoadingLabel2.text = loadedPct.toString();
mainVideoCanvas.addChild(VidLoadingLabel2);
if (loadedPct >= 20)
{
trace(event.bytesLoaded);
trace(loadedPct);
player.load();
player.play();
trace(player.state);
trace('Playing');
}
if (loadedPct == 100)
{
trace('Ready to Complete');
trace(player.state);
mainVideoCanvas.removeChild(VidLoadingLabel2);
mainVideoCanvas.removeChild(LoadingImage);
mainVideoCanvas.addChild(player);
player.addEventListener(VideoEvent.COMPLETE, completePlay);
}
Thanks and regards
deadbrain

The web server needs specific support for the variant of HTTP that Flash speaks when it tries to stream a movie. Adobe isn't using bog-standard HTTP for this. If the web server doesn't have this support, you get the behavior you see: complete download before playback begins.
With H.264 and Apache, you can add the support you need for this with CodeShop's mod_h264_streaming plugin.

Related

Firebase: Is it possible to stream a video? [duplicate]

I'm working on an app that has video streaming functionality. I'm using firebase database and firebase storage. I'm trying to find some documentation on how firebase storage handles video files, but can't really find much.
There's mentioning in the docs that firebase storage works with other google app services to allow for CDN and video streaming, but all searches seem to lead to a dead end. Any advice?
I think there are several types of video streaming, which could change our answer here:
Live streaming (subscribers are watching as an event happens)
Youtube style (post a video and end users watch at their convenience)
Having built a live streaming Periscope style app using Firebase Storage and the Firebase Realtime Database, I pretty strongly recommend against it--we uploaded three second chunks and synced them via the Realtime Database. While it worked (surprisingly well), there was ~5 second latency over very good internet, and it also wasn't the most efficient solution (after all, you're uploading and storing that video, plus there wasn't any transcoding). I recommend using some WebRTC style, built for video transport, and using the Realtime Database for signaling along side the stream.
On the other side, it's definitely possible to build mobile YT on Firebase features. The trick here is going to be transcoding the video (using something like Zencoder or Bitmovin, more here: https://cloud.google.com/solutions/media/) to chop up your uploaded video into smaller chunks of different resolutions (and different formats, iOS requires HLS for streaming, for instance). You client can store chunk information in the Realtime Database (chunk name, resolutions available, number of chunks), and can download said chunks from Storage as the video progresses.
If you want to steam a video from Firebase Storage, this is the best way I found. This will depend on the size of your video file. I'm only requesting 10-30mb files so this solution works good for me. Just treat the Firebase Url as a regular url:
String str = "fire_base_video_URL";
Uri uri = Uri.parse(str);
videoViewLandscape.setVideoURI(uri);
progressBarLandScape.setVisibility(View.VISIBLE);
videoViewLandscape.requestFocus();
videoViewLandscape.start();
If you want to loop the video:
videoViewLandscape.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mp) {
mp.setLooping(true);
}
});
And if you want to show a progress bar before the video starts do this:
videoViewLandscape.setOnInfoListener(new MediaPlayer.OnInfoListener() {
#Override
public boolean onInfo(MediaPlayer mp, int what, int extra) {
if (what == MediaPlayer.MEDIA_INFO_BUFFERING_END) {
progressBarLandScape.setVisibility(View.GONE);
return true;
}
else if(what == MediaPlayer.MEDIA_INFO_BUFFERING_START){
progressBarLandScape.setVisibility(View.VISIBLE);
return true;
}
return false;
}
});
This is not the best way of doing things but it works for me for now until I can find a good video streaming service.
2020: Yes, firebase storage video streaming is easy and possible.
All other questions suggest that you use a protocol like HLS. However, this is only necessary if you develop an app for the Apple AppStore that serves videos that are longer than 10 minutes.
In all other cases, you can simply encode your videos in mp4 and upload them to firebase. Your clients can then stream the mp4 without a problem. Just make sure that your moov atom is at the beginning of your mp4 file. This allows to start playing the video immediately, even if it is not fully loaded.
Users can also skip ahead or go back thanks to variable bit requests which are supported by firebase storage.
To test it, just upload a video to your firebase storage and open it in your browser.
You can host HLS videos on Firebase Cloud Storage. It works pretty well for me.
The trick is to modify the playlist .m3u8 files to contain the storage folder prefix, and the ?alt=media suffix for each file entry in the playlist:
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:3
#EXT-X-MEDIA-SEQUENCE:0
#EXTINF:2.760000,
<folder_name>%2F1_fileSequence_0.ts?alt=media
#EXT-X-ENDLIST
You also don't really have to use server-side transcoding, you can have the client who uploads the video do it, and save considerable costs.
I've written a full tutorial with source code here: https://itnext.io/how-to-make-a-serverless-flutter-video-sharing-app-with-firebase-storage-including-hls-and-411e4fff68fa
this is my exact implementation for it to start a video playing from storage on firebase as soon as the view is open, and then have the view disappear and then addded a button to click after to replay the video.
I have a demo link with the key so you can see it works. any questions hit me up.
you will just have to create a IBAction if you want the button after the video disappears.
// BackMuscles.swift
// Messenger
//
// Created by Zach Smith on 8/12/21.
// Copyright © 2021 spaceMuleFitness. All rights reserved.
//
import UIKit
import AVKit
import AVFoundation
class BackMuscles: UIViewController {
#IBOutlet weak var playv: UIButton!
let avPlayerViewController = AVPlayerViewController()
var avPlayer:AVPlayer?
override func viewDidLoad() {
super.viewDidLoad()
self.view.addBackground()
let movieUrl:NSURL? = NSURL(string: "https://firebasestorage.googleapis.com/v0/b/messenger-test-d225b.appspot.com/o/test%2FTestVideo.mov?alt=media&token=bd4ccba3-b446-43bc-809e-b1152aa3c2ff")
if let url = movieUrl {
self.avPlayer = AVPlayer(url: url as URL)
self.avPlayerViewController.player = self.avPlayer
}
NotificationCenter.default.addObserver(self, selector: #selector(playerDidFinishPlaying), name: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: avPlayerViewController.player?.currentItem)
self.present(self.avPlayerViewController, animated: true) { () -> Void in
self.avPlayerViewController.player?.play() // Do any additional setup after loading the view.
}
}
#objc func playerDidFinishPlaying(note: NSNotification) {
self.avPlayerViewController.dismiss(animated: true)
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
}
#IBAction func playV(sender: UIButton) {
let amovieUrl:NSURL? = NSURL(string: "https://firebasestorage.googleapis.com/v0/b/messenger-test-d225b.appspot.com/o/test%2FTestVideo.mov?alt=media&token=bd4ccba3-b446-43bc-809e-b1152aa3c2ff")
if let aurl = amovieUrl {
self.avPlayer = AVPlayer(url: aurl as URL)
self.avPlayerViewController.player = self.avPlayer
self.present(self.avPlayerViewController, animated: true) { () -> Void in
self.avPlayerViewController.player?.play()
}
}
}
}
If you want to create a YT like app, you can first compress the video, I recommend using this library to manage video compression, i recommend the one in this link. I've manage to compress a video of 118 mg to 6 mg in under 42 seconds. It also has a great demo app, just follow the example.
After you get the compressed file upload the file to Storage, in you client app you will play the video url using a player like Exo Player.
The video below is pretty good it uses exoplayer to stream instead of mediaplayer or videoViewLanscape
https://www.youtube.com/watch?v=s_D5C5e2Uu0
try{
BandwidthMeter bandwidthMeter = new DefaultBandwidthMeter();
TrackSelector trackSelector = new DefaultTrackSelector(new AdaptiveTrackSelection.Factory(bandwidthMeter));
simpleExoPlayer = ExoPlayerFactory.newSimpleInstance(this, trackSelector);
String vid = "https://www.youtube.com/watch?v=s_D5C5e2Uu0";
Uri uri= Uri.parse(vid);
DefaultHttpDataSourceFactory dataSourceFactory = new DefaultHttpDataSourceFactory("exoplayer_video");
ExtractorsFactory extractorsFactory = new DefaultExtractorsFactory();
MediaSource mediaSource = new ExtractorMediaSource(uri,dataSourceFactory,
extractorsFactory, null, null);
videoView.setPlayer(simpleExoPlayer);
simpleExoPlayer.prepare(mediaSource);
simpleExoPlayer.setPlayWhenReady(true);
}
catch (Exception e){
}
The below is the implements in the buid gradle app file that you will need.
implementation 'com.google.android.exoplayer:exoplayer:r2.4.0'
implementation 'com.google.android.exoplayer:exoplayer-core:r2.4.0'
implementation 'com.google.android.exoplayer:exoplayer-dash:r2.4.0'
implementation 'com.google.android.exoplayer:exoplayer-hls:r2.4.0'
implementation 'com.google.android.exoplayer:exoplayer-smoothstreaming:r2.4.0'
implementation 'com.google.android.exoplayer:exoplayer-ui:r2.4.0'
To use firestorage to play videos, all you need is the full url to the video. You then pass this url into a video view or exoplayer. No full download is needed. The videoview will stream the content YT style
see first of all you need to understand firebase rules okay
like make it true for development mode
then you have to create a storage reference for storing videos in your app
okay
then you have to select video you want to upload okay
then use Uploadtask for creating a task
and upload video file in storage
now for retrieving video use exoplayer library in android
you can visit here for more

Android Air App Locks up due to RPCDataManager?

I am currently developing an Android app using FlashBuilder 4.5 (AIR) annd I have almost finished it apart from a few things. One of these things is that during testing of the app I notice that an Error was happening to do with the RPCDataManager. I beleive this maybe to do with the NavigateToUrl functions I have in the App as this is when the eroor occurs. The two NavigateToUrl function in the app are below:
protected function link_icon_clickHandler(event:MouseEvent):void
{
navigateToURL(new URLRequest(getJByIDResult.lastResult.link));
//tel, sms, mailto, market, http and https
}
protected function email_icon_clickHandler(event:MouseEvent):void
{
var urlString:String = "mailto:";
urlString += "?subject=";
urlString += getJByIDResult.lastResult.c_name+" Information";
urlString += "&body=";
urlString += getJByIDResult.lastResult.j_name+" "+getJByIDResult.lastResult.dl+" "+desc_txt.text+" "+getJByIDResult.lastResult.link;
navigateToURL(new URLRequest(urlString));
}
Now these Functions are initiated when the users cliks on either a Mail icon or a Internet icon. They Functions actually work and do redirect you to a Website and Send an E-mail, however no matter which one you select there seems to be an error triggered which then completley locks up the application and does not allow any further actions (Back, Home etc.). Ther error code thta is created is shown below:
Error: Requesting : cRPCDataManager:cRPCDataManager:#:1.website_link
at mx.data::DataList/http://www.adobe.com/2006/flex/mx/internal::fetchItemProperty()[C:\depot\DataServices\branches\milestone\lcds45_fb45\frameworks\projects\data\src\mx\data\DataList.as:3609]
at mx.data::ConcreteDataService/fetchItemProperty()[C:\depot\DataServices\branches\milestone\lcds45_fb45\frameworks\projects\data\src\mx\data\ConcreteDataService.as:2540]
at mx.data.utils::Managed$/getProperty()[C:\depot\DataServices\branches\milestone\lcds45_fb45\frameworks\projects\data\src\mx\data\utils\Managed.as:164]
at valueObjects::_Super_Companies/get website_link()[C:\Users\Jack\Documents\Dropbox\Projects\GApp\GApp Final\src\valueObjects\_Super_C.as:132]
at ObjectOutput/writeObject()
at mx.data::DataList/writeExternal()
at mx.data::DataList/writeExternal()[C:\depot\DataServices\branches\milestone\lcds45_fb45\frameworks\projects\data\src\mx\data\DataList.as:727]
at mx.collections::ArrayCollection/writeExternal()[E:\dev\4.5.1\frameworks\projects\framework\src\mx\collections\ArrayCollection.as:161]
at ObjectOutput/writeObject()
at spark.components.supportClasses::ViewDescriptor/writeExternal()[E:\dev\4.5.1\frameworks\projects\mobilecomponents\src\spark\components\supportClasses\ViewDescriptor.as:179]
at ObjectOutput/writeObject()
at spark.components.supportClasses::NavigationStack/writeExternal()[E:\dev\4.5.1\frameworks\projects\mobilecomponents\src\spark\components\supportClasses\NavigationStack.as:238]
Can anyone please help me with this?
Thanks
Dave
It's very hard to help if you don't provide more code, but my guess is that you're trying to call something like "#:1.website_link" in your LCDS service. I don't think this has anything to do with navigateToURL since it's specifying an LCDS class.

Silverlight MediaElement refusing to play audio

I am having the hardest time figuring this problem out. I have a Silverlight 4 application that loads audio and video files from URLs. The URLs are the same domain as the application is hosted on and it works great for video.
The URLs are actually asp.net mvc controllers that are responsible for reading the file from a shared location on and the server and serving back a filestream. The URLs look something like this:
http://localhost:31479/CourseMedia?path=\omnisandbox1\ILMSShare2\Demo-Fire+Behavior\media\Disclaim.wma&encrypted=False&id=00000000-0000-0000-0000-000000000000
If I put the URL directly into the browser the file loads and plays in windows media player just fine, and if I use a separate test silverlight project to load the url it also works, but for the life of me I can not get it to work properly in my main project.
This is the routine I use to actually do the source setting:
protected void SetPlayerURL(MediaElement player, string url)
{
if (player != null && url.Length > 0)
{
player.ClearValue(MediaElement.SourceProperty);
player.Source = new Uri(this.Packet.GetMediaUrl(url, false, Guid.Empty));
}
}
and the GetMediaURL function simply builds the URL format seen above:
public string GetMediaUrl(
string path,
bool encrypted,
Guid key)
{
StringBuilder builder = new StringBuilder();
builder.AppendFormat("http://{0}/CourseMedia?path={1}&encrypted={2}&id={3}",
this.Host,
System.Windows.Browser.HttpUtility.UrlEncode(path),
encrypted,
key);
return builder.ToString();
}
The request to the controller is never made for the media when it is audio. Seems odd to me as this exact code works fine for video. The MediaElement state never leaves "Closed" and the CurrentStateChanged,, MediaOpened, and MediaFailed events are never triggered.
I am at a loss!
Try setting ScrubbingEnabled of the MediaElement to false, there were some problems with Framework version 3.5 and audio and the workaround was setting that to false. Might be worth trying.
Also try capturing BufferingStarted, BufferingEnded, MediaEnded along with your MediaFailed and MediaOpened events. I'm curious if it is a buffering issue.

Seeking not working in HTML5 audio tag

I have a lighttpd server running locally. If I load a static file on the server (through an html5 audio tag), it plays and seeks fine.
However, seeking doesn't work when running a dev server (web.py/CherryPy) or if I return the bytes via a defined action url instead of as a static file. It won't load the duration either.
According to the "HTTP byte range requests" section in this Opera Page it's something to do with support for byte range requests/partial content responses. The content is treated as streaming instead.
What I don't understand is:
If the browser has the whole file downloaded surely it can display the duration, and surely it can seek.
What I need to do on the web server to enable byte range requests (for non-static urls).
Any advice would be most gratefully received.
Here's some web.py code to get you started (just happened to need this as well and ran into your question):
## experimental partial content support
## perhaps this shouldn't be enabled by default
range = web.ctx.env.get('HTTP_RANGE')
if range is None:
return result
total = len(result)
_, r = range.split("=")
partial_start, partial_end = r.split("-")
start = int(partial_start)
if not partial_end:
end = total-1
else:
end = int(partial_end)
chunksize = (end-start)+1
web.ctx.status = "206 Partial Content"
web.header("Content-Range", "bytes %d-%d/%d" % (start, end, total))
web.header("Accept-Ranges", "bytes")
web.header("Content-Length", chunksize)
return result[start:end+1]
Google tells me you have to use the staticFilter for byte ranges to work in CherryPy - but that is for static files only. Luckily this posting also includes pointers on how to do it for non-static data :-)

Streaming webcam video in Flash using MP4 encoding

One of the features of the Flash app I'm working on is to be able to stream a webcam to others. We're just using the built-in webcam support in Flash and sending it through FMS.
We've had some people ask for higher quality video, but we're already using the highest quality setting we can in Flash (setting quality to 100%).
My understanding is that in the newer flash players they added support for MPEG-4 encoding for the videos. I created a simple test Flex app to try and compare the video quality of the MP4 vs FLV encodings. However, I can't seem to get MP4 to work at all.
According to the Flex documentation the only thing I need to do to use MP4 instead of FLV is prepend "mp4:" to the name of the stream when calling publish:
Specify the stream name as a string
with the prefix mp4: with or without
the filename extension. The prefix
indicates to the server that the file
contains H.264-encoded video and
AAC-encoded audio within the MPEG-4
Part 14 container format.
When I try this nothing happens. I don't get any events raised on the client side, no exceptions thrown, and my logging on the server side doesn't show any streams starting.
Here's the relevant code:
// These are all defined and created within the class.
private var nc:NetConnection;
private var sharing:Boolean;
private var pubStream:NetStream;
private var format:String;
private var streamName:String;
private var camera:Camera;
// called when the user clicks the start button
private function startSharing():void {
if (!nc.connected) {
return;
}
if (sharing) { return; }
if(pubStream == null) {
pubStream = new NetStream(nc);
pubStream.attachCamera(camera);
}
startPublish();
sharing = true;
}
private function startPublish():void {
var name:String;
if (this.format == "mp4") {
name = "mp4:" + streamName;
} else {
name = streamName;
}
//pubStream.publish(name, "live");
pubStream.publish(name, "record");
}
Would be helpful to know the version of FMS you are running?
It seems like you need at least FMS 3.0.2.
Are you sure this applies to live streams and not only for recording? this 1 2 links suggest that while the player can decode sorenson, vp6 and h264, it can only encode in sorenson.
I'm in a similar situation, so I would like to have this clarified.
edit: what actually makes me doubt is that the documentation says flv and mp4, which arent codecs but containers, live streaming doesnt use containers, the encoded frames travel directly inside rtmp packets
Flash Player doesn't encode using H.264, but Flash Media Server can record any codec in the F4V container. Flash Media Live Encoder can encode using H.264.
So basically you can't send h264 from web flash player (yet?)...

Resources