flutter audio play delay - firebase

I am using the audioplayers package to play my mp3 audio files that are stored in firebase cloud storage. There is a significant delay for both Android and iOS and only just slightly faster in Android. I have since moved all my audio sound files to local asset.
AudioPlayer audioPlayer = AudioPlayer(mode: PlayerMode.LOW_LATENCY);
play(String url) async {
int result = await audioPlayer.play(url);
if (result == 1) {
// success
print('success');
}
}
Just a few days ago, I tested with the audio player in iOS Swift and play some audio files from firebase cloud storage but I did't encounter any significant delay due to buffering and it was a lot faster.
I need to find a way to get around this as I have many audio files and they need to be stored in the network. Anyone of you have encountered similar issues and do you have any good suggestions?

Update
Made this second PR that addresses few shortcomings of the first original PR. Both are merged into master branch of audioplayers.
My PR changes are:
playbackRate is always used in playImmediatelyAtRate instead of constant values -- initially set by the library to _defaultPlaybackRate i.e. 1.0
playImmediatelyAtRate is added to resume method as well, not just play
Original Solution
This is the final code that helped solving the audio play delay for the OP:
in play & resume method
AVPlayer *player = playerInfo[#"player"];
float playbackRate = [playerInfo[#"rate"] floatValue];
if (#available(iOS 10.0, *)) {
[player playImmediatelyAtRate:playbackRate];
} else {
[player play];
}
So calling [player playImmediatelyAtRate:playImmediatelyAtRate:playbackRate] instead of [player play]; seems to fix the issue.
So far it hasn't been merged into the pub and is still an open the first incomplete PR has been merged, second PR as well.
Original comment:
There's this open pull request that should fix delay on iOS. that hasn't reached release version. Also there's this discussion on big initial lag.

Related

seemingly simple Firestore query is very slow [duplicate]

I'm having slow performance issues with Firestore while retrieving basic data stored in a document compared to the realtime database with 1/10 ratio.
Using Firestore, it takes an average of 3000 ms on the first call
this.db.collection(‘testCol’)
.doc(‘testDoc’)
.valueChanges().forEach((data) => {
console.log(data);//3000 ms later
});
Using the realtime database, it takes an average of 300 ms on the first call
this.db.database.ref(‘/test’).once(‘value’).then(data => {
console.log(data); //300ms later
});
This is a screenshot of the network console :
I'm running the Javascript SDK v4.50 with AngularFire2 v5.0 rc.2.
Did anyone experience this issue ?
UPDATE: 12th Feb 2018 - iOS Firestore SDK v0.10.0
Similar to some other commenters, I've also noticed a slower response on the first get request (with subsequent requests taking ~100ms). For me it's not as bad as 30s, but maybe around 2-3s when I have good connectivity, which is enough to provide a bad user experience when my app starts up.
Firebase have advised that they're aware of this "cold start" issue and they're working on a long term fix for it - no ETA unfortunately. I think it's a separate issue that when I have poor connectivity, it can take ages (over 30s) before get requests decide to read from cache.
Whilst Firebase fix all these issues, I've started using the new disableNetwork() and enableNetwork() methods (available in Firestore v0.10.0) to manually control the online/offline state of Firebase. Though I've had to be very careful where I use it in my code, as there's a Firestore bug that can cause a crash under certain scenarios.
UPDATE: 15th Nov 2017 - iOS Firestore SDK v0.9.2
It seems the slow performance issue has now been fixed. I've re-run the tests described below and the time it takes for Firestore to return the 100 documents now seems to be consistently around 100ms.
Not sure if this was a fix in the latest SDK v0.9.2 or if it was a backend fix (or both), but I suggest everyone updates their Firebase pods. My app is noticeably more responsive - similar to the way it was on the Realtime DB.
I've also discovered Firestore to be much slower than Realtime DB, especially when reading from lots of documents.
Updated tests (with latest iOS Firestore SDK v0.9.0):
I set up a test project in iOS Swift using both RTDB and Firestore and ran 100 sequential read operations on each. For the RTDB, I tested the observeSingleEvent and observe methods on each of the 100 top level nodes. For Firestore, I used the getDocument and addSnapshotListener methods at each of the 100 documents in the TestCol collection. I ran the tests with disk persistence on and off. Please refer to the attached image, which shows the data structure for each database.
I ran the test 10 times for each database on the same device and a stable wifi network. Existing observers and listeners were destroyed before each new run.
Realtime DB observeSingleEvent method:
func rtdbObserveSingle() {
let start = UInt64(floor(Date().timeIntervalSince1970 * 1000))
print("Started reading from RTDB at: \(start)")
for i in 1...100 {
Database.database().reference().child(String(i)).observeSingleEvent(of: .value) { snapshot in
let time = UInt64(floor(Date().timeIntervalSince1970 * 1000))
let data = snapshot.value as? [String: String] ?? [:]
print("Data: \(data). Returned at: \(time)")
}
}
}
Realtime DB observe method:
func rtdbObserve() {
let start = UInt64(floor(Date().timeIntervalSince1970 * 1000))
print("Started reading from RTDB at: \(start)")
for i in 1...100 {
Database.database().reference().child(String(i)).observe(.value) { snapshot in
let time = UInt64(floor(Date().timeIntervalSince1970 * 1000))
let data = snapshot.value as? [String: String] ?? [:]
print("Data: \(data). Returned at: \(time)")
}
}
}
Firestore getDocument method:
func fsGetDocument() {
let start = UInt64(floor(Date().timeIntervalSince1970 * 1000))
print("Started reading from FS at: \(start)")
for i in 1...100 {
Firestore.firestore().collection("TestCol").document(String(i)).getDocument() { document, error in
let time = UInt64(floor(Date().timeIntervalSince1970 * 1000))
guard let document = document, document.exists && error == nil else {
print("Error: \(error?.localizedDescription ?? "nil"). Returned at: \(time)")
return
}
let data = document.data() as? [String: String] ?? [:]
print("Data: \(data). Returned at: \(time)")
}
}
}
Firestore addSnapshotListener method:
func fsAddSnapshotListener() {
let start = UInt64(floor(Date().timeIntervalSince1970 * 1000))
print("Started reading from FS at: \(start)")
for i in 1...100 {
Firestore.firestore().collection("TestCol").document(String(i)).addSnapshotListener() { document, error in
let time = UInt64(floor(Date().timeIntervalSince1970 * 1000))
guard let document = document, document.exists && error == nil else {
print("Error: \(error?.localizedDescription ?? "nil"). Returned at: \(time)")
return
}
let data = document.data() as? [String: String] ?? [:]
print("Data: \(data). Returned at: \(time)")
}
}
}
Each method essentially prints the unix timestamp in milliseconds when the method starts executing and then prints another unix timestamp when each read operation returns. I took the difference between the initial timestamp and the last timestamp to return.
RESULTS - Disk persistence disabled:
RESULTS - Disk persistence enabled:
Data Structure:
When the Firestore getDocument / addSnapshotListener methods get stuck, it seems to get stuck for durations that are roughly multiples of 30 seconds. Perhaps this could help the Firebase team isolate where in the SDK it's getting stuck?
Update Date March 02, 2018
It looks like this is a known issue and the engineers at Firestore are working on a fix. After a few email exchanges and code sharing with a Firestore engineer on this issue, this was his response as of today.
"You are actually correct. Upon further checking, this slowness on getDocuments() API is a known behavior in Cloud Firestore beta. Our engineers are aware of this performance issue tagged as "cold starts", but don't worry as we are doing our best to improve Firestore query performance.
We are already working on a long-term fix but I can't share any timelines or specifics at the moment. While Firestore is still on beta, expect that there will be more improvements to come."
So hopefully this will get knocked out soon.
Using Swift / iOS
After dealing with this for about 3 days it seems the issue is definitely the get() ie .getDocuments and .getDocument. Things I thought were causing the extreme yet intermittent delays but don't appear to be the case:
Not so great network connectivity
Repeated calls via looping over .getDocument()
Chaining get() calls
Firestore Cold starting
Fetching multiple documents (Fetching 1 small doc caused 20sec delays)
Caching (I disabled offline persistence but this did nothing.)
I was able to rule all of these out as I noticed this issue didn't happen with every Firestore database call I was making. Only retrievals using get(). For kicks I replaced .getDocument with .addSnapshotListener to retrieve my data and voila. Instant retrieval each time including the first call. No cold starts. So far no issues with the .addSnapshotListener, only getDocument(s).
For now, I'm simply dropping the .getDocument() where time is of the essence and replacing it with .addSnapshotListener then using
for document in querySnapshot!.documents{
// do some magical unicorn stuff here with my document.data()
}
... in order to keep moving until this gets worked out by Firestore.
Almost 3 years later, firestore being well out of beta and I can confirm that this horrible problem still persists ;-(
On our mobile app we use the javascript / node.js firebase client. After a lot of testing to find out why our app's startup time is around 10sec we identified what to attribute 70% of that time to... Well, to firebase's and firestore's performance and cold start issues:
firebase.auth().onAuthStateChanged() fires approx. after 1.5 - 2sec, already quite bad.
If it returns a user, we use its ID to get the user document from firestore. This is the first call to firestore and the corresponding get() takes 4 - 5sec. Subsequent get() of the same or other documents take approx. 500ms.
So in total the user initialization takes 6 - 7 sec, completely unacceptable. And we can't do anything about it. We can't test disabling persistence, since in the javascript client there's no such option, persistence is always enabled by default, so not calling enablePersistence() won't change anything.
I had this issue until this morning. My Firestore query via iOS/Swift would take around 20 seconds to complete a simple, fully indexed query - with non-proportional query times for 1 item returned - all the way up to 3,000.
My solution was to disable offline data persistence. In my case, it didn't suit the needs of our Firestore database - which has large portions of its data updated every day.
iOS & Android users have this option enabled by default, whilst web users have it disabled by default. It makes Firestore seem insanely slow if you're querying a huge collection of documents. Basically it caches a copy of whichever data you're querying (and whichever collection you're querying - I believe it caches all documents within) which can lead to high Memory usage.
In my case, it caused a huge wait for every query until the device had cached the data required - hence the non-proportional query times for the increasing numbers of items to return from the exact same collection. This is because it took the same amount of time to cache the collection in each query.
Offline Data - from the Cloud Firestore Docs
I performed some benchmarking to display this effect (with offline persistence enabled) from the same queried collection, but with different amounts of items returned using the .limit parameter:
Now at 100 items returned (with offline persistence disabled), my query takes less than 1 second to complete.
My Firestore query code is below:
let db = Firestore.firestore()
self.date = Date()
let ref = db.collection("collection").whereField("Int", isEqualTo: SomeInt).order(by: "AnotherInt", descending: true).limit(to: 100)
ref.getDocuments() { (querySnapshot, err) in
if let err = err {
print("Error getting documents: \(err)")
} else {
for document in querySnapshot!.documents {
let data = document.data()
//Do things
}
print("QUERY DONE")
let currentTime = Date()
let components = Calendar.current.dateComponents([.second], from: self.date, to: currentTime)
let seconds = components.second!
print("Elapsed time for Firestore query -> \(seconds)s")
// Benchmark result
}
}
well, from what I'm currently doing and research by using nexus 5X in emulator and real android phone Huawei P8,
Firestore and Cloud Storage are both give me a headache of slow response
when I do first document.get() and first storage.getDownloadUrl()
It give me more than 60 seconds response on each request. The slow response only happen in real android phone. Not in emulator. Another strange thing.
After the first encounter, the rest request is smooth.
Here is the simple code where I meet the slow response.
var dbuserref = dbFireStore.collection('user').where('email','==',email);
const querySnapshot = await dbuserref.get();
var url = await defaultStorage.ref(document.data().image_path).getDownloadURL();
I also found link that is researching the same.
https://reformatcode.com/code/android/firestore-document-get-performance

Firebase: Is it possible to stream a video? [duplicate]

I'm working on an app that has video streaming functionality. I'm using firebase database and firebase storage. I'm trying to find some documentation on how firebase storage handles video files, but can't really find much.
There's mentioning in the docs that firebase storage works with other google app services to allow for CDN and video streaming, but all searches seem to lead to a dead end. Any advice?
I think there are several types of video streaming, which could change our answer here:
Live streaming (subscribers are watching as an event happens)
Youtube style (post a video and end users watch at their convenience)
Having built a live streaming Periscope style app using Firebase Storage and the Firebase Realtime Database, I pretty strongly recommend against it--we uploaded three second chunks and synced them via the Realtime Database. While it worked (surprisingly well), there was ~5 second latency over very good internet, and it also wasn't the most efficient solution (after all, you're uploading and storing that video, plus there wasn't any transcoding). I recommend using some WebRTC style, built for video transport, and using the Realtime Database for signaling along side the stream.
On the other side, it's definitely possible to build mobile YT on Firebase features. The trick here is going to be transcoding the video (using something like Zencoder or Bitmovin, more here: https://cloud.google.com/solutions/media/) to chop up your uploaded video into smaller chunks of different resolutions (and different formats, iOS requires HLS for streaming, for instance). You client can store chunk information in the Realtime Database (chunk name, resolutions available, number of chunks), and can download said chunks from Storage as the video progresses.
If you want to steam a video from Firebase Storage, this is the best way I found. This will depend on the size of your video file. I'm only requesting 10-30mb files so this solution works good for me. Just treat the Firebase Url as a regular url:
String str = "fire_base_video_URL";
Uri uri = Uri.parse(str);
videoViewLandscape.setVideoURI(uri);
progressBarLandScape.setVisibility(View.VISIBLE);
videoViewLandscape.requestFocus();
videoViewLandscape.start();
If you want to loop the video:
videoViewLandscape.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mp) {
mp.setLooping(true);
}
});
And if you want to show a progress bar before the video starts do this:
videoViewLandscape.setOnInfoListener(new MediaPlayer.OnInfoListener() {
#Override
public boolean onInfo(MediaPlayer mp, int what, int extra) {
if (what == MediaPlayer.MEDIA_INFO_BUFFERING_END) {
progressBarLandScape.setVisibility(View.GONE);
return true;
}
else if(what == MediaPlayer.MEDIA_INFO_BUFFERING_START){
progressBarLandScape.setVisibility(View.VISIBLE);
return true;
}
return false;
}
});
This is not the best way of doing things but it works for me for now until I can find a good video streaming service.
2020: Yes, firebase storage video streaming is easy and possible.
All other questions suggest that you use a protocol like HLS. However, this is only necessary if you develop an app for the Apple AppStore that serves videos that are longer than 10 minutes.
In all other cases, you can simply encode your videos in mp4 and upload them to firebase. Your clients can then stream the mp4 without a problem. Just make sure that your moov atom is at the beginning of your mp4 file. This allows to start playing the video immediately, even if it is not fully loaded.
Users can also skip ahead or go back thanks to variable bit requests which are supported by firebase storage.
To test it, just upload a video to your firebase storage and open it in your browser.
You can host HLS videos on Firebase Cloud Storage. It works pretty well for me.
The trick is to modify the playlist .m3u8 files to contain the storage folder prefix, and the ?alt=media suffix for each file entry in the playlist:
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:3
#EXT-X-MEDIA-SEQUENCE:0
#EXTINF:2.760000,
<folder_name>%2F1_fileSequence_0.ts?alt=media
#EXT-X-ENDLIST
You also don't really have to use server-side transcoding, you can have the client who uploads the video do it, and save considerable costs.
I've written a full tutorial with source code here: https://itnext.io/how-to-make-a-serverless-flutter-video-sharing-app-with-firebase-storage-including-hls-and-411e4fff68fa
this is my exact implementation for it to start a video playing from storage on firebase as soon as the view is open, and then have the view disappear and then addded a button to click after to replay the video.
I have a demo link with the key so you can see it works. any questions hit me up.
you will just have to create a IBAction if you want the button after the video disappears.
// BackMuscles.swift
// Messenger
//
// Created by Zach Smith on 8/12/21.
// Copyright © 2021 spaceMuleFitness. All rights reserved.
//
import UIKit
import AVKit
import AVFoundation
class BackMuscles: UIViewController {
#IBOutlet weak var playv: UIButton!
let avPlayerViewController = AVPlayerViewController()
var avPlayer:AVPlayer?
override func viewDidLoad() {
super.viewDidLoad()
self.view.addBackground()
let movieUrl:NSURL? = NSURL(string: "https://firebasestorage.googleapis.com/v0/b/messenger-test-d225b.appspot.com/o/test%2FTestVideo.mov?alt=media&token=bd4ccba3-b446-43bc-809e-b1152aa3c2ff")
if let url = movieUrl {
self.avPlayer = AVPlayer(url: url as URL)
self.avPlayerViewController.player = self.avPlayer
}
NotificationCenter.default.addObserver(self, selector: #selector(playerDidFinishPlaying), name: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: avPlayerViewController.player?.currentItem)
self.present(self.avPlayerViewController, animated: true) { () -> Void in
self.avPlayerViewController.player?.play() // Do any additional setup after loading the view.
}
}
#objc func playerDidFinishPlaying(note: NSNotification) {
self.avPlayerViewController.dismiss(animated: true)
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
}
#IBAction func playV(sender: UIButton) {
let amovieUrl:NSURL? = NSURL(string: "https://firebasestorage.googleapis.com/v0/b/messenger-test-d225b.appspot.com/o/test%2FTestVideo.mov?alt=media&token=bd4ccba3-b446-43bc-809e-b1152aa3c2ff")
if let aurl = amovieUrl {
self.avPlayer = AVPlayer(url: aurl as URL)
self.avPlayerViewController.player = self.avPlayer
self.present(self.avPlayerViewController, animated: true) { () -> Void in
self.avPlayerViewController.player?.play()
}
}
}
}
If you want to create a YT like app, you can first compress the video, I recommend using this library to manage video compression, i recommend the one in this link. I've manage to compress a video of 118 mg to 6 mg in under 42 seconds. It also has a great demo app, just follow the example.
After you get the compressed file upload the file to Storage, in you client app you will play the video url using a player like Exo Player.
The video below is pretty good it uses exoplayer to stream instead of mediaplayer or videoViewLanscape
https://www.youtube.com/watch?v=s_D5C5e2Uu0
try{
BandwidthMeter bandwidthMeter = new DefaultBandwidthMeter();
TrackSelector trackSelector = new DefaultTrackSelector(new AdaptiveTrackSelection.Factory(bandwidthMeter));
simpleExoPlayer = ExoPlayerFactory.newSimpleInstance(this, trackSelector);
String vid = "https://www.youtube.com/watch?v=s_D5C5e2Uu0";
Uri uri= Uri.parse(vid);
DefaultHttpDataSourceFactory dataSourceFactory = new DefaultHttpDataSourceFactory("exoplayer_video");
ExtractorsFactory extractorsFactory = new DefaultExtractorsFactory();
MediaSource mediaSource = new ExtractorMediaSource(uri,dataSourceFactory,
extractorsFactory, null, null);
videoView.setPlayer(simpleExoPlayer);
simpleExoPlayer.prepare(mediaSource);
simpleExoPlayer.setPlayWhenReady(true);
}
catch (Exception e){
}
The below is the implements in the buid gradle app file that you will need.
implementation 'com.google.android.exoplayer:exoplayer:r2.4.0'
implementation 'com.google.android.exoplayer:exoplayer-core:r2.4.0'
implementation 'com.google.android.exoplayer:exoplayer-dash:r2.4.0'
implementation 'com.google.android.exoplayer:exoplayer-hls:r2.4.0'
implementation 'com.google.android.exoplayer:exoplayer-smoothstreaming:r2.4.0'
implementation 'com.google.android.exoplayer:exoplayer-ui:r2.4.0'
To use firestorage to play videos, all you need is the full url to the video. You then pass this url into a video view or exoplayer. No full download is needed. The videoview will stream the content YT style
see first of all you need to understand firebase rules okay
like make it true for development mode
then you have to create a storage reference for storing videos in your app
okay
then you have to select video you want to upload okay
then use Uploadtask for creating a task
and upload video file in storage
now for retrieving video use exoplayer library in android
you can visit here for more

Altbeacon library not working on Android 5.0

Yesterday I got the update for Android 5.0 on my Nexus 4, and the altbeacon library stopped detecting beacons. It appears that didEnterRegion and didRangeBeaconsInRegion are not even getting called when monitoring and ranging, respectively.
Even the Locate app from Radius Networks behaves differently now, the values from beacons, once they are detected, doesn't get updated anymore and often it appears as if the beacons went out of range.
One thing I noted differently, is that now in the logcat it appears the following line "BluetoothLeScanner﹕ could not find callback wrapper". I went ahead and looked for that class and saw that it was introduced with Android L, but I don't know if that has something to do with it.
It's important to say that before the update I had been working with both the Locate app and the Reference Application without any trouble.
I don't know if this is a generalized problem or not, but if it happened to me I'm sure it could happen to someone else, so any help it would really be appreciated.
Thanks in advance!
UPDATE:
After failing at getting the library to work I decided to try the Android L branch of the library. What I did was that I plugged in the new library into the Reference App, but didn't work as expected either.
The Monitor Activity seems to be working ok by notifying when the device has entered a new region. However, the Ranging Activity doesn't report any beacons, although didRangeBeaconsInRegion is getting called, always report zero beacons. Curiously, when the activity is paused (switching momentarily to another app) the logcat shows that now didRangeBeaconsInRegion does get called with actual beacons.
I'm kind of stuck right now because I don't know how to get any of libraries working on Android L, so again, any help would really be appreciated.
I'm using the latest Altbeacon build on 5.0+ and have no problem with it. in fact, I never used it on kitkat so i'm not really sure i can help but here is my working code which listen to iBeacons.
implement beaconConsumer:
public class MainActivity implements BeaconConsumer
init BeaconManager
beaconManager = BeaconManager.getInstanceForApplication(this);
if (beaconManager != null && !beaconManager.isBound(this)) {
beaconManager.getBeaconParsers().add(new BeaconParser().
setBeaconLayout("m:0-3=4c000215,i:4-19,i:20-21,i:22-23,p:24-24"));
beaconManager.bind(this);
}
onConnect and start listner
#Override
public void onBeaconServiceConnect() {
beaconManager.setRangeNotifier(new RangeNotifier() {
#Override
public void didRangeBeaconsInRegion(Collection<Beacon> beacons, Region region) {
if (beacons.size() > 0) {
Beacon firstBeacon = beacons.iterator().next();
}
}
});
beaconManager.startRangingBeaconsInRegion(new Region("com.example.app", null, null, null));
}
this code is working on 3 devices
Nexus 4 5.0.1
Samsung Galaxy s4 - Stock 5.0.1
Samsung Galaxy s4 - CM12 5.1.1
Old question, but maybe some people will try to find answer for higher systems where you have to ask for permissions. You need to ask for Manifest.permission.ACCESS_FINE_LOCATION before scanning. At least that was the problem I met. In my opinion lib should crash such cases at least and indicate the problem

Please suggest a way to store a temp file in Windows Azure

Here I have a simple feature on ASP.NET MVC3 which host on Azure.
1st step: user upload a picture
2nd step: user crop the uploaded picture
3rd: system save the cropped picture, delete the temp file which is the uploaded original picture
Here is the problem I am facing now: where to store the temp file?
I tried on windows system somewhere, or on LocalResources: the problem is these resources are per Instance, so here is no guarantee the code on an instance shows the picture to crop will be the same code on the same instance that saved the temp file.
Do you have any idea on this temp file issue?
normally the file exist just for a while before delete it
the temp file needs to be Instance independent
Better the file can have some expire setting (for example, 1H) to delete itself, in case code crashed somewhere.
OK. So what you're after is basically somthing that is shared storage but expires. Amazon have just announced a rather nice setting called object expiration (https://forums.aws.amazon.com/ann.jspa?annID=1303). Nothing like this for Windows Azure storage yet unfortunately, but, doesnt mean we can't come up with some other approach; indeed even come up with a better (more cost effective) approach.
You say that it needs to be instance independant which means using a local temp drive is out of the picture. As others have said my initial leaning would be towards Blob storage but you will have cleanup effort there. If you are working with large images (>1MB) or low throughput (<100rps) then I think Blob storage is the only option. If you are working with smaller images AND high throughput then the transaction costs for blob storage will start to really add up (I have a white paper coming out soon which shows some modelling of this but some quick thoughts are below).
For a scenario with small images and high throughput a better option might be to use the Windows Azure Cache as your temporary storaage area. At first glance it will be eye wateringly expensive; on a per GB basis (110GB/month for Cache, 12c/GB for Storage). But, with storage your transactions are paid for whereas with Cache they are 'free'. (Quotas are here: http://msdn.microsoft.com/en-us/library/hh697522.aspx#C_BKMK_FAQ8) This can really add up; e.g. using 100kb temp files held for 20 minutes with a system throughput of 1500rps using Cache is about $1000 per month vs $15000 per month for storage transactions.
The Azure Cache approach is well worth considering, but, to be sure it is the 'best' approach I'd really want to know;
Size of images
Throughput per hour
A bit more detail on the actual client interaction with the server during the crop process? Is it an interactive process where the user will pull the iamge into their browser and crop visually? Or is it just a simple crop?
Here is what I see as a possible approach:
user upload the picture
your code saves it to a blob and have some data backend to know the relation between user session and uploaded image (mark it as temp image)
display the image in the cropping user interface interface
when user is done cropping on the client:
4.1. retrieve the original from the blob
4.2. crop it according the data sent from the user
4.3. delete the original from the blob and the record in the data backend used in step 2
4.4. save the final to another blob (final blob).
And have one background process checking for "expired" temp images in the data backend (used in step 2) to delete the images and the records in the data backend.
Please note that even in WebRole, you still have the RoleEntryPoint descendant, and you still can override the Run method. Impleneting the infinite loop in the Run() (that method shall never exit!) method, you can check if there is anything for deleting every N seconds (depending on your Thread.Sleep() in the Run().
You can use the Azure blob storage. Have look at this tutorial.
Under sample will be help you.
https://code.msdn.microsoft.com/How-to-store-temp-files-in-d33bbb10
you have two way of temp file in Azure.
1, you can use Path.GetTempPath and Path.GetTempFilename() functions for the temp file name
2, you can use Azure blob to simulate it.
private long TotalLimitSizeOfTempFiles = 100 * 1024 * 1024;
private async Task SaveTempFile(string fileName, long contentLenght, Stream inputStream)
{
try
{
//firstly, we need check the container if exists or not. And if not, we need to create one.
await container.CreateIfNotExistsAsync();
//init a blobReference
CloudBlockBlob tempFileBlob = container.GetBlockBlobReference(fileName);
//if the blobReference is exists, delete the old blob
tempFileBlob.DeleteIfExists();
//check the count of blob if over limit or not, if yes, clear them.
await CleanStorageIfReachLimit(contentLenght);
//and upload the new file in this
tempFileBlob.UploadFromStream(inputStream);
}
catch (Exception ex)
{
if (ex.InnerException != null)
{
throw ex.InnerException;
}
else
{
throw ex;
}
}
}
//check the count of blob if over limit or not, if yes, clear them.
private async Task CleanStorageIfReachLimit(long newFileLength)
{
List<CloudBlob> blobs = container.ListBlobs()
.OfType<CloudBlob>()
.OrderBy(m => m.Properties.LastModified)
.ToList();
//get total size of all blobs.
long totalSize = blobs.Sum(m => m.Properties.Length);
//calculate out the real limit size of before upload
long realLimetSize = TotalLimitSizeOfTempFiles - newFileLength;
//delete all,when the free size is enough, break this loop,and stop delete blob anymore
foreach (CloudBlob item in blobs)
{
if (totalSize <= realLimetSize)
{
break;
}
await item.DeleteIfExistsAsync();
totalSize -= item.Properties.Length;
}
}

Wowza + Flex not reproducing whole audio

I'm writing a flex app, which must record an audio and then playback. It records just fine, I can hear the flv on the server, but when it comes to the playback it cuts the end a little bit, and each time I ask to reproduce again it cuts a little bit more. What can it be? I guess it's something related to buffer management, but I don't know exactly. Any thoughts?
EDIT: Here's the code I'm using to playback. It is called from a mediator:
var streamPlayClient:Object = new Object();
this.stream.client = streamPlayClient;
streamPlayClient.onPlayStatus = function(infoObject:Object):void {
if (infoObject.code == "NetStream.Buffer.Flush") {
stopPlayback();
}
}
this.stream.play("flv:" + this.streamName);
As it turns out, I have to handle the NetStream.Buffer.Empty event, instead of the NetStream.Play.Complete or the NetStream.Buffer.Flush.

Resources