Browser HTTP connection limitation and YouTube - http

I'm developing a service where people can stream multiple audio files at the same time.
Unfortunately, when streaming about 4 streams simultaneously, Chrome's HTTP connection limit seems to kick in: new stream requests only arive at the server when a previous connection was closed.
Interestingly enough, I can play 10+ videos at the same time on YouTube.
What kind of technique could YouTube have used here to circumvent the browser's simultaneous http connection limit?

The crucial point here I suppose is that YouTube streams are not directly controlled by the browser, they are embedded Flash players which use streams that are handled by Flash. If you want to hand off the streaming process to an external app/library (Flash, Java etc) you can circumvent these limitations quite easily.
The other point is that YouTube has a huge CDN, so there is no guarantee you're getting any two videos from the same server, which would also help to circumvent concurrency limitations (to a point, at least).
I'm not surprised that Chrome stops you after a while because Google did a load of research and experiments regarding browser concurrency and relative efficiency a while ago, and I remember reading somewhere that they concluded that 3-4 concurrent connections to the same server represented the the most efficient data transfer architecture over straight HTTP. Annoyingly, I can't find a reputable source to reference with that (although I got it from one in the first place), however this is related and probably part of the same research program.
This is also the sort of research Facebook get quite heavily involved in, and you might find some useful information in over at http://developers.facebook.com/ if you can be bothered sifting through the rubbish to find it...

Related

Does integrating WebRTC one to one audio/video calls affect the performance of web application

After knowing about some great features of WebRTC, I thought of using WebRTC one to one audio/video calls in my web application. The web application is for many organizations/entities of a category who can register and keep recording several records daily for their internal working and about their clients. The clients of these individual organizations/entities also have access to the web application to access their details.
The purpose of using WebRTC is for communication between clients and organizations. Also for daily inquires by new people to these organizations about products and services.
While going through articles on google etc. I found broadcasting or one to many calls requires very high bandwidth to users if we don't make use of Media Server.
So what is the case for one to one calls?
Will it affect the performance of web application or bring any critical situation if several users are making audio/video calls(one to one) to each other simultaneously as a routine?
The number of users will be very large and users will be recording daily several entries as their routine work. But still it is manageable and application will be running smoothly but I am not sure about the new concept WebRTC. Will it require a very high hosting plan? Is using WebRTC for current scenario suitable or advisable?
WebRTC by its nature is Peer-to-Peer. Meaning that the streaming data is handled CLIENT side. All decoding, encoding, ICE candidate gathering/negotiation, and media encrypting/transmitting will happen on the client side and not on server side. So, you will be providing the pages, client side JS, and some data exchange(session negotiation signalling) but all in all, it is not a huge amount of work. It should be easily handled without having to worry about your host machine being over worked.
All that said, here are the only a performance concerns that would POSSIBLY affect your hosting server.
Signalling session startup, negotiations, and tare down. This is very minimal(only some json data at the beginning of a session). This should not be too much of a burden but you should be aware that if 1000 sessions start at the same time, you will have a queue of messages to direct to the needed parties. How you determine the parties, forward the messages, and what work you do server side could all affect performance. If written smartly(how to store sessions, how to forward messages, etc.) should not be a terrible burden.This could easily done with SignalR since you are on ASP.NET or you could use a separate one running Node.js(or the same box, does not matter) if you so desired.
RTP TURN relay if needed. This will probably be through a different server(or the same one as your hosting server if you want). For SOME connections, a TURN server is needed and any production ready WebRTC solution should take this into account. Here is a good open source turn server. Bandwidth usage here could be very high as RTP packets are sent to this server and the forwarded to the peer in the connection.
If you are recording the streams, you may have increased hosting traffic depending on how you implement it. Firefox supports client side recording of the streams but Chrome does not(they say it is in the works currently). You could use existing JS libraries to record the feeds client side and then push them anywhere you want. You could also push all the data through a MediaServer that will mux, demux, and forward the data to be recorded anywhere you like. Janus-Gateway videoroom is a good lightweight example of a mediaserver.
Client side is a different story.
There are higher level concerns in the Javascript. If you use one of the recording JS libraries, this is especially evident as they do canvas captures numerous times a second which are a heavy hit and would degrade the user experience.
CPU utilization by the browser will increase as the quality of the video being streamed increases. This is rather obvious as HD video frames take more CPU power to encode/decode than SD frames.
Client side bandwidth usage can also be an issue. Chrome and Firefox try to modify the bitrate of each video/audio feed dynamically but the video Bitrate can go all the way up to 2 Mbps. You can cap this in Chrome( by adding an attribute in the SDP) but not in Firefox(last I checked) as of yet.

Can uploads be too fast?

I'm not sure if this is the right place to ask this, but I'll do it anyway.
I have developed an uploader for the Umbraco CMS that lets people upload a queue of files in one go. This uses some simple flash app that just calls a .NET ashx to upload the files one at a time. When one is done, the next one starts.
Recently I've had a user hit a problem where 1 or 2 uploads will go up fine, but then the rest fail. This happens for himself and a client of his. After some debugging, he thinks he's found the problem, but it seems weird so was wondering if anyone else has had this problem?
Both him and his client are on a fibre optic broadband connection so have got really fast upload speeds. When it was tested on a lesser speed broadband connection, all the files were uploaded no problem. According to one of his developer friends, apparently they had come across it before and had to put a slight delay in the upload script to make it work.
Does this sound possible? Had anyone else hit this problem? Is there a known workaround to prevent the uploads from failing?
I have not struck this precise problem before, but I have done a lot of diagnosis of DSL and broadband troubleshooting before, so will do my best to answer this.
There are 2 possible causes for this particular symptom, both generally outside of your network control (I would have thought).
1) Packet loss
Of course where some links receive a very high volume of traffic then they can chose to just dump a lot of data (eg all that is over that link maximum set size), but TCP/IP should be controlling that, and also expecting that sort of thing to drop from time to time, so this seems less likely.
2) The receiving server
May have some HTTP bottlenecks into that server or even the receiving server CPU / RAM etc, may be at capacity.
From a troubleshooting perspective, even if these symptoms shouldn't (in theory) exist, the fact that they do, and you have a specific
Next steps if you really need to understand how it is all working might be to get some sort of packet sniffer (like WireShark) to try to work out at a packet level what exactly is happening.
Also Socket programming can often program directly to the TCP/IP sockets, so you would be processing at the lower network layers, and seeing the responses and timeouts etc.
Also if you control the receiving server, then you can do the same from that end, or at least review the error logs to see what is getting thrown up as a problem.
A really basic method could be to send a pathping to the receiving server if that is possible, and that might highlight slow nodes getting the server, or packet loss between your local machine and the end server.
The upshot? Put in a slow down function in the upload code, and that should at least make the code work.
Get in touch if you need any analysis of the WireShark stuff.
I have run into a similar problem with an MVC2 website using Flash uploader and Firefox. The servers were load balanced with a Big-IP load balancer. What we found out in debugging this is that Flash, in Firefox, did not send the session ID on continuation requests and the load balancer would send continuation requests off to another server. Because the user had no session on the new server, the request failed.
If a file could be sent in one chunk, it would upload fine. If it required a second chunk, it failed. Because of this the upload would fail after an undetermined number of files being uploaded.
To fix it, I wrote a Silverlight uploader.

Flash Media Server: Trouble with recording 2 audio/video streams at the same time?

In short
For a project I need an audio/video-chat for 2 people, with the ability to record (part of) the session. I am running into issues where the 2nd user's recorded video gets messed up, with massive amounts of (seemingly) skipping frames and/or audio loss, most likely caused by the audio stream (when not recording the audio, the problem doesn't appear to occur).
Overview
For a project I need a setup where two clients can video(+audio, obviously :) chat with one another. Also, the 'host' should be able to record the session when (s)he presses a button, and stop recording in the same way.
While this setup is far from rocket science, I've been experiencing issues that I can't seem to figure out. The clients connect fine, and each other's videos show up propely and they can even hear each other just fine. I chose to re-initialize the connection when the host starts the recording, which also works just fine. The recorded files, however, at times experience issues.
The possible cause
Usually the hosts' recording is great, with audio and video working as it should. The client video shows problems, with the video literally jumping when played back (in a number of applications such as VLC, KMPlayer, Adobe's own media player, etc.). I tried to debug the situation and it seems to be directly related to the audio, though how this happens I'm not sure. When I did tests with no recorded audio, both videos played back fine.
The problem
I'll try to explain what happens in the client video; the video plays back normally, with audio working fine too. But in seemingly random locations of the video, playback suddenly skips seconds of the video (and audio), so a one minute video sometimes lasts mere seconds (even though the playback bar and the such show that the video in fact lasts for a minute).
I have not found any logic in how it skips --some videos show only 4/5 gaps of a few seconds each, others jump 20-30 seconds ahead--, so I'm assuming it's a random thing.
Scenarios
I have tested multiple scenarios, and the problem seems consistent enough (as in, it occurs every time, just not when I dont record audio). I have used a local (developer edition) FMS server, the hosted influxis service, and I have used two local computers, one local machine (using 2 webcams) and 1 local, 1 (really) remote computer. All setups seem to have the same issues with the recorded (client) video.
In closing
I'm not sure what details you need, so please, ask me for anything you might need to help me find a solution to this. I have searched and debugged like there was no tomorrow, and haven't been able to figure out what is causing this.
Many, many thanks in advance!
-Dave
I've had some pretty extensive experience with FMS and specifically with influxis... Also did recording of both audio and video as the system I had needed it for security reasons. However, in all cases the video and audio were recorded on the server - not on the same machine as either client.
The issues I've seen include a vast amount of tweaking of both audio and video quality - to find the correct mix to get optimal results. That's where I'd start, adjust quality down and see where that gets you.
For a test environment, I'd suggest using different machines so the CPU doesn't become your issue - as the actual environment would have the clients on different boxes with more CPU.
Contact me with addition information, etc - I and several friends have a system working and I'm happy to help.
<>
Looked over my notes - we were able to talk to influxis about the lagging audio - and there were a couple of server setting they played with AND it was our frames-per-second and other quality guys...
I'm not sure if my problem was the same as yours, but we found a solution.
The flash piece needed to set the silence level (setSilenceLevel) such that it didn't insert audio packets that signal no audio. These audio packets caused recording problems that produced random skips and audio drops.
Dave, have you ruled out latency? are the computers with both cams connected to a high speed network? I would also check the processor utilisation on the machine which is recording.

Why do requests and responses get lost?

Even on big-time sites such as Google, I sometimes make a request and the browser just sits there. The hourglass will turn indefinitely until I click again, after which I get a response instantly. So, the response or request is simply getting lost on the internet.
As a developer of ASP.NET web applications, is there any way for me to mitigate this problem, so that users of the sites I develop do not experience this issue? If there is, it seems like Google would do it. Still, I'm hopeful there is a solution.
Edit: I can verify, for our web applications, that every request actually reaching the server is served in a few seconds even in the absolute worst case (e.g. a complex report). I have an email notification sent out if a server ever takes more than 4 seconds to process a request, or if it fails to process a request, and have not received that email in 30 days.
It's possible that a request made from the client took a particular path which happened to not work at that particular moment. These are unavoidable - they're simply a result of the internet, which is built upon unstable components and which TCP manages to ensure a certain kind of guarantee for.
Like someone else said - make sure when a request hits your server, you'll be ready to reply. Everything else is out of your hands.
They get lost because the internet is a big place and sometimes packets get dropped or servers get overloaded. To give your users the best experience make sure you have plenty of hardware, robust software, and a very good network connection.
You cannot control the pipe from the client all the way to your server. There could be network connectivity issues anywhere along the pipeline, including from your PC to your ISP's router which is a likely place to look first.
The bottom line is if you are having issues bringing Google.com up in your browser then you are guaranteed to have the same issue with your own web application at least as often.
That's not to say an ASP application cannot generate the same sort of downtime experience completely on it's own... Test often and code defensively are the key phrases to keep in mind.
Let's not forget browser bugs. They aren't nearly perfect applications themselves...
This problem/situation isn't only ASP related, but it covers the whole concept of keeping your apps up and its called informally the "5 nines" or "99.999% availability".
The wikipedia article is here
If you lookup the 5 nines you'll find tons of useful information, which you can apply as needed to your apps.

Is it possible to downsample an audio stream at runtime with Flash or FMS?

I'm no expert in audio, so if any of you folks are, I'd appreciate your insights on this.
My client has a handful of MP3 podcasts stored at a relatively high bit rate, and I'd like to be able to serve those files to her users at "different" bit rates depending on that user's credentials. (For example, if you're an authenticated user, you might get the full, unaltered stream, but if you're not, you'd get a lower-bit-rate version -- or at least a purposely tweaked lower-quality version than the original.)
Seems like there are two options: downsampling at the source and downsampling at the client. In this case, knowing of course that the source stream would arrive at the client at a high bit rate (and that there are considerations to be made about that, which I realize), I'd prefer to alter the stream at the client somehow, rather than on the server, for several reasons.
Is doing so possible with the Flash Player and ActionScript alone, at runtime (even with a third-party library), or does a scenario like this one require a server-based solution? If the latter, can Flash Media Server handle this requirement specifically? Again, I'd like to avoid using FMS if I can, since she doesn't really have the budget for it, but if that's the only option and it's really an option, I'm open to considering it.
Thanks in advance...
Note: Please don't question the sanity of the request -- I realize it might sound a bit strange, but the requirements are what they are. In that light, for purposes of answering the question, you can ignore the source and delivery path of the bits; all I'm really looking for is an explanation of whether (and ideally how) a Flash client can downsample an MP3 audio stream at runtime, irrespective of whether the audio's arriving over a network connection or being read directly from disk. Thanks much!
I'd prefer to alter the stream at the client somehow, rather than on the server, for several reasons.
Please elucidate the reasons, because resampling on the client end would normally be considered crazy: wasting bandwidth sending the higher-quality version to a user who cannot hear it, and risking a canny user ripping the higher-quality stream at it comes in through the network.
In any case the Flash Player doesn't give you the tools to process audio, only play it.
You shouldn't need FMS to process audio at the server end. You could have a server-side script that loaded the newly-uploaded podcasts and saved them back out as lower-bitrate files which could be served to lowly users via a normal web server. For Python see eg. PyMedia, py-lame; or even a shell script using lame or ffmpeg or something from the command line should be pretty easy to pull off.
If storage is at a premium, have you looked into AAC audio? I believe Flash 9 and 10 on desktop browsers will play it. AAC in my experience takes only half of the size of the comparable MP3 (i.e. a 80kbps AAC will sound the same as a 160kbps MP3).
As for playback quality, if I recall correctly there's audio playback settings in the Publish Settings section in the Flash editor. Wether or not the playback bitrate can be changed at runtime is something I'm not sure of.

Resources