Monitoring your network while watching a Youtube live streaming (http://youtube.com/live/), you can see that they are downloading a file to your cache, and this file is actually the live stream.
Bitgravity use the same way to deliver their live stream since years (Check Twit.tv for example).
Does anyone know what is the server side used for this ? and how can someone achieve this instead of using Adobe FMS, Wowza or Red5 ?
These guys have put together an open source video streaming server, so you can look at the source code and see how they did it.
They wrote it in Java.
The current version is a working prototype, which showcases the main ideas. The main design goal is low resource usage.
there can be many ways to implement streaming, i dont think google will let you know how they do that, but it can be done even by simple http, just a simple stream that sends the video data without the "range" header so its just go on and on
Related
I need to implement a screen sharing application using BFCP but not able to find much, can some one please describe or explain in brief how this can be achieved. There is very little information about this on the internet now sure why. SO doesnot even have a tag for BFCP
I have gone through the following links from cisco and also found an outdated library for implementing it. Any help is greatly appreciated.
Is there any other way for sharing screen in a video SIP call?
Sip support multiple streams. Number of SDP streams is unlimited, both end should support new one.
Fore sure no problem send screen sharing info as video stream and send any special info related(like mouse move etc) by SIPMESSAGE. However such setup will require SIP expert in team.
I am starting to work on a project where I need to stream Twitter data using PowerTrack/GNIP and I have to be honest when I say I am very very inexperienced when it comes to networks and I have absolutely no knowledge when it comes to Data Stream (HTTP), how they work etc.
Are there any resources out there that go through all of this in simple terms? I would love to be able to map Data streaming process in my head before I start looking at APIs etc.
Thanks
Take a look at the following two resources which give a good overview of video streaming. Video streaming has probably more background available and should help you understand the concepts:
https://developer.apple.com/library/ios/documentation/NetworkingInternet/Conceptual/StreamingMediaGuide/Introduction/Introduction.html
http://www.jwplayer.com/blog/what-is-video-streaming/
In very simple terms, streaming breaks a large file or live stream into chunks, and sends those chunks one after another to a client (e.g. browser). The client can generally request a start point for content which is not a live stream. In the background this generally works by the client sending requests for each individual chunk (rather than just one request with multiple responses).
The advantage of the multiple request approach is that you know the client is actually still interested (e.g. the user has not browser to another page etc) and for video and audio etc the client can dynamically request different bandwidth files depending on the current network connection - see: http://en.wikipedia.org/wiki/Adaptive_bitrate_streaming.
Twitter do have a streaming page also, but you have probably already seen this:
https://dev.twitter.com/streaming/overview
Abstract: There is a page with a player that loads audio file and plays it. The player used on the web page is jwplayer. I need to find a way to determine if the audio file is being streamed to the player or not.
Background: In my research I found that if I use nginx header like X-Accel-Redirect - the file will be streamed. I have setup the web server with nginx + apache combination (nginx is reverse proxy for apache), after that I pointed jwplayer to the mp3 file - and it is working. I mean I am able to click anywhere on the audio timeline and it immediately starts playing sound. But, since I didn't set that header yet, and adding the fact that player already works - that's why I need to check my question and know for sure.
Some of my own thoughts: JwPlayer itself supports some kind of bufferring, so I have no idea whether it just downloads the mp3 file I am testing this functions on, or it receives the stream and plays it out.
Is there a way to check and know for sure? The only idea about all of this I have is to check access logs, but I don't know what to look for, or if I need a special format for the logs to see those requried data.
While I was researching the issue I got some weird download related topics and something about HTTP headers with "Ranges" in them, but I am not sure that it relates to the streaming or not.
Please advice.
From the point of view of the server, there is no difference between download and streaming. A server just send bits. What happens to those bits later is unknown. What you need is a player that sends reports to back to the server or a loging service such as mixpanel.
I have a requirement to do a flash program or something like that (not necessarily flash, it can be javascript or something in aspnet) that allow me to save audio in the client side of a web app, and save it on a file in the server side of the web app.
I've been searching a lot in google, and all I've found are just old questions, but no answers that fulfill my question
Please I need help!
Ive found this but the only thing that it does its to recognize the microphone, I need the hability of saving the audio file, by the way the server is implemented in aspnet
Possible Duplicate (when tagged with Flash)
How can I record audio using Action script then upload it to server?
The static function
Microphone.getMicrophone() returns a
reference to a Microphone object for
capturing audio. To begin capturing
the audio, you must attach the
Microphone object to a NetStream
object (see NetStream.attachAudio()).
There's at least one example in the
LiveDocs. Start at
flash.media.Microphone.
via #aaaidan
This is like asking everybody to do your entire homework for you.
You need to break the problem down into smaller achievable pieces/goals.
Example:
Record audio
Send to web server
Now you'll most likely get better results when you google it up.
I want to record voice online and I guess I need to use FMS or Red5 and I don't know how to use Red5 with Asp.net, actually this is my first attempt to handle such a thing and currently I am a .net developer.
So someone please show me a way to handle it and show me how to use Red5 with Asp.net.
Thanks in advance.
This is the nice page which has very good infromation abou red ands ASP http://www.aspnetajaxchat.com/Deployment_Guide.pdf
http://www.freelancer-job.com/blog/2008/08/13/flash-aspnet-coder-to-integrate-red5-based-audiovideo-chat-module-to-aspnet-website-by-zukinet/
Some more information avilable in are there any ASP.NET with Voice Recording sample codes?
I have successfully written an asp.net application to stream multiple users P2P video using Red5.
Integration of red5 is actually simple. Once you've got it working on your Server/VM all you have to do is install the olfa Demo and you can write a player/streamer in flash. You just have to set the netconnections. One for the incoming stream and another for the outgoing stream. Then you'd have to add a mic & camera capture to attach them to the outgoing stream. If you want to make your player/Streamer more robust you can use a combination of Javascript and a webservice(AJAX) to control what streams to where.
You weren't very detailed on what you wanted to do otherwise I could have probably assisted you further.
For example code go to
http://code.google.com/p/red5/source/browse/#svn%2Fflash%2Ftrunk%253Fstate%253Dclosed