I have a streaming server (for pushing data not video) setup with GraniteDS and it works great.
I have to include multiple swf files in a web page. Each of these swf files has a data table which includes streaming data(this is a specific requirement - so I really cant combine all data tables into 1 huge data table/swf file). All the swf files however, connect to the same gravity channel/streaming endpoint.
How many connections are there from the web page to the streaming server? Does each swf file start a new streaming connection? Or do all them share the same connection since they are just connecting to a single channel?
Regards,
Ravi.
Ah, very good question grasshoppa.
Essentially, each one of them has their own dedicated connection. So, if you have 6 swfs, each one would have a connection to the streaming server, so 6 connections. The problem with this is that if you're using RTMPT, your browser might block (or cycle) the extra connections since there's a limit (IE used to have a 2 connections per domain limit, FF is 10 I believe).
The question however is are they all getting streaming data at the same time? Is the data different from swf to swf? One possible solution for this would be to have one of the swf be the 'main' swf which connects to the service, gets all the data and sends it to the other swfs either with Javascript or using LocalConnection.
But, I don't know enough about your specs or why you have multiple swfs in the first place...
Related
After knowing about some great features of WebRTC, I thought of using WebRTC one to one audio/video calls in my web application. The web application is for many organizations/entities of a category who can register and keep recording several records daily for their internal working and about their clients. The clients of these individual organizations/entities also have access to the web application to access their details.
The purpose of using WebRTC is for communication between clients and organizations. Also for daily inquires by new people to these organizations about products and services.
While going through articles on google etc. I found broadcasting or one to many calls requires very high bandwidth to users if we don't make use of Media Server.
So what is the case for one to one calls?
Will it affect the performance of web application or bring any critical situation if several users are making audio/video calls(one to one) to each other simultaneously as a routine?
The number of users will be very large and users will be recording daily several entries as their routine work. But still it is manageable and application will be running smoothly but I am not sure about the new concept WebRTC. Will it require a very high hosting plan? Is using WebRTC for current scenario suitable or advisable?
WebRTC by its nature is Peer-to-Peer. Meaning that the streaming data is handled CLIENT side. All decoding, encoding, ICE candidate gathering/negotiation, and media encrypting/transmitting will happen on the client side and not on server side. So, you will be providing the pages, client side JS, and some data exchange(session negotiation signalling) but all in all, it is not a huge amount of work. It should be easily handled without having to worry about your host machine being over worked.
All that said, here are the only a performance concerns that would POSSIBLY affect your hosting server.
Signalling session startup, negotiations, and tare down. This is very minimal(only some json data at the beginning of a session). This should not be too much of a burden but you should be aware that if 1000 sessions start at the same time, you will have a queue of messages to direct to the needed parties. How you determine the parties, forward the messages, and what work you do server side could all affect performance. If written smartly(how to store sessions, how to forward messages, etc.) should not be a terrible burden.This could easily done with SignalR since you are on ASP.NET or you could use a separate one running Node.js(or the same box, does not matter) if you so desired.
RTP TURN relay if needed. This will probably be through a different server(or the same one as your hosting server if you want). For SOME connections, a TURN server is needed and any production ready WebRTC solution should take this into account. Here is a good open source turn server. Bandwidth usage here could be very high as RTP packets are sent to this server and the forwarded to the peer in the connection.
If you are recording the streams, you may have increased hosting traffic depending on how you implement it. Firefox supports client side recording of the streams but Chrome does not(they say it is in the works currently). You could use existing JS libraries to record the feeds client side and then push them anywhere you want. You could also push all the data through a MediaServer that will mux, demux, and forward the data to be recorded anywhere you like. Janus-Gateway videoroom is a good lightweight example of a mediaserver.
Client side is a different story.
There are higher level concerns in the Javascript. If you use one of the recording JS libraries, this is especially evident as they do canvas captures numerous times a second which are a heavy hit and would degrade the user experience.
CPU utilization by the browser will increase as the quality of the video being streamed increases. This is rather obvious as HD video frames take more CPU power to encode/decode than SD frames.
Client side bandwidth usage can also be an issue. Chrome and Firefox try to modify the bitrate of each video/audio feed dynamically but the video Bitrate can go all the way up to 2 Mbps. You can cap this in Chrome( by adding an attribute in the SDP) but not in Firefox(last I checked) as of yet.
I was designing a video management system for multiple clients where the recording server is supposed to record video clips to its local drive and then whenever the client wants to playback these recorded videos, it should be shared with the client. There seem to be two approaches to this:
Use windows file sharing protocol like SMB/CIFS to share the drive to all the clients - windows permits up-to 20 clients, once the drive is shared it will be mapped by every client as a local drive and client can view whichever file is to be played back
Use a streaming server and based on client requests, stream (HTTP/RTSP streaming) a particular file that is required by a client (separate thread to listen and stream required files to each client), in this case server management would become tedious as the number of clients increase
Can anyone let me know the pros and cons of each of these approaches, so that I can design an efficient architecture for the same? Most video management software I have seen, seem to be using the second approach, any specific advantages in doing that?
Regards,
Saurabh Gandhi
I'm trying to write a WP8 app that needs to upload a large amount of data back to my server. My server runs on ASP.net and implements REST using WebAPI.
I've gotten to a point where I can upload small amount of data, say 2-5MB using a POST and transfer them over to Azure blobs. Now, I'm thinking about moving a decent amount of data, say ~40-50MB from the phone using the background transfer API defined here http://msdn.microsoft.com/en-us/library/windowsphone/develop/hh202955(v=vs.105).aspx
The phone API supports -
Over cellular connection - 5 MB,
Over Wi-Fi connection with battery power - 20 MB
Over Wi-Fi connection with external power - 100 MB
The part that I'm struggling to understand is -
The MSDN kb article recommends that the server implement range requests, which is fair. However, it doesn't say how much could be chunked at a time. Can my upload server config be unbounded for request size?
I would prefer to keep the client as 'dumb' as possible and use the existing transfer APIs on the phone. My concerns are around performance of my server and how much memory would be available on the server if I start seeing considerable traffic on the server. Can someone give me pointers for server best practices to accept large amounts of data?
BackgroundTranferRequest does not support the Range header. see here
Implement your own upload and download client and you can use Range and also bypass the various size limitations.
I want to setup a video on demand server which support Http protocol. It is like Youtube, which hosts a lot of videos, and end users could play them from browser (by using Flash or Html 5).
Two quick questions,
For the big video files, shall I put them on disk or in memory? How Youtube or other big video site did it? Not sure if put all video in memory is too expensive, and put video on disk is too slow?
Is there any open source video hosting server for my purpose? If steaming is supported, it will be great.
thanks in advance,
George
If you just want to have an HTML page that links to your video files - no problem, but most browsers will download the entire file before you system even considers playing it.
If you want to stream the files (like YouTube and others do) then you aren't actually using HTTP for the video itself. HTTP is used to get the information about the stream so your player can stream and play directly without having to download the entire file first.
Streaming video uses RTSP (or some other streaming protocol) for the audio and video data.
The closest HTTP protocol can get to "streaming" video is to use Server-Push of individual image frame with each frame flagged to replace the previous frame. Not all browsers can handle this directly, but might need an ActiveX control or Java Applet. The original QuickTime did this before the streaming protocols were implemented at the servers.
re: how does YouTube deal with big video file
I suspect they are on disk until they are needed. Moved into memory only as needed. Flushed from memory when no longer needed.
re: is there an open source video server for my purpose
YES! Check out http://www.videolan.org/
-Jesse
another approach is to use HTTP Live Streaming - HLS - the web server is simply a standard httpd server - video/audio is preprocessed on server side into a set of bitrate playlists.
The logic is on the client side to retrieve the media as a series of 6 second files, based on bandwidth appropriate playlist.
So :
- use files not memory
- there are open source HLS segmentators (ffmpeg)
Imagine you have a web site that you want to send a lot of data. Say 40 files totaling the equivalence of 2 hours of upload bandwidth. You expect to have 3 connection losses along the way (think: mobile data connection, WLAN vs. microwave). You can't be bothered to retry again and again. This should be automated. Interruptions should not cause more data loss than neccessary. Retrying a complete file is a waste of time and bandwidth.
So here is the question: Is there a software package or framework that
synchronizes a local directory (contents) to the server via HTTP,
is multi-platform (Win XP/Vista/7, MacOS X, Linux),
can be delivered as one self-contained executable,
recovers partially uploades files after interrupted network connections or client restarts,
can be generated on a server to include authentication tokens and upload target,
can be made super simple to use
or what would be a good way to build one?
Options I have found until now:
Neat packaging of rsync. This requires an rsync (server) instance on the server side that is aware of a privilege system.
A custom Flash program. As I understand, Flash 10 is able to read a local file as a bytearray (indicated here) and is obviously able to speak HTTP to the originating server. Seen in question 1344131 ("Upload image thumbnail to server, without uploading whole image").
A custom native application for each platform.
Thanks for any hints!
Related work:
HTML5 will allow multiple files to be uploaded or at least selected for upload "at once". See here for example. This is agnostic to the local files and does not feature recovery of a failed upload.
Efficient way to implement a client multiple file upload service basically asks for SWFUpload or YUIUpload (Flash-based multi-file uploaders, otherwise "stupid")
A comment in question 997253 suggests JUpload - I think using a Java applet will at least require the user to grant additional rights so it can access local files
GearsUploader seems great but requires Google Gears - that is going away soon