CIFS/SMB vs HTTP/RTSP streaming to share video files - http

I was designing a video management system for multiple clients where the recording server is supposed to record video clips to its local drive and then whenever the client wants to playback these recorded videos, it should be shared with the client. There seem to be two approaches to this:
Use windows file sharing protocol like SMB/CIFS to share the drive to all the clients - windows permits up-to 20 clients, once the drive is shared it will be mapped by every client as a local drive and client can view whichever file is to be played back
Use a streaming server and based on client requests, stream (HTTP/RTSP streaming) a particular file that is required by a client (separate thread to listen and stream required files to each client), in this case server management would become tedious as the number of clients increase
Can anyone let me know the pros and cons of each of these approaches, so that I can design an efficient architecture for the same? Most video management software I have seen, seem to be using the second approach, any specific advantages in doing that?
Regards,
Saurabh Gandhi

Related

Does integrating WebRTC one to one audio/video calls affect the performance of web application

After knowing about some great features of WebRTC, I thought of using WebRTC one to one audio/video calls in my web application. The web application is for many organizations/entities of a category who can register and keep recording several records daily for their internal working and about their clients. The clients of these individual organizations/entities also have access to the web application to access their details.
The purpose of using WebRTC is for communication between clients and organizations. Also for daily inquires by new people to these organizations about products and services.
While going through articles on google etc. I found broadcasting or one to many calls requires very high bandwidth to users if we don't make use of Media Server.
So what is the case for one to one calls?
Will it affect the performance of web application or bring any critical situation if several users are making audio/video calls(one to one) to each other simultaneously as a routine?
The number of users will be very large and users will be recording daily several entries as their routine work. But still it is manageable and application will be running smoothly but I am not sure about the new concept WebRTC. Will it require a very high hosting plan? Is using WebRTC for current scenario suitable or advisable?
WebRTC by its nature is Peer-to-Peer. Meaning that the streaming data is handled CLIENT side. All decoding, encoding, ICE candidate gathering/negotiation, and media encrypting/transmitting will happen on the client side and not on server side. So, you will be providing the pages, client side JS, and some data exchange(session negotiation signalling) but all in all, it is not a huge amount of work. It should be easily handled without having to worry about your host machine being over worked.
All that said, here are the only a performance concerns that would POSSIBLY affect your hosting server.
Signalling session startup, negotiations, and tare down. This is very minimal(only some json data at the beginning of a session). This should not be too much of a burden but you should be aware that if 1000 sessions start at the same time, you will have a queue of messages to direct to the needed parties. How you determine the parties, forward the messages, and what work you do server side could all affect performance. If written smartly(how to store sessions, how to forward messages, etc.) should not be a terrible burden.This could easily done with SignalR since you are on ASP.NET or you could use a separate one running Node.js(or the same box, does not matter) if you so desired.
RTP TURN relay if needed. This will probably be through a different server(or the same one as your hosting server if you want). For SOME connections, a TURN server is needed and any production ready WebRTC solution should take this into account. Here is a good open source turn server. Bandwidth usage here could be very high as RTP packets are sent to this server and the forwarded to the peer in the connection.
If you are recording the streams, you may have increased hosting traffic depending on how you implement it. Firefox supports client side recording of the streams but Chrome does not(they say it is in the works currently). You could use existing JS libraries to record the feeds client side and then push them anywhere you want. You could also push all the data through a MediaServer that will mux, demux, and forward the data to be recorded anywhere you like. Janus-Gateway videoroom is a good lightweight example of a mediaserver.
Client side is a different story.
There are higher level concerns in the Javascript. If you use one of the recording JS libraries, this is especially evident as they do canvas captures numerous times a second which are a heavy hit and would degrade the user experience.
CPU utilization by the browser will increase as the quality of the video being streamed increases. This is rather obvious as HD video frames take more CPU power to encode/decode than SD frames.
Client side bandwidth usage can also be an issue. Chrome and Firefox try to modify the bitrate of each video/audio feed dynamically but the video Bitrate can go all the way up to 2 Mbps. You can cap this in Chrome( by adding an attribute in the SDP) but not in Firefox(last I checked) as of yet.

Live Video Streaming asp.net

I've a scenario where there're two servers(server 1 & server 2). There's a web cam/CCTV cam(or of any kind) that feeds video to server 1. I assume the video feeding to the server 1 is easily done by the camera setup. This live video is now somehow sent to Server 2 which is eventually broadcast to the clients.
Now the thing is I actually need a lead to follow how to start with the whole thing and where does media server's role kick in(if it's actually needed). I don't have any idea whatsoever regarding the whole process and am having trouble making relevant searches. Any advise or help would be much appreciated. Thanks in advance
is there any specific reason why 2 servers are involved ? You could easily stream video from server 1, where you get the camera feed.
You could either use a streaming media server (Like adobe media server) or use a standalone application like Windows Media Encoder to give out the live stream for users to view.
Does your server 1 have enough bandwidth to stream the video to multiple users ? If 100 people view your stream at 1 Mbps, you will be needing around 20Mbps bandwidth in your server at a minimum, else the video streams may suffer. If you cannot arrange that much bandwidth in your server, you will have to use a CDN hosted streaming server (There are lot of service providers).
If only a few users will be viewing your stream simultaneously, it may be fine with your existing setup.
If you are following the two server setup as you mentioned, follow these steps to broadcast.
Set up Adobe media server (trial will do for upto 10 simultaneous connection streams) on server 2.
Install Adobe media encoder on server 1, where the video stream is available.
From server 1, push the video stream via adobe media encoder to server 2, (set up a publishing point first for live).
get the streaming link from AMS installed in server 2, which can be embedded into any compatible player (flowplayer or jwplayer), and put it in a webpage for public access.
Hope this helps.

Uploading a large file from phone

I'm trying to write a WP8 app that needs to upload a large amount of data back to my server. My server runs on ASP.net and implements REST using WebAPI.
I've gotten to a point where I can upload small amount of data, say 2-5MB using a POST and transfer them over to Azure blobs. Now, I'm thinking about moving a decent amount of data, say ~40-50MB from the phone using the background transfer API defined here http://msdn.microsoft.com/en-us/library/windowsphone/develop/hh202955(v=vs.105).aspx
The phone API supports -
Over cellular connection - 5 MB,
Over Wi-Fi connection with battery power - 20 MB
Over Wi-Fi connection with external power - 100 MB
The part that I'm struggling to understand is -
The MSDN kb article recommends that the server implement range requests, which is fair. However, it doesn't say how much could be chunked at a time. Can my upload server config be unbounded for request size?
I would prefer to keep the client as 'dumb' as possible and use the existing transfer APIs on the phone. My concerns are around performance of my server and how much memory would be available on the server if I start seeing considerable traffic on the server. Can someone give me pointers for server best practices to accept large amounts of data?
BackgroundTranferRequest does not support the Range header. see here
Implement your own upload and download client and you can use Range and also bypass the various size limitations.

Efficient reliable incremental HTTP multi-file (or whole directory) upload software

Imagine you have a web site that you want to send a lot of data. Say 40 files totaling the equivalence of 2 hours of upload bandwidth. You expect to have 3 connection losses along the way (think: mobile data connection, WLAN vs. microwave). You can't be bothered to retry again and again. This should be automated. Interruptions should not cause more data loss than neccessary. Retrying a complete file is a waste of time and bandwidth.
So here is the question: Is there a software package or framework that
synchronizes a local directory (contents) to the server via HTTP,
is multi-platform (Win XP/Vista/7, MacOS X, Linux),
can be delivered as one self-contained executable,
recovers partially uploades files after interrupted network connections or client restarts,
can be generated on a server to include authentication tokens and upload target,
can be made super simple to use
or what would be a good way to build one?
Options I have found until now:
Neat packaging of rsync. This requires an rsync (server) instance on the server side that is aware of a privilege system.
A custom Flash program. As I understand, Flash 10 is able to read a local file as a bytearray (indicated here) and is obviously able to speak HTTP to the originating server. Seen in question 1344131 ("Upload image thumbnail to server, without uploading whole image").
A custom native application for each platform.
Thanks for any hints!
Related work:
HTML5 will allow multiple files to be uploaded or at least selected for upload "at once". See here for example. This is agnostic to the local files and does not feature recovery of a failed upload.
Efficient way to implement a client multiple file upload service basically asks for SWFUpload or YUIUpload (Flash-based multi-file uploaders, otherwise "stupid")
A comment in question 997253 suggests JUpload - I think using a Java applet will at least require the user to grant additional rights so it can access local files
GearsUploader seems great but requires Google Gears - that is going away soon

How do I connect a pair of clients together via a server for an online game?

I'm developing a multi-player game and I know nothing about how to connect from one client to another via a server. Where do I start? Are there any whizzy open source projects which provide the communication framework into which I can drop my message data or do I have to write a load of complicated multi-threaded sockety code? Does the picture change at all if teh clients are running on phones?
I am language agnostic, although ideally I would have a Flash or Qt front end and a Java server, but that may be being a bit greedy.
I have spent a few hours googling, but the whole topic is new to me and I'm a bit lost. I'd appreciate help of any kind - including how to tag this question.
If latency isn't a huge issue, you could just implement a few web services to do message passing. This would not be a slow as you might think, and is easy to implement across languages. The downside is the client has to poll the server to get updates. so you could be looking at a few hundred ms to get from one client to another.
You can also use the built in flex messaging interface. There are provisions there to allow client to client interactions.
Typically game engines send UDP packets because of latency. The fact is that TCP is just not fast enough and reliability is less of a concern than speed is.
Web services would compound the latency issues inherent in TCP due to additional overhead. Further, they would eat up memory depending on number of expected players. Finally, they have a large amount of payload overhead that you just don't need (xml anyone?).
There are several ways to go about this. One way is centralized messaging (client/server). This means that you would have a java server listening for udp packets from the clients. It would then rebroadcast them to any of the relevant users.
A second way is decentralized (peer to peer). A client registers with the server to state what game / world it's in. From that it gets a list of other clients in that world. The server maintains that list and notifies the other clients of people who join / drop out.
From that point forward clients broadcast udp packets directly to the other users.
If you look for communication framework with high performance try look at ACE C++ framework (it has Java bindings).
Official web-site is: http://www.cs.wustl.edu/~schmidt/ACE-overview.html
You could also look into Flash Media Interactive Server, or if you want a Java implementation, Wowsa or Red5. Those use AMF and provide native functionality for ShareObjects including synching of the ShareObjects among connected clients.
Those aren't peer to peer though (yet, it's coming soon I hear). They use centralized messaging managed by the server.
Good luck

Resources