I have a website that displays videos. It seems to work OK with 5 minute "webm" and "mp4" videos that are referenced by the HTML5 video tag. But if I ever use long videos, I'm worried that the website would be overwhelmed. The website uses ASP.net, and I did find an article on the topic of progressive download of large files with asp.net. However, I don't know if getting small packets of files interferes with the user doing a "seek" (e.g. dragging the control bar position to a point he is interested in, for instance). I also don't know if it would stream pages that don't need to be streamed.
Is there any way to solve the problem of too much video data being sent over the internet at once, and overwhelming either my server or the user's PC?
This should help you: Streaming MP4 video through .NET HTML5
Seeking and buffering is handled by the player itself, and yes when you seek to a new point in the video it will automatically just request those parts of the video.
There isn't a way to do adaptive streaming of HTML5 videos, but you shouldn't have to worry about seeking in long videos. Modern web servers allow browsers to request chunks of a file, so if a user is watching a very long video and skips right to the middle, the browser will stop downloading from the beginning and start retrieving the video file from the spot they skipped to. Contrast this with the early days of Flash video, when you have to wait for the whole beginning of a video file to download in sequence to get to the middle part you want to watch.
There are, however, a few things you can do to conserve bandwidth and keep your server happy.
First, if a user has started watching a video and is done with it, you can make sure the browser doesn't continue to download the rest of that large file. For example, this may happen if you're switching through multiple videos on a single page or if you're using a framework like Ember.js or Backbone to navigate through multiple "pages" without actually point your browser to a new URL.
If you pause a video and remove it from the DOM, the browser might still be downloading it. But you can stop that with code like this:
video.src = "";
video.load();
Also, consider storing your video files on a CDN. That way, the server that's running your ASP scripts, communicating with your database and handling other related logic is not also responsible for serving up those large files. There are other benefits as well, like having separate and more efficient headers and having servers in multiple locations that are geographically closer to your users.
HTML5 introduces the MediaStream API, which is meant to capture and stream video and audio data, primarily intented to get access to local media devices, e.g. a webcam. But using the MediaStream object concatenated with XHR you should be able to provide video streaming without further plugins. The browser support is not widspread right now and in not-up-to-date versions of Firefox and Chromium, you have to enable it via some sort of setting (e.g. about:flags in Chromium).
I don't know of any working implementation for streaming video from a server, I just played around with webcam data recently. I think that it is possible (somewhen even with broad browser support ;-). Be aware that HTML5 consists of many different API's most of which you would normally use via JS.
See also: http://www.html5rocks.com/en/tutorials/webrtc/basics/?redirect_from_locale=de#toc-mediastream
Related
I'm currently building my portfolio with wordpress, and I have large video files (each between 2.5gb and 5gb) to display. Youtube and vimeo compress quite a lot the videos, and I'm trying to find an alternative to these website.
I want to use the media library to display the projects, but I wonder if the files will be watchable. Because of their size, I have the fear that the videos will load every 10sec. Am I right, or videos that weight 5gb will run without issue ?
Most good quality internet video delivery services, including YouTube and Vimeo will actually create multiple different bit rate versions of your video and then server the highest quality they can depending on the device watching the video and the current network conditions - you can see more info and how to see it in action here:
https://stackoverflow.com/a/42365034/334402
At the moment this is the 'state of the art' in the industry for delivery best quality video reliably - you don't have to use YouTube or Vimeo as other hosting services exist or you can even run your own streaming server using something like Wowza or GStreamer. It is worth being aware that this domain is complex and ever changing so you may find it easier to use some hosting service.
I ask google about this,but didn't find anything usefull.
Is there a way to creat web app in flex that will convert video to flv,after that video is uploaded to the server?Can this be done with flex,if can,do I still need to install ffmpeg on server?
Tnx for response.
You should convert server side anyway. This will allow you to validate the upload.
I agree with the other post that you probably should perform the conversion server side. Since you don't have a great grasp of the difference between container formats and video formats yet, creating custom client side code might get difficult since you're going to have to get a little closer to the metal.
Given that, there are some people who are doing conversion on the fly client side for certain video formats to FLV within the flash client. MKVLoader is a pretty nifty project where they use the new appendBytes(bytes) method that is now available as of Flash 10.1 on the NetStream object to convert MKV to FLV in the client. This is a really cool trick, but you'll run into problems as soon as you want to support another format.
ffmpeg supports so many formats, it would be silly not to use it. Since you mention that the video will get to your server anyway, you might as well convert it on the server before storing it.
I've got a crossdomain.xml file which allows SWFs running on only a certain few domains to download resources from my domain. However, one simple way around this is for a user to download the SWF to their local machine, and run it there (i.e. by double-clicking on it within Windows Explorer, not by running through http://localhost). It seems that when this happens, the crossdomain.xml file is ignored.
I understand that in my actionscript, I can do this:
if (Security.sandboxType.indexOf(Security.REMOTE) == -1)
// running locally - don't allow
However it is incredibly easy for someone to decompile the SWF and simply remove this line.
Is it possible to do something on the server side to stop a locally running SWF to download from my site? I tried checking the referrer but this field often isn't populated. Does anyone have any other ideas?
Thanks, Matt
You will never be able to completely prevent downloads by using crossdomain.xml. If the user just copies and pastes the requested URL to a resource into a blank browser window, the mechanism stops working. Also, the mechanism can be cheated by using a proxy. All it does is raise the bar a little, especially when someone tries to use an SWF video player to stream an FLV video hosted on your site.
If protecting your resources is worth the effort, you should consider adding some sort of authentication / authorization mechanism and/or encryption.
Double Clicking and running a SWF will usually only work if you have a stand alone player installed, otherwise it will open w/ a browser. Does Adobe distribute a stand alone player outside of Flash Pro? They didn't used to, although with Flash Platform Tools growing, they may do so now.
Nevertheless, I would expect most users will not have a stand alone player installed. In terms of security and protecting content, I suspect you're focusing on the wrong thing.
I understand that file upload speeds are limited by the upload speed of the internet connection among other things. Is it possible to use jquery or some other method to compress the file locally before upload and then upload a file to the server? Any other solutions?
While others have already provided answers, one thing you might be able to do (depending on how your website is setup) is, once the user has chosen the file, begin the upload process immediately. That way, if the user has to fill in additional information about the file (maybe a description of the file, a different name for the server, keywords, etc), their file is uploading in the meantime, and the information can be provided later.
Other than that, you're SOL.
If upload speed is a concern, perhaps consider a client side application the user has to download.
Or a flash based uploader. Using flash you'd get more control over the upload and it is consistent across browsers. This is what YouTube does to allow 2GB video uploads with minimal stress on the user's part. It doesn't make it faster if the client's connection is poor, but it helps with the reliability of the upload.
The browser already takes care of all the little optimizations that would make it faster on the client side, so no, you can't really use JavaScript to speed up a file upload. There isn't much you can do if the clients connection is the bottle neck.
No, you can't read the local filesystem in JavaScript. You can't do it with Flash or Java under the default config, either (with the partial exception of Flash 10). Further, there is no standard way to send compressed requests (the way there is for responses).
The upload time will be determined by a variety of factors. So, for example, network speed, web server response time, upload file size, and so on. Check with your IT department and go over those points. If the issue is one of file size, there are ways to compress and reduce file size in Android. Refer to this sample code for Android compression and document capture.
https://github.com/ExtrieveTechnologies/QuickCapture
I am considering creating a website with the complexity of Facebook that should be able to scale into the millions of users. My question is: Is there any reason not to use Adobe Flex for such large project apart from the obvious point of requiring everyone to have Flash installed and not having to rely on Adobe? In my view Adobe Flex would reduce the server load for Facebook, because more of the work could me done on the client side. Do you agree?
Of course Facebook could have been implemented in Flash. But then the question is would it have succeeded? There are reasons big web companies like Google, Facebook and Yahoo only use Flash as sparingly as possible.
The thing I would fear most is alienating users. The Flash plugin isn't the best piece of software out there. It is slow and likely to crash once in a while. If your app gets bigger you might get some loading times that might not be acceptable to your users. Also in my opinion full Flash sites just don't feel right because they behave differently from HTML websites. All great websites like Google, Flickr, Stackoverflow or Facebook feel very light and slick which is very elegant makes up for great usability.
And then HTML and JavaScript are a lot more flexible. Do you want your website to be available on smartphonse? The iPhone has no Flash and even with phones that do you have the problem that users will very likely hate a full Flash site since those phones don't necessarily scale Flash as nicely as they scale HTML and Flash draws battery like crazy. If someone comes up with another revolution like smartphones you can be sure it supports HTML and JavaScript but you can't be so sure about Flash.
Then the question is how would you gain any efficiency? Of course you can write your UI with Flex and just call very lightweight webservices like you would use them for AJAX and you can even cache some of the content of the site locally so that you don't transmit more data as necessary for user interaction (the UI is transmitted only once). But you can also do that with JavaScript. You can write your UI in HTML and JavaScript, load it once and then just pull the naked JSON data from the servers and render it using JavaScript. You can also fetch lots of this data in advance to get the number of requests down. But still such an approach has its cons. Did you ever notice that when you type an answer on stackoverflow and someone else submits an answer you get notified while typing your answer. Such real time features are very cool and you might want this at some point in time which means more server interaction.
But whatever you do your servers still have to scale if your site grows. Even if you minimize the number of GET requests that hit your servers they will still grow a lot when your site gets popular and you will need a lot of servers to handle you will just improve your users/servers ratio.
The most interesting point though is that Flex is much easier to program than AJAX (think about browser incompatibilities for instance) and still AJAX was not only invented but the whole world messes with all those problems that come with it instead of using Flex. I think this tells a lot about of the value of the result you get when creating a full website in Flash.
Go to facebook and do view source...do you see all that JavaScript? That all runs client side
Johannes is right to point out the difference for server vs. client. The server side stuff is what needs to scale.
As an example, the Microsoft Silverlight team has assembled a facebook client app in silverlight (using the Facebook public API). My point is, using todays technologies, it is entirely possible to write a web application targeting many different kinds of client technologies: classic web browsers (HTML/javascript), 'rich internet applications' (flex, silverlight), ...
See also the myriad of Twitter clients out there.
The company I work for has a large app in Flash that is used by Governments. It is very hard to maintain and does fail sometimes. The problem is all of the .fla and .as files that have to be altered just to make a small change. Yes, the app could have been built better but even so, it is still harder to maintain than an HTML/JavaScript front end.
While I love writing Flash/Flex apps, I believe they should complement a site and not be the site.
Using a good JavaScript framework like jQuery takes the Browser compatibility question out of the picture (for the most part) and allows a lot of functionality.
Flex is the GUI for the client. You still need server-side storage and that's what has to scale. The user interface could be in Flex, while most of your users won't like such interfaces.
You will have to do a custom version of your site for the iPad/iPhone.
There are other ways of moving load to the client side. Javascript will give you porting headaches, but less than moving away from the entire architecture like Flex.
OTOH when you get a million users you'll have the resources to reimplement your site.
I don't think you would see a performance advantage with a site like Facebook, because the content is highly dynamic, comes from many different places, and is created by many independent entities. Flash (and therefore Flex) is better for monolithic apps from a single source that don't need to change very often.
The default in Flash is to build everything into a single .swf file that holds everything. It is possible to break out of this default behavior, of course. You can make web service calls, pull in external components via the SWC mechanism, load static content via HTTP, etc. Nevertheless, it's not the default pattern, which affects how Flash development libraries and tools work. Besides, the more of this you do, the less of the "run everything we possibly can on the client side" benefit you get. It gets soaked up in HTTP connection overhead.
The default on the plain old standards-based web is to store all assets separately and assemble them dynamically at the client. This is one reason the web is slow -- again, all that HTTP connection overhead -- but also why it is flexible and dynamic. It mates well with a site like Facebook which requires constant evolution by a lot of independent developers.
I say this having developed a Flex app, which I am happy with. Only one person -- me -- has to maintain it, and it's naturally a monolithic app. It plays right into Flex's strengths.