Upload file larger than 100MB in Flex - apache-flex

I want to upload larger than 100 MB local file and then send it to remove Java server.
I got these possible alternatives:
Use FileReference Flex class, which is not recommended for files larger than 100Mb, and hope that the application will not crash.
Slice the given file to small parts, then send them. For me, this seems pretty harsh decision.
The question: Are there some Flex library which allows sending larger files than 100Mb?
If the answer is NOT, are there some 3rd party libraries for the same thing?
If not, are there some 3rd party libraries for 'slicing' files and sending them asynchronously to server?
EDIT: If I sliced the file to parts, how large they are supposed to be (for proper hash check)?

First of all File's FileReference does not allow you to split files, it allows you to read all data at once, it doesnt allow streaming. Uploading of files larger then 100mb works well without any third party library, however the problem happens at server side, usually ASP.NET or Tomcat server needs bigger timeout in order to accept larger files.
Usually uploading on ASP.NET server (quite same for Java based server as well), script execution time does not consider uploading time and usually it times out before script is uploaded. If uploading takes more then 10 minutes and script execution timeout is less then 10 minutes then no matter what client side library you choose you will never be able to upload files.
You can choose silverlight if you want to break files into smaller parts and you can consider md5 hash given inbuilt in silverlight.

HTTP was not designed for transfer of such large files, so I'd recommend looking into alternate methods of file transfers, such as FTP.
I know that someone tried to create an FTP Flex client using sockets, but I thought the project ran into technical limitations which prevented it from being fully complete.
If at all possible, I'd strongly recommend re-thinking your business requirements.

If the flex fileReference cannot open files larger than 100mb, you could open them with HTML/Javascript via the ExternalInterface. Once you have the file you can split it up in chunks and send it to Flex bit by bit (via Base64 encoding possibly) or upload directly from HTML/JS. Altough I don't know if HTML can open files larger than 100mb.

Related

What are the technical reasons to use multiple, smaller files instead of one large JS file? [duplicate]

This question already has answers here:
What are the benefits of concatenating all Javascript files into one before sending it to client?
(3 answers)
Closed 6 years ago.
Is there a difference between having one large Javascript file compared to having many different Javascript files?
I recently learned that a separate application at my office contains two Javascript files for everything it requires. They're almost 2 MB and contain roughly 40K lines of code each.
From a maintainability standpoint, that is obviously awful. I can't imagine dealing with that in SVN. But does it actual make a difference in the performance of the application?
There is a setting for chunked transfer encoding in IIS, but I know little about it beyond what's mentioned in the article there. The "Rationale" section doesn't seem particularly relevant to Javascript. It seems more important for the "actual" pages in the application and communicating back and forth between the client and server.
Tagged with ASP.NET as the setting is under the "ASP" section of IIS... If that's not actually related please edit and remove the tag or let me know and I can.
Javascript files are often combined in production environments to cut down on server requests and HTTP overhead. Each time you request a resource, it takes a round trip from the client to the server, which affects page load speed.
Each separate request incurs HTTP overhead, basically extra data that is attached to the request/response headers, that must get downloaded too.Some of this will change with the implementation of HTTP2, and smaller files will become more efficient.
From a maintainability perspective, you'd never want to deal with files that large. Ideally, each JS file should be broken up into a logical module and stored independently in SVN. That makes it easier for the developers to work with and keep track of changes. Those small, modular files would then go through a build process to combine and possibly minify/uglify them to get them ready to be served in a production environment.
There are tons of tools you can use to automate this build process like Gulp, Grunt, or npm. Some .NET content management systems like DNN have settings that allow you to do this automatically in production.

Uploading large files in IIS Asp.net [duplicate]

I've done a good bit of research to find an upload component for .NET that I can use to upload large files, has a progress bar, and can resume the upload of large files. I've come across some components like AjaxUploader, SlickUpload, and PowUpload, to name a few. Each of these options cost money and only PowUpload does the resumable upload, but it does it with a java applet. I'm willing to pay for a component that does those things well, but if I could write it myself that would be best.
I have two questions:
Is it possible to resume a file upload on the client without using flash/java/Silverlight?
Does anyone have some code or a link to an article that explains how to write a .NET HTTPHandler that will allow streaming upload and an ajax progress bar?
Thank you,
Austin
[Edit]
I realized I do need to be able to do resumable file uploads for my project, any suggestions for components that can do that?
1) Is it possible to resume a file upload on the client without using flash/java/Silverlight?
No. The actual HTTP protocol itself does not support resume of partial uploads, so even if you did use flash or silverlight, you'd still need to use something else like FTP on the server.
I've "solved" this problem in the past by writing a custom client application in C# which broke the file down into small chunks (2meg), transmitted those separately, and then the server combines them all back together.
2) Does anyone have some code or a link to an article that explains how to write a .NET HTTPHandler that will allow streaming upload and an ajax progress bar?
While this doesn't solve the 'resume' problem, I've used SWFUpload on the client side and it worked brilliantly. It provides a smart file browser (where you can prompt the user for only jpeg files, etc) and upload progress tracking, all without needing to modify your server at all.
It's not possible to resume an upload using standard HTML file input control, because the whole file gets submitted with the HTTP request.
I've used NeatUpload in the past, which gives you a progress bar. It's under an LGPL license, so you don't need to pay for it and it's open source.
Nothing more to add about the resume problem.
I used (and keep on using) telerik radUpload and I am quite satisfied with it
(it can even be used in medium trust mode which was quite important for me). The only problem I had (and was not able to fix) is to upload files bigger than 2GB...
SlickUpload is pretty solid and a lot of big companies use it from what the site says.
This is probably too late for your project, but POW Upload have now implemented auto resume upload in their new version. We're about to implement it on our site.

Is there a way to speed up file uploads?

I understand that file upload speeds are limited by the upload speed of the internet connection among other things. Is it possible to use jquery or some other method to compress the file locally before upload and then upload a file to the server? Any other solutions?
While others have already provided answers, one thing you might be able to do (depending on how your website is setup) is, once the user has chosen the file, begin the upload process immediately. That way, if the user has to fill in additional information about the file (maybe a description of the file, a different name for the server, keywords, etc), their file is uploading in the meantime, and the information can be provided later.
Other than that, you're SOL.
If upload speed is a concern, perhaps consider a client side application the user has to download.
Or a flash based uploader. Using flash you'd get more control over the upload and it is consistent across browsers. This is what YouTube does to allow 2GB video uploads with minimal stress on the user's part. It doesn't make it faster if the client's connection is poor, but it helps with the reliability of the upload.
The browser already takes care of all the little optimizations that would make it faster on the client side, so no, you can't really use JavaScript to speed up a file upload. There isn't much you can do if the clients connection is the bottle neck.
No, you can't read the local filesystem in JavaScript. You can't do it with Flash or Java under the default config, either (with the partial exception of Flash 10). Further, there is no standard way to send compressed requests (the way there is for responses).
The upload time will be determined by a variety of factors. So, for example, network speed, web server response time, upload file size, and so on. Check with your IT department and go over those points. If the issue is one of file size, there are ways to compress and reduce file size in Android. Refer to this sample code for Android compression and document capture.
https://github.com/ExtrieveTechnologies/QuickCapture

Best Upload for web application FTP or HTTP?

We have a web application where the users from whole world would upload there file at a same time. We want an effecient, robust upload system. Max file size would be 50 MB.
There would be atleast 1lac users uploading at same time.
Please suggest which is the better upload system, FTP or HTTP?
Currently we have http based upload where in we do get some errors like connection problem, session time out, time out error, etc...
Even suggest me for any 3rd party ftp upload tools if you come accross.
I will suggest you yo go with HTTP, because it is much favorable in terms of user convenience.
If you are having critical issues with Large file upload then Please have a look at Darren johnstone's Large File upload library for ASP.Net.
Still If you have to go with FTP then I will suggest you to go with using some Client Side reach technology which runs under browser like FLASH, JAVA Applets ( or might be SilverLight )
Depends on what you're doing.
every user I ever met knew how to use a browser, but the standard random user doesn't even have an FTP client installed. So usually HTTP uploads aren't really problematic. I wouldn't wanna upload huge files, but 50megs isn't that bad yet. If you want an FTP upload you probably would go for a java applet, so your users are guaranteed to have even the software needed to upload their files. Any PHP/WebFTP things will just run you into the same problems again.
Sometimes if I don't know if things I want to do work well, its a good time to look at how others are doing it. Gmail for example has a fabulous upload system. imageshack, millions and millions of users are uploading their stuff their, basically thats all the page does, and all of them use "normal" HTTP, with a little bit of JavaScript sugar to display the progress.
edit: here is an example with PHP: (although u seem to be using asp, it might still help)
http://www.devpro.it/upload_progress/

convert a PDF to a jpg?

In a Flex/AIR application, I need to create snapshots (like big thumbnails) of local PDF files (one per page, if indicated).
Ideally, I would like to do it all on the client side (PDF is a public specification, albeit a REALLY COMPLICATED one).
I have read about an "Adobe plug-in" but I cannot find a specific piece of software that makes the HTMLLoader.pdfCapability report anything but ERROR_CANNOT_LOAD_READER. I hope to load the PDF and then move the bitmap data to an Image in order to save it.
Second choice is sending it off to a web service and getting a set of jpg files back.
Thanks
I would suggest generating the previews on the server side (as the HTMLLoader relies on the client having Adobe Reader installed), unless you can find a component that is capable of generating thumbnails and that does not introduce requirements for specific software to be installed on client machines.
If you're developing an in-house solution for use in your company and you control or can impose requirements for the applications installed on user machines, I guess it won't be that bad. But if you're targeting consumers, there's nothing more annoying than introducing dependencies. And it could turn your customers away from your application.
Also having it server-side is usually the preferred option, since you could persist your thumbnails and not have to generate them every single time (if that makes sense for your application) and you unload heavy processing tasks from your client application. Processing PDFs with large number of pages can be a heavy processing task.
Doing this client-side probably isn't the best idea. It seems overly complicated, could be quite slow, and will require a few unnecessary dependencies. If you do it server-side you can convert the PDFs to images and then send down the image to the client-side. Converting PDFs to images is a relatively simple process if you use a third-party PDF library.

Resources