How is the file uploading progress reported? - asp.net

I have been trying to implement an ajax-style file upload. I was wondering what we must do to report the uploading progress. I am trying to implement this in my asp.net web page.
I understand the mechanism by which we can upload a file, ajax-style, on a web page. I have been googling a lot about how to show a progress bar, but I don't see proper explanations. But I've come to understand that we have to manage this from the server-side some how. (cf file upload progress)
Any ideas/code would be appreciated. Thanx in advance.

I'm not sure why you want to roll your own, as there are a number of upload controls using Ajax, Flash, Silverlight, etc. Nonetheless, the concept is all about querying for progress and returning the current state. This guy went through the same process and shows how he built a component using jQuery.
Displaying progress requires your client getting feedback from your server. This means repetitive queries to your server to get progress. Display the progress on your client using that information.
On your server, you have to not only accept the inbound file upload but also respond with the aggregate progress. It's easy if you get the Content-Length header passed along in the request; the problem lies in that you won't be able to rely on having that information. There are strategies for dealing with this, but it requires you to have code that accepts the uploaded file as you're going to have to read the inbound bits.
At this point, you have other things to care about that are well beyond the scope of a progress bar, such as dealing with large file uploads in-process of your web server (out-of-process is much less detrimental to your server.)
Because of the complexities involved, I would encourage you to find an existing component instead of creating your own.

Related

Uploading large files in IIS Asp.net [duplicate]

I've done a good bit of research to find an upload component for .NET that I can use to upload large files, has a progress bar, and can resume the upload of large files. I've come across some components like AjaxUploader, SlickUpload, and PowUpload, to name a few. Each of these options cost money and only PowUpload does the resumable upload, but it does it with a java applet. I'm willing to pay for a component that does those things well, but if I could write it myself that would be best.
I have two questions:
Is it possible to resume a file upload on the client without using flash/java/Silverlight?
Does anyone have some code or a link to an article that explains how to write a .NET HTTPHandler that will allow streaming upload and an ajax progress bar?
Thank you,
Austin
[Edit]
I realized I do need to be able to do resumable file uploads for my project, any suggestions for components that can do that?
1) Is it possible to resume a file upload on the client without using flash/java/Silverlight?
No. The actual HTTP protocol itself does not support resume of partial uploads, so even if you did use flash or silverlight, you'd still need to use something else like FTP on the server.
I've "solved" this problem in the past by writing a custom client application in C# which broke the file down into small chunks (2meg), transmitted those separately, and then the server combines them all back together.
2) Does anyone have some code or a link to an article that explains how to write a .NET HTTPHandler that will allow streaming upload and an ajax progress bar?
While this doesn't solve the 'resume' problem, I've used SWFUpload on the client side and it worked brilliantly. It provides a smart file browser (where you can prompt the user for only jpeg files, etc) and upload progress tracking, all without needing to modify your server at all.
It's not possible to resume an upload using standard HTML file input control, because the whole file gets submitted with the HTTP request.
I've used NeatUpload in the past, which gives you a progress bar. It's under an LGPL license, so you don't need to pay for it and it's open source.
Nothing more to add about the resume problem.
I used (and keep on using) telerik radUpload and I am quite satisfied with it
(it can even be used in medium trust mode which was quite important for me). The only problem I had (and was not able to fix) is to upload files bigger than 2GB...
SlickUpload is pretty solid and a lot of big companies use it from what the site says.
This is probably too late for your project, but POW Upload have now implemented auto resume upload in their new version. We're about to implement it on our site.

Is there a way to speed up file uploads?

I understand that file upload speeds are limited by the upload speed of the internet connection among other things. Is it possible to use jquery or some other method to compress the file locally before upload and then upload a file to the server? Any other solutions?
While others have already provided answers, one thing you might be able to do (depending on how your website is setup) is, once the user has chosen the file, begin the upload process immediately. That way, if the user has to fill in additional information about the file (maybe a description of the file, a different name for the server, keywords, etc), their file is uploading in the meantime, and the information can be provided later.
Other than that, you're SOL.
If upload speed is a concern, perhaps consider a client side application the user has to download.
Or a flash based uploader. Using flash you'd get more control over the upload and it is consistent across browsers. This is what YouTube does to allow 2GB video uploads with minimal stress on the user's part. It doesn't make it faster if the client's connection is poor, but it helps with the reliability of the upload.
The browser already takes care of all the little optimizations that would make it faster on the client side, so no, you can't really use JavaScript to speed up a file upload. There isn't much you can do if the clients connection is the bottle neck.
No, you can't read the local filesystem in JavaScript. You can't do it with Flash or Java under the default config, either (with the partial exception of Flash 10). Further, there is no standard way to send compressed requests (the way there is for responses).
The upload time will be determined by a variety of factors. So, for example, network speed, web server response time, upload file size, and so on. Check with your IT department and go over those points. If the issue is one of file size, there are ways to compress and reduce file size in Android. Refer to this sample code for Android compression and document capture.
https://github.com/ExtrieveTechnologies/QuickCapture

Performing bulk processing in ASP.NET page

We need the ability to send out automatic emails when certain dates occur or when some business conditions are met. We are setting up this system to work with an existing ASP.NET website. I've had a chat with one of the other devs here and had a discussion of some of the issues.
Things to note:
All the information we need is already modelled in the ASP.NET website
There is some business-logic that is required for the email generation which is also in the website already
We decided that the ideal solution was to have a separate executable that is scheduled to run overnight and do the processing and emailing. This solution has 2 main problems:
If the website was updated (business logic or model) but the executable was accidentally missed then the executable could stop sending emails, or worse, be sending them based on outdated logic.
We are hoping to use something like this to use UserControls to template the emails, which I don't believe is possible outside of an ASP.NET website
The first problem could have been avoided with build and deployment scripts (which we're looking into at the moment anyway), but I don't think we can get around the second problem.
So the solution we decided on is to have an ASP.NET page that is called regularly by SSIS and to have that do a set amount of processing (say 30 seconds) and then return. I know an ASP.NET page is not the ideal place to be doing this kind of processing but this seems to best meet our requirements. We considered spawning a new thread (not from the worker pool) to do the processing but decided that if we did that we couldn't use the page returned to signify a success or failure. By processing within the page's life-cycle we can use the page content to give an indication of how the processing went.
So the question is:
Are there any technical problems we might have with this set-up?
Obviously if you have tried something like this any reports of success/failure will be appreciated. As will suggestions of alternative set-ups.
Cheers,
Don't use the asp.net thread to do this. If the site is generating some information that you need in order to create or trigger the email-send then have the site write some information to a file or database.
Create a Windows service or scheduled process that collects the information it needs from that file or db and run the email sending process on a completely seperate process/thread.
What you want to avoid is crashing your site or crashing your emailer due to limitations within the process handler. Based on your use of the word "bulk" in the question title, the two need to be independent of each other.
I think you should be fine. We use the similar approach in our company for several years and don’t get a lot of problems. Sometimes it takes over an hour to finish the process. Recently we moved the second thread (as you said) to a separate server.
Having the emailer and the website coupled together can work, but it isn't really a good design and will be more maintenance for you in the long run. You can get around the problems you state by doing a few things.
Move the common business logic to a web service or common library. Both your website and your executable/WCF service can consume it, and it centralizes the logic. If you're copying and pasting code, you know there's something wrong ;)
If you need a template mailer, it is possible to invoke ASP.Net classes to create pages for you dynamically (see the BuildManager class, and blog posts like this one. If the mailer doesn't rely on Page events (which it doesn't seem to), there shouldn't be any problem for your executable to load a Page class from your website assembly, build it dynamically, and fill in the content.
This obviously represents a significant amount of work, but would lead to a more scalable solution for you.
Sounds like you should be creating a worker thread to do that job.
Maybe you should look at something like https://blog.stackoverflow.com/2008/07/easy-background-tasks-in-aspnet/
You can and should build your message body (templated message body) within domain logic (it means your asp.net application) when some business conditions are met and send it to external service which should only send your messages. All messages will have proper informations.
For "when certain dates occur" scenario you can use simple solution for background tasks (look at Craig answer) and do the same as above: parse template, build message and fast send to specified service.
Of course you should do this safe then app pool restarts does not breaks your tasks.

Best Upload for web application FTP or HTTP?

We have a web application where the users from whole world would upload there file at a same time. We want an effecient, robust upload system. Max file size would be 50 MB.
There would be atleast 1lac users uploading at same time.
Please suggest which is the better upload system, FTP or HTTP?
Currently we have http based upload where in we do get some errors like connection problem, session time out, time out error, etc...
Even suggest me for any 3rd party ftp upload tools if you come accross.
I will suggest you yo go with HTTP, because it is much favorable in terms of user convenience.
If you are having critical issues with Large file upload then Please have a look at Darren johnstone's Large File upload library for ASP.Net.
Still If you have to go with FTP then I will suggest you to go with using some Client Side reach technology which runs under browser like FLASH, JAVA Applets ( or might be SilverLight )
Depends on what you're doing.
every user I ever met knew how to use a browser, but the standard random user doesn't even have an FTP client installed. So usually HTTP uploads aren't really problematic. I wouldn't wanna upload huge files, but 50megs isn't that bad yet. If you want an FTP upload you probably would go for a java applet, so your users are guaranteed to have even the software needed to upload their files. Any PHP/WebFTP things will just run you into the same problems again.
Sometimes if I don't know if things I want to do work well, its a good time to look at how others are doing it. Gmail for example has a fabulous upload system. imageshack, millions and millions of users are uploading their stuff their, basically thats all the page does, and all of them use "normal" HTTP, with a little bit of JavaScript sugar to display the progress.
edit: here is an example with PHP: (although u seem to be using asp, it might still help)
http://www.devpro.it/upload_progress/

convert a PDF to a jpg?

In a Flex/AIR application, I need to create snapshots (like big thumbnails) of local PDF files (one per page, if indicated).
Ideally, I would like to do it all on the client side (PDF is a public specification, albeit a REALLY COMPLICATED one).
I have read about an "Adobe plug-in" but I cannot find a specific piece of software that makes the HTMLLoader.pdfCapability report anything but ERROR_CANNOT_LOAD_READER. I hope to load the PDF and then move the bitmap data to an Image in order to save it.
Second choice is sending it off to a web service and getting a set of jpg files back.
Thanks
I would suggest generating the previews on the server side (as the HTMLLoader relies on the client having Adobe Reader installed), unless you can find a component that is capable of generating thumbnails and that does not introduce requirements for specific software to be installed on client machines.
If you're developing an in-house solution for use in your company and you control or can impose requirements for the applications installed on user machines, I guess it won't be that bad. But if you're targeting consumers, there's nothing more annoying than introducing dependencies. And it could turn your customers away from your application.
Also having it server-side is usually the preferred option, since you could persist your thumbnails and not have to generate them every single time (if that makes sense for your application) and you unload heavy processing tasks from your client application. Processing PDFs with large number of pages can be a heavy processing task.
Doing this client-side probably isn't the best idea. It seems overly complicated, could be quite slow, and will require a few unnecessary dependencies. If you do it server-side you can convert the PDFs to images and then send down the image to the client-side. Converting PDFs to images is a relatively simple process if you use a third-party PDF library.

Resources