I was wondering what's the best practise for serving a generated big file in classic asp.
We have an application with "export to excel" function that produces 10MB files. The excels are created by just calling a .asp page that has the Response.ContentType set to excel and has an HTML table for the data.
This gives as problem that it takes 4 minutes before the user sees the "Save as..." dialog.
My current solution is to call an .asp page that creates the excel on the server with AJAX and lets the page return the URL of the generated document. Then I can use javascript to display the on the original page.
Is this easy to do with classic asp (creating files on server with some kind of stream) while keeping security in mind? (URL should make people be able to guess the location of other files)
How would I go about handling deleted the generated files overtime? They have to be deleted periodicly as the data changes in realtime.
Thanks.
edit: I realized now that creating the file on the server will probably also take 4 minutes...
I think you are selecting a complex route, when the solution is simple enough (Though I may be missing some requirements)
If you to generate an excel, just call an asp page that do the following:
Response.clear
Response.AddHeader "content-disposition", "attachment; filename=myexcel.xls"
Response.ContentType = "application/excel"
'//write the content of the file
Response.write "...."
Response.end
This will a start a download process in the browser without needing to generate a extra call, javascript or anything
See this question for more info on the format you will choose to generate the excel.
Edit
Since Thomas update the question and the real problem is that the file take 4 minutes to generate, the solution could be:
Offer the user the send the file by email (if this is a workable solution in you server or hosting).
Generate the file async, and let the user know when the file generation is done (with an ajax call, like SO does when other user have added an answer)
To generate the file on the server
'//You should change for a random name or something that makes sense
FileName = "C:\temp\myexcel.xls"
FileNumber = FreeFile
Open FileName For Append As #FileNumber
'//generate the content
TheRow = "...."
Print #FileNumber, TheRow
Close #FileNumber
To delete the temp files generated
I use Empty Temp Folders a freeware app that I run daily on the server to take care of temp files generated. (Again, it depends on you server or hosting)
About security
Generate the files using random numbers or GUIds for a light protection. If the data is sensitive, you will need to download the file from a ASP page, but I think that you will be in the same problem again...(waiting 4 minutes to download)
Read file using FSO.
Set headers for Excel file-type, name according to file read and for download (attachment)
Flush response after headers are set. The client should display "save as" dialogue.
Output FSO to response. Client will download file and see progress bar.
How do you plan to generate the Excel? I hope you don't plan to call Excel to do that, as it is unsupported, and generally won't work well.
You should check to see if there are COM components to generate Excel that you can call from Classic ASP. Alternatively, add one ASP.NET page for the purpose. I know for a fact that there are compoonents that can be called from ASP.NET pages to do this. Worse come to worst, there's an Excel exporter component from Infragistics that works with their UltraWebGrid control to export. The grid need not be visible in order to accomplish this, but styles in the grid translate to styles in the spreadsheet. They also allow you to manipulate the spreadsheet programmatically.
Related
A bit of an odd use-case scenario but I have the need for a program
that allows users to select file(s) and have those full path file
names sent to a SQL database.
I’ve built said program in ASP .NET Web application written in VB
using Visual Studio 17.3.5. The FileUpload Control records the users
selected files and those filenames successfully transfer to the SQL
database.
The problem I’m having is it appears the FileUpload control is doing
what’s it’s designed to do… actually upload the file, not sure where
but I’m assuming to some temp space. I noticed this when I selected
a large file to test with, there was a lag in the system and if the
file was larger than ‘maxRequestLength’ it would crash. Again my goal
is to record and save just the filenames of the selected files not the
actual files themselves.
If it wasn’t obvious I’m very green when it comes to coding, any help
is greatly appreciated!
Initially I increased 'maxRequestLength' in the Web.config but that
didn't address the problem, which is the FileUpload control is
actually uploading or trying to upload the file somewhere.
Current code related to FUD:
*Side note: I’m using a Wizard control and this code is submitted/executed when the user clicks ‘Finish’ at the end of the Wizard.
If fud_SelectFiles.HasFiles Then
For Each uploadedfile In fud_SelectFiles.PostedFiles
StrFile_Name += String.Format(Server.MapPath(uploadedfile.FileName)) + ","
'Trims last comma from file name and places into session variable
Session("v_File_Name") = StrFile_Name.TrimEnd(",")
Next
End If
Current situation
I have ASP.NET web application that render PDF for users using MS Report Viewer. The PDF is rendered with this method:
Byte pdfByte = Byte();
pdfByte = ReportViewer.LocalReport.Render("PDF", Nothing, mimeType, encoding, extensions, stream, warning)
And send to browser as an attachment with response object:
Response.Clear()
Response.ContentType = mimeType
Response.AddHeader("content-disposition", "attachment; filename=myfile." + extension)
Response.BinaryWrite(pdfByte)
Response.Flush()
Response.End()
This work great! The user browser will get the PDF as download-able attachment.
What I am trying to achieve
Render multiple PDF and send all of them separately to user's browser. User will get separate PDF documents. It doesn't matter whether they will get them all at once or one by one.
The problem
The problem is after Response.End() the next line of code is not executed. I have tried to store the pdfByte object in session, looping through it and send them to user's browser with Response object but after the first PDF get sent then it stop.
I have also tried removing Response.End() thinking the code will keep running but still it stop after the first PDF get sent.
Please advice any workaround or tips. Thanks!
You cannot send multiple files (as separate entities) in a single HTTP response (the protocol does not support it. However, what you can do is to archive all files together and send the that single zip (or whatever format you want) to the client.
You can use libraries such as DotNetZip/SharpZipLib to combine (and compress) files together. Based on library API, you may need to save PDF files to disk before adding to zip file. Also do not forget to change your content type appropriate while sending the zip file to client.
Yet another alternative is to provide user with a page having multiple links to download files. It may mean that you either have to store your PDFs for some time so that they can served later (via links) or make link point to a handler that will re-run the report again to get the PDF out of it.
Admittedly the method I'm using doesn't feel very elegant, but here's what I'm doing:
create one IFRAME on the page for each document you want to send to the client (maybe create the IFRAMEs dynamically in server-side code if the number of documents is variable);
create a HttpHandler that generates the PDF documents, depending on a parameter you're passing in through the QueryString, just like you're doing above;
set the src on all IFRAMES to the URL of the HttpHandler with the appropriate parameters attached.
Of course the HttpHandler needs to do implement security logic, if required.
This works quite beautifully: If I want to send 3 documents, I create 3 IFRAMEs, set their src, and the user will see 3 "Save As..." dialogs pop up.
I am using file upload mechanism to upload file for an employee and converting it into byte[] and passing it to varBinary(Max) to store into database.
Now I what I have to do is, if any file is already uploaded for employee, simply read it from table and show file name. I have only one column to store a file and which is of type VarBinary.
Is it possible to get all file information from VarBinary field?
Any other way around, please let me know.
If you're not storing the filename, you can't retrieve it.
(Unless the file itself contains its filename in which case you'd need to parse the blob's contents.)
If the name of the file (and any other data about the file that's not part of the file's byte data) needs to be used later, then you need to save that data as well. I'd recommend adding a column for the file name, perhaps one for its type (mime type or something like that for properly sending it back to the client's browser, etc.) and maybe even one for size so you don't have to calculate that on the fly for each file (useful when displaying a grid of files and not wanting to touch the large blob field in the query that populates the grid).
Try to stay away from using the file name for system-internal identity purposes. It's fine for allowing the users to search for a file by name, select it, etc. But when actually making the request to the server to display the file it's better to use a simple integer primary key from the table to actually identify it. (On a side note, it's probably a good idea to put a unique constraint on the file name column.)
If you also need help displaying the file to the user, you'll probably want to take the approach that's tried and true for displaying images from a database. Basically it involves having a resource (generally an .aspx page, but could just as well be an HttpHandler instead) which accepts the file ID as a query string parameter and outputs the file.
This resource would have no UI (remove everything from the .aspx except the Page directive) and would manually manipulate the response headers (this is where you'd set the content type from the file's type), write the byte stream to the client, and end the response. From the client's perspective, something like ~/MyContent/MyFile.aspx?fileID=123 would be the file. (You can suggest a file name to the browser for saving purposes in the response headers, which you'd probably want to do with the file's stored name.)
There's no shortage of quick tutorials (some several years old, it's been around for a while) on how to do this with images. Just remember that there's essentially no difference from the server's perspective if it's an image or any other kind of file. All the server needs to do is send the type in the response headers and write the file's bytes to the client. How the client handles the file is up to the browser. In the vast majority of cases, the browser will know what to do (display an image, display via a plugin a PDF, save a .doc, etc.).
I have to create a tab delimited txt file from a query.
I want to call an HttpHandler that returns my txt file as a stream, I don't want to create the file phisically.
1st question:
what is the best practice to create the tab delimited txt file from a query result?
I have to fetch all rows and create the file manually?
2nd question:
How to set a timeout for the HttpHandler that creates the file?
Thanks for your time.
I would create a plain old http output stream and change the content type to 'text/plain' which means that you don't need to physically create the file on the web server, and if you add the content-disposition header to the output and specify that it has an attachment called something like 'report.txt' the user will be prompted to Open or Save the content, rather than just viewing it in the browser like a normal web page.
You can use the Server.ScriptTimeout = x to set the script timeout by gaining access to the current HttpContext object
Hope this helps
As part of a Classic ASP Project the user should be able to download a file - which is dynamicly extracted from a zip archive and sent via Response.BinaryWrite() - by simply calling "document.asp?id=[some id here]".
Extracting and sending is not the problem but I need to delete the extracted file after the download finished. I never did any ASP or VBA before and I guess that's why I stuck here.
I tried deleting the file right after Response.WriteBinary() using FileSystemObject.DeleteFile() but this results in a 404-Error on the client-side.
How can I wait till the download finished and then do additional actions?
Edit: This is how my code looks like:
'Unzip a specified file from an archive and put it's path in *document*
set stream = Server.CreateObject("ADODB.Stream")
stream.Open
stream.Type = 1 ' binary
stream.LoadFromFile(document)
Response.BinaryWrite(stream.Read)
'Here I want to delete the *document*
I suspect that the point you are calling the DeleteFile method the file you are trying delete is currently locked by something else, the question is what?
Try including:-
stream.Close()
after your BinaryWrite. Also make sure you've done a similar thing to the component you've used to extract the file. If the component doesn't offer any obviouse "close" methods they trying assigning Nothing to the variables referencing them.
Is it not possible to stream the file into memory, then binary write the stream to the browser, this way the file is never created on the server and there is no need to delete it.
I found a solution: The extracted files are saved in a special directory and everytime a user runs the document.asp it checks this directory for files older than one hour and deletes them.
I think it's the simplest way to manage, but furthermore I would prefer a solution where the document is deleted after downloading.