What should I use for writing a file to the response? There are two different options as I see it. Option one is to read the file into a stream and then write the bytes to the browser with
Response.BinaryWrite(new bytes[5])
Next option is to just write the file from the file system directly with Response.WriteFile. Any advantages/disadvantages with either approach?
Edit: Corrected typos
Response.TransmitFile is preferred if you have the file on disk and are using at least asp.net 2.0.
Response.WriteFile reads the whole file into memory then writes the file to response. TransmitFile "Writes the specified file directly to an HTTP response output stream without buffering it in memory."
The other consideration is whether this is a file which is written one time or frequently. If you are frequently writing this file, then you might want to cache it, thus Response.BinaryWrite makes the most sense.
If you have it in memory, I would not write it to the file system and use Response.WriteFile.
Related
The ChannelSftp class has versions of get() and put() methods not returning anything, or returning InputStream/OutputStream.
What's the use case for using the methods returning streams, and reading/writing the files byte by byte, versus using the easy get() and put() methods where you specify the source and destination paths, and let the program do everything for you?
My guess is if you are downloading and playing a video/audio file would be one case, but what if you just move files to/from one server to another? Any point in using the streams then?
Here is the documentation:
http://epaul.github.io/jsch-documentation/javadoc/com/jcraft/jsch/ChannelSftp.html#get(java.lang.String,%20java.lang.String)
As with any other I/O interface, the variants with streams are useful when you do not manipulate files, but in-memory data.
For example, you might have produced the content based on user input and you want to upload it. You do not need the local copy in a file. So you stream the in-memory data to SFTP.
Streams are also useful abstraction.
If you are uploading from a file or downloading to a file, use the overloads that take paths. Creating a file stream is unnecessary overhead in this case.
My ASP.NET Application has an FileUpload control. My server doesn't have any antivirus program. If I add a byte to binary content of the file before saving file, does my server affect from virus? When displaying file, I will remove extra byte from the content.
Thanks for replies.
A virus will only cause you problems if it is run on the server (i.e. the file is opened). You can get around this by renaming all uploaded files with a .resources extension. All requests for this type of file are sent by IIS to ASP.NET, which rejects them. So effectively, the files store the data but can't be opened/run at all. Then you can still serve them back by reading their content in an ASP.NET page/module, and returning the data as a file with the correct extension.
Transforming the data as you suggest will also provide a level of protection, though I'd probably do more than add a byte to the end. Perhaps run the whole stream through a reversible algorithm (e.g. a fast encryption or something).
Of course, this doesn't protect the client from any virus.
I need to export a large amount of data (~100mb) from a sql table to a user via the web. What would be the best solution for doing so? One thought was to export the data to a folder on the db server, compress it (by some means) and then provide a download link for the user. Any other methods for doing so? Also, can we compress data from within sql server?
Any approaches are welcome.
I wouldn't tie up the database waiting for the user to download 100Mb, even for a high speed user. When the user requests the file have them specify an email address. Then call an asynch process to pull the data, write it to a temp file (don't want > 100mb in memory after all), then zip the temp file to a storage location, then send the user an email with a link to download the file.
You can respond to a page request with a file:
Response.AddHeader("Content-Disposition",
"attachment; filename=yourfile.csv");
Response.ContentType = "text/plain";
Be sure to turn buffering off, so IIS can start sending the first part of the file while you are building the second:
Response.BufferOutput = false;
After that, you can start writing the file like:
Response.Write("field1,field2,field3\r\n");
When the file is completely written, end the response, so ASP.NET doesn't append a web page to your file:
Response.End();
This way, you don't have to write files on your web servers, you just create the files in memory and send them to your users.
If compression is required, you can write a ZIP file in the same way. This is a nice free library to create ZIP files.
Your approach works fine. SSIS + 7zip might be useful for automating the process if you need to do it more than a couple times.
If XML is OK, one approach would be to select the data "FOR XML" like this:
http://www.sqljunkies.ddj.com/Article/296D1B56-8BDD-4236-808F-E62CC1908C4E.scuk
And then spit out the raw XML directly to the browser as content-type: text/xml. Also be sure to set up Gzip compression on your web server for files with XML extensions. http://www.microsoft.com/technet/prodtechnol/WindowsServer2003/Library/IIS/502ef631-3695-4616-b268-cbe7cf1351ce.mspx?mfr=true
This will shrink the XML file down to 1/3 or maybe 1/4 the size as it's transferred. This wouldn't be the highest performance option because of the inherent wasted space in XML files, but a lot depends on what format you're looking for in the end.
Another option would be to use the free CSharpZipLib to compress the XML (or whatever format you want) into a zip file that the user would download. Along those lines, if this is something that will be used frequently you might want to look into caching and storing the zip file on the web server with some sort of expiration so it's not regenerated for every single request.
The download link is a perfectly valid and reasonable solution. Another would be to automatically redirect the user to that file so they didn't need to click a link. It really depends on your workflow and UI experience.
I would suggest against implementing compression in the SQL Server engine. Instead look at the DotNetZip library (Or System.IO.Conpression if you think your users have the capability of uncompressing gzip archives) and implement the compression within the web application.
When is PostedFile.InputStream available when uploading a large file?
I'd like to pass a Stream to another process and I'm hoping that if a large file was being uploaded that I can pass the Stream straight to that new process w/o writing to the file system. Since the process and/or upload could take a while, I'm wondering if I can start reading the InputStream immediately or whether I have to wait for the whole file to be transferred to the server before it can be processed.
I guess a more general question is - what's the lifecycle of a POST request when file upload is involved?
The PostedFile.InputStream isn't available until the entire file has been uploaded. IIS6 caches the file in memory while IIS7 now caches the file to disk before handing off the input stream to your method.
You can get a HttpModule such as NeatUpload which allows you access to the bits while they're uploading.
Is the WriteFile call properly synchronous, and can I delete the file written immediately after the call?
If you're writing a file to the client with Response.WriteFile(), a call to Response.Flush() will make sure it's been fully output to the client. Once that's done you can delete it off of the webserver.
You may want to come up with a more robust system if the file is mission-critical. Say, a client-side script to validate that the file was received OK and then alerts the webserver that the file can be deleted.
That is the solution, after use the syntax Response.WriteFile(fileName);, type the following code lines:
Response.Flush();
System.IO.File.Delete(fullPathFileName);
Response.End();
It is fully synchronous, as you can see by looking at the implementation of HttpResponse.WriteFile with Lutz Reflector. You can delete the file immediately after the call to Response.WriteFile.
You don't have the guarantee that the response stream has been completely transmitted to the client, but calling Response.Flush doesn't give you that guarantee either. So I don't see a need to call Response.Flush before deleting the file.
Avoid loading the file into a MemoryStream, it brings you no benefit, and has a cost in memory usage, especially for large files.
If memory serves it is synchronous, as are the rest of the RESPONSE commands.
TransmitFile
You can also call TransmitFile to let IIS take care of it. It actually gets sent by IIS outside of your worker processs.
Memory Stream
If you are REALLY paranoid, don't send the file. Load it into a memory stream (if the size is reasonable) and transmit that. Then you can delete the file whenever you like. The file on disk will never be touched by IIS.