FileUpload virus protection of server - asp.net

My ASP.NET Application has an FileUpload control. My server doesn't have any antivirus program. If I add a byte to binary content of the file before saving file, does my server affect from virus? When displaying file, I will remove extra byte from the content.
Thanks for replies.

A virus will only cause you problems if it is run on the server (i.e. the file is opened). You can get around this by renaming all uploaded files with a .resources extension. All requests for this type of file are sent by IIS to ASP.NET, which rejects them. So effectively, the files store the data but can't be opened/run at all. Then you can still serve them back by reading their content in an ASP.NET page/module, and returning the data as a file with the correct extension.
Transforming the data as you suggest will also provide a level of protection, though I'd probably do more than add a byte to the end. Perhaps run the whole stream through a reversible algorithm (e.g. a fast encryption or something).
Of course, this doesn't protect the client from any virus.

Related

Save file directly to disk in ASP.NET without loading it into memory

I have an ASP.NET web application and I want my users to be able to upload large files. However, some files are very large and uses too much memory.
In principle it should be possible to receive the request stream and write it directly to a FileWriter stream, removing any need to load the entire file into memory first.
I've tried accessing Request.InputStream and writing it directly to a file. It works, but a test using larger files reveal that Request.InputStream is only available after the entire request is already loaded into memory.
Can someone tell me an approach I can use to receive a normal Request.InputStream in ASP.NET and directly write it to a file without first loading it into memory?
Note, the file is sent through a normal request in a browser by posting a form with a file field.
(I actually use BlueImp JQuery File Upload but I don't think it's relevant to this question)
The process is called byte serving.
Byte Serving:
Byte serving is the process of sending only a portion of an HTTP/1.1 message from a server to a client. Byte serving begins when an HTTP server advertises its willingness to serve partial requests using the Accept-Ranges response header. A client then requests a specific part of a file from the server using the Range request header.
Is seems that IIS and ASP.NET are capable of handling Accept-Range headers. There is a Range Controller on Microsoft git repositories.
Here is an article that may be useful in configuring IIS to handle these requests.

Uploading file in web applications

This may be immature question but...
When we use html input file control to upload a file, OS encrypts! the full path of the file due to security. i.e.: C:\falsepath\XXXXXX.txt
why security has to be enforced, since the client is the one uploading the file, he obviously knows the location, why can't it just provide full path (client script)
But how does server gets stream of bytes from client?
Can somebody explain me what is happening behind the screen?
OS-windows environment , Browsers -all
Server does not to know what is local path, browser sends to him stream of bytes. Local path is for good looking for user, nothing else.
If you ask: how does BROWSER know where the file is, this is good question, but you didn't write what is your OS.
You should know, that the server is completely separated from the client.
The client application sends to the server a message, which contains the content of the file and a file name (just the name of the file, not the directory. The change of the actual name to the C:\falsepath* is made only to prevent scripts on client's side to know anything about the original location, which may contain sensitive information you don't want to publish.

How does HTTP file upload work internally in IIS?

I'd like to understand what happen under the hood when you do an web upload.
I guess one of these:
The file is loaded in memory by the browser, sent to the web server buffer memory, and then the app is notified to collect it.
The file is being readed by the browser and at the same time sent to the web server, that can start to save the bytes progresively.
I've tried to upload a very large file, and put a breakpoint on the frist line of the method receiving the upload. I've seen how the browser toke a lot of time loading... but the breakpoint was still not hit, and after a while the breakpoint is hit.
I want to understand this, because in the worst scenario, if I allow big uploads, they could blow up the server memory at some point.
What does happen if I upload a 2Gb file? (considering that the web server/app accepts that length) would it take 2Gb of server memory?
Cheers.
The documentation for the HttpPostedFile class (which represents a file uploaded to the server in ASP.NET) specifies:
Files are uploaded in MIME
multipart/form-data format. By
default, all requests, including form
fields and uploaded files, larger than
256 KB are buffered to disk, rather
than held in server memory.

ASP.net file operations delay

Ok, so here's the problem: I'm reading the stream from a FileUpload control, reading in chunks of n bytes and writing the array in a loop until I reach the stream's end.
Now the reason I do this is because I need to check several things while the upload is still going on (rather than doing a Save(); which does the whole thing in one go). Here's the problem: when doing this from the local machine, I can see the file just fine as it's uploading and its size increases (had to add a Sleep(); clause in the loop to actually get to see the file being written).
However, when I upload the file from a remote machine, I don't get to see it until the the file has completed uploading. Also, I've added another call to write the progress to a text file as the progress is going on, and I get the same thing. Local: the file updates as the upload goes on, remote: the token file only appears after the upload's done (which is somewhat useless since I need it while the upload's still happening).
Is there some sort of security setting in (or ASP.net) that maybe saves files in a temporary location for remote machines as opposed to the local machine and then moves them to the specified destination? I would liken this with ASP.net displaying error messages when browsing from the local machine (even on the public hostname) as opposed to the generic compilation error page/generic exception page that is shown when browsing from a remote machine (and customErrors are not off)
Any clues on this?
Thanks in advance.
FileUpload control renders as an <input type="file"> HTML element; this way, your browser will open that file, read ALL content, encode and send it.
Your ASP.NET request just starts after IIS receives all browser data.
This way, you'll need to code a client component (Flash, Java applet, Silverlight) to send a file in small chunks and rebuild that at server-side.
EDIT: Some information on MSDN:
To control whether the file to upload is temporarily stored in memory or on the server while the request is being processed, set the requestLengthDiskThreshold attribute of the httpRuntime element. This attribute enables you to manage the size of the input stream buffer. The default is 256 bytes. The value that you specify should not exceed the value that you specify for the maxRequestLength attribute.
I understand that you want to check the file which is being uploaded for it's content.
If this is your requirement then why not add a textbox and populate it while you are reading the file from HttpPostedFile.

Export large amounts of data to client in asp.net

I need to export a large amount of data (~100mb) from a sql table to a user via the web. What would be the best solution for doing so? One thought was to export the data to a folder on the db server, compress it (by some means) and then provide a download link for the user. Any other methods for doing so? Also, can we compress data from within sql server?
Any approaches are welcome.
I wouldn't tie up the database waiting for the user to download 100Mb, even for a high speed user. When the user requests the file have them specify an email address. Then call an asynch process to pull the data, write it to a temp file (don't want > 100mb in memory after all), then zip the temp file to a storage location, then send the user an email with a link to download the file.
You can respond to a page request with a file:
Response.AddHeader("Content-Disposition",
"attachment; filename=yourfile.csv");
Response.ContentType = "text/plain";
Be sure to turn buffering off, so IIS can start sending the first part of the file while you are building the second:
Response.BufferOutput = false;
After that, you can start writing the file like:
Response.Write("field1,field2,field3\r\n");
When the file is completely written, end the response, so ASP.NET doesn't append a web page to your file:
Response.End();
This way, you don't have to write files on your web servers, you just create the files in memory and send them to your users.
If compression is required, you can write a ZIP file in the same way. This is a nice free library to create ZIP files.
Your approach works fine. SSIS + 7zip might be useful for automating the process if you need to do it more than a couple times.
If XML is OK, one approach would be to select the data "FOR XML" like this:
http://www.sqljunkies.ddj.com/Article/296D1B56-8BDD-4236-808F-E62CC1908C4E.scuk
And then spit out the raw XML directly to the browser as content-type: text/xml. Also be sure to set up Gzip compression on your web server for files with XML extensions. http://www.microsoft.com/technet/prodtechnol/WindowsServer2003/Library/IIS/502ef631-3695-4616-b268-cbe7cf1351ce.mspx?mfr=true
This will shrink the XML file down to 1/3 or maybe 1/4 the size as it's transferred. This wouldn't be the highest performance option because of the inherent wasted space in XML files, but a lot depends on what format you're looking for in the end.
Another option would be to use the free CSharpZipLib to compress the XML (or whatever format you want) into a zip file that the user would download. Along those lines, if this is something that will be used frequently you might want to look into caching and storing the zip file on the web server with some sort of expiration so it's not regenerated for every single request.
The download link is a perfectly valid and reasonable solution. Another would be to automatically redirect the user to that file so they didn't need to click a link. It really depends on your workflow and UI experience.
I would suggest against implementing compression in the SQL Server engine. Instead look at the DotNetZip library (Or System.IO.Conpression if you think your users have the capability of uncompressing gzip archives) and implement the compression within the web application.

Resources