ASP.NET - copy file from webserver to USB key - asp.net

i'm on a task where I need to create a file serverside and move this file to an USB key.
Is it possible to copy a file from a webserver to an USB Key ?
(any security issues)
Furthermore the user needs to indicate to which path the file needs to be saved on. Is there a control like the asp upload control, where the user can browse to the right directory or is the simple solution to use a textbox, where the user can write e.g. "E:\mygeneratedfiles"
The USB key is on the users local machines

From the ASP.NET perspective, you can return the file in HTTP response, but once the file is sent to the client web browser, you're pretty much out of luck.
There might be something you can do with javascript to streamline the saving process (not my area of expertise), but accessing the client's filesystem directly, especially writing to it, is out of the question. If you want to do that you'll have to write an ActiveX control or similar type of plugin.
Edit:
For returning the file in the HTTP response, load your file in to a 1-dimensional byte array and use the following code pattern:
context.Response.Clear()
context.Response.AddHeader("content-disposition", "attachment;filename=" & objFile.FileName)
context.Response.BinaryWrite(objFile.FileImage)
context.Response.End()
In this example objFile.FileName is the file name string and objFile.FileImage is a Byte array containing the file. context is the current HttpContext.

Use this Code samples on this FileUpload control page

Related

Uploading file in web applications

This may be immature question but...
When we use html input file control to upload a file, OS encrypts! the full path of the file due to security. i.e.: C:\falsepath\XXXXXX.txt
why security has to be enforced, since the client is the one uploading the file, he obviously knows the location, why can't it just provide full path (client script)
But how does server gets stream of bytes from client?
Can somebody explain me what is happening behind the screen?
OS-windows environment , Browsers -all
Server does not to know what is local path, browser sends to him stream of bytes. Local path is for good looking for user, nothing else.
If you ask: how does BROWSER know where the file is, this is good question, but you didn't write what is your OS.
You should know, that the server is completely separated from the client.
The client application sends to the server a message, which contains the content of the file and a file name (just the name of the file, not the directory. The change of the actual name to the C:\falsepath* is made only to prevent scripts on client's side to know anything about the original location, which may contain sensitive information you don't want to publish.

FileUpload virus protection of server

My ASP.NET Application has an FileUpload control. My server doesn't have any antivirus program. If I add a byte to binary content of the file before saving file, does my server affect from virus? When displaying file, I will remove extra byte from the content.
Thanks for replies.
A virus will only cause you problems if it is run on the server (i.e. the file is opened). You can get around this by renaming all uploaded files with a .resources extension. All requests for this type of file are sent by IIS to ASP.NET, which rejects them. So effectively, the files store the data but can't be opened/run at all. Then you can still serve them back by reading their content in an ASP.NET page/module, and returning the data as a file with the correct extension.
Transforming the data as you suggest will also provide a level of protection, though I'd probably do more than add a byte to the end. Perhaps run the whole stream through a reversible algorithm (e.g. a fast encryption or something).
Of course, this doesn't protect the client from any virus.

Storing a file on the server AND in the database?

I'm using MVC 3 and SQL Server 2008 R2.
I'm building a File Management tool for my client, so they can store images and pdfs, and then insert them onto pages.
I've been looking into FILESTREAM in SQL Server, and it looks great. It allows me to store the files, as well as keep a backup of them in case something goes wrong.
But I also want the files stored physically somewhere, so that my client can send a link to someone, like: http://www.mysite.com/files/mydoc.pdf
What's the best practice here? Is it safe and/or smart to use both?
If serving up the files via URL is your only reason for storing on the server too, I'd say that you don't want to do this.
In MVC it's trivial to create a controller action that handles that URL, looks up the file in the DB and returns it to the user. The filename is simply an action parameter in this case, your action logic takes that parameter retrieves the file from the database and return it with a FileStreamResult
return File(fileStream, contentType, fileName);
Here's some more info. About half way down the page shows an example of using FileStreamResult
http://www.mikesdotnetting.com/Article/125/ASP.NET-MVC-Uploading-and-Downloading-Files

ASP.net file operations delay

Ok, so here's the problem: I'm reading the stream from a FileUpload control, reading in chunks of n bytes and writing the array in a loop until I reach the stream's end.
Now the reason I do this is because I need to check several things while the upload is still going on (rather than doing a Save(); which does the whole thing in one go). Here's the problem: when doing this from the local machine, I can see the file just fine as it's uploading and its size increases (had to add a Sleep(); clause in the loop to actually get to see the file being written).
However, when I upload the file from a remote machine, I don't get to see it until the the file has completed uploading. Also, I've added another call to write the progress to a text file as the progress is going on, and I get the same thing. Local: the file updates as the upload goes on, remote: the token file only appears after the upload's done (which is somewhat useless since I need it while the upload's still happening).
Is there some sort of security setting in (or ASP.net) that maybe saves files in a temporary location for remote machines as opposed to the local machine and then moves them to the specified destination? I would liken this with ASP.net displaying error messages when browsing from the local machine (even on the public hostname) as opposed to the generic compilation error page/generic exception page that is shown when browsing from a remote machine (and customErrors are not off)
Any clues on this?
Thanks in advance.
FileUpload control renders as an <input type="file"> HTML element; this way, your browser will open that file, read ALL content, encode and send it.
Your ASP.NET request just starts after IIS receives all browser data.
This way, you'll need to code a client component (Flash, Java applet, Silverlight) to send a file in small chunks and rebuild that at server-side.
EDIT: Some information on MSDN:
To control whether the file to upload is temporarily stored in memory or on the server while the request is being processed, set the requestLengthDiskThreshold attribute of the httpRuntime element. This attribute enables you to manage the size of the input stream buffer. The default is 256 bytes. The value that you specify should not exceed the value that you specify for the maxRequestLength attribute.
I understand that you want to check the file which is being uploaded for it's content.
If this is your requirement then why not add a textbox and populate it while you are reading the file from HttpPostedFile.

Export large amounts of data to client in asp.net

I need to export a large amount of data (~100mb) from a sql table to a user via the web. What would be the best solution for doing so? One thought was to export the data to a folder on the db server, compress it (by some means) and then provide a download link for the user. Any other methods for doing so? Also, can we compress data from within sql server?
Any approaches are welcome.
I wouldn't tie up the database waiting for the user to download 100Mb, even for a high speed user. When the user requests the file have them specify an email address. Then call an asynch process to pull the data, write it to a temp file (don't want > 100mb in memory after all), then zip the temp file to a storage location, then send the user an email with a link to download the file.
You can respond to a page request with a file:
Response.AddHeader("Content-Disposition",
"attachment; filename=yourfile.csv");
Response.ContentType = "text/plain";
Be sure to turn buffering off, so IIS can start sending the first part of the file while you are building the second:
Response.BufferOutput = false;
After that, you can start writing the file like:
Response.Write("field1,field2,field3\r\n");
When the file is completely written, end the response, so ASP.NET doesn't append a web page to your file:
Response.End();
This way, you don't have to write files on your web servers, you just create the files in memory and send them to your users.
If compression is required, you can write a ZIP file in the same way. This is a nice free library to create ZIP files.
Your approach works fine. SSIS + 7zip might be useful for automating the process if you need to do it more than a couple times.
If XML is OK, one approach would be to select the data "FOR XML" like this:
http://www.sqljunkies.ddj.com/Article/296D1B56-8BDD-4236-808F-E62CC1908C4E.scuk
And then spit out the raw XML directly to the browser as content-type: text/xml. Also be sure to set up Gzip compression on your web server for files with XML extensions. http://www.microsoft.com/technet/prodtechnol/WindowsServer2003/Library/IIS/502ef631-3695-4616-b268-cbe7cf1351ce.mspx?mfr=true
This will shrink the XML file down to 1/3 or maybe 1/4 the size as it's transferred. This wouldn't be the highest performance option because of the inherent wasted space in XML files, but a lot depends on what format you're looking for in the end.
Another option would be to use the free CSharpZipLib to compress the XML (or whatever format you want) into a zip file that the user would download. Along those lines, if this is something that will be used frequently you might want to look into caching and storing the zip file on the web server with some sort of expiration so it's not regenerated for every single request.
The download link is a perfectly valid and reasonable solution. Another would be to automatically redirect the user to that file so they didn't need to click a link. It really depends on your workflow and UI experience.
I would suggest against implementing compression in the SQL Server engine. Instead look at the DotNetZip library (Or System.IO.Conpression if you think your users have the capability of uncompressing gzip archives) and implement the compression within the web application.

Resources