How can I "lock" files until purchased by user? - asp.net

I'm building a site in which users can purchase MP3 files which they can download from their user login area.
In previous applications I've developed, I would allow admin to upload the file and it would be stored under "/Uploads/MP3s/filename.mp3". However, I need to make this secure so that users cannot gain access to these files until they have purchased them.
What is the best, and most secure, way of doing this?

You should have a database where you store which user bought which mp3. Uploaded mp3's should not be stored in an openly accessable folder. Store them in another folder then the httpfolder, but make sure your iis has access to this folder. This way nobody can guess the path to the file because it's not in under the http-root.
Use a download page which checks the download permissions and only then sends the mp3 to the user with Response.WriteFile(filename) and the correct mime-type etc.
Protected Sub ServeMP3(ByVal f As FileInfo)
Response.Clear()
Response.ContentType = "audio/mpeg3"
Response.AddHeader("content-disposition", "inline; filename=" & f.Name)
Response.WriteFile(f.FullName)
Response.End()
End Sub
Instead of "inline" (stream and play), you can use "attachment" to force a file download

Hide them behind a HTTP Handler, Module, Web Service or Page that can check the validity of the request, and then stream the file or display an error/ redirect to the purchase page.
This will have the advantage of completely abstracting away the real paths for the files too...security through obscurity (:

Related

Asp.Net 6.0 Redirect user to a URL containing json content and start download

I have a scenario where the user passes a fileName to download.
We don't download the file on the server and stream back to the user because of bandwidth restrictions
We get the file path to download, and redirect to the location where the json file would be hosted
[Route("[controller]/DownloadJsonFile")]
public async Task DownloadJsonFile(string fileName)
{
//Get the file name
string fileToDownload = "https://hostedfilelocation/....test.json"
Response.Redirect(fileToDownload);
}
Currently, this method ends up rendering the Json content on the browser.
Is there a way so that the browser can start automatically downloading the file?
That way it wouldn't take super long to render the file on the browser.
P.S. If the file is of type zip or gzip, it is not rendered on the browser but rather is automatically downloaded.
The application is a .Net 6 Asp.Net MVC application
I have tried the below code but the behavior is the same but it renders json on the browser instead of downloading it.
string fileToDownload = "https://hostedfilelocation/....test.json"
HttpResponse response = HttpContext.Response;
response.Clear();
response.ContentType = "application/octet-stream";
response.Headers.Add("Content-Disposition", "attachment; filename=" + fileName);
Response.Redirect(fileToDownload);
The approaches mentioned in this blog post are all mentioning rendering the file in an iframe but I want the download happen on the client side.
Download File via browser redirect
If you want to download it directly, add the download attribute:
<a class='download-file-link' target='_blank' href='DownloadJsonFile' download="somefilename">

Allow folder access only to admin user

I'm new to ASPX and VB.NET and i'm trying to create two different content for two kinds of users.
Actually all pages for a normal user are ready and now i'm trying to make the Admin part i've created a folder Administrator in which there is a index.aspx that only user that logged and have as role in the database "ADMIN" should access it.
The loggin part is done as the following:
Protected Sub loginBtn_Click(sender As Object, e As EventArgs)
If UserExists(username.Value, password.Value) Then
FormsAuthentication.SetAuthCookie(username.Value, False)
If username.Value = "gab" Then
Page.Response.Redirect("\Administrator\Index.aspx", True)
Else
Page.Response.Redirect("Default.aspx", True)
End If
Else
username.Value = ""
ClientScript.RegisterStartupScript(Me.[GetType](), "alert", "openModal();", True)
End If
End Sub
For now i just check if the username is "gab" but lately i'd a function that SELECT the role from the database.
The issue is that if a normal user log and in the path just write \Administrator\index.aspx he can access that folder and even if an administrator change path to "Default.aspx" he can access content of a normal user
I would do that a normal user could see just his aspx pages and the admin just pages in Administrator folder but i need some suggestions on how to do it.
There are a number of ways you can do it, including many not listed here.
You may consider checking the permissions of each user on page load and redirecting them when necessary. This does add mean that you are hitting the database again on each page load, so you'll need to take that into consideration.
You may also try using client side storage, like a cookie, and running the checks client side. You'll want to be careful with what you store on the client side as it may open up security vulnerabilities.
If I knew more about your project, I may be able to give you more specifics.

Avoiding Protected view when opening streamed Excel documents

We have an ASP.NET application which dynamically generates Excel documents and streams them to the client, using Response.WriteFile, see code below. Note that the document is deleted once the file has been written to the client. Thus, no documents are ever left on the server.
However, my client's users has now all upgraded to Office 2010, and now the documents will open in "Protected View". In order to edit the document, the user has to click "Enable editing" first. This is considered unacceptable for the users.
The reason that this happens is that streamed documents are placed in the Temporary Internet files, and this is considered a "potentially unsafe location". And documents in such locations are opened in protected view. I am just hoping there is some way to work around this.
Theoretically, I could place the document in a folder which is accessible from the client, and redirect to the document.
This solution is not an option, however. Firstly, since the document would be left on the server, it could be accessible for other users, which is a problem since the documents may contain confidential data.
There are other reasons why this is not a vialable option.
An other theoretical workaround would be to ask all users to disable the setting "Enable protected view for files located in potentially unsafe locations". Naturally, this is not an option either.
So, in short, is there anyway to avoid the documents to be opened in "Protected view" while using the streaming technique described below?
Response.Buffer = true;
Response.Clear();
Response.AddHeader("Pragma", "no-cache");
Response.Expires = 0;
Response.AddHeader("Content-Type", contentType);
Response.AddHeader("Content-Disposition", "attachment; filename=" + proposedFilename);
Response.WriteFile(dstFullPathName);
Response.Flush();
Response.Close();
File.Delete(dstFullPathName);
HttpContext.Current.ApplicationInstance.CompleteRequest();

asp.net 2.0 asp:FIleUpload control saving uploaded files to a different server

I'm trying to use an asp:FileUpload Control to allow users to upload files (.doc, .gif, .xls, .jpg) to a server that is outside of our DMZ and not the Web Server. We want to have the ability to look at these files for viruses, structure, etc prior to saving them into another directory that would allow access to outside users. From what I have read about this control is that it will allow for files to be uploaded to the web server. Can this control be used to upload files to a server other than the web server? If it can be done where should I look for this type of functionality or how do I force it to go to https:\servername\folder name (Where server name is not the web server)? Would I have to read the file then write it to the other server?
Thanks,
Erin
FileUoload control can only upload data to the web server. If you need to save file to a different server, you need handle the POST request, read data from the Fileupload control and save them to your UNC share.
As far I know, using the fileupload control you actually upload contents to webserver which inturn gets rendered to your client (page) when requested; I don't think you can upload files to different server other than webserver; that shouldn't happen as well. Take a look at the below URL for fileupload if you want
http://msdn.microsoft.com/en-us/library/aa479405.aspx
http://www.asp.net/data-access/tutorials/uploading-files-cs
Thanks.
This depends on your web server setting and permission granted to the application. If it is DMZ then I would assume that a very minimal permission is granted to the application. In such scenario the application will not be able to access any resource other than webserver unless an explicit permission is granted to the account running application to access the network resource (which is not recommended). However, if the nework server you are trying to save the file has ftp enabled, then you could write the bytes streamed in file upload control to the network server with authenticated ftp account that has necessary permission.
You may use the below function:
Imports System.Net
Imports System.IO
Public Function Upload(ByVal FileByte() As Byte, ByVal FileName As String, ByVal ftpUserID As String, ByVal ftpPassword As String, ByVal ftpURL As String) As Boolean
Dim retValue As Boolean = False
Try
Dim ftpFullPath As String = ftpURL + "/" + FileName
Dim ftp As FtpWebRequest = FtpWebRequest.Create(New Uri(ftpFullPath))
ftp.Credentials = New NetworkCredential(ftpUserID, ftpPassword)
ftp.KeepAlive = True
ftp.UseBinary = True
ftp.Method = WebRequestMethods.Ftp.UploadFile
Dim ftpStream As Stream = ftp.GetRequestStream()
ftpStream.Write(FileByte, 0, FileByte.Length)
ftpStream.Close()
ftpStream.Dispose()
retValue = True
Catch ex As Exception
Throw ex
End Try
Return retValue
End Function
Function Call:
Upload(FileUploadControl.FileBytes, "filename.ext" "user", "password", "ftppath")

asp.net secure images against static requests from other users?

I work on a site that generates dynamic images for each specific user. Sometimes these images contain depictions of very sensitive data. Lately we have started to see requests for images that belong to a different user in the form of
http://myapp/images/someuid/image1.jpg
obviously, someone figured out they could access another users images if they created the proper URL. we store the images to the file system to help reduce bandwidth.
how can we protect this - some sort of http handler?
is there a way of serving the image to take advantage o -f caching without having to write it to the file system and letting IIS do the dirty work?
Use an .ashx:-
TimeSpan maxAge = new TimeSpan(0, 15, 0); //!5 minute lifetiem.
context.Response.ContentType = "image/gif";
context.Response.Cache.SetCacheability(HttpCacheability.Private);
context.Response.Cache.SetExpires(DateTime.UtcNow.Add(maxAge));
context.Response.Cache.SetMaxAge(maxAge);
context.Response.Cache.SetLastModified(lastModified); // last modified date time of file
context.Response.WriteFile(filenameofGif);
You can include what ever code checks you need to ensure the correct users is accessing the image.
I think the best option would be to deny direct access to the images from the web and create an aspx that will check users permissions and return the right image.
If the images are to be private to a particular user, then you should either store them outside the main application folder or put a web.config in each of those image folders (like someuid) and limit the access in the configuration file - either cutting out everyone (deny="*") or allowing access just for the particular user (allow="john").
In both cases you can use a handler to stream the image to the user, but at least you can check for permissions now. If the requesting user does not have permissions then throw a 401 at him or even display another image like imagenotfound.gif.
However, I am afraid the handler will generate a lot of traffic as there will be one call per image, I don't know how many images you're displaying per user.

Resources