We have a asp.net website which allows users to upload files. Currently we do this by filestream. User clicks a button and we create the folder under site root> downloads > folder-idofuser > filename. Is there a better way than this. ie all files in a specific location outsite application ???. Paths are not stored in the database for access we just list the files in the relative folder on the file stream. Maybe we should do it different any advice?
They can upload as many files as they want. and presently we get about 30,000 per month. Due to our folder structure we are now experiencing slowness on the folder so we archived folders over 6 months old to a new location on the hd. Lots of files and was ok at the start but now with over 600gigs worth it a bit of a nightmare. Still leaves hundreds of thousands of folders under 1 folder.
I have read lots and lost of articles and questions im still not sure whats the best thing to do see 2 examples:
Looked at https://stackoverflow.com/questions/810215/best-practice-checks-when-allowing-users-to-upload-files-to-a-web-application
and Managing user-uploaded files with an ASP.NET website and Visual Studio
Now we are planning on moving from current in house hosting to a server farm. This seems very expensive when having so many files as we do. Should we have 2 servers 1 for applications/ websites and one for files and database. or would one suffice. obviously money is a big part of the equations but, trying to get a good idea as to whats best for the pound etc. any good ones please advise. And how should we store the files and access them. should we use sub domains etc and how.
Hope the questions not to boring, thanks all in advance
Isn't it better to save the files on the server with unique names (Guid) and save the path to the database.
This way you only need to query your database which is realy fast and not loop through all your files on your system.
And when someone requests the file just give it to the user. This could be realy fast i think. This way you can also have a script that will run and archive your files every month or day. And set a flag in the record that it is being archived with the new path to the file. This can be done automatically every night.
Related
I'm exporting an excel file that's created dynamically at run time from DataTables in aspx.cs file at server side using ClosedXML lib. I want to let user select the downloading file location at client side, which is currently moved to downloads.
You unfortantly cannot do this. This is also why you can never select a local file or location from server side.
So, the users local file system is 100% off limits.
And the reason is quite simple. If you come to my web site to look at some cute cat picture? Well, while you are doing that, do you think it would be ok if my web code also starts looking around at your files? Hum, maybe a file called banking? Maybe a filel called passwords? Hum, how about I steal all your email files? How about I look for a Excel sheet called passwords?
So, when it comes to poking around, looking at, and deciding things like file locations? You cannot on the server side GET ANY information, nor can you even find and define what file to pick for up-loading, and the SAME applies to downloading of files. If I could pick a location, then gee, why don't I start overwriting some of your system files - including some that would give me remote access to your computer, right?
So, things like what folder, what file, even the computer name etc? These things are 100% hidden, off limits and simple not allowed. Now it would be possible for someone to come out with a new web browser tht allowed local file rights and access. But then again, no one in their right minds would ever use such a browser, and the security hole would be too large. As a result, for reasons of security, such information, and even simple knowledge of the local file system is not allowed, nor even exposed to the web server.
But then again, the user might be on a iPad, or android phone, and their file systems and how their folders works is not even the same as say a windows desktop computer anyway.
However, you can see with above, that your ability or even options to mess with, or even choose local file locations is not allowed for reasons of security.
So, if you web site provides a file, or even streams down a file, it will go into the download folder as per user browser settings - you unfortantly can't change this - it works that way due to security concerns.
I've got an archive located at my web site with files in it that I'd like users of my app to be able to access. Files at this location may be changing so I'd like to be able download a file list that is then presented to the users, who would then click on the file(s) to download.
I think I've got the downloading handled with NSURLSession, but I can't find a direct way to get a list of the files located at http://www.example.com/archive/ in Swift. I feel like I'm missing something obvious. By the way, I'm not well versed in most aspects of "web programming" so small words would be appreciated if this involves stuff like POST and GET. Thanks.
I am creating a module of my website where I can display images in "albums", much like facebook.
For storing/grouping images, I planned on having them in the ~/Images folder inside my application's structure. Is this considered bad practice, or will it open up my application to any security vulnerabilities? I read that you shouldn't place things like this in your site structure, but I don't quite understand why (or if this is the same scenario).
Therefore, albums would be grouped as...
~/Images/album1, ~/Images/album2, etc.
Is this an appropriate thing to put inside App_Data, or is there a more 'preferred' location for things such as this?
Sorry if this is a trivial question.
All three of the answers here are good. There is no preferred storage for uploaded images, it's all up to you based on your requirements.
As Henhealg says, don't store them in App_Data. If you put them here, they will not be accessible from the web. For example, the following would not render an image even if the path was correct:
<img src="/App_Data/album1/image1.png" alt="" />
One option is to have your local ~/Albums directory mapped to a different folder accessible to the web server, like sylon says. This keeps the images out of the directory where your MVC app is served from, but "pretends" that they are there. If you control IIS and can set up a file share, this may be an option for you.
Also, like XToro says, storing them in a SQL database is an option. Storing here is flexible because you don't have to worry about folder or file name collisions. Multiple users can each have albums and files with the same names, yet they won't collide because they don't occupy filesystem space the same way normal files do. If security is important to your app (not showing photos or albums to unauthorized users), having them in a SQL table makes this fairly easy.
However if you are not as worried about security or file naming collisions, you can just as easily store them in your MVC app's ~/Images or ~/Albums directory.
Depending on the performance of your server, you may want to consider storing your images into a database using BLOB
https://dev.mysql.com/doc/refman/5.0/en/blob.html
Images can be easily sorted, organized, categorized without the need to worry about folder structures and folder permissions. Simply use your PHP/AJAX/language of your choice to provide the authentication and choose which files you wish to display.
This way, each image can have it's own fields (as many as you want) like the user who posted it, the original filename, a caption, the album it belongs in etc etc
Since you can easily as a user check where the images are stored once the application is in production, where you store the images does not matter as much as what permissions you set to the folder(s) that the images are stored in.
I would use file system as you are saying but store it outside of the application folder as you are saying it is bad practice. I agree with this - when i do deployments I prefer to delete everything and drop in the new code and keep the web.config file that way I always have a clean environment and it is much easier to get started from scratch without having to worry about what I need to back up or bring from previous install.
I would use IIS to map the directory into my solution wherever I desire from a network share storage or whereever you want to safely keep your albums.
e.g. D:\MySafeStorage\Albums\ map to your website's ~\Albums\ when your website is in C:\inetpub\MyWebSite\
I want to backup my existing ASP.net web app before updating it.
Therefore I create a backup folder inside the website (ie same level as App_Code, web.config). Call it something like Backup_20110910
Then I move all the current website files/folders (excluding web.config, app_data) into the backup folder.
Then I extract the zip of the latest code in the now clean folder.
Is there any potential problems with this approach? As after all, you are increasing the number of csharp files in your website folder, could there be conflicts etc.
I wouldn't back up within the folder structure, there's a possibility that someone then finds your backup folders and browses to them, running the older code. If you zip it then you suddenly have files someone can download too. Even more amusingly if, as a lot of people do, when you change web.config you rename the old one to web.config.bak a lot of security scanners look for that because now it can be downloaded, as it's no longer a .config file, but a .bak.
Backup outside the web root, not within and all of those worries will go away.
There won't be an issue - except that it might become confusing to have identical folder structures within the current folder structure - it's always wisest to keep backups completly seperate from the current build
I've been messing around a bit with various solutions to what I would see as a fairly common problem, but I've not yet been able to solve it in a satisfactory way.
What I wish to achieve is some kind of functionality where a user can upload new files, or select existing files to reuse them.
What I've been using so far is a combination of the filefield, filefield_sources, imce and ckeditor modules. I guess ckeditor isn't really important for the solution, but I need to be able to embed images from the archive somehow, and this is done with IMCE . Since I do not want everything to be accessible from the filebrowser I created a subdirectory and set full access to it in the IMCE settings, lets call it default/files/site
This worked fine as long as all filehanding was done through IMCE, but when I uploaded files directly from the filefield my files ended up in the default/files root, so I set up folders for my fields, for example default/files/site/movies in a field that allowed the .flv format. This worked fine to, as long as I didn't try to access the files through IMCE. It appears the folders created by filefield are not accessible from IMCE?
I'm also in a position where I need to support large uploads (200MB+), but from my experience in other projects, allowing file uploads through FTP is usually a life-saver, but from what I understand IMCE won't support files not uploaded through Drupal in some way, since they are not present in the database (giving the message: The selected file could not be used because the file does not exist in the database.)
I'm aware that I don't really have a clear question to my problem, but somehow I need to figure this out pretty fast. How would I preferably solve this? I'm aware that I'm not the first to have this problem, but I have not yet been able to find a nice and stable solution. What am I missing?
Also check this thread (http://drupal.org/node/438940) and the reference to John Locke's work at: http://www.freelock.com/blog/john-locke/2010-02/using-file-field-imported-files-drupal-drush-rescue
Well, I'm not personally familiar with IMCE off of the top of my head, but if you need files that have been uploaded via ftp to be added to the files tables, then my impulse would be to write a small module which would then allow the user to click a button and start off a batch process. (This is me assuming that you are using Drupal 6, as the batch api doesn't exist in 5.)
Said batch process would then iterate over all of the files in the appropriate directory, which I would assume you had uploaded the files to, use file_copy() (from Drupal's file API) to copy the files to default/files/site, and then would add said files to the files table, which is actually quite simple with drupal_write_record().
It might not even need to use the batch api - it somewhat matters if you're just uploading 10-30 really big files, or 200-300 MB files.
For using the batch api, I'd look at http://drupal.org/node/180528 - this has a fairly basic example of how the batch api works, which basically consists of telling the api that you want to keep calling function_a, and then inside of function_a setting your progress in the context array until you're done, at which time the batch process finishes. Then you just have whoever uploads the files via ftp to hit a button on the website to move and register the files.