Disk IO Performance Limitations based on numbers of folders/files - unix

I have an application where users are allowed to upload images to the server. Our Web Server is a windows 2008 server and we have a site (images.mysite.com) that points to a shared drive on a unix box.
The code used to do the uploading is C# 3.5.
The system currently supports a workflow where after a threshold is met a new subfolder can be generated. The question we have is how many files and/or subfolders can you have in a single folder before there is a degredation in performance - in serving the images up through IIS 7 and reading/writing through code?

We had a site which hit 350,000 image files in a single directory. The site operates just fine serving those images. The problem comes in when you try to view that directory in Explorer. Explorer is interested in sharing with you more than the file name; it wants to show you an icon, and other properties such as image size which it has to obtain by reading the file itself.
The subfolders are more a convenience for the administrative individuals that have to manage those resources.
You may want to set a file count threshold parameter starting at about 10K per folder and tune it up based on how well that folder can be navigated in Explorer.

Related

Where to put an embedded database in an AspNet Core application

I have lately re-discovered embedded databases such as Sqlite (sql, relational) and LiteDb (noSql) and I like working with them for small web apps or mobile apps.
However, I cannot find any good answer to where to place them. Where to put them if:
The web app is likely to be containerized
The database can grow dynamically
Changes to the code and new deployments should not risk losing any change in database
1. Database file as part of solution (versioned in source control)
I've seen places where the *.db file is placed somewhere in the solution and it's versioned in source control.
I can see how this could be a problem as the database can be modified outside the context of development (i.e: when the app is up and running in production, the DB may change and in the next deployment the db may be overwritten if no backup/restore process in place)
Sometimes I have seen it inside wwwroot/App_Data. See this for instance. I assume App_Data is some kind of protected folder and its files cannot be server statically by the web server (is it?). Otherwise this is even worse.
2. Database file in binary folder
When testing, it's fine to have the database file generated somewhere in the bin folder, but this causes a similar problem as the previous one. What happens when a new software version is released and therefore the database file is overwritten in production?
So the questions are:
Is there any good practice regarding where to place embedded database files?
Is there any alternative to having backup/restore processes to avoid the described data-loss scenarios?
What happens when the app is contenierized and the database file grows once deployed? If the file is inside a container along with the running application, can it grow indefinitely? I don't recall specifying anything about a maximum size for containers anywhere when creating images..
Is having the DB in an external storage such a cloud blob store the alternative? I'm guessing the real benefit of embedded databases is gone if the file is in a different host.
Any good read about this would be appreciated.
PS:
I am asking for AspNet Core apps mainly because I see some projects using the wwwroot folder to place the embedded DB, but the question applies to any technology/framework.
This other question doesn't help either.

Writing a PDF outside ASP.NET MVC website folder

I am trying to write a PDF file outside the ASP.NET MVC website folder and I get the following error:
Cannot using a leading .. to exit above top directory
We have another dataset static website that need not to be affected from website modifications and accidentally deleted. I suppose that the problem is the following line of code:
string path = Server.MapPath("../../Data/Invoices");
How can I workaround this limitation? I thought about disabling the Web Deploy option "Remove additional files a destination" but it's to risky, we need to maintain invoice copies for years. I'm scare from new DEV machine installations/new programmers, etc. We are working on Windows Server 2008 R2 and IIS 7.
Any other creative idea is welcome.
Thanks.
Even though the idea of keeping data on the application server makes me uncomfortable; you can use a virtual directory that points to a location like; c:/Alberto/Data/Invoices and map your virtual directory from your application to http://{server}/albertos/invoices (invoices would be your virtual directory and albertos would be the application name). Then I think you can call Server.MapPath('~/invoices') on it as you wish.

Do not upload specific files when publishing website with VS 2008

I use the "publish website" option to directly publish my ASP.NET - website to my ftp-server. This works quite nice.
The problem is the biggest part of my project are DLL-Files in the bin-directory which are external libraries that I only update quite rare.
So I do not want then to be uploaded every time. With my local resources I can select whether these files should be uploaded every time, never or only when changed, but I do not find this options for files in the bin-directory.
Any way to solve this?

Advanced image editing off the web

I'm building an app in ASP.NET that will store some pictures of objects. The pictures will be uploaded by suppliers and downloaded by subscribers. In between, they will have to be edited before becoming available to subscribers.
The editing involves creating a cropping path tightly around the object in the picture, in which some advanced desktop image software will have to be used I suppose.
My problem is in exchanging pictures between my ASP.NET app and the desktop software in a manner that is easy and transparent for the user.
I've done some thinking and I've come up with:
- Manually downloading and uploading the image (Not much user friendly...)
- An image editing program that can upload to a web service (Haven't found yet...)
- Develop a plug-in for an image editing program (Too advanced...)
I'd appreciate any suggestions you may have, thank you!
It sounds like you need some automation to move files between the web server and a file share. I am assuming that the number of images that need to be processed is pretty large, because if it's not, then the overhead of downloading/re-uploading each would not be that much.
So do the following:
1) Create an API for your web app that lists files that are available, or new files since some date/time, or files that have been marked as "new". The API should probably also allow marking a status on them (so you can tell it when you've finishing pulling something down, and it won't be offered again) if you don't want to trust date/time as an indicator of it being new.
2) Write an app (non-web) that runs on a schedule and uses this API to automatically download files to a shared filesystem area in your local network, and marks them as "downloaded"
The app should also monitor these files (the ones it downloaded & saved to your local share) for changes, and if changed, upload them back to your web app. To do this you may need to keep a database of filenames and modification dates/times.
This shouldn't be too hard to write in whatever language you are using for your web (assume c# or vb). By "API" I just mean, a web page that provides a list in a standardized format (e.g. json) that you can parse with your automation application, and another page that allows posting the file back for re-upload.
I'm assuming that the web server is not your own, or generally, you can't simply have it save the file uploads directly to some area where your image editors can access them. Otherwise you could just do that.
Meanwhile I came out with another possible solution.
I'm thinking of having our own windows app on the editor's computers. This app will be associated with a custom extension. When an editor downloads a file (with this extension) for editing, it will be opened in our application which in turn will open the image in some editor program.
This app will be monitoring the files for changes, and in such case, it will upload these images.
Any thoughts on this?

Cannot execute System.IO.File.Move to a network drive from SharePoint Server 2007 (MOSS)

I am using VS2008 C# and MOSS (SharePoint Server 2007).
I have created an asp.net web form which appears on a WebPart within a SharePoint site. When submitting the form, a small .csv file is generated. Ideally I want to have this file created on a network drive (on another server), but for some reason I cannot do this. I can create the file happily on the hard drive of my SharePoint server itself that I am working on, but it just never appears on any network drive that I choose.
I then thought I'd create the file on my c: drive first (as it works), then use asp.net to 'Move' the file to the network directory. I used:
System.IO.File.Move(sourceFile, destinationFile);
This failed also at the 'moving' stage. - the file gets created fine! On the MOSS server I am working on, there is a C: drive and a D: drive (partitioned). The file creation works fine on both drives, but just not any network drive even if I avoid using drive mappings as below:
(e.g. "G:\\Group Files\\" or "\\\\Global\\Group Files\\" )
Obviously, I thought security was an issue, so I ensured the MOSS server and the network server both allowed each other with 'Full Access' using Active Directory. I even granted access to myself as a user, admin groups, the ASPNET account, NETWORK SERVICE account (amongst others), etc with MODIFY access. Still no joy. I can PING the network server that I want to create the .csv file on, so it is 'seeing' it.
The work-around that I have done is create the file on the SharePoint server's c: drive, then run a batch file (on schedule) that purely copies the file to the destination G: drive - this works a treat, but I am frustrated that I cannot create the file on the destination server straight away, using code.
I've got a hunch it is SharePoint related, but if anyone can shed light on this matter I'd be extremely grateful!!
Thanks in advance, Ash ;-)
Ah, I had this problem today and found this thread whilst looking for an answer. My problem was that my temp directory didn't have the right permissions! Sure, I could get the file uploaded without issue, but I couldn't move it unless I were using a LAN account. I made sure NETWORK SERVICE and my sharepoint application pool had access, and made sure the user who was uploading the file had permission too. Then it worked.

Resources