What is the best architecture for file management website? - asynchronous

Here is what I think my website should be able to provide to user.
Ability to upload file to the system. It should not blocking, user should be able to surf other pages of the website while upload is ongoing. Once upload is done user will get notified about upload.
User should be able to view of his/her uploaded files in website.
Ability to edit files in web browsers using third party APIs
Number of user are going to be around 5000, and all of them might upload files at the same time so performance should not decrease.
Where should I store this files? How to make sure that read and write of files on this directory should handle concurrent user request?
Considering above points. What should be the best way to architect this website?
Are there any existing web framework that play along with this type of architecture like rails, express?

If you want to have the ability to browse the site while a file is uploading, you'll want to use something on the front end that overrides anchor tags and asynchronously fetches the next page - there might be a library or something to accomplish this but it should be easy to implement yourself with jQuery.
To make this easier (and for many other reasons), you'll almost definitely want to structure your site with an MVC (Model View Controller) architecture. Rails is structured this way, as is almost any web framework. It doesn't sound like what you're describing is better suited to Rails over PHP or Python etc so just use whatever language or framework you (or your developers) feel most comfortable with. You might want to do some research into available plugins for editing files (it really depends on what type of files you want to edit and how) and using those to influence your decision on which language to choose as well.
With regards to storing files on your server, any logical system should suffice. Perhaps:
/username/year/month/day/myFile.txt
You'll want to do something to ensure filenames don't clash as well. And obviously you'd want a database storing the information linking files to users.

Related

Publish MVC Website as A Single DLL File / Encrypted

Here it is my problem:
I developed an MVC site and currently using standard method to publish which will placed files & folders inside the server. All dll files go under BIN folder and so on with the Content and Views go to Content & Views folder.
The problem is this website is an admin panel designed for commercial hardware device (embedded windows OS), so exposing the views and content as a plain text file can't be an option since it will open vulnerability of hijacking/code stealing. Even the device will be packed in a sealed box, anyone who buy it can broke the case and when they are knowing that the device run in windows environment then anything of security breach may happen, including stole the views code to be copied/changed for any purpose.
So I would need to secure the MVC files. I imagine if MVC can be published in secure files, e.g put all the content and views inside dll files.
By default there is an assumption that whoever has access to your views and DLLs is trusted. If they have your files, they can do whatever they want with them.
By the nature of HTML, there is no point in trying to conceal your content files such as javascript and CSS. These files are served to the client regardless, so they are always retrievable.
If you want to put your views into DLLs, you can look into RazorGenerator.
A Custom Tool for Visual Studio that allows processing Razor files at design time instead of runtime, allowing them to be built into an assembly for simpler reuse and distribution.
Please note that what you're doing is known as security through obscurity.
[ ... ] security through obscurity is the use of secrecy of the design or implementation to provide security
Security through obscurity is discouraged and not recommended by standards bodies.
MVC views should never contain business logic, only formatting logic and that is it. Moreover, since C# code is compiled into Intermediate Language (IL), anyone can reverse the process and get the source code.
In such case, you need an obfuscator to mingle the IL to make it difficult to hack, but that this not 100% guaranteed to prevent hackers from reverse engineer you IL (DLL and exe).
The best thing to do is to establish a comprehensive way of testing the admin panel and to facilitate a robust update process, so in case anything went wrong, you push your updates as quickly as possible.

Is there a solution for a BitTorrent Uploader?

I have a requirement by my client to be able to upload extremely large files.
I'm talking about 7 GB files. The website they are currently running on is a ASP.NET 4.0 app, so obviously the standard upload scheme for my web app is not going to work.
I'm tossing around multiple options trying to figure out what the best route to go would be.
One option I'm thinking about seeing if I can do would be to have a BitTorrent Uploader. The end users for this app will typically have the same file on hand, so the idea would be that an end user would go to the site, say that they wanted to upload a file. At that point, they would pick the file, and then the server would immediately mark that person as a seed for that file. Then, my web app would go to a preconfigured leech on our side, and instruct the leech to download the file. I would expect at some point during or after this process the torrent would do some magic to find other seeders on the client's network, or wherever, but that's the idea.
Is there any technology out there already that does this? Or am I describing something that I'm going to have to build from the ground up?
It doesn't sound like it's going to be easy to do this with BitTorrent. In order for BT to work, you need torrent files. In order to create a torrent file for a particular file, you need that file (the torrent file basically contains a hash of the file). In general for a torrent, you need a tracker. You could rely on a public one, but that could be a risky dependency. You could operate your own, but that has other challenges (for one, you'd have to make sure it's locked down so it doesn't become a free-for-all for all the latest movies, music & TV).
Assuming you have a tracker in place, you then need to coordinate the downloading of torrents. Your users are going to have to create the torrent files, which is an extra complicated step, then presumably upload them via usual HTTP methods. As well as getting the user to upload the torrent, you'd have to remind the user to start seeding the torrent in their client of choice. You'd then want to automatically begin leeching the torrent (again, security issue here - what if a user uploads a completely unrelated torrent for the latest episode of House?). Apart from the security problem, this is probably the easiest part - most torrent clients can be configured to watch a directory and automatically start downloading torrent files in that directory. Once you've started downloading, you have to make sure that the user continues seeding the torrent until you've completed, otherwise you'll be stuck with a useless partial file.
It could all work, but without a fair bit of customisation work it's going to be a convoluted process at best for your users, and quite possibly beyond them. Obviously I don't know your specific requirements, but I'd be looking at more traditional file transfer protocols, like FTP.....

Using SQL for localization instead of RESX files in ASP.NET

I'm thinking of developing the following but wondering if it already exists out there:
I need a SQL based solution for assigning and managing localization text values for an asp.net site instead of using RESX files. This helps maintain text on the site without having to take it down for deployment whenever an update is required.
Thanks.
We actually went down that path, and ended up with a really really slow web site - ripping out the SQL-based translation mechanism and using the ASP.NET resources gave us a significant performance boost. So I can't really recommend you do that same thing.... (and yes - we were caching and optimizing for throughput and everything - and the SQL based stuff was still significantly slower).
You get what you pay for - the SQL based approach was more flexible in terms of being able to "translate" on the fly, and fix typos and stuff. But in the end, in our app (Webforms, .NET 2.0 at that time), using resources proved to be the only viable way to go.
We did this (SQL-Based Translation) and we are really happy with the result! We developed an interface for translation-agencies to perform the updates to the page online. As a side effect, the solution started to serve as content-management system. If you cache your data, performance is not an issue. The downside is, that we invested multiple hundreds of hours into our solution. (I would guess sth. arround 600 hours, but I could check.).
We ended up with a hybrid solution where users could edit content into a database but the application then created a .resx which was deployed manually.
You could also bypass the server translation altogether and do translation in jQuery on the client which is an approach I have used successfully.
I'm not sure about the website restart, but at least using .NET MVC is very convenient and I haven't noticed that restart problem, and, if occurs, how often you need to update the resx files? For bigger projects I use to create a solution with multiple projects, one for the localization, something like this:
MyApp.Localization
Model
Page
File1.resx
MyApp.Core
MyApp.Web
Then in the Web project I add a reference to the Localization project, and use it like
#MyApp.Localization.Model.Customer.CustomerName
#MyApp.Localization.Page.About.PageTitle
#MyApp.Localization.File1.Paragraph1
Everytime I change the translated text, I either upload an updated .dll or copy the .resx files.
NOTE: You need to set your resx files to PUBLIC, so can be accessed as strongly typed.
I created a SQL based translation scheme. But I only load the needed translations for a given page when it is requested, and just the ones for that particular page.
Those get loaded into a dictionary object when the page reloads and cached during the session. Then is just does text replacement based off a lookup on that.
Pretty much all of it is dynamically generated, and includes user defined content that must be translated, so the flexibility is key.
Performance is quite fast, the SQL queries to retrieve all the data take much longer (relatively speaking).

ASP.NET MVC WAP, SharePoint Designer and SVN

All,
I'm starting a new ASP.NET MVC project which requires some content management capabilities.
The people who will be managing the content prefer to use SharePoint Designer (successor to FrontPage) to modify content. I'd like to allow them to keep doing that.
The issues are:
Since I'd like this to be a WAP, not a website project, how can I allow them to see their changes in action without requiring them to have Visual Studio on their local machines? Can I specify a "default" action for a controller so that given a url like
/products/new_view_here
Can I let them save pages (views) and see them in the browser without having to go through the check-in/build/deploy process?
I'd like their changes to be stored in SVN; SharePoint designer seems to only support Visual SourceSafe (ugh) directly.
The ideas I've come up with so far are
Write an HTTP handler that implements the FrontPage Server Extensions protocol. This sounds time consuming, but I haven't yet looked at the protocol spec. However, it would allow me to perform whatever operations I want on the server side, including checking files into SVN.
Ditch the WAP in favor of a website project. I do not like having the source present on the server, however. Also, will MVC work in a website project?
Surely someone has tackled this problem before?
This seems to be pretty complex. If they are going to be making static html pages then another option besides Frontpage Extensions is to use FTP, as I recall Frontpage worked nicely over ftp. Then that would smooth over the editing portion of the problem.
I don't know what the exact technology would be but there are services that will monitor a file-system for changes, you could have it automatically commit to svn.
In this case I would have it commit to a branch, maybe for each designer, and then when they have completed some portion you, or some team member then merges their changes into the branch so that there's meaningful history other than, a series of mechanical commits that will be worthless to read.
Use FTP instead of Frontpage Extensions
Use a file system monitor to mechanically commit saves to an SVN branch
When milestones are reached manually merge those changes to the trunk.
Also if not FTP, then WebDAV may be a good option too. You may also need to extend the MVC framework to compile the template each pageview just for development purposes.
Good Luck!

ASP.NET XML ObjectDataSource Wrapper Class Examples

I want to use XML instead of SQLServer for a simple website.
Are their any good tutorials, code examples, and/or tools available to make a (prefer VB.NET) wrapper class to handle the basic list, insert, edit, and delete (CRUD) code?
The closest one I found was on a Telerik Trainer video/code for their Scheduler component where they used XML to handle the scheduling data in the demo. They created an ObjectDataSource class. Here is a LINK to that demo if anyone is interested.
[Reply to Esteban]
it would make deployment easier for clients that use godaddy where the database isn't in the app_data folder. also backing up those websites would be as simple as FTP the entire thing.
i have concerns about possible collisions on saving. especially if I add something as simple as a click counter to say a list of mp3 files visitors to the site can access.
In these days of SQL Server Express, I'd say there's really no reason for you not to use a database.
I know this doesn't really answer your question, but I'd hate to see you roll out code that will be a nightmare to maintain and scale.
Maybe you could tell us why you want to use XML files instead of a proper database.
It would make deployment easier for clients that use go-daddy where the database isn't in the app_data folder. also backing up those websites would be as simple as FTP the entire thing.
I have concerns about possible collisions on saving. especially if I add something as simple as a click counter to say a list of mp3 files visitors to the site can access.

Resources