Sharing large files efficiently on web link - asp.net

I would like to provide a link on my web site to download a large file. This should be done with scale in mind. What is best efficient way as of today?
Of course i can do a classic way:
<a href="//download.myserver.com/largefile.zip" title="Download via HTTP" >
The problem with this approach is: i dont want traffic to my server to explode with downloads. So I would rather redirect to external hosting for this large file. What is best way to host this file then?

If you want to avoid download traffic to your server, then I personally suggest using Azure Blob Storage. There is lots of documentation and client libraries for .Net. It removes download traffic from your site and the security concerns of hosting files and moves them to the Azure cloud which is very secure to say the least.
If you want the files to be publicly available to anyone, then make a public container, get the url of the file you want and place it in the anchor tag, otherwise you may need to familiarise yourself with the blob leasing (plenty of documentation too). Though like most things it is not free. The silver lining is you only pay for what you use.
You can get started here.
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-dotnet
Disclaimer,
I do not work for Microsoft, nor I do not benefit form this. This is just a personal opinion based on previous experiences and projects.

Related

Build an Offline website - Burn it on a CD

I need to build a website that can be downloaded to a CD.
I'd like to use some CMS (wordpress,Kentico, MojoPortal) to setup my site, and then download it to a cd.
There are many program that know how to download a website to a local drive, but how to make the search work is beyond my understanding.
Any idea???
The project is supposed to be an index of Local community services, for communities without proper internet connection.
If you need to make something that can be viewed from a CD, the best approach is to use only HTML.
WordPress, for example, needs Apache and MySQL to run. And although somebody can "install" the website on his own computer if you supply the content via a CD, most of your users will not be knowledgeable enough to do this task.
Assuming you are just after the content of the site .. in general you should be able to find a tool to "crawl" or mirror most sites and create an offline version that can be burned on a CD (for example, using wget).
This will not produce offline versions of application functionality like search or login, so you would need to design your site with those limitations in mind.
For example:
Make sure your site can be fully navigated without JavaScript (most "crawl" tools will discover pages by following links in the html and will have limited or no JavaScript support).
Include some pages which are directory listings of resources on the site (rather than relying on a search).
Possibly implement your search using a client-side technology like JavaScript that would work offline as well.
Use relative html links for images/javascript, and between pages. The tool you use to create the offline version of the site should ideally be able to rewrite/correct internal links for the site, but it would be best to minimise any need to do so.
Another approach you could consider is distributing using a clientside wiki format, such as TiddlyWiki.
Blurb from the TiddlyWiki site:
TiddlyWiki allows anyone to create personal SelfContained hypertext
documents that can be published to a WebServer, sent by email,
stored in a DropBox or kept on a USB thumb drive to make a WikiOnAStick.
I think you need to clarify what you would like be downloaded to the CD. As Stennie said, you could download the content and anything else you would need to create the site either with a "crawler" or TiddlyWiki, but otherwise I think what you're wanting to develop is actually an application, in which case you would need to do more development than what standard CMS packages would provide. I'm not happy to, but would suggest you look into something like the SalesForce platform. Its a cloud based platform that may facilitate what you're really working towards.
You could create the working CMS on a small web/db server image using VirtualBox and put the virtual disk in a downloadable place. The end user would need the VirtualBox client (free!) and the downloaded virtual disk, but you could configure it to run with minimal effort for the creation, deployment and running phases.

Migrate static content from ASP.NET project to Windows Azure platform

I've got asp.net project. I want publish it in azure platform. My project contains different static content: images, javascript, css, html pages and so on. I want store this content in azure blob storage. So, my questions are:
1) Is there any way to automate the process of migration this content from my application to blob storage?
2) How can I use data retreived from blob storage? Any examples would be great!
Best regards,
Alexander
First off, what you're trying to do could create cross-site scripting (they'll be on different domain names) or security issues (if you're using SSL). So make sure you really want to seperate the static files from the rest of your web site.
That said, the simpliest approach would be to use any one of a number of Windows Azure Storage management utilities (Storage Explorer or Cerebrata's Storage Studio would both work), to upload the static content to a Windows Azure Storage blob container. Then set the permissions on that container to publis read so that anyone with a web browser can access the contents of the container.
Finally, change all referrences to the content to point to the new URI's in blob storage and deploy your ASP.NET web role.
Again though, if I were you, I'd really look at what you're trying to accomplish with this approach. By putting it in blob storage, you do gain access to a few things (like CDN enablement), but as a trade-off, you lose control over many others (like simplified access control via IIS for request logs to tell when someone is downloading your image files a trillion times to try and run up your bill). So unless there's a solid NEED for this, I'd generally recommend against it.
Adding a bit to #Brent's answer: you'll get a few more benefits when offloading static content to blob storage, such as reduction in load against your Web Role instances.
I wrote up a more detailed answer on this similar StackOverflow question.
In light of your comment to Brent, you may want to consider uploading the content into Blob storage and then proxying it through a WebRole. You can use something like an HttpModule to accomplish that fairly seamlessly.
This has 2 main advantages:
You can add/modify files without reloading your web roles or losing them on role refresh.
If you're migrating a site, the files can stay at the same URLs they were pre-migration.
The disadvantages:
You're paying the monetary cost for Blob accesses and the performance cost to your web roles.
You can't use the Azure CDN.
Blob storage is generally slower (higher latency) than disk access.
I've got a fairly simple module I wrote to do exactly this. I haven't gotten around to posting it anywhere public, but if you're going to do this I can send you the code or something.

Streaming with Amazon S3

My website (developed with ASP.NET) publishes news along with related JPEG files and flash videos. Right now it is hosted as dedicated server. But it is becoming hard to maintain backup and bandwidth.
Can I host image files, video files and audio files in S3 bucket to resolve my issue? I have seen some articles related to this. But wanted to check with those who already doing this well. Do I need to take care of any steps to do so?
Please suggest me.
Yup, S3 works just fine for this. You need to make your buckets public to do so.
Alternatively, you can use CloudFront on top of S3 - http://aws.amazon.com/cloudfront/. It's a bit more expensive, but I've had good success with it.

Developing an online music store

We need to develop an application to sell music online. No need to specify that all will be done quite legally and in so doing, we have to plan an interface to pay artists. However, we are confronted with a question: What is the best way to store music on the server? Should we save it on server's disk from a HTTP fileupload? Should we save via FTP or would it be wiser to save it in the database? No need to say that we need it to be the most safiest as possible. So maybe an https is required here. But, we what you think is the best way? Maybe other idea? Because in all HTTP case, upload songs (for administration) is quite long and boring, but easly linkable to a song that admin create in his web application comparativly to an FTP application to upload song on server and then list directory in admin part to link the correct uploaded song to the song informations in database.
I know that its maybe not quite clear, it's because i'm french but tell me and I will try to explain part that you don't understand.
I've used Krystalware's SlickUpload ASP.NET control in the past to take care of the uploading part for you (you can use the in built control if you want to but this has a lot of the nifty ajax-style features done for you and is quite cheap).
Edit:
[I would not advocate storing the music file itself in the database. Much better [in my humble opinion] only to store the location of the file in the database. If you use one of the cloud services listed below then the location might simply be an HTTP link]
I'd also seriously consider using a cloud storage service for storing the music files. Something like Amazon S3 or Rackspace Cloud Files. CloudFiles is good because, if you wish, you can also enable CDN delivery (Content Delivery Network) which means your users can access the uploaded music tracks much faster than if served off your local web server, for instance.
Hope this helps,
Richard.

How do I cluster an upload folder with ASP.Net?

We have a situation where users are allowed to upload content, and then separately make some changes, then submit a form based on those changes.
This works fine in a single-server, non-failover environment, however we would like some sort of solution for sharing the files between servers that supports failover.
Has anyone run into this in the past? And what kind of solutions were you able to develop? Obviously persisting to the database is one option, but we'd prefer to avoid that.
At a former job we had a cluster of web servers with an F5 load balancer in front of them. We had a very similar problem in that our applications allowed users to upload content which might include photo's and such. These were legacy applications and we did not want to edit them to use a database and a SAN solution was too expensive for our situation.
We ended up using a file replication service on the two clustered servers. This ran as a service on both machines using an account that had network access to paths on the opposite server. When a file was uploaded, this backend service sync'd the data in the file system folders making it available to be served from either web server.
Two of the products we reviewed were ViceVersa and PeerSync. I think we ended up using PeerSync.
In our scenario, we have a separate file server that both of our front end app servers write to, that way you either server has access to the same sets of files.
The best solution for this is usually to provide the shared area on some form of SAN, which will be accessible from all servers and contain failover.
This also has the benefit that you don't have to provide sticky load balancing, the upload can be handled by one server, and the edit by another.
A shared SAN with failover is a great solution with a great (high) cost. Are there any similar solutions with failover at a reasonable cost? Perhaps something like DRBD for windows?
The problem with a simple shared filesystem is the lack of redundancy (what if the fileserver goes down)?

Resources