We need to develop an application to sell music online. No need to specify that all will be done quite legally and in so doing, we have to plan an interface to pay artists. However, we are confronted with a question: What is the best way to store music on the server? Should we save it on server's disk from a HTTP fileupload? Should we save via FTP or would it be wiser to save it in the database? No need to say that we need it to be the most safiest as possible. So maybe an https is required here. But, we what you think is the best way? Maybe other idea? Because in all HTTP case, upload songs (for administration) is quite long and boring, but easly linkable to a song that admin create in his web application comparativly to an FTP application to upload song on server and then list directory in admin part to link the correct uploaded song to the song informations in database.
I know that its maybe not quite clear, it's because i'm french but tell me and I will try to explain part that you don't understand.
I've used Krystalware's SlickUpload ASP.NET control in the past to take care of the uploading part for you (you can use the in built control if you want to but this has a lot of the nifty ajax-style features done for you and is quite cheap).
Edit:
[I would not advocate storing the music file itself in the database. Much better [in my humble opinion] only to store the location of the file in the database. If you use one of the cloud services listed below then the location might simply be an HTTP link]
I'd also seriously consider using a cloud storage service for storing the music files. Something like Amazon S3 or Rackspace Cloud Files. CloudFiles is good because, if you wish, you can also enable CDN delivery (Content Delivery Network) which means your users can access the uploaded music tracks much faster than if served off your local web server, for instance.
Hope this helps,
Richard.
Related
I would like to provide a link on my web site to download a large file. This should be done with scale in mind. What is best efficient way as of today?
Of course i can do a classic way:
<a href="//download.myserver.com/largefile.zip" title="Download via HTTP" >
The problem with this approach is: i dont want traffic to my server to explode with downloads. So I would rather redirect to external hosting for this large file. What is best way to host this file then?
If you want to avoid download traffic to your server, then I personally suggest using Azure Blob Storage. There is lots of documentation and client libraries for .Net. It removes download traffic from your site and the security concerns of hosting files and moves them to the Azure cloud which is very secure to say the least.
If you want the files to be publicly available to anyone, then make a public container, get the url of the file you want and place it in the anchor tag, otherwise you may need to familiarise yourself with the blob leasing (plenty of documentation too). Though like most things it is not free. The silver lining is you only pay for what you use.
You can get started here.
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-dotnet
Disclaimer,
I do not work for Microsoft, nor I do not benefit form this. This is just a personal opinion based on previous experiences and projects.
The Situation: I'm going to implement a digital repository using alfresco community version 5.1 to manage our university digital content which is stored at a moment in differents ftp servers (software installers, books, thesis). I intent to use alfresco as a backend and Orchard CMS as our intranet frontend which is a non functional requierement and communicate both with CMIS. The general idea is that we use a social networking approch in which every user can modify metadata, add tags in order to improve the search, which by the way is the general objective of my work (allows searches and download to the digital content of our intranet , because right know it takes a lot of time to find anything because it is storage in a ftp server without a good cataloging).
I already successfully created a custom data model but when a decided to migrate the content from these ftps, i didn't find any documentation about it. I read about bulk import tool but it happent that i need the data locally in the same computer that runs alfresco, and as i said, the data source are different ftp server.
So How can i migrate data from differents ftps servers as datasource to Alfresco?. Is it necessary to physically import files to Alfresco or can i work with index pointing to the ftp files (keep the files in the ftps and have in Alfresco a reference of that object (I only have search and download functional requierements))?.
Please I need your help as a guidence because here in cuba we dont have experience working with Alfresco and it is very difficult to have access to internet. So if you can point out the way of fixing this, or any recommendation i will be forever greatfull. Thank You and Again so sorry to disturb You
If this were a one-time thing, you could use an FTP client of some sort to simply drag and drop the files from your FTP server into Alfresco's FTP server. This would copy the files only and would not set any custom metadata.
Alternatively, you could write some Java to do this. Java can read from FTP servers and can write to Alfresco via CMIS. This would give you the opportunity to set some properties on the objects written into Alfresco beyond just the file name, creation date, and modification date.
However, if you are going to do this regularly, you might want to look at an integration tool. For example, you could use Apache Camel to watch the FTP servers, and when there is a change, it could fetch the file and write it to Alfresco via CMIS.
This will likely take some coding to make it work exactly right, but hopefully this gives you some options to consider.
I've got asp.net project. I want publish it in azure platform. My project contains different static content: images, javascript, css, html pages and so on. I want store this content in azure blob storage. So, my questions are:
1) Is there any way to automate the process of migration this content from my application to blob storage?
2) How can I use data retreived from blob storage? Any examples would be great!
Best regards,
Alexander
First off, what you're trying to do could create cross-site scripting (they'll be on different domain names) or security issues (if you're using SSL). So make sure you really want to seperate the static files from the rest of your web site.
That said, the simpliest approach would be to use any one of a number of Windows Azure Storage management utilities (Storage Explorer or Cerebrata's Storage Studio would both work), to upload the static content to a Windows Azure Storage blob container. Then set the permissions on that container to publis read so that anyone with a web browser can access the contents of the container.
Finally, change all referrences to the content to point to the new URI's in blob storage and deploy your ASP.NET web role.
Again though, if I were you, I'd really look at what you're trying to accomplish with this approach. By putting it in blob storage, you do gain access to a few things (like CDN enablement), but as a trade-off, you lose control over many others (like simplified access control via IIS for request logs to tell when someone is downloading your image files a trillion times to try and run up your bill). So unless there's a solid NEED for this, I'd generally recommend against it.
Adding a bit to #Brent's answer: you'll get a few more benefits when offloading static content to blob storage, such as reduction in load against your Web Role instances.
I wrote up a more detailed answer on this similar StackOverflow question.
In light of your comment to Brent, you may want to consider uploading the content into Blob storage and then proxying it through a WebRole. You can use something like an HttpModule to accomplish that fairly seamlessly.
This has 2 main advantages:
You can add/modify files without reloading your web roles or losing them on role refresh.
If you're migrating a site, the files can stay at the same URLs they were pre-migration.
The disadvantages:
You're paying the monetary cost for Blob accesses and the performance cost to your web roles.
You can't use the Azure CDN.
Blob storage is generally slower (higher latency) than disk access.
I've got a fairly simple module I wrote to do exactly this. I haven't gotten around to posting it anywhere public, but if you're going to do this I can send you the code or something.
I maintain a web application (ASP.NET/IIS7/SQL2K8/Win2K8) that needs to access documents, actually hundreds of thousands of documents, and growing. Currently, they are all on a Windows 2K8 Server fileshare, being accessed by UNC path (SMB). The files are in a single flat directory and I'm trying to plan how to best improve this solution. I don't want to use the SQL Filestream attribute as it would be significant effort to migrate it all into that, and would really lock in to SQL Server. I also need to find a way to replicate the data for disaster recovery, so perhaps a solution can help with that too.
Options could be:
Segment files into multiple directories?
Application would add metadata for which directory it's on (or segment by other means)
Segment files into separate servers? (virtualize)
Backup becomes more complicated.
Application would add metadata for which server it's on
NAS Storage
SAN Storage
Put a service (WCF) in front of the files and have the app talk to the service
bonus of being reusable across many applications
Assuming I'm going to store on filesystem and not in database (I've read those disccusions here), which would be a more scalable solution?
You've got a couple issues:
- managing a large volume of (static?) files
- preparing for backups and disaster recovery of said files
I'll throw this out there, even though I'm not a fan of the answer, but you might poke around with the free SharePoint 2010 Foundation that's included with server 2k8. If you're having issues with finding the documents you need (either by search, taxonomy via tagging or other metadata) as well as document expiration and you don't want to buy a full blown document management system, this might be a solution. Of course it introduces new problems...
If your only desire is to have these files available to spit out on the web, then the file store like you're using now really is the simplest solution. For DR/redundancy purposes, I'd look at a) running them on a raid/SAN of some sort and b) auto-syncing them with the cloud (either azure or amazon). For b) you can get apps that make the cloud appear as a mapped drive and then use an rsync type software to keep the cloud up to date.
If you want to build something new and cool, you might think about moving the entire file archive into the cloud and just write a table in a db to manage the file name, old location, new cloud location and a redirector code that can provide the access tokens to requestors.
3 different approaches... your choice.
We have a Web login feature. We will offer Free calls to a large campaign.
Scenarios:
Because of free calls, we will offer a unique file to be downloaded and stored
After a week or month we will call them and offer them our desktop application to scan and see how trusted, the user is
If we dont find the same file again, we will never start business and more our own statistics
Based on that report we want to do some follow ups campaign
We can do this with cookies but we want user experience and trust analysis
Example:
if you play a music in youtube.com, without notice your file is actually in /tmp/Flash....flv with lot of data on it.
Question:
How can i do the similar using Flex/Flash from the web browser ? Please kindly advise to any link or existing resource..
Thanks in advance.
You cant write file on Client PC trough Flash.
You can use Shared Object which are very similar to Normal Files and they will not be deleted in most cases.
http://learn.adobe.com/wiki/display/Flex/Shared+Objects
Claudio
In short, you can't. Flex in the browser doesn't have any filesystem access.
The only ways to store persistent data from the browser with Flex are through cookies, browser hacks like the evercookie and local shared objects.