Large Video Uploads via a website - drupal

Some of the problems that can happen are timeouts, disconnections, and not being able to resume a file and having to start from the beginning. Assuming these files are up to around 5gigs in size, what is the best solution for dealing with this problem?
I'm using a Drupal 6 install for the website.
Some of my constraints due to the server setup I have to deal with:
Shared hosting with max 200 connections at a time (unlimited disk space)
Shared hosting.
Unable to create users through an API (so can't automatically generate ftp accounts)
I do have the ability to run cron-type scripts via a Drupal module.
My initial thought was to create ftp users based off of Drupal accounts and requiring them to download an ftp client for their OS of choice. But the lack of API to auto-create ftp accounts and the inability to do it from command line kind of hinder that solution. If there's a workaround someone can think of, let me know!
Thanks

Usually, shared hostings does not support large files uploads through the browser. A solution may be to use another files hosting for your large uploads. A nice an easy solution to integrate is Amazon S3 and its browser based upload with a from POST.
It can be integrated in a custom module that provide an upload form protected using Drupal access control. If you require the files to be hosted on the Drupal server, you can use cron (either Drupal's or an external one) to move the files from S3 to your own hosting.

You're kind of limited in what you can do on a shared host. Your best option is probably to install SWFUpload and hope there aren't a lot of mid-upload errors.
Better options that you probably can't use on a shared host include the upload progress PHP extension (which Drupal automatically uses when it's installed) and, as you said, associating FTP accounts with Drupal accounts.

Related

How to migrate data to Alfresco from Ftp servers as data sources?

The Situation: I'm going to implement a digital repository using alfresco community version 5.1 to manage our university digital content which is stored at a moment in differents ftp servers (software installers, books, thesis). I intent to use alfresco as a backend and Orchard CMS as our intranet frontend which is a non functional requierement and communicate both with CMIS. The general idea is that we use a social networking approch in which every user can modify metadata, add tags in order to improve the search, which by the way is the general objective of my work (allows searches and download to the digital content of our intranet , because right know it takes a lot of time to find anything because it is storage in a ftp server without a good cataloging).
I already successfully created a custom data model but when a decided to migrate the content from these ftps, i didn't find any documentation about it. I read about bulk import tool but it happent that i need the data locally in the same computer that runs alfresco, and as i said, the data source are different ftp server.
So How can i migrate data from differents ftps servers as datasource to Alfresco?. Is it necessary to physically import files to Alfresco or can i work with index pointing to the ftp files (keep the files in the ftps and have in Alfresco a reference of that object (I only have search and download functional requierements))?.
Please I need your help as a guidence because here in cuba we dont have experience working with Alfresco and it is very difficult to have access to internet. So if you can point out the way of fixing this, or any recommendation i will be forever greatfull. Thank You and Again so sorry to disturb You
If this were a one-time thing, you could use an FTP client of some sort to simply drag and drop the files from your FTP server into Alfresco's FTP server. This would copy the files only and would not set any custom metadata.
Alternatively, you could write some Java to do this. Java can read from FTP servers and can write to Alfresco via CMIS. This would give you the opportunity to set some properties on the objects written into Alfresco beyond just the file name, creation date, and modification date.
However, if you are going to do this regularly, you might want to look at an integration tool. For example, you could use Apache Camel to watch the FTP servers, and when there is a change, it could fetch the file and write it to Alfresco via CMIS.
This will likely take some coding to make it work exactly right, but hopefully this gives you some options to consider.

Build an Offline website - Burn it on a CD

I need to build a website that can be downloaded to a CD.
I'd like to use some CMS (wordpress,Kentico, MojoPortal) to setup my site, and then download it to a cd.
There are many program that know how to download a website to a local drive, but how to make the search work is beyond my understanding.
Any idea???
The project is supposed to be an index of Local community services, for communities without proper internet connection.
If you need to make something that can be viewed from a CD, the best approach is to use only HTML.
WordPress, for example, needs Apache and MySQL to run. And although somebody can "install" the website on his own computer if you supply the content via a CD, most of your users will not be knowledgeable enough to do this task.
Assuming you are just after the content of the site .. in general you should be able to find a tool to "crawl" or mirror most sites and create an offline version that can be burned on a CD (for example, using wget).
This will not produce offline versions of application functionality like search or login, so you would need to design your site with those limitations in mind.
For example:
Make sure your site can be fully navigated without JavaScript (most "crawl" tools will discover pages by following links in the html and will have limited or no JavaScript support).
Include some pages which are directory listings of resources on the site (rather than relying on a search).
Possibly implement your search using a client-side technology like JavaScript that would work offline as well.
Use relative html links for images/javascript, and between pages. The tool you use to create the offline version of the site should ideally be able to rewrite/correct internal links for the site, but it would be best to minimise any need to do so.
Another approach you could consider is distributing using a clientside wiki format, such as TiddlyWiki.
Blurb from the TiddlyWiki site:
TiddlyWiki allows anyone to create personal SelfContained hypertext
documents that can be published to a WebServer, sent by email,
stored in a DropBox or kept on a USB thumb drive to make a WikiOnAStick.
I think you need to clarify what you would like be downloaded to the CD. As Stennie said, you could download the content and anything else you would need to create the site either with a "crawler" or TiddlyWiki, but otherwise I think what you're wanting to develop is actually an application, in which case you would need to do more development than what standard CMS packages would provide. I'm not happy to, but would suggest you look into something like the SalesForce platform. Its a cloud based platform that may facilitate what you're really working towards.
You could create the working CMS on a small web/db server image using VirtualBox and put the virtual disk in a downloadable place. The end user would need the VirtualBox client (free!) and the downloaded virtual disk, but you could configure it to run with minimal effort for the creation, deployment and running phases.

How to update asp.net azure application

I am new to windows azure. I've created simple HelloWorld ASP.NET azure application and published it. I know I can republish whole application in Visual Studio by clicking right button on project and then publish it. But is it possible to update only one file (aspx page, picture etc.)
Thanks!
Regards, Alexander.
I think if you're just learning Windows Azure, the most helpful answer is "You can't." The way Windows Azure works is that to update an application, you create the full package and deploy it again.
This isn't to say that David's answer isn't also correct. I just wanted to directly answer the question of "How do I change just one file after I deploy?"
If you want to update individual files such as images, one thing you can do is store all images (and css, javascript, and any other static content) in Blob storage. This has several advantages:
Easy to upload new files individually, with both free tools and paid tools. For instance: Cloudberry Explorer is a free app, and Cerebrata Cloud Storage Studio is a paid app, both which let you manage containers and blobs individually.
Smaller deployment package, because you've removed images and other large files
Less load on IIS, since image requests go directly to blob storage, not to your role instances
You can't store your aspx files in blobs, though you can store static content like html in blobs. To update aspx, you're basically updating the deployment. You can now do this as an "upgrade" which doesn't disrupt your IP address and, if you have multiple instances, doesn't take down your service during upgrade.
You can either use webdeploy (which should do a selective update of all files) or connect via remote desktop and update certain files yourself.
Like the comment and MSDN says: neither of these two ways are recommended/usable for production deployments. They are only meant as a shortcut for certain development scenarios.

wordpress deploying solution, ideas?

I develop on a local machine a Wordpress site and I'm now looking for a mechanism to deploy it easy and fast. I'm thinking about a DEV environment (located on my local machine), a STAGING environment (a subdomain on the client page, maybe staging.example.com) and of course a LIVE environment (example.com)!
My current workaround:
As I work with Aptana I'm able to sync my changed files with the deploy mechanism the IDE provides. Exporting my local database, finding/replacing the permalinks and importing the whole thing - finish! To deploy live, I have to replace all staging files with the live files.
This should be easier! Is there anyone out there, having a better workflow?
I'm open and really excited about your ideas!
Thanks a lot
greetings
Yep, it's frustrating and completely insane that Wordpress requires this process because they put absolute urls in the database. I develop in a similar fashion using multiple staging sites for qa and client review. After my first deployment with Wordpress I almost gave up on the platform entirely. All of the solutions recommended by core developers and others simply didn't work.
So I wrote a plugin: http://wordpress.org/extend/plugins/root-relative-urls/
that fixes the problem. With this plugin you don't need to do a search & replace on your content. No hosts file hacks, or dns tricks. With my plugin you can access the site via IP address or Computername or any type of forwarded host. And since it converts urls to root relative before they enter the database, you won't have to worry about them working between the different domain formats. And since they don't hard-code the scheme (http/s) in the url you won't have to worry about the 520 or so bugs that were reported in the wordpress trac database if you use SSL.
It's a staple for any wordpress project I work on these days. And I have written a couple other plugins to deal with idiosyncrasies that exist in the platform that you can check out here: http://wordpress.org/extend/plugins/profile/marcuspope
Hope that answers your problem.
I use Capistrano https://github.com/capistrano/capistrano/wiki/ for all my deployment needs and it is really good solution. You can simply script anything and it just works.
It could work for your deployment scheme too.
I also use Capistrano for both WordPress and Drupal deployments. I typically install modules locally for testing then push to test and production environments. For uploads, etc. I add custom tasks to manage syncing files stored in scm and those that are not. Here is a simple guide I put together.
http://www.celerify.com/deploy-wordpress-drupal-using-capistrano

How do I cluster an upload folder with ASP.Net?

We have a situation where users are allowed to upload content, and then separately make some changes, then submit a form based on those changes.
This works fine in a single-server, non-failover environment, however we would like some sort of solution for sharing the files between servers that supports failover.
Has anyone run into this in the past? And what kind of solutions were you able to develop? Obviously persisting to the database is one option, but we'd prefer to avoid that.
At a former job we had a cluster of web servers with an F5 load balancer in front of them. We had a very similar problem in that our applications allowed users to upload content which might include photo's and such. These were legacy applications and we did not want to edit them to use a database and a SAN solution was too expensive for our situation.
We ended up using a file replication service on the two clustered servers. This ran as a service on both machines using an account that had network access to paths on the opposite server. When a file was uploaded, this backend service sync'd the data in the file system folders making it available to be served from either web server.
Two of the products we reviewed were ViceVersa and PeerSync. I think we ended up using PeerSync.
In our scenario, we have a separate file server that both of our front end app servers write to, that way you either server has access to the same sets of files.
The best solution for this is usually to provide the shared area on some form of SAN, which will be accessible from all servers and contain failover.
This also has the benefit that you don't have to provide sticky load balancing, the upload can be handled by one server, and the edit by another.
A shared SAN with failover is a great solution with a great (high) cost. Are there any similar solutions with failover at a reasonable cost? Perhaps something like DRBD for windows?
The problem with a simple shared filesystem is the lack of redundancy (what if the fileserver goes down)?

Resources