Build an Offline website - Burn it on a CD - asp.net

I need to build a website that can be downloaded to a CD.
I'd like to use some CMS (wordpress,Kentico, MojoPortal) to setup my site, and then download it to a cd.
There are many program that know how to download a website to a local drive, but how to make the search work is beyond my understanding.
Any idea???
The project is supposed to be an index of Local community services, for communities without proper internet connection.

If you need to make something that can be viewed from a CD, the best approach is to use only HTML.
WordPress, for example, needs Apache and MySQL to run. And although somebody can "install" the website on his own computer if you supply the content via a CD, most of your users will not be knowledgeable enough to do this task.

Assuming you are just after the content of the site .. in general you should be able to find a tool to "crawl" or mirror most sites and create an offline version that can be burned on a CD (for example, using wget).
This will not produce offline versions of application functionality like search or login, so you would need to design your site with those limitations in mind.
For example:
Make sure your site can be fully navigated without JavaScript (most "crawl" tools will discover pages by following links in the html and will have limited or no JavaScript support).
Include some pages which are directory listings of resources on the site (rather than relying on a search).
Possibly implement your search using a client-side technology like JavaScript that would work offline as well.
Use relative html links for images/javascript, and between pages. The tool you use to create the offline version of the site should ideally be able to rewrite/correct internal links for the site, but it would be best to minimise any need to do so.
Another approach you could consider is distributing using a clientside wiki format, such as TiddlyWiki.
Blurb from the TiddlyWiki site:
TiddlyWiki allows anyone to create personal SelfContained hypertext
documents that can be published to a WebServer, sent by email,
stored in a DropBox or kept on a USB thumb drive to make a WikiOnAStick.

I think you need to clarify what you would like be downloaded to the CD. As Stennie said, you could download the content and anything else you would need to create the site either with a "crawler" or TiddlyWiki, but otherwise I think what you're wanting to develop is actually an application, in which case you would need to do more development than what standard CMS packages would provide. I'm not happy to, but would suggest you look into something like the SalesForce platform. Its a cloud based platform that may facilitate what you're really working towards.

You could create the working CMS on a small web/db server image using VirtualBox and put the virtual disk in a downloadable place. The end user would need the VirtualBox client (free!) and the downloaded virtual disk, but you could configure it to run with minimal effort for the creation, deployment and running phases.

Related

How do you make an R coded program running on AWS accessible from your website?

In the creation of a simulation for our company, we coded the entire thing in R. It runs on AWS, and the consumers have been given links that route to the AWS page. Our website, however, is currently running off of Wordpress. In order for our customers to be able to access the product, we need to find a way to connect the product to the website. We would hence like to replace the current site with a new one that allows users to access our simulation from the website.
The only option we’ve come up with is to create a separate domain that has the interface built into the R program, and have a link to that domain from the current website. However, we would prefer to have a more direct solution.
Do any of you have any suggestions as to how we might achieve this?
Thanks for your time!
This answer very much depends on your code, but I think you have several options.
Run on external website
Pros:
Full control over code, easy to update without risking changes to main site
Easily accessible either by directly linking to it, or using <iframe> (HTML) on your main website, no Wordpress support required!
Cons:
Separate domain, some extra costs (?)
Shinyapps.io
Pros:
Easily publishable, often free
Cons:
Available for mostly everyone, which might not be ideal in a business situation
Less control over the platform
EDIT: I wanted to add that you can host your own shiny applications, and build the front-end using HTML. This gives you some more control.
AWS
Pros:
You should be able to set up an instance where the simulation is run on a subdomain that is not directly tied to Wordpress, e.g. outside of the main Wordpress folder.
As I said, the ideal solution depends on your code. Does it take user input, does it need to save files often? What kind of access control do you need?

Sharing large files efficiently on web link

I would like to provide a link on my web site to download a large file. This should be done with scale in mind. What is best efficient way as of today?
Of course i can do a classic way:
<a href="//download.myserver.com/largefile.zip" title="Download via HTTP" >
The problem with this approach is: i dont want traffic to my server to explode with downloads. So I would rather redirect to external hosting for this large file. What is best way to host this file then?
If you want to avoid download traffic to your server, then I personally suggest using Azure Blob Storage. There is lots of documentation and client libraries for .Net. It removes download traffic from your site and the security concerns of hosting files and moves them to the Azure cloud which is very secure to say the least.
If you want the files to be publicly available to anyone, then make a public container, get the url of the file you want and place it in the anchor tag, otherwise you may need to familiarise yourself with the blob leasing (plenty of documentation too). Though like most things it is not free. The silver lining is you only pay for what you use.
You can get started here.
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-dotnet
Disclaimer,
I do not work for Microsoft, nor I do not benefit form this. This is just a personal opinion based on previous experiences and projects.

How to migrate data to Alfresco from Ftp servers as data sources?

The Situation: I'm going to implement a digital repository using alfresco community version 5.1 to manage our university digital content which is stored at a moment in differents ftp servers (software installers, books, thesis). I intent to use alfresco as a backend and Orchard CMS as our intranet frontend which is a non functional requierement and communicate both with CMIS. The general idea is that we use a social networking approch in which every user can modify metadata, add tags in order to improve the search, which by the way is the general objective of my work (allows searches and download to the digital content of our intranet , because right know it takes a lot of time to find anything because it is storage in a ftp server without a good cataloging).
I already successfully created a custom data model but when a decided to migrate the content from these ftps, i didn't find any documentation about it. I read about bulk import tool but it happent that i need the data locally in the same computer that runs alfresco, and as i said, the data source are different ftp server.
So How can i migrate data from differents ftps servers as datasource to Alfresco?. Is it necessary to physically import files to Alfresco or can i work with index pointing to the ftp files (keep the files in the ftps and have in Alfresco a reference of that object (I only have search and download functional requierements))?.
Please I need your help as a guidence because here in cuba we dont have experience working with Alfresco and it is very difficult to have access to internet. So if you can point out the way of fixing this, or any recommendation i will be forever greatfull. Thank You and Again so sorry to disturb You
If this were a one-time thing, you could use an FTP client of some sort to simply drag and drop the files from your FTP server into Alfresco's FTP server. This would copy the files only and would not set any custom metadata.
Alternatively, you could write some Java to do this. Java can read from FTP servers and can write to Alfresco via CMIS. This would give you the opportunity to set some properties on the objects written into Alfresco beyond just the file name, creation date, and modification date.
However, if you are going to do this regularly, you might want to look at an integration tool. For example, you could use Apache Camel to watch the FTP servers, and when there is a change, it could fetch the file and write it to Alfresco via CMIS.
This will likely take some coding to make it work exactly right, but hopefully this gives you some options to consider.

Is source code off an app avilble for the user?

If i write an desktop app in tidesdk or tide kit will it be possible for users to read my source code, just like from ordinary web page or not ?
Yes, if the user knows where to look. It's not viewable by right clicking the window and selecting source, but if they browse to the install directory, all the HTML / related files are there in broad daylight.
You could come up with some strategies to protect them, either using encryption or just providing a bootstrapper application which downloads the rest of the source from a server on startup or something like that...but if it's a huge concern of yours you're probably better off using a different platform.

Making an entire website available offline?

Consider this scenario:
I could use a CMS, say Wordpress, to create a product catalogue, where my products are effectively tagged and categorised for ease of navigation. For employees and customers, this would provide an effective and visual means to browse a catalogue of products.
The problem with this is that it requires a connection to the internet to serve up the information. There could be many situations where the users of this catalogue are not connected to the internet, but still need to browse the catalogue - like field sales staff, for example.
How then, is it possible to make this entire site available for viewing (and distributing) offline? It would need to function exactly as the internet-connected version, serving up the same information and images.
Is it possible!?
I guess the limitation is the the WP database serves up the info and that would require everyone to have a MAMP-type installation, with Wordpress on their machines?
You could create a static mirror of the site e.g. wget -km http://DOMAIN. Package that into an archive and get them to install a new archive whenever it's been updated.
If you need it to function exactly, like you mentioned, you might want to check out XAMPP. It is a package containing an apache webserver, mysql, perl and php. It is not required to be installed before being used, but it does require starting the components which could probably be scripted.
The downside is you will need to customize this version unless you want to include all your information in the catalogs. Also, since your current server likely has different modules than what comes standard with XAMPP this could lead to having to basically maintain two versions of the site.
If you don't need the databases to sync (e.g. portable POS systems), MAMP is a great solution. I've implemented this several times in cases where field agents required web-based promo materials. Easy to update, maintenance free, small learning curve. MAMP all the way.
I'm developing a Wordpress site, mirrored locally under http://localhost. I'm able to transfer the database with a simple plugin that handles backup, then before I load locally I remap the URI strings inside the SQL. Being PHP serialized, some care is needed to keep the string size aligned. I.e. change each occurrence of s:N:"...http://your_site/" to s:M:"...http://localhost/your_site/", with M = N + 10.

Resources