I wanted to know that, is there some special requirement for a website to make use of CDN ?
i mean is there some special scheme(or atleast considerations) on which your website must be build right from the start to make use of CDN (Content delivery network).
is there anything that can stop a website from making use of CDN, for example the way it references the content files, static file paths or any other thing conceivable.
Thanks
It depends.
You have two kinds of CDN services:
Services like AWS Cloudfront that require you to upload the files in some special place that they read from (eg. AWS S3) - In this case you need have a step in your build process to correctly upload the files and handle the addresses somehow inside your application
Services like Akamai that just need you to change and tweak your DNS records so they will serve the request to your users instead of you - In this case you would have two domains (image.you.com and image2.you.com) and have the image.you.com pointing to Akamai and image2.you.com pointing to the original source of the file. Whenever a user requested an image in Akamai, they would come to you through the "back door", fetch it and starting serving that file always.
If you use the second approach it's really simple to have a CDN supporting your application.
There are a whole bunch of concerns when dealing with CDN solutions.
The first one is that a CDN can't serve a dynamic page - i.e. a page that is unique to every user. Typically, that includes PHP, ASPX, JSP, RubyOnRails etc. - so if you're hoping to support lots of users for a dynamic site, you have to come up with another solution. Some CDN providers support "Edge Side Includes" - this allows you to glue dynamic pages together with cached content on the CDN, but this creates quite a complex application.
Of course, even on a dynamic application, a CDN can still serve static files - images, stylesheets, javascript files, videos etc.
#Tucaz explains the two major options here (actually, Akamai also provides a "filestore" CDN option). If you select the second option - effectively, the CDN becomes a caching reverse proxy in front of your website - it makes sense to tweak the cache headers on your HTTP server, and tell the CDN to honour those. Make sure you set your .ASPX files to not cache!
Related
Haven't found many resources for this - If I wanted to use jQuery in my app, for example, would it be more beneficial to download jQuery into my project's directory, or to link the google CDN for use?
CDN -
Less Latency
Using a CDN: Using a CDN helps bring resources closer to the user by
caching them in multiple locations around the world. Once those
resources are cached, a user’s request only needs to travel to the
closest Point of Presence to retrieve that data instead of going back
to the origin server each time.
Without CDN - Offline test localhost
You can test your website on your local machine without network
connectivity if you serve your libraries locally while in development.
Without CDN Monkey Patching
You can modify and fix certain issues in a library that create
breaking issues in your software and host these. If you use a CDN you
will have to use the original library's code instead thus losing these
fixes.
I am needing to host media files on one server (with a different domain name) and have my website (files) on the other. I have all Wordpress base websites and am needing all current files to be moved to the other domain/server. I cannot do this manually as there are over 10,000 media files all up. Is there any plugin that allows to do this? Or any other way to do this? I am doing this to reduce the average CPU load / memory requirement. Thanks
If you are having performance issues with WordPress, my first recommendation would be to make sure you are using a caching plugin such as WP Super Cache or W3 Total Cache (I happen to use the latter). You will need to use a persistent caching option as well for the best performance, such as Memcached.
I can only speak to W3TC, but it does have an option to server your static content via a CDN such as RackSpace CloudFiles. When configured properly it will move files from your media library to the CDN, and replace the links in your content to the proper URL.
If performance is your main interest, you should also look at serving your site via Nginx and php-cgi, managed through something like spawn-fcgi. There are some caveats to using Nginx vs Apache, but once tuned the performance is far superior. You can find a guide for the configuration on the WordPress site.
Lastly you can set up a reverse proxy from your front end server to point to static files hosted on a different server - the content just passes through your front end server. This can be accomplished using Apache or Nginx, but the performance will be better in the latter. Please see my site for an example of using an Nginx reverse proxy - you would just want to proxy requests for your static files location to a different back-end server.
How can I make sure that static content (images, css, javascript) is cached? What is the best approach?
Will recommend you to go through this tutorial to understand how caching happens on web (HTTP) in general.
Simply speaking, the web server needs to generate appropriate HTTP headers while sending the content to the client in order to control client-side caching. In ASP.NET/IIS environment, its IIS that typically handles the static file contents and therefore, you must configure IIS appropriately to control caching static files as per you needs. See below links for more information about configuring IIS caching for static content:
http://www.iis.net/ConfigReference/system.webServer/staticContent/clientCache
How to configure static content cache per folder and extension in IIS7?
EDIT: As you have asked about the best approach, the most prevalent approach that I see now days is to version static content (say by appending some version identifier at the end of file or URL). Once version-ed, you can treat it as immutable and then emit cache headers for caching it for infinite duration. In ASP.NET application, you can probably append the assembly version (or product version) to each static content URL. So essentially, you will invalidating the cache for every build (or every product release).
You can also make use of the HTML5 Offline web applications manifest. It allows you to set up a manifest where you define which files will be cached locally.
It is a nice, clear to understand broadly implemented, way of avoiding having to learn about IIS and HTML Caching.
http://www.w3schools.com/html/html5_app_cache.asp
(you should totally read up about those things)
I use both Telerik and Microsoft CDN, for their respective AJAX toolkits. Both work great 99% of the time. However, I was working out of two different cafes recently and went to visit my site: The first cafe did not permit the Telerik CDN, while the second one does not allow the Microsoft CDN as a URL request. I can actually see the status bar in IE shows "ajax.microsoft.com" as the file being retrieved as I am waiting for the website to load.
Lack of CDN access seems to be a very unusual problem. In fact, I cannot fathom why such URL requests would be blocked when the cafe seems to permit pretty much everything else. Any reason? Could this be an availability issue at the respective CDNs themselves (ie how reliable are these CDNs)? And of-course, is there a recommended fix, apart from discarding CDN use?
Update: I can now connect to my app. So my lack of access to ajax.microsoft.com was most likely a temporary lack of MS CDN availability, and not any domain blocking.
all you need to do is implement fallback to your local server, explained here, http://happyworm.com/blog/2010/01/28/a-simple-and-robust-cdn-failover-for-jquery-14-in-one-line/
The Telerik online demos use the CDN by default, but fallback to embedded resources if the Amazon cloud service is unavailable. If you have the RadControls for ASP.NET AJAX installed locally, then you can see the source of the demo site. The files that you need to review are ~/Common/Footer.ascx and its code file ~/App_Code/QuickStart/Footer.cs, also
~/App_Code/QuickStart/QsfCdnConfigurator.cs ~/App_Code/QuickStart/HeadTag.cs. The Footer files set a cookie using JavaScript, depending on whether the CDN is available and the last two files provide support for reading the cookie on the server side and setting the appropriate configuration for the script manager.
We have a web site in the domain, let's name: http://website.com. It is necessary to implement same look-and-feel on another web site (https://custom.website.com). As we can see, the 2nd is in the sub-domain of the 1st one, but it is secured (it uses https).
To achieve same look-and-feel same DLLs are used in both web sites. These DLLs contain functionality for menus, JavaScripts, etc). But the 2nd web-site uses images and some css files from the 1st one. For example, in order to display "Logo.png" instead of usual "~/Images/Logo.png" the following path to file is rendered into HTML: "http://website.com/Images/Logo.png"
All stuff was done on the local environment, and work perfect (http://localhost/ referred to http://website.com).
BUT, when web-site was deployed to 'real' (development) environment we got surprise: IE notifies:
webpage contains content that will not be delivered using a secure HTTPS connection
I see option to resolve an issue: we could include images into secured web site and use them locally, but in this case we will need to do redeployment if something changed on the main web site.
Question: is there any workaround, how from secured web site we could use images that are located on the non-secured.
Thanks. Any thoughts are welcome.
P.S. I am using ASP.NET 3.5, web sites are hosted under Windows 2008
You need to host your images, CSS and scripts (more generally, whatever is loaded from the webpage) on your HTTPS site too to avoid mixed-content.
Depending on the level of security and isolation you could set up a shared virtual directory for the two websites that point to the same physical location.
Example, create a directory at C:\inetpub\shared-static and create a virtual directory /static under each website pointing to C:\inetpub\shared-static. From there both websites can refer to the image like ~/static/logo.png as necessary for shared content. When the directory has a new file (or replaced file) placed in it, both websites will refer to the same file.
If you can enable support of https on the main web site, you could use https for the image urls instead of http.
Alternative as others have suggested is to sync images / or used a shared location when serving them.
I see workaround:
on the 2nd web-site implement functionality that will check (once per day or per hour) if own images are the latest and update them when necessary...
That is a some work, but with such solution web site will be easier to support.
If you see a better option, please let me know.
Thanks.