CDN or download into directory? - cdn

Haven't found many resources for this - If I wanted to use jQuery in my app, for example, would it be more beneficial to download jQuery into my project's directory, or to link the google CDN for use?

CDN -
Less Latency
Using a CDN: Using a CDN helps bring resources closer to the user by
caching them in multiple locations around the world. Once those
resources are cached, a user’s request only needs to travel to the
closest Point of Presence to retrieve that data instead of going back
to the origin server each time.
Without CDN - Offline test localhost
You can test your website on your local machine without network
connectivity if you serve your libraries locally while in development.
Without CDN Monkey Patching
You can modify and fix certain issues in a library that create
breaking issues in your software and host these. If you use a CDN you
will have to use the original library's code instead thus losing these
fixes.

Related

Will Multiple CDN Speed Up Page Load

I noticed DJI Store website uses multiple CDN domains to server static elements.
Web page:
https://store.dji.com/?site=brandsite&from=nav
CDNs:
https://asset2.djicdn.com/assets/v2/common/14292283_1302296159810439_4324228009709332653_n.jpg
https://asset4.djicdn.com/assets/v2/build/app-0f0a05d6b0cd030cf68ca92e67816241.css
https://product2.djicdn.com/uploads/sku/covers/31314/small_55e19eff-2d6a-4d75-8e63-b9b5822fd298.png
Just wondering what is the purpose of using more than 1 CDN domain, more parallel downloads?
If so, how many domains I should use?
This is no longer a recommended way to load assets from CDN. Its better to use a single CDN and load as many resources using it as possible so that the HTTP/2 connection can be reused and the page has to create less connections.
Back in the HTTP/1.1 times, it was a common practice to load resources over multiple hosts to parallelize their download. This was a helpful practice at that time and could significantly speed up rich webpages for users with more bandwidth. This technique was called Domain Sharding.
But after HTTP/2, it is no longer required and seen as a bad practice. The above store seems to be built in the HTTP/1.1 era and optimized for the browser of that time.
There is another term "Incidental Domain Sharding" which means web development practices have resulted in developers unnecessarily relying on more and more hosts to deliver their content. For example, sites these days load fonts from Google Fonts, public libraries from some javascript CDN, and host their private content on a private CDN. This requires the browser to open several unnecessary connections that can otherwise be avoided, and prevents browser to avail the HTTP/2 multiplexing. But there are possible solutions like PageCDN and EasyFonts that collectively can help achieve maximum performance out of the available technologies since they load all the page resources over single CDN.
If you want to see Incidental Domain Sharding in action, have a look at source code of http://www.piston.rs/dyon-tutorial/ They are loading resources over 5 CDNs, and their private content (website CSS and JS files) still need a private CDN.

2 Servers for website and media files (Wordpress Plugin Needed)

I am needing to host media files on one server (with a different domain name) and have my website (files) on the other. I have all Wordpress base websites and am needing all current files to be moved to the other domain/server. I cannot do this manually as there are over 10,000 media files all up. Is there any plugin that allows to do this? Or any other way to do this? I am doing this to reduce the average CPU load / memory requirement. Thanks
If you are having performance issues with WordPress, my first recommendation would be to make sure you are using a caching plugin such as WP Super Cache or W3 Total Cache (I happen to use the latter). You will need to use a persistent caching option as well for the best performance, such as Memcached.
I can only speak to W3TC, but it does have an option to server your static content via a CDN such as RackSpace CloudFiles. When configured properly it will move files from your media library to the CDN, and replace the links in your content to the proper URL.
If performance is your main interest, you should also look at serving your site via Nginx and php-cgi, managed through something like spawn-fcgi. There are some caveats to using Nginx vs Apache, but once tuned the performance is far superior. You can find a guide for the configuration on the WordPress site.
Lastly you can set up a reverse proxy from your front end server to point to static files hosted on a different server - the content just passes through your front end server. This can be accomplished using Apache or Nginx, but the performance will be better in the latter. Please see my site for an example of using an Nginx reverse proxy - you would just want to proxy requests for your static files location to a different back-end server.

Dependencies that must be done away with for using CDN

I wanted to know that, is there some special requirement for a website to make use of CDN ?
i mean is there some special scheme(or atleast considerations) on which your website must be build right from the start to make use of CDN (Content delivery network).
is there anything that can stop a website from making use of CDN, for example the way it references the content files, static file paths or any other thing conceivable.
Thanks
It depends.
You have two kinds of CDN services:
Services like AWS Cloudfront that require you to upload the files in some special place that they read from (eg. AWS S3) - In this case you need have a step in your build process to correctly upload the files and handle the addresses somehow inside your application
Services like Akamai that just need you to change and tweak your DNS records so they will serve the request to your users instead of you - In this case you would have two domains (image.you.com and image2.you.com) and have the image.you.com pointing to Akamai and image2.you.com pointing to the original source of the file. Whenever a user requested an image in Akamai, they would come to you through the "back door", fetch it and starting serving that file always.
If you use the second approach it's really simple to have a CDN supporting your application.
There are a whole bunch of concerns when dealing with CDN solutions.
The first one is that a CDN can't serve a dynamic page - i.e. a page that is unique to every user. Typically, that includes PHP, ASPX, JSP, RubyOnRails etc. - so if you're hoping to support lots of users for a dynamic site, you have to come up with another solution. Some CDN providers support "Edge Side Includes" - this allows you to glue dynamic pages together with cached content on the CDN, but this creates quite a complex application.
Of course, even on a dynamic application, a CDN can still serve static files - images, stylesheets, javascript files, videos etc.
#Tucaz explains the two major options here (actually, Akamai also provides a "filestore" CDN option). If you select the second option - effectively, the CDN becomes a caching reverse proxy in front of your website - it makes sense to tweak the cache headers on your HTTP server, and tell the CDN to honour those. Make sure you set your .ASPX files to not cache!

Storing CSS files on Windows Azure

I'm working on my first Windows Azure application and I'm wondering how people go about managing CSS & JS files within their apps?
At the moment my CSS and JS are just part of my cloud app so every time I make a small CSS change the app needs to be redeployed which isn't ideal. Is it best practice to remove those components from the cloud app and deploy them elsewhere? If that is the case where is the best place to store them? Inside a cloud storage account using blobs or something else?
Bear in mind that if you put your assets in storage, each time there is a page request that includes a link to storage, it counts as a storage transaction. Currently, they are priced at $0.01 per 10,000, so it would take a while to be costly. But if you have 2 CSS files, 2 JS files and 4 images on a given page, that's 8 transactions per page request.
If you get 1000 page requests per day * 30 days that's 240,000 per month / 10,000 = $0.24. Not a big deal if your page requests stay low. But, if your site is even remotely higher traffic, it can start to add up quickly.
Yeah, throw your assets into a public container in storage and build absolute urls to the storage account container from the web app (use a helper method). This way you can just sparsely upload assets as they change.
Next step would be to expose the container over the CDN to get the distributed edge caching too.
We store our JS and CSS in blobs with the Azure CDN and it works great.
A completely different 'solution' might be to check out:
http://blogs.msdn.com/b/windowsazure/archive/2011/07/12/now-available-windows-azure-accelerator-for-web-roles.aspx
I personally haven't used them yet but they're supposed to let you alter/update your web role projects without needing to redeploy the entire thing.
I am not sure if this will work as easily as you might expect for CSS files if they are being referenced from a different domain.
CSS files that are hosted on a different domain might be blocked by the browser. See Cross-Origin Resource Sharing: http://www.w3.org/TR/cors/ However I am not sure if this is widely implemented.
An alternative might be to use a handler which forwards requests for the CSS files on your server to the blob.

Lack of CDN availability

I use both Telerik and Microsoft CDN, for their respective AJAX toolkits. Both work great 99% of the time. However, I was working out of two different cafes recently and went to visit my site: The first cafe did not permit the Telerik CDN, while the second one does not allow the Microsoft CDN as a URL request. I can actually see the status bar in IE shows "ajax.microsoft.com" as the file being retrieved as I am waiting for the website to load.
Lack of CDN access seems to be a very unusual problem. In fact, I cannot fathom why such URL requests would be blocked when the cafe seems to permit pretty much everything else. Any reason? Could this be an availability issue at the respective CDNs themselves (ie how reliable are these CDNs)? And of-course, is there a recommended fix, apart from discarding CDN use?
Update: I can now connect to my app. So my lack of access to ajax.microsoft.com was most likely a temporary lack of MS CDN availability, and not any domain blocking.
all you need to do is implement fallback to your local server, explained here, http://happyworm.com/blog/2010/01/28/a-simple-and-robust-cdn-failover-for-jquery-14-in-one-line/
The Telerik online demos use the CDN by default, but fallback to embedded resources if the Amazon cloud service is unavailable. If you have the RadControls for ASP.NET AJAX installed locally, then you can see the source of the demo site. The files that you need to review are ~/Common/Footer.ascx and its code file ~/App_Code/QuickStart/Footer.cs, also
~/App_Code/QuickStart/QsfCdnConfigurator.cs ~/App_Code/QuickStart/HeadTag.cs. The Footer files set a cookie using JavaScript, depending on whether the CDN is available and the last two files provide support for reading the cookie on the server side and setting the appropriate configuration for the script manager.

Resources