Optimizing Page Load: multiple CDN files or single CDN file? - pagespeed

I'm trying to optimize my page loads. Currently, I have multiple resources being pulled from various CDNs (e.g. jquery, etc.). In all, I have about 10 different JS files and 10 different CSS files. Around 50-75% are available on CDNs.
When I run PageSpeed/YSlow on it (via GTMetrix), it complains about me having too many resources and that I should combined the files. I combined the JS files into a single file and did the same to the CSS files (later, I will serve these from a CDN). When I re-ran the tests, my page load time went from 2.19s to 1.87s. It would appear that combining the files and serving locally is faster than separate files served from CDNs.
I could not find any definitive tests showing that combining files and serving locally is better than separate files served from a CDN is superior. I can only guess at this point that once I put the combined file on a CDN that things will get even faster.
Is combining files a superior approach?

You have two dimensions for improvement:
Combine Resources
In general fewer requests mean fewer roundtrips and better page load speeds. (Ref: Yahoo: Minimize HTTP Requests - Best Practices for Speeding Up Your Web Site)
Serve Resources via CDN
A content delivery network will typically have multiple data centers and can server content closer to your site visitor. (Ref: Use a Content Delivery Network - Best Practices for Speeding Up Your Web Site)
You can apply both principals and test to see how your site improves. I would definitely start with combining resources and you'll probably see a good gain there.

Related

Will Multiple CDN Speed Up Page Load

I noticed DJI Store website uses multiple CDN domains to server static elements.
Web page:
https://store.dji.com/?site=brandsite&from=nav
CDNs:
https://asset2.djicdn.com/assets/v2/common/14292283_1302296159810439_4324228009709332653_n.jpg
https://asset4.djicdn.com/assets/v2/build/app-0f0a05d6b0cd030cf68ca92e67816241.css
https://product2.djicdn.com/uploads/sku/covers/31314/small_55e19eff-2d6a-4d75-8e63-b9b5822fd298.png
Just wondering what is the purpose of using more than 1 CDN domain, more parallel downloads?
If so, how many domains I should use?
This is no longer a recommended way to load assets from CDN. Its better to use a single CDN and load as many resources using it as possible so that the HTTP/2 connection can be reused and the page has to create less connections.
Back in the HTTP/1.1 times, it was a common practice to load resources over multiple hosts to parallelize their download. This was a helpful practice at that time and could significantly speed up rich webpages for users with more bandwidth. This technique was called Domain Sharding.
But after HTTP/2, it is no longer required and seen as a bad practice. The above store seems to be built in the HTTP/1.1 era and optimized for the browser of that time.
There is another term "Incidental Domain Sharding" which means web development practices have resulted in developers unnecessarily relying on more and more hosts to deliver their content. For example, sites these days load fonts from Google Fonts, public libraries from some javascript CDN, and host their private content on a private CDN. This requires the browser to open several unnecessary connections that can otherwise be avoided, and prevents browser to avail the HTTP/2 multiplexing. But there are possible solutions like PageCDN and EasyFonts that collectively can help achieve maximum performance out of the available technologies since they load all the page resources over single CDN.
If you want to see Incidental Domain Sharding in action, have a look at source code of http://www.piston.rs/dyon-tutorial/ They are loading resources over 5 CDNs, and their private content (website CSS and JS files) still need a private CDN.

Is it feasible to use SSI (Combine CSS/JS files using htaccess) to speedup website?

I went through this article today
http://isitvivid.com/blog/combining-your-cssjs-files-with-htaccess
and found that we can use htaccess file to combile CSS and JS files and this thing is called Server Side Includes.
I am already usign gZip compression, Keep Alive feature and Cache for my website. By using SSI to combine CSS/JS will help to speed up website?
I'm sure you could speed up your site a little bit by combining assets, but unless you're loading hundreds of things synchronously you shouldn't expect more than a micro optimization. You're more likely to get speed benefits by using the defer script tag attributes when loading your script tags and minimizing your css files.

Dependencies that must be done away with for using CDN

I wanted to know that, is there some special requirement for a website to make use of CDN ?
i mean is there some special scheme(or atleast considerations) on which your website must be build right from the start to make use of CDN (Content delivery network).
is there anything that can stop a website from making use of CDN, for example the way it references the content files, static file paths or any other thing conceivable.
Thanks
It depends.
You have two kinds of CDN services:
Services like AWS Cloudfront that require you to upload the files in some special place that they read from (eg. AWS S3) - In this case you need have a step in your build process to correctly upload the files and handle the addresses somehow inside your application
Services like Akamai that just need you to change and tweak your DNS records so they will serve the request to your users instead of you - In this case you would have two domains (image.you.com and image2.you.com) and have the image.you.com pointing to Akamai and image2.you.com pointing to the original source of the file. Whenever a user requested an image in Akamai, they would come to you through the "back door", fetch it and starting serving that file always.
If you use the second approach it's really simple to have a CDN supporting your application.
There are a whole bunch of concerns when dealing with CDN solutions.
The first one is that a CDN can't serve a dynamic page - i.e. a page that is unique to every user. Typically, that includes PHP, ASPX, JSP, RubyOnRails etc. - so if you're hoping to support lots of users for a dynamic site, you have to come up with another solution. Some CDN providers support "Edge Side Includes" - this allows you to glue dynamic pages together with cached content on the CDN, but this creates quite a complex application.
Of course, even on a dynamic application, a CDN can still serve static files - images, stylesheets, javascript files, videos etc.
#Tucaz explains the two major options here (actually, Akamai also provides a "filestore" CDN option). If you select the second option - effectively, the CDN becomes a caching reverse proxy in front of your website - it makes sense to tweak the cache headers on your HTTP server, and tell the CDN to honour those. Make sure you set your .ASPX files to not cache!

Javascript Script Combining and Caching

I am building a AJAX intensive web application (using ASP.NET, JQuery, and WCF web services) and am looking into building an HTTP Handler that handles script combining and compression for my JavaScript files and my CSS files. I know not combining the scripts is generally a less preferred approach, and I'm sure it's probably the way I will end up going, but my question is this...
Since so many of my JS files are due to the controls I use don't they get cached by the browser after the first load anyway? Since so many of these controls can be found on many of the pages of my web application is it actually faster to combine all of my scripts and serve that one file (which will vary for every page) or to serve the individual files which will get cached? I guess what I'm getting at is, by enabling script combining am I now losing part of the caching ability of the browser? I know I can cache the combined script, but the combined script will be different for every page whereas with the individual control scripts each one will be cached and the number of new scripts will be minimal for each page call.
Does this make any sense? Thoughts?
The fewer number of JS files you serve, the faster your pages will be, due a smaller number of round trips to the server. I would manually put all the common js code into one file (or as few files as possible), all the css code into one file etc., and not worry about using a handler to combine the files. The handler is going to take processing time to combine the files, so you are going to pay that penalty also. You can turn on gzip compression on IIS and have it handle that for you. I would run something like YUI Compressor on the Javascript files used in production.
If the handler changes the file contents from page to page, browsers won't be able to cache it. If you are using SSL this point will be moot though as the browser won't cache the files anyway.
EDIT
I've been corrected some browsers (like FF) can cache SSL content but not all.
As other mentioned: minify, gzip and turn on caching (set expire time and see to that you support etags) on the one static JS file you have and the one static CSS file you have. On top of this it's recommended to load your CSS files as early as possible and your JS files as late as possible (JS file loading is blocking other downloads and it's faster for the browser to render the page if it got the CSS as soon as possible). Sprites also help if you have many small images/icons. Loading static content from sub domains will help the browser to have more simultaneous downloads and you could drop all you cookies for those sub domains to lower the http header size.
You could consult YSlow for performance analysis, it's a great tool!
generally a less preferred
is it for live js? I thinking of JavaScriptMVC which compresses all the code into one file when complied for production, not development... It's a heavy weight I believe.
Usually it's better to combine all scripts. In this case you'll reduce http overhead. Minified controls scripts usually are quite small. In rare case when you are using quite large control you could not combine it to main js.

will the use of multiple subdomains speed up my website?

i am considering moving my images to a subdomain on my website, and i read somewhere that moving the script to a different one would make it even faster! is it really true? or should i just leave it at what it is if i am not considering a real CDN?
Yes and no. The site itself won't be faster, but it may load faster in most browsers and thereby it may seem faster.
The reason is that most browsers limit themselves to a set maximum of concurrent connections to a domain. Say you have your site on www.mysite.com. Now when your browser tries to download your html, css, scripts and images it may need to download 20-30 files from the server. Since the browser limits itself to, say 4, concurrent connections to your domain the browser will have to download only 4 files at one time.
Now if you serve your css files on a separate subdomain css.mysite.com, your images on images.mysite.com and scripts on scripts.mysite.com your browser can open 4 concurrent connections to each of the domains. Hence it can download up to 16 files at the same time. If your banwidth allows it this may cause the page to load faster.
So your site may appear to be faster for the visitor, but the reason will be loading times, not any speedup of code or database access.

Resources