I noticed DJI Store website uses multiple CDN domains to server static elements.
Web page:
https://store.dji.com/?site=brandsite&from=nav
CDNs:
https://asset2.djicdn.com/assets/v2/common/14292283_1302296159810439_4324228009709332653_n.jpg
https://asset4.djicdn.com/assets/v2/build/app-0f0a05d6b0cd030cf68ca92e67816241.css
https://product2.djicdn.com/uploads/sku/covers/31314/small_55e19eff-2d6a-4d75-8e63-b9b5822fd298.png
Just wondering what is the purpose of using more than 1 CDN domain, more parallel downloads?
If so, how many domains I should use?
This is no longer a recommended way to load assets from CDN. Its better to use a single CDN and load as many resources using it as possible so that the HTTP/2 connection can be reused and the page has to create less connections.
Back in the HTTP/1.1 times, it was a common practice to load resources over multiple hosts to parallelize their download. This was a helpful practice at that time and could significantly speed up rich webpages for users with more bandwidth. This technique was called Domain Sharding.
But after HTTP/2, it is no longer required and seen as a bad practice. The above store seems to be built in the HTTP/1.1 era and optimized for the browser of that time.
There is another term "Incidental Domain Sharding" which means web development practices have resulted in developers unnecessarily relying on more and more hosts to deliver their content. For example, sites these days load fonts from Google Fonts, public libraries from some javascript CDN, and host their private content on a private CDN. This requires the browser to open several unnecessary connections that can otherwise be avoided, and prevents browser to avail the HTTP/2 multiplexing. But there are possible solutions like PageCDN and EasyFonts that collectively can help achieve maximum performance out of the available technologies since they load all the page resources over single CDN.
If you want to see Incidental Domain Sharding in action, have a look at source code of http://www.piston.rs/dyon-tutorial/ They are loading resources over 5 CDNs, and their private content (website CSS and JS files) still need a private CDN.
Related
I'm trying to optimize my page loads. Currently, I have multiple resources being pulled from various CDNs (e.g. jquery, etc.). In all, I have about 10 different JS files and 10 different CSS files. Around 50-75% are available on CDNs.
When I run PageSpeed/YSlow on it (via GTMetrix), it complains about me having too many resources and that I should combined the files. I combined the JS files into a single file and did the same to the CSS files (later, I will serve these from a CDN). When I re-ran the tests, my page load time went from 2.19s to 1.87s. It would appear that combining the files and serving locally is faster than separate files served from CDNs.
I could not find any definitive tests showing that combining files and serving locally is better than separate files served from a CDN is superior. I can only guess at this point that once I put the combined file on a CDN that things will get even faster.
Is combining files a superior approach?
You have two dimensions for improvement:
Combine Resources
In general fewer requests mean fewer roundtrips and better page load speeds. (Ref: Yahoo: Minimize HTTP Requests - Best Practices for Speeding Up Your Web Site)
Serve Resources via CDN
A content delivery network will typically have multiple data centers and can server content closer to your site visitor. (Ref: Use a Content Delivery Network - Best Practices for Speeding Up Your Web Site)
You can apply both principals and test to see how your site improves. I would definitely start with combining resources and you'll probably see a good gain there.
I wanted to know that, is there some special requirement for a website to make use of CDN ?
i mean is there some special scheme(or atleast considerations) on which your website must be build right from the start to make use of CDN (Content delivery network).
is there anything that can stop a website from making use of CDN, for example the way it references the content files, static file paths or any other thing conceivable.
Thanks
It depends.
You have two kinds of CDN services:
Services like AWS Cloudfront that require you to upload the files in some special place that they read from (eg. AWS S3) - In this case you need have a step in your build process to correctly upload the files and handle the addresses somehow inside your application
Services like Akamai that just need you to change and tweak your DNS records so they will serve the request to your users instead of you - In this case you would have two domains (image.you.com and image2.you.com) and have the image.you.com pointing to Akamai and image2.you.com pointing to the original source of the file. Whenever a user requested an image in Akamai, they would come to you through the "back door", fetch it and starting serving that file always.
If you use the second approach it's really simple to have a CDN supporting your application.
There are a whole bunch of concerns when dealing with CDN solutions.
The first one is that a CDN can't serve a dynamic page - i.e. a page that is unique to every user. Typically, that includes PHP, ASPX, JSP, RubyOnRails etc. - so if you're hoping to support lots of users for a dynamic site, you have to come up with another solution. Some CDN providers support "Edge Side Includes" - this allows you to glue dynamic pages together with cached content on the CDN, but this creates quite a complex application.
Of course, even on a dynamic application, a CDN can still serve static files - images, stylesheets, javascript files, videos etc.
#Tucaz explains the two major options here (actually, Akamai also provides a "filestore" CDN option). If you select the second option - effectively, the CDN becomes a caching reverse proxy in front of your website - it makes sense to tweak the cache headers on your HTTP server, and tell the CDN to honour those. Make sure you set your .ASPX files to not cache!
This discussion around gzip, minified, and combining both for maximum compression effects made me wonder if I can get a net performance effect by placing jQuery libraries directly inside my HTML file?
I already do this for icon sprites, which I base64 encode and then load via CSS datasource, in an effort to minimize the number of HTTP requests.
Now I ask: do you know if loading the jQuery library in its raw min+gzip form, as available on the official production download link, would be more efficient compared to loading it via the Google AJAX libraries that are hosted on their CDN?
TL;DR:
You should always keep common items (like a JS library) in external files, because that allows for caching on your server, in the browser, and at many nodes along the way. Caching times dramatically outweigh compression or request times when looking at the overall speed of a page.
Longer Version:
There are a number of things to consider for this question. Here are the ones that came off the top of my head.
Minifying scripts and styles is always good for speed. I don't think there's much to discuss here.
Compression (and extraction in the browser) has an overhead. Firing up the gzip module takes time away from sending the request, so for small files or binary files (images, etc.), it's usually better to skip the gzip. Something large and consisting mostly of ascii (like a JS library), however, is a good thing to compress, so that checks out.
Reducing HTTP requests is generally a good thing to do. Note, though, that ever since HTTP 1.1 (supported in all major browsers), a new HTTP request does not mean a new socket connection. With keep-alive, the same socket connection can serve the webpage and all the affiliated files in one go. The only extra overhead on a request is the request and response headers which can be negligible unless you have a lot of small images/scripts. Just make sure these files are served from a cookie-free domain.
Related to point 3, embedding content in the page does have the benefit of reducing HTTP requests, but it also has the drawback of adding to the size of the page itself. For something large like a JS library, the size of library vastly outweighs the size of the HTTP overhead needed for an extra file.
Now here's the kicker: if the library is embedded in the page, it will almost never be cached. For starters, most (all?) major browsers will NOT cache the main HTML page by default. You can add a CACHE-CONTROL meta tag to the page header if you want, but sometimes that's not an option. Furthermore, even if that page is cached, the rest of your pages (that are probably also using the library) will now need to be cached as well. Suddenly you have your whole site cached on the user's machine including many copies of the JS library, and that's probably not what you want to do to your users. On the other hand, if you have the JS library as an external file, it can be cached by itself just once, especially (as #Barmar says) if the library is loaded from a common CDN like Google.
Finally, caching doesn't only happen in the client's browser. If you're running an enterprise-level service, you may have multiple server pools with shared and individual caches, CDN caches that have blinding speed, proxy caches on the way to your domain, and even LAN caches on some routers and other devices. So now I direct you back to the TL;DR at the top, to say that the speed-up you can gain by caching (especially in the browser) trumps any other consideration you are likely to encounter.
Hope that helps.
I have used Drupal 6 for my multilingual website. I am getting issue of site performance I have enabled the cache setting of drupal but it still going slow when move from one page to other.
I have also used boost module but it’s not comfortable for multilingual website.
Please tell me any other way so I will improve performance. Thanks in advance.
your question has a whole host of answers, instead of listing all (which i would have to copy from several sites), here are a few main ones, and some resources i'd recommend:
-> Turn off all modules you are not using
-> Turn on caching
-> Try using the memcache module
-> Enable drupal JS and CSS aggregation, so we would have less files to load, meaning less HTTP requests
-> Use a CDN
-> GZIP contents
-> Minify Javascript
-> Avoid Redirects
-> Reduce Duplicate Scripts
Some Resources I would Recommend
http://wimleers.com/article/improving-drupals-page-loading-performance
http://drupal.org/node/326504
http://groups.drupal.org/node/85979
http://groups.drupal.org/node/195218
http://www.bootstrappingindependence.com/technology/how-to-improve-website-performance-with-drupal-php-mysql-and-apache/
http://www.vmirgorod.name/blog/tuning-drupal-performance
http://pronovix.com/blog/my-favorite-drupal-performance-hacks
http://fenix-solutions.com/blog/2009/12/09/tips-for-improving-drupal-performance/
http://drupalst.com/blog/improving-drupal-performance
The same issue I have faced in one of my application, I have worked on the following steps and it increases the performance which I verified through YSlow and GooglePageSpeed.
If you are using Apache, Replace Apache with NGINX as the web server for your Drupal site. This improves performance and reduces memory utilization when thousands of connections run concurrently. (Apache allocates memory to every additional connection, so it tends to start swapping to disk as concurrent connections increase)
Implementation of reverse proxy server. NGINX is a very popular reverse proxy server for Drupal sites. Implementing a reverse proxy server removes the burden of handling Internet traffic from your application server and allows other performance‑enhancing steps: caching of static files and the use of multiple load‑balanced application servers.
Implementation of CDN (Content Distribution Network)
Implementation of browser level caching at server level
Should enable Compression of images, CSS and Javascript files.
You can use Akamai for node/page level caching as a result it will increase the performance.
Use image sprites (css3embed), avoid iframes if you are using
Index database tables (use dbtuner)
Use of adding an Expires header
Disable DbLog module if not in use.
Move your assets at bottom
Reduce DNS lookups
Hope it would help.
Thanks
you need realy fast hosting, i recommend use nginx with phpfpm. without any var or opcode cacher your perfomance will be just incredible. im my vps i have speed up big drupal site to 200ms per page
and of course you must to audit your site for slow queries (big views, many blocks or php code).
what is the benefit from making subdomain for images in my website?
is it regarding faster downloading or server caching . I noticed that all big sites like facebook and others are doing something like that for images , videos.....
It's a common recommendation for high load, high performance websites: Serving images off different servers can improve your overall performance significantly.
Browsers generally only keep 2-5 connections to the same server, so if your webpage has 10 images, the browser will download 2 to 5 of them simultaneously, while the rest waits. Using different subdomains will "trick" the browser, and it will open additional connections, which reduces the overall page load time.
Using a CDN for images, stylesheets and other static content, can also take some load of your webserver, which is another benefit.