CloudFront CDN (No GZip) vs Webserver / Local For all CSS / JS - wordpress

I am currently in the process of running some optimizations for WordPress site we are soon deploying. We are using w3 Total Cache for optimizations on the site. I have recently signed up for CloudFront for a CDN and I feel this has definitely helped but we are losing gzip compression for css/js files using this.
I wanted to see if anyone had any opinions on a good way to handle this. Currently I am hosting all js/css from the webserver and it seems to give better performance (using PageSpeed Insights add on for Google Chrome)
This is more of a best practice question to help me get some insight on this scenario.
Anyone have any recommendations?
Thank you in advance.

You can pre-gzip the JavaScript and CSS files, upload them to S3 with an application/x-gzip HTTP header, then have CloudFront serve them out.
If you automate the process as part of your build/deployment workflow, it's pretty easy to maintain.

After spending some time testing I have found the best way for our workflow is to have the css / js assets gzip on the server rather than serving from cloudfront.
It just seems way faster to serve the gzipped files on the server side than to serve non-gzipped files from cloudfront.
I feel the suggestion that #Ryan Parman makes perfect sense if we were not editing the css/js very often. I suppose there are some ways to create a script to sync the css/js after editing but for the current project what I have suggested above works like a charm and is fast!

Related

How to enable CDN for a subdomain with external web server?

My landing page is a static built website on Netlify.
Then I have a subdomain, which is an A record in DNS (set in Netlify).
This subdomain is points to an external web server (nginx+Django), which provides REST API and serves static content.
The question is how can I make this subdomain to use Netlify's CDN, when serving static content from Django API?
Are there any approaches to do it?
Netlify advises against using any CDN in front of it, as it already provides CDN. As well as there'll be a problem with SSL certificate.
I tried my best to look for such a question, but I couldn’t pick the right terms, I suppose, so I didn’t find it.
I would be very grateful for any advice!
Thank you and best regards!
You can use Netlify's Proxy Rewrites: https://docs.netlify.com/routing/redirects/rewrites-proxies/ to add some level of CDN to your site. However, it will totally be upto Netlify to decide how much time your content will stay in cache. If it's not in Netlify's cache, there won't be much of a performance boost by doing this - so I'd advise against it, unless absolutely necessary.

Serve static assets with an efficient cache policy

Help needed!
I am struggling setting the cache policy on my wordpress website.
I have tried to set the expirations manually in .htaccess and I have tried several plugins.
Nevertheless Google Page Speed Insights keeps displaying the message "Serve static assets with an efficient cache policy".
Is it possible to sort of manually add a cache policy (e.g. via .htaccess) for a specific line in Google Page Speed Insights?
Any help or suggestions will be much appreciated!
"Serve static assets with an efficient cache policy".
Most likely you are getting this message for resources which are not hosted by your website and is coming from external sites such as Google Analytics code or any other third party code. If that is the case you won't be able to do much and you can simply skip the "Serve static assets with an efficient cache policy" suggestion.
If the suggestion is for your own hosted content and you are not able to resolve even after making changes to .htacces files the problem could be in your web hosting and you need to contact them for resolution.

Host complete WordPress website in Amazon CloudFront or any other CDN

What I would like to do
According to a Amazon Senior Architect to whom I spoke recently, it is possible to cache dynamic website content in Amazon CloudFront.
The way I understand how this could work, is that in addition to doing a page cache of each page which has been accessed, cache the page in CloudFront too.
What I have tried
I have experimented a lot with W3 Total Cache and its settings but did not find a solution to this problem. I also have tried to set up CloudFront directly in the AWS control panel but did not find a way to cache the static result (html?) of WordPress's php calls.
Question
How would you tackle the issue?
How can I cache a static version of WordPress's dynamic pages in CloudFront or any other CDN?
Here is a concept plugin which aims to do this:
https://github.com/PeterBooker/wp-cloudfront-helper
most CDNs advertise that they can cache dynamic web sites. Unless they have very specific information about the pages they need to serve, they can't. There are situations where the "penalty" of a CDN for a dynamic page isn't that bad, but having an additional hop in-between the web server and the end user can only be faster in some very specific situations. The main reason is the absense of a Last-Modified header for the generated pages.
From my experience working with (at) CDN providers, the most performant way to include a CDN is to use a different hostname for the static assets and point that hostname to the CDN with your web server as the origin server.
Jan

Wordpress Development Domain

We are developing Wordpress sites and due to some issues with some plugins once the domain is changed (from dev.domain.com to www.domain.com), we have been using a practice of editing host files in house so that we can develop the site on the www.domain.com domain.
This solution works fine for us in house, however when a client wishes to see their site in development we have to walk them through the editing of their host files to see the development site and then back so they can see their live site.
Does anyone have a better idea or solution. Is there a way to make this work at the Apache level? Maybe a Chrome app? It needs to be super simple and automated (scripted) if possible.
Thanks
You could create an VPN connection to your "house" so that the fake www.domain.com will be avaible for them.
So, the issue is that some plugins think they're on www (or will only behave correctly on www)?
Then quite possibly your best quick bet is a Chrome/Firefox extension, although obviously that does require the client to have them installed.
I'd recommend looking at HostAdmin for Chrome and/or fire-hostadmin for Firefox. These should allow you to amend the hosts without too much hassle or direct editing of protected system files.
It is always worth noting that, depending on the client's computer's security, they may run into potential read-only issues (I believe SpyBot Search & Destroy and similar tools lock the hosts file by default).

Image optimization for Websites - mod_pagespeed X bulk optimize and replace source (trimage)

I have a couple of wordpress websites running on nginx with ngx_pagespeed (same as mod_pagespeed). It optimizes css, js, html and images and serve optimized resources from ram cache.
So I am considering moving all media to another domain/server (amazon S3). Problem is with that I will loose the ngx_pagespeed optimization.
What are my options? What do you guys think about cli optimize and replace source images before moving them to s3? Maybe a tool like Trimage would do the trick.
Another problem is these websites are feed by their owners so I cannot control image optimization pre upload. All i can do is either optimize with mod_pagespeed or bulk optimize before moving to S3.
What do you guys think? Anyone has came across a similar problem before?
Best regards.
One is solution that gets you the best of both worlds is to use a CDN that supports origin pull, like Amazon CloudFront, and configuring the ModPagespeedMapRewriteDomain option in mod_pagespeed (see the section on Mapping Rewrite Domains).
This works as such. When you configure the MapRewriteDomain option, you'll set it up so that mod_pagespeed will change the URL of optimized resources (images, js, etc.) to use the CDN's domain. When the CDN receives a request for a resource it doesn't have, it will fetch it from the origin domain, and cache it (this is the origin pull feature). That way you'll be able to get the benefits of both a CDN for your static resources, and mod_pagespeed's resource optimization features.

Resources