Help needed!
I am struggling setting the cache policy on my wordpress website.
I have tried to set the expirations manually in .htaccess and I have tried several plugins.
Nevertheless Google Page Speed Insights keeps displaying the message "Serve static assets with an efficient cache policy".
Is it possible to sort of manually add a cache policy (e.g. via .htaccess) for a specific line in Google Page Speed Insights?
Any help or suggestions will be much appreciated!
"Serve static assets with an efficient cache policy".
Most likely you are getting this message for resources which are not hosted by your website and is coming from external sites such as Google Analytics code or any other third party code. If that is the case you won't be able to do much and you can simply skip the "Serve static assets with an efficient cache policy" suggestion.
If the suggestion is for your own hosted content and you are not able to resolve even after making changes to .htacces files the problem could be in your web hosting and you need to contact them for resolution.
Related
My landing page is a static built website on Netlify.
Then I have a subdomain, which is an A record in DNS (set in Netlify).
This subdomain is points to an external web server (nginx+Django), which provides REST API and serves static content.
The question is how can I make this subdomain to use Netlify's CDN, when serving static content from Django API?
Are there any approaches to do it?
Netlify advises against using any CDN in front of it, as it already provides CDN. As well as there'll be a problem with SSL certificate.
I tried my best to look for such a question, but I couldn’t pick the right terms, I suppose, so I didn’t find it.
I would be very grateful for any advice!
Thank you and best regards!
You can use Netlify's Proxy Rewrites: https://docs.netlify.com/routing/redirects/rewrites-proxies/ to add some level of CDN to your site. However, it will totally be upto Netlify to decide how much time your content will stay in cache. If it's not in Netlify's cache, there won't be much of a performance boost by doing this - so I'd advise against it, unless absolutely necessary.
I'm using WP Super Cache as the hosting recommends.
I guess that WP Super Cache creates a cached file the first time somebody visit a page of the website. I want to cach all the files at once, so my idea is to open all the pages of the site.
I wonder if I download the entire website with wget, all the pages will be cached.
WP Super Cache has a "preload" mode which you can find under it's settings. It's exactly what your searching for and does what you say it does, it simply "simulates" a page visit.
What I would like to do
According to a Amazon Senior Architect to whom I spoke recently, it is possible to cache dynamic website content in Amazon CloudFront.
The way I understand how this could work, is that in addition to doing a page cache of each page which has been accessed, cache the page in CloudFront too.
What I have tried
I have experimented a lot with W3 Total Cache and its settings but did not find a solution to this problem. I also have tried to set up CloudFront directly in the AWS control panel but did not find a way to cache the static result (html?) of WordPress's php calls.
Question
How would you tackle the issue?
How can I cache a static version of WordPress's dynamic pages in CloudFront or any other CDN?
Here is a concept plugin which aims to do this:
https://github.com/PeterBooker/wp-cloudfront-helper
most CDNs advertise that they can cache dynamic web sites. Unless they have very specific information about the pages they need to serve, they can't. There are situations where the "penalty" of a CDN for a dynamic page isn't that bad, but having an additional hop in-between the web server and the end user can only be faster in some very specific situations. The main reason is the absense of a Last-Modified header for the generated pages.
From my experience working with (at) CDN providers, the most performant way to include a CDN is to use a different hostname for the static assets and point that hostname to the CDN with your web server as the origin server.
Jan
I done things a bit backward developing my new site, don't ask me why! But i built the site on the live server it will be hosted on first and the other day i created a sub-domain to hold a copy of the website so i can use it as a sandbox environment and test new plugins, to get PayPal working etc
I followed this tutorial
So it all worked fine! i have a copy of my site on a subdomain working. I had the infamous admin login redirect to itself issue but i sorted that, the reason it wasnt working was because i had my caches disabled in magento before i copied the site. So i had to enable them again in order to gain access ( If anyone knows why this is please share).
So my problem now is, i am updating the design of my website using the css and images in the skin folder. The problem is i update something in the css and load it onto my server and into the subdomain skins folder but nothing changes on the frontend UNTIL about 15 minutes later and me clearing all caches hundreds of times!! i really don't understand whats happening?
The links to my css/js and image folders are all correct in the head of the website. It's just like a time delay between me changing something in the css and the website updating itself.
Any information would be greatly appreciated.
Kind regards
Tom
Have you also disabled cached on Magento Admin? Perhaps you can try reloading the site on a non-cached based version of browsers (e.g. Incognito Mode in Google Chrome).
Your browser is also caching the external css files which is basically good for saving bandwidth of server and reducing page loading speed. But, for development purpose, you need to avoid css caching. On firefox or IE, you can use CTRL+F5 to reload a webpage without cached css.
If you do not want to use CTRL+F5, you can also add timestamp to your css file as URL parameter.
For example, style.css?<?php echo time();?>
You can also use Apache module to expire the caching.
.htaccess
ExpiresByType text/css "access plus 1 second"
http://httpd.apache.org/docs/2.2/mod/mod_expires.html
I am currently in the process of running some optimizations for WordPress site we are soon deploying. We are using w3 Total Cache for optimizations on the site. I have recently signed up for CloudFront for a CDN and I feel this has definitely helped but we are losing gzip compression for css/js files using this.
I wanted to see if anyone had any opinions on a good way to handle this. Currently I am hosting all js/css from the webserver and it seems to give better performance (using PageSpeed Insights add on for Google Chrome)
This is more of a best practice question to help me get some insight on this scenario.
Anyone have any recommendations?
Thank you in advance.
You can pre-gzip the JavaScript and CSS files, upload them to S3 with an application/x-gzip HTTP header, then have CloudFront serve them out.
If you automate the process as part of your build/deployment workflow, it's pretty easy to maintain.
After spending some time testing I have found the best way for our workflow is to have the css / js assets gzip on the server rather than serving from cloudfront.
It just seems way faster to serve the gzipped files on the server side than to serve non-gzipped files from cloudfront.
I feel the suggestion that #Ryan Parman makes perfect sense if we were not editing the css/js very often. I suppose there are some ways to create a script to sync the css/js after editing but for the current project what I have suggested above works like a charm and is fast!