I've been using Google PageSpeed insights to try and improve my site's performance, and so far it's proven extremely successful. Things like deferring scripts worked beautifully, since I already had an in-house version of jQuery's .ready() to defer scripts until the page had loaded fully, all I had to do was inline that particular function and move the full scripts to the end of the page. That worked great.
But now I find myself glaring at the one remaining yellow dot on the checklist: "Eliminate render-blocking CSS in above-the-fold content".
The way my CSS is set up is to have one global _.css file containing styles that apply to the page structure in general, or are used in more than one or two places across the site. Most pages then have an associated CSS file (for instance, party.php has party.css) containing styles specific to that particular page. All CSS files are cached indefinitely, as I append /t=FILEMTIME to filenames (and later remove them with .htaccess) in order to guarantee that files are updated when they are changed.
So anyway, Google recommends inlining critical styles needed for above-the-fold content. Trouble is... well, take a look at this screenshot: http://prntscr.com/1qt49e
As you can see... ALL of the content is above-the-fold! People hate scrolling, especially on a game that involves loading many pages. So I designed the site to fit on one screen (assuming a good enough resolution). So that means... ALL of the styles apply to above-the-fold content! So... is there any solution? Or am I stuck with that yellow mark on an otherwise near-perfect score?
A related question has been asked before: What is “above-the-fold content” in Google Pagespeed?
Firstly you have to notice that this is all about 'mobile pages'.
So when I interpreted your question and screenshot correctly, then this is not for your site!
On the contrary - doing some of the things advised by Google in their guidelines will things make worse than better for 'normal' websites.
And not everything that comes from Google is the "holy grail" just because it comes from Google. And they themselves are not a good role model if you have a look at their HTML markup.
The best advice I could give you is:
Set width and height on replaced elements in your CSS, so that the browser can layout the elements and doesn't have to wait for the replaced content!
Additionally why do you use different CSS files, rather than just one?
The additional request is worse than the small amount of data volume. And after the first request the CSS file is cached anyway.
The things one should always take care of are:
reduce the number of requests as much as possible
keep your overall page weight as low as possible
And don't puzzle your brain about how to get 100% of Google's PageSpeed Insights tool ...! ;-)
Addition 1: Here is the page on which Google shows us, what they recommend for Optimize CSS Delivery.
As said before, I don't think that this is neither realistic nor that it makes sense for a "normal" website! Because mainly when you have a responsive web design it is most certain that you use media queries and other layout styles. So if you are not gonna load your CSS first and in a blocking manner you'll get a FOUT (Flash Of Unstyled Text). I really do not believe that this is "better" than at least some more milliseconds to render the page!
Imho Google is starting a new "hype" (when I have a look at all the question about it here on Stackoverflow) ...!
How I got a 99/100 on Google Page Speed (for mobile)
TLDR: Compress and embed your entire css script between your <style></style> tags.
I've been chasing down that elusive 100/100 score for about a week now. Like you, the last remaining item was was eliminating "render-blocking css for above the fold content."
Surely there is an easy solve?? Nope. I tried out Filament group's loadCSS solution. Too much .js for my liking.
What about async attributes for css (like js)? They don't exist.
I was ready to give up. Then it dawned on me. If linking the script was blocking the render, what if I instead embedded my entire css in the head instead. That way there was nothing to block.
It seemed absolutely WRONG to embed 1263 lines of CSS in my style tag. But I gave it a whirl. I compressed it (and prefixed it) first using:
postcss -u autoprefixer --autoprefixer.browsers 'last 2 versions' -u cssnano --cssnano.autoprefixer false *.css -d min/ See the NPM postcss package.
Now it was just one LONG line of space-less css. I plopped the css in <style>your;great-wall-of-china-long;css;here</style> tags on my home page. Then I re-analyzed with page speed insights.
I went from 90/100 to 99/100 on mobile!!!
This goes against everything in me (and probably you). But it SOLVED the problem. I'm just using it on my home page for now and including the compressed css programmatically via a PHP include.
YMMV (your mileage may vary) pending on the length of your css. Google may ding you for too much above the fold content. But don't assume; test!
Notes
I'm only doing this on my home page for now so people get a FAST render on my most important page.
Your css won't get cached. I'm not too worried though. The second they hit another page on my site, the .css will get cached (see Note 1).
Few tips that may help:
I came across this article in CSS optimization yesterday:
CSS profiling for ... optimization
A lot of useful info on CSS and what CSS causes the most performance drains.
I saw the following presentation on jQueryUK on "hidden secrets" in Googe Chrome (Canary) Dev Tools:
DevTools Can do that.
Check out the sections on Time to First Paint, repaints and costly CSS.
Also, if you are using a loader like requireJS you could have a look at one of the CSS loader plugins, called require-CSS, which uses CSSO - a optimzer that also does structural optimization, eg. merging blocks with identical properties. I used it a few times and it can save quite a lot of CSS from case to case.
Off the question:
I second #Enzino in creating a sprite for all the small icons you are loading. The file sizes are so small it does not really warrant a server roundtrip for each icon. Also keep in mind the total number of concurrent http requests are browser can do. So requests for a larger number of small icons are "render-blocking" as well. Although an empty page compare to yours, I like how duckduckgo loads for example.
Please have a look on the following page https://varvy.com/pagespeed/render-blocking-css.html .
This helped me to get rid of "Render Blocking CSS". I used the following code in order to remove "Render Blocking CSS". Now in google page speed insight I am not getting issue related with render blocking css.
<!-- loadCSS -->
<script src="https://cdn.rawgit.com/filamentgroup/loadCSS/6b637fe0/src/cssrelpreload.js"></script>
<script src="https://cdn.rawgit.com/filamentgroup/loadCSS/6b637fe0/src/loadCSS.js"></script>
<script src="https://cdn.rawgit.com/filamentgroup/loadCSS/6b637fe0/src/onloadCSS.js"></script>
<script>
/*!
loadCSS: load a CSS file asynchronously.
*/
function loadCSS(href){
var ss = window.document.createElement('link'),
ref = window.document.getElementsByTagName('head')[0];
ss.rel = 'stylesheet';
ss.href = href;
// temporarily, set media to something non-matching to ensure it'll
// fetch without blocking render
ss.media = 'only x';
ref.parentNode.insertBefore(ss, ref);
setTimeout( function(){
// set media back to `all` so that the stylesheet applies once it loads
ss.media = 'all';
},0);
}
loadCSS('styles.css');
</script>
<noscript>
<!-- Let's not assume anything -->
<link rel="stylesheet" href="styles.css">
</noscript>
I too have struggled with this new pagespeed metric.
Although I have found no practical way to get my score back up to %100 there are a few things I have found helpful.
Combining all css into one file helped a lot. All my sites are back up to %95 - %98.
The only other thing I could think of was to inline all the necessary css (which appears to be most of it - at least for my pages) on the first page to get the sweet high score. Although it may help your speed score this will probably make your page load slower though.
The 2019 optimal solution for this is HTTP/2 Server Push.
You do not need any hacky javascript solutions or inline styles. However, you do need a server that supports HTTP 2.0 (any modern server version will), which itself requires your server to run SSL. However, with Let's Encrypt there's no reason not to be using SSL anyway.
My site https://r.je/ has a 100/100 score for both mobile and desktop.
The reason for these errors is that the browser gets the HTML, then has to wait for the CSS to be downloaded before the page can be rendered. Using HTTP2 you can send both the HTML and the CSS at the same time.
You can use HTTP/2 push by setting the Link header.
Apache example (.htaccess):
Header add Link "</style.css>; as=style; rel=preload, </font.css>; as=style; rel=preload"
For NGINX you can add the header to your location tag in the server configuration:
location = / {
add_header Link "</style.css>; as=style; rel=preload, </font.css>; as=style; rel=preload";
}
With this header set, the browser receives the HTML and CSS at the same time which stops the CSS from blocking rendering.
You will want to tweak it so that the CSS is only sent on the first request, but the Link header is the most complete and least hacky solution to "Eliminate Render Blocking Javascript and CSS"
For a detailed discussion, take a look at my post here: Eliminate Render Blocking CSS using HTTP/2 Push
Consider using a package to automatically generate inline styles from your css files. A good one is Grunt Critical or Critical css for Laravel.
Related
I've got this web app where the favicon is inlined in the HTML, e.g.,
<link rel="icon" href="data:image/x-icon;base64,A VERY VERY LONG STRING...">
However I can definitely see that both Chrome and Firefox (latest version as of this date) issue a request to favicon.ico at the root of my website anyway, e.g. http://example.com/favicon.ico
In case it matters:
The base64-encoded string embedded in the href attribute is quite big.
The favicon <link> tag is managed by react-helmet
The website itself isn't particularly slow. (Consistent good Apdex score throughout.)
I can only assume that the developers at the time (all gone now) wanted to inline the favicon to avoid an HTTP request and therefore wrote some "infrastructure" to support that: namely using a Webpack plugin to automatically base64 encode all assets imported as JavaScript modules (e.g. import favicon from './assets/favicon.ico').
Clearly this isn't working as it was intended but what strikes me the most is that the actual base64 string weights more than the favicon.ico file itself (20k vs 15k). So I'm not sure where the benefit is (if any).
While I don't know any better than you why the original developers designed it that way, it makes sense for offline file rendering of a simple all-in-one html file.
I actually just looked this up, because I am building a SUPER small all-in-one html file. I don't have to include an extra file if it's base 64 encoded into the single html file.
Here's my last two days of reading in few a minute.
As of 2021, 93% of online browsers could view a SVG as an Favicon
https://en.wikipedia.org/wiki/Usage_share_of_web_browsers
https://caniuse.com/link-icon-svg
.ICO is outdated way to create 'favicons' and requires you to make multiple small sizes of your image whereas .PNG can scale down from any size. It's easily the best lazy option for a quick icon. Because the viewing size of Icons are so small, any complex picture is undistinguishable. Making very simple designs optimal.
This is where .SVG shines.
https://www.iconfinder.com/
find image > inspect > open in new page > save image as
Paint 3D's Magic Select is free tool worth mentioning
This is by far the most informative and straight forward video on auto SVG
https://www.youtube.com/watch?v=10m_2bPXa1s
Now, we're left with a 4-8KBs of data. Which could be a 5th of your .PNGs size.
Next we'll want to optimize it
https://www.youtube.com/watch?v=iVzW3XuOm7E
So we could skip a DOM request by having all the data in the head but that leads us here.
https://css-tricks.com/probably-dont-base64-svg/
Now say we're creating a Single Page Application and care about SEO. Not only do we score higher and reduce our load times but we offer a better experience for users with the lowest internet speeds.
i'm trying to improve speed of my website. i'm using PageSpeed Insights to check my site performance and it was telling me to remove render blocking java script and css. so i did it and know its causing problem in my website design. so what should i do to remove rendering blocking without causing problem in my website design.
Render Blocking CSS
Render blocking CSS will always show on Google Page Speed Insights if you are using external resources for your CSS.
What you need to do is to inline all of your 'above the fold' styles in <style></style> tags in the head of your web page.
I will warn you, this is NOT easy and plugins that claim to do this often do not work, it requires effort.
To explain what is happening:-
A user navigates to your site and the HTML starts downloading.
As the HTML downloads the browser is trying to work out how to render that HTML correctly and it expects styling on those elements.
Once the HTML has downloaded if it hasn't found styles for the elements that appear above the fold (the initial part of the visible page) then it cannot render anything yet.
The browser looks for your style sheets and once they have downloaded it can render the page.
Point 4. is the render blocking as those resources are stopping the page from rendering the initial view.
To achieve this you need to work out every element that displays without scrolling the page and then find all the styles associated with those elements and inline them.
Render Blocking JS
This one is simpler to fix.
If you are able to use the async attribute on your external JS then use that.
However be warned that in a lot of cases this will break your site if you have not designed for it in the first place.
This is because async will download and execute your JS files as fast as possible. If a script requires another script to function (i.e. you are using jQuery) then if it loads before the other script it will throw an error. (i.e. your main.js file uses jQuery but downloads before it. You call $('#element') and you get a $ is undefined error as jQuery is not downloaded yet.)
The better tag to use if you do not have the knowledge required to implement async without error is to use the defer attribute instead.
This will not start downloading the script until the HTML has finished parsing. However it will still download and execute scripts in the order specified in the HTML.
Add async in the script tag and put the css and js in the last of the page
So I am wondering whether it is better for me as far as page speed, to precompile my less style sheets instead of using less.js. After testing this via google page speed, I noticed I actually went down a couple points after precompiling. As far as i know it should be less demanding on the user. But I guess I am wrong? Is there something else I should take into account? Minifying the css is also an option, but I don't think that will make a considerable difference.
Thanks.
I think you should change the title of your question in something like "does precompiling Less generates a better page speed score?"
If probably depends of the size of your Less code. In general page speed wants that you first load something readable and load non critical CSS and JavaScript afterwards.
When you compile all your Less code in one big CSS file, page speed will complain that you should load the critical CSS first (preferred to do that in source (no extra http request) ).
When compiling client side with less.js the compiled CSS code is inserted in source, but require two http request (the compiler and the less file). Less code can be smaller than the compiled CSS. But you will have to load the compiler (possible from CDN).
But overall you should realize that the client side compiler have to compile your Less code again for every page request. Client side compiling will take time and so create a bad user experience in most situations.
Minifying the css is also an option, but I don't think that will make a considerable
difference.
Minifying reduce the number of bytes that have to be send and so always helps to make your website load faster.
some tests
When i load a simple page with:
<link href="css/bootstrap.min.css" rel="stylesheet">
I found a page speed score of 95/100 and have to fix:
Optimize CSS Delivery of the following:
http://example.com/css/bootstrap.min.css
When i load the same page with:
<link rel="stylesheet/less" type="text/css" href="less/bootstrap.less" />
<script src="//cdnjs.cloudflare.com/ajax/libs/less.js/2.0.0/less.min.js"></script>
I found a page speed score of 91/100 and have to fix:
Remove render-blocking JavaScript:
http://cdnjs.cloudflare.com/ajax/libs/less.js/2.0.0/less.min.js
Although the second situation has to do many http request (to load all the less code) and run the code client side the score is not so much lower (and still good enough).
When you don't optimize the CSS in the first situation, for instance minify the code and set proper caching headers and so on the page speed score will further go down.
So in summary, yes you should precompile your Less code and minify the CSS code for a better user experience and page speed is not always a good predictor for the user experience.
I have a site whose stylesheets are becoming overwhelming, and a full 50% to 90% or so is not used on certain pages. Rather than have 23 separate blocking CSS sheets, I'd like to find out which are being used on the page I'd like to target, and have those exported into one sheet.
I have seen several questions that recommend "Dust me selectors" or similar add on which will tell what selectors are and are not being used; but that's not what I want. I need to be able to export all used styles from all sheets for that particular page into one new sheet that can be used to replace the 23 others. The solution should be able to support a responsive website (media calls). The website page I'm targeting is: http://tripinary.com.
I've found: https://unused-css.com but this is a paid service and I need free;
The next closest thing I've come across is http://www.csstrashman.com/ but this does not look at stylesheets. In fact, it completely ignores them and ultimately I'm having trouble with the responsiveness of the site. Many times as well, this site just crashes.
I don't mind a programmatic solution if someone has had to do this before and can recommend a direction.
(deleted my comment to RwwL answer to make it a thorough answer)
UnCSS, whether node.js or as a grunt or gulp task, is able to list used CSS rules by an array of pages in an array of Media Queries.
uncss: https://github.com/giakki/uncss
grunt-uncss: https://github.com/addyosmani/grunt-uncss
gulp-uncss: https://github.com/ben-eb/gulp-uncss
Multipage:
You can pass files as an argument to any of the 3 plugins, like:
var files = ['my', 'array', 'of', 'HTML', 'files'],
options = { /* (…) */ };
uncss(files, options, function (error, output) {
console.log(output);
});
Avoid:
urls (Array):
array of URLs to load with Phantom (on top of the files already passed if any).
NOTE: this feature is deprecated, you can pass URLs directly as arguments.
Media Queries and responsive are taken into account:
media (Array):
By default UnCSS processes only stylesheets with media query "all", "screen", and those without one. Specify here which others to include.
You can add stylesheets, ignore some of them, add inline CSS and many other options like htmlroot
Remaining problems:
1/ Conditional classes if you use them for IE9-. They obviously won't be matched in a WebKit PhantomJS environment!
HTML:
<!--[if IE 9]><html class="ie9 lte-ie9" lang="en"><![endif]--> <!-- teh conditional comment/class -->
CSS:
.ie9 .some-class { property: value; ] /* Only matched in IE9, not WebKit PhantomJS */
Should they be added by hand or script to the html element in testing environment? (how it renders is of no importance)
Is there an option in uncss?
As long as you don't style with :not(.ie9) (weird), it should be fine.
EDIT: you can use the ignore option with a pattern to force uncss to "provide a list of selectors that should not be removed by UnCSS". Won't be tested though.
2/ Scripts that will detect resolution (viewport width) and adapt content to it by removing/adding it or adding a class on a container. They will execute in PhantomJS in desktop resolution I guess and thus won't do their job so you'll need to modify calls to PhantomJS or something like that... Or dig into options or GitHub issues of the 3 projects (I didn't)
Other tools I heard of, not tested or barely or couldn't test, no idea about the MQ part:
in grunt-uncss readme, the Coverage part
ucss from Opera (there's already an ansswer here, couldn't make it work)
Helium
CSSESS
mincss
Addy Osmani has countless presentations of 100+ slides presenting awesome tools like this one: https://speakerdeck.com/addyosmani/automating-front-end-workflow (you'll regret even more that days are made only of 24 hours and not 48 err wait 72 ^^)
How about the CSS Usage plugin for Firebug?
Steps:
Visit your page in Firefox
Click "CSS Usage" tab in Firebug
Click the Scan button
Click the bold file name
Save page of CSS selectors to disk
Here are some screen shots and walk through. Not sure about media queries or if it'll work on your site, and it'll probably not keep -webkit etc, but maybe it'll get you part of the way there.
Opera Software released a CSS crawler on Github that claims it can find unused and duplicate selectors. It might do the trick if you're comfortable with a command-line tool. https://github.com/operasoftware/ucss
You Can Check in Google Chrome by doing inspect element (F12) . The unused CSS has Line over the tags.
If you wanted, you could try to build a script that runs on a (non-production) server that goes through every css rule, removes it from the stylesheet, loads the page using something like phantomjs, and checks to see if anything changed from the last time it loaded the page. If so, then put the css rule back, if not, then leave it out and move on to the next rule. It would take a while to run, but it would work. You would also have to setup an instance of your server that does not use caching for it to run on.
Try using this tool,which is just a simple js script
https://github.com/shashwatsahai/CSSExtractor/
This tool helps in getting the CSS from a specific page listing all sources for active styles and save it to a JSON with source as key and rules as value.
It loads all the CSS from the href links and tells all the styles applied from them
You can modify the code to save all css into a .css file. Thereby combining all your css.
We've reached the end of our tether here trying to overcome a nasty and intermittent FOUC in Firefox 3.5.x+ for a new release we're working on.
We've tried:
Disabling Javascript in FF
Using Quirks mode rendering by removing the DOCTYPE
Moving from #import for additional CSS to <link>
Switching concatenation on and off
Removing CSS files from the concat, one at a time
Switching the local cache off in Firefox
etc
Our previous release never exhibited any FOUC issues, so it's something we've done to this release. Changes we've made so far include:
Using Base64 encoded images over Data URIs for all decorative imagery, served via CSS.
Separating 'framework'-related CSS files from page-specific CSS and bundling them as two separate CSS files
To recreate the problem... use Firefox 3.5.x or 3.6.x, then:
Head on over to: http://my.publisher-subdomain.env.yola.net/
Login with username: 'stack#yola.com' and password: 'stackoverflow'
Once logged-in, you should be at http://my.publisher-subdomain.env.yola.net/sites/
Click the Account link in the main nav.
The Account page should load, and you should see a FOUC. If the FOUC does not occur, clear your cache and reload the page.
Your help would be greatly appreciated! :)
UPDATE:
The dev environment is still exhibiting the FOUC, but only if FireFox is running low on memory or has a lot of extensions installed. Latency and rendering speed definitely affect the visibility of this FOUC.
Although this is a very old question, I found it when I was searching for a solution to the same problem. So, I wanted to post the solution for future reference. I just needed to move the reference to my CSS files above the references to external Javascript that needed to be in my header.
I can be wrong, but this could be a concurrent connections issue. According to my Firebug's "Net" tab
the HTML page simply takes a lot of time to load - maybe also because it is on a development server? - and the style sheet gets loaded after the HTML page.
I can't claim to entirely understand what's happening here, but I would try putting the style sheet onto a different domain as a first measure. That should make Firefox establish a connection straight away.
It would probably also be a good idea to go back to normal images instead of data: URIs - that would reduce the size of the style sheet, and data: URIs won't work at all in IE < 8.