Do I necessarily need to remove "Render Blocking CSS" - css

I put my homepage through Google's PageSpeed test and it gave me a score of 69 for Mobile and 95 for Desktop. The one and only issue being a Render Blocking CSS.
Now, all my web pages on my website are Above the Fold. i.e. There is no scroll involved anywhere. Given this, personally I feel I should not be doing anything special, since the CSS is required to view my page the way I designed it, right from the get go.
If I do asynchronous loading or something, it'll end up showing the content on a black and white un-organised page, just before the intended output.
Do I ignore Google? It would mean that I'd never score 100/100, and wouldn't that affect my SEO chances?

TL;DR — No, you don't have to. But in most cases, it helps, indirectly.
Render blocking is in place to prevent FOUC.
Ideally you should only load the CSS responsible for rendering the "above the fold" of your page as render blocking and all the rest of your styles using async methods.
However, most sites load all their CSS as render blocking. Why? Because most websites do not afford a CSS specialist to customize their CSS loading for their specific case. They'll sometimes pay for a theme, but that's it.
Themes are not typically optimized from this point of view because there is no way to know what elements the user will want in their above the fold area.
Is this a huge problem?
NO.
First of all, all of this is only about when the user loads the very first page of your website. All the other pages will use the cached stylesheets (already loaded on first page visited) (unless you load different stylesheets for different pages).
And second of all, the general idea that Google lowers your page's SEO score for having render-blocking CSS is, technically, wrong. They do penalize for a lot of other reasons (like accessibility, readability and responsiveness issues) but not for having render-blocking CSS.
However, there is an indirect correlation between the two.
Google Page Speed is a tool telling you how you can improve the loading speed of your page or to leave the impression the page loads faster.
if you fix the problems it identifies, the page will load faster or at least it will seem to load faster
if your page is or feels faster, there are less chances users will hit the back button while waiting for your page to load.
THIS user behavior is where the SEO penalty comes in. Google registers any such behavior as a general "user did not find what he was looking for on that website" and lowers the page's SEO score for whatever the user searched for
Any method of keeping users from hitting the back button in the first 30 seconds after they left for your website (that will keep the bounce rate down) is a good method to fight SEO penalties.
And... it's true: one of the most efficient methods is to make your page load faster.
Others include:
make the loading process look professional (place correctly sized placeholders for images, so the page doesn't jump around when loading);
keep FOUC as close to 0 as possible
render something, rather than nothing
if possible, give users a general idea of how much of the page has loaded (in %)
make the website loadup with some basic schema of what's on longer pages. users will read the schema, trying to figure out if they're on the right page and they won't notice the loading time - since you give them something to do while waiting
cut the "bla bla" and try to be honest about whatever your page has to offer
I can't emphasize this enough: it really pays off to be honest. There is a huge difference in results, SEO wise:
If your page is about A, but you want to show this to users looking for B, do not tell them you've got B and don't hide it from them. Just tell them:
"Look, this is not B, it's A, but here are a few reasons why you should consider A instead of B."
Most users will read those reasons. Especially if they're well written, they address real problems, and they don't look like they're just trying to buy time.
A very good idea is to place your strongest argument second or third in the list (second if first is rather long, third if first two are not so long).
The reasoning is: if you place it further down, many users don't read past three weak arguments - they label the entire list as unconvincing and go back.
Also, if you place the arguments in the order of their importance, the user will realize it and, as soon as they reach two arguments that are not convincing, they'll assume it gets worse further down the list and, again, they'll hit back button.
But if you place a second or third argument stronger than the previous ones, they will read through the entire list hoping to find another one.
Now, if your arguments are compelling, the user will go for A instead of B => Win.
If not, they will still go for B, but at least they'll do it later (after they read your reasons), and the penalty will be much smaller, if any (the longer time a user spends on your page, the less the penalty, should they press back) => No loss.
If you can keep the user occupied for more than 30 seconds, you're typically in the clear SEO wise. And that's the really important SEO issue at hand, not render-blocking per-se.
In the end, it is totally possible to create a page with a very low score on Page Speed while having a very high SEO score. It's unlikely, but totally possible.

Related

Lowering Largest Contentful Paint, Over A Full Second After FCP On Mobile

I've spent quite a bit of time on performance optimization for my site and I'm right there at the finish line for getting all the green good scores in search console for mobile usability and core web vitals. The last outstanding thing is getting my LCP under 2.5 seconds for my blog posts.
I'm consistently getting 2.6-2.7 and I can't quite figure out why. The LCP element is the <h1> tag on my blog posts and there are no above-the-fold images.
Example URL With Higher LCP
Let me say that I am using a Beaver Builder website and some marketing tags like GA, GTM, etc but I've spent a lot of time analyzing my script and style loads to create an optimal experience with preconnects and preloads of various resources. I know it's a big ask to try to get a site like this down in load time. I'm on Kinsta hosting with low TTFB, full-page caching and CDN. I am also using the Perfmatters plugin to control various aspects of load times.
I've done everything I can think of to get the LCP down and it seems like the <H1> tag is visible almost immediately but then may be repainted later towards the end of the page load, but I can't figure out the cause of this.
Anyone out there feeling generous that could take a look?
Page Speed Insights URL
as i can see Pagespeed is showing aggregated real user experience of your entire domain, not of this specific pages but for all pages on this domain.
DOM content load time for your page on mobile is around 3sec, that means your LCP has to be greater than this
You need to reduce your DOM content load time first. You need to prioritize your network calls such that redundant dependency needs to be minimized
And also for desktop page, you are lazy loading first fold image which is not considered as a good user experience and might be contributing to LCP for desktop version of your page, also impact negatively for your SEO

Very slow "portfolio" section of Wordpress

A client of mine is adding content to his Wordpress. The site is www.airsolid.ca.
He uses "portfolios" to add his different boat models. All seems fine except when we click "all boat models" in the section where it lists all items, it takes up to 30 seconds to load.
Here is the direct link to the section:
http://www.airsolid.ca/bateaux/
Any idea on what I could change to make it load under 3-5 seconds? I have a feeling it loads all images at once... and since there are many, it takes way too much time. Ironically, he doesn't even want the images to show when he lists them.
Use https://tools.pingdom.com to monitor what's loading, how long it's loading, etc. You can see if images or scripts are holding it up.
Since the screen is white while it's loading, I'd imagine it's a query issue. You can use the Query Monitor plugin to help determine the cause.
I used pingdom and got these results: https://tools.pingdom.com/#!/ekJpVY/http://www.airsolid.ca/bateaux/
It had 1 request until the ~22 second mark at which point the CSS/JS/Image requests came in, which means that it's not being held up by scripts or images.
The page is only ~2mb, which means it's not loading all the images either. I'd start with Query Monitor - it's definitely something server side, probably a faulty WP_Query or other issue in a PHP loop.

Finding click-counter for NFP website, written in iframe

I am a non-programmer working for a church. We have no tech staff. Our website is based upon a template that doesn't provide a widget for counting clicks. We'd like to add one (or preferably two) jpg image(s) with a counter(s) to track the number of times clicked, and display the cumulative total next to the jpg(s). Church members will go to the page and click each time they participate in one or both of two different church objectives.
Our web host says to do this I must find, write, or purchase 3rd party code written in iframe, to embed into one of our pages.
I googled the issue and am only finding hit counters which track visitors to a page, rather than clicks of an image. We'd prefer two different jpgs to track two different objectives, but if necessary I can change from two jpgs to one, if having two counters on the same page is a problem.
Can anyone point me to where I could get code like this either for free, or for pay, and what it would cost?
There is a lot of good information here. They talk about an issue with iframe receiving the click vs. you recording it. If you keep reading there is a possibility to work it. Hope this helps!
Look here: Detect Click into Iframe using JavaScript

Standard way to persist data between requests in ASP.NET-MVC

What is the most standard or best way to persist data between requests?
Should I use cookies or session variables? I'm interested in keeping data like sort order, sort column, and page number (for paginiation).
I'm coming from a webforms background so normally this type of thing was automatically handled for me in the viewstate of the controls I was using.
update
I like the querystring idea, for searching and more meaningful URLs; however, I'm working on an "index/list" view, which consists of a View with header, and "control" options, like DDLs for filtering and a partial view that renders the table of data.
The DDLs use a $.load() to call an ActionResult on the controller, which returns the partial view, passing parameters there in the querystring, but since these are ajax requests the main page url of the user's browser does not get updated.
Is there a best-practice for taking querystrings off the main-page URL and using them in ajax requests to other ActionResults?
If you want it to survive only through one request/redirect TempData is your friend.
However, for things like your pagination, URL is the best method, for the ability to share links alone.
A standard way is to pass those sort of things via URL Query Parameters. You can modify your routing to expect certain URL variables. That way the pages become more search engine friendly as well.
It depends on how permanent you want the information to be:
Things like the page number should indeed be in the URL (as others have pointed out) - this helps with bookmarking, etc, but remember that if you add more content to the list, then that bookmarked result set will not always be what the user wanted...
If you're happy for these values to be lost when a session times out (by default around 20 minutes), then put them in Session.
If you think that sessions are going to timeout before the next request, or you want to save it across visits then you should be storing them in either cookies, or a profile (potentially allowing "Anonymous" profiles, which work with the users cookies, so they would lose them across machines).
Personally, I'd think very carefully about putting sort order and columns in the URL if you do you could actually end up really confusing search engines:
Lots of pages with very similar content (page 1, sorted by date desc, page 1 sorted by date asc, etc) - search engines don't like duplicate content, and nor should you as Google (for instance) will only show two pages from your site in a default result set, you want them to be valid, not duplicates.
Search engines will spend lots more time crawling your site, and potentially give up - If on every page they find links to "Sort by this column", they will attempt to follow them, resulting in more work on the server, higher bandwidth use, etc.
These can be mitigated through the use of a Robots.txt file denying access to sorted versions of the page, but if this is generated almost dynamically that will be very complex to maintain going forward.
In response to your update, a nice way to achieve that for pages would be to have links to "Previous" and "Next" pages of results (or better yet, a list of all pages in the list), output on the page, with the page numbers, that you then hide with JavaScript.
This way users should see your nice, AJAXy behaviour, and search engines (and users without JavaScript - mobile, or those using older screen readers for instance) will still be able to get access to all your pages - this will help your pages to degrade gracefully, or use "Progressive Enhancement".
Things that were previously in viewstate should probably be put back in the clients hands via either hidden fields or cookies.
Session is "too" easy. In a dev environment it works great, pretty much no matter what you put in it. In production scalability and persistence become a problem. In-process session is likely to disappear unexpectedly if you have crashing bug in your site, and requires server affinity when load balancing. Out-of process session fixes the durability and affinity issues, but can still be a performance bottle neck if too much stuff is put in session. A VERY common problem is that each page will put 1 or 2 items into session but never take them out again when they are done. And even if a page removes it session data when it is no longer needed, the data can still get orphaned if a user starts a process and never completes it.
Cookies is a fast and simple way to persist data between requests, and you can also make them live only for a limited time depending on your needs.
Session are easiest.

Bust iFrames accurately when implementing DiggBar or FacebookBar?

Understanding all the security and UI concerns with iFrames, I am implementing a toolbar similar to the DiggBar or FacebookBar.
A top bar persists across the top 30 pixels of the screen, and an iFrame displaying external content fills up the remainder of the page.
When users close the toolbar, and thereby exit my little site to go directly to the third-party site, how can I bust the iFrame properly and display the right page? If the user clicks on even one link in the iFrame, I end up showing the wrong page.
Given my understanding of browser security, and coupled with how DiggBar and FacebookBar fail to do this accurately, I'm guessing it cannot be done.
But I was hoping the Stackoverflow coders are smarter and might have an answer? :)
Thanks!
You can't. Because of browser cross site-scripting security, your bar which sits in its own frame cannot access any other frames and determine their URLs.
Not to mention that'll you'll be sued by website owners for numerous things and that you'll piss off every hacker out there.
This is the last thing you want to do if you'd like to NOT in your our office as that one guy who wanted to include everyone elses web site in their website with the owners permission.
I wouldn't speak up at any of the conventions either.
I've also added the question: "Have you ever written code or worked on code that frames other sites?" to my list of questions to use to weed out job applicants.

Resources