Chrome makes multiple request for same asset if capitalization differs - asp.net

I'm working on a large asp.net web project that has had a number of different developers/consultants making changes to it over the last few years. I've noticed that depending on the developer, paths to images and other static content may contain the correct casing, all lower case, or something completely random. The browser appears to be making multiple requests for the same asset due to the difference in casing. For example -
<!DOCTYPE html>
<html>
<head>
<title></title>
</head>
<body>
<img src="http://cdn.sstatic.net/stackoverflow/img/sprites.png" />
<img src="http://cdn.sstatic.net/stackoverflow/Img/sprites.png" />
</body>
</html>
Aside from searching for every image in the project and normalizing the casing, is there anything that can be done here? Perhaps something I can put in the page response headers to tell the browser to ignore casing, etc.

Well, the browser (it's not just Chrome that does it, any browser that doesn't is buggy) has to do this because there's no way for it to know that you happen to be using a case-insensitive mapping, so <http://cdn.sstatic.net/stackoverflow/img/sprites.png> and <http://cdn.sstatic.net/stackoverflow/Img/sprites.png> are completely different URIs.
There's a few things you can do.
First find-replace those that are:
Particularly commonly used.
Particularly heavy files.
Particularly commonly mis-spelt.
Not likely to result in you find-replacing something that ruins unrelated code.
Another thing you can do is to force canonicalisation of case in a handler that when invoked for a URI that doesn't match your case-canonicalisation rules, 301's to the form that does. This means that rather than grab 3 different 10kb images you'll grab 1 10kb image and have 2 or 3 redirects of a couple-hundred bytes. That said, below a certain file size then cost of an extra request out-weighs the saving.
Finally you can use a filter (a stream object that Response.Filter is set to, that writes to the previous value of Response.Filter) or code in the PreRender step that scans for local URIs (if you change the case of URIs on other sites you could result in 404s) and outputs them correctly.

Related

What is the benefit of base64 encoding a favicon?

I've got this web app where the favicon is inlined in the HTML, e.g.,
<link rel="icon" href="data:image/x-icon;base64,A VERY VERY LONG STRING...">
However I can definitely see that both Chrome and Firefox (latest version as of this date) issue a request to favicon.ico at the root of my website anyway, e.g. http://example.com/favicon.ico
In case it matters:
The base64-encoded string embedded in the href attribute is quite big.
The favicon <link> tag is managed by react-helmet
The website itself isn't particularly slow. (Consistent good Apdex score throughout.)
I can only assume that the developers at the time (all gone now) wanted to inline the favicon to avoid an HTTP request and therefore wrote some "infrastructure" to support that: namely using a Webpack plugin to automatically base64 encode all assets imported as JavaScript modules (e.g. import favicon from './assets/favicon.ico').
Clearly this isn't working as it was intended but what strikes me the most is that the actual base64 string weights more than the favicon.ico file itself (20k vs 15k). So I'm not sure where the benefit is (if any).
While I don't know any better than you why the original developers designed it that way, it makes sense for offline file rendering of a simple all-in-one html file.
I actually just looked this up, because I am building a SUPER small all-in-one html file. I don't have to include an extra file if it's base 64 encoded into the single html file.
Here's my last two days of reading in few a minute.
As of 2021, 93% of online browsers could view a SVG as an Favicon
https://en.wikipedia.org/wiki/Usage_share_of_web_browsers
https://caniuse.com/link-icon-svg
.ICO is outdated way to create 'favicons' and requires you to make multiple small sizes of your image whereas .PNG can scale down from any size. It's easily the best lazy option for a quick icon. Because the viewing size of Icons are so small, any complex picture is undistinguishable. Making very simple designs optimal.
This is where .SVG shines.
https://www.iconfinder.com/
find image > inspect > open in new page > save image as
Paint 3D's Magic Select is free tool worth mentioning
This is by far the most informative and straight forward video on auto SVG
https://www.youtube.com/watch?v=10m_2bPXa1s
Now, we're left with a 4-8KBs of data. Which could be a 5th of your .PNGs size.
Next we'll want to optimize it
https://www.youtube.com/watch?v=iVzW3XuOm7E
So we could skip a DOM request by having all the data in the head but that leads us here.
https://css-tricks.com/probably-dont-base64-svg/
Now say we're creating a Single Page Application and care about SEO. Not only do we score higher and reduce our load times but we offer a better experience for users with the lowest internet speeds.

website response files are sent to browser in single roundtrip?

I was asked a question in an interview "if a website request is made in browser, its response including html, images, js files are coming to the browser in a single round trip or multiple internal round trips with server?" and interviewer told that it is done through multiple round trips (internally).
However I am not convinced, because wherever I search and i get the answer as a single response. Any help to understand it better?
If you look inside a html file you'll find references to external resources like
<img src="{name of image file etc}"/>
<link rel="stylesheet" href="[filename of stylesheet]" />
<script src="..." />
These are some elements within a html file that trigger multiple requests.
So a request to a web page may appear like a single response, it's actually an aggregate response - made up of lots of resource responses, such as stylesheets, images and javascript files.

"Eliminate render-blocking CSS in above-the-fold content"

I've been using Google PageSpeed insights to try and improve my site's performance, and so far it's proven extremely successful. Things like deferring scripts worked beautifully, since I already had an in-house version of jQuery's .ready() to defer scripts until the page had loaded fully, all I had to do was inline that particular function and move the full scripts to the end of the page. That worked great.
But now I find myself glaring at the one remaining yellow dot on the checklist: "Eliminate render-blocking CSS in above-the-fold content".
The way my CSS is set up is to have one global _.css file containing styles that apply to the page structure in general, or are used in more than one or two places across the site. Most pages then have an associated CSS file (for instance, party.php has party.css) containing styles specific to that particular page. All CSS files are cached indefinitely, as I append /t=FILEMTIME to filenames (and later remove them with .htaccess) in order to guarantee that files are updated when they are changed.
So anyway, Google recommends inlining critical styles needed for above-the-fold content. Trouble is... well, take a look at this screenshot: http://prntscr.com/1qt49e
As you can see... ALL of the content is above-the-fold! People hate scrolling, especially on a game that involves loading many pages. So I designed the site to fit on one screen (assuming a good enough resolution). So that means... ALL of the styles apply to above-the-fold content! So... is there any solution? Or am I stuck with that yellow mark on an otherwise near-perfect score?
A related question has been asked before: What is “above-the-fold content” in Google Pagespeed?
Firstly you have to notice that this is all about 'mobile pages'.
So when I interpreted your question and screenshot correctly, then this is not for your site!
On the contrary - doing some of the things advised by Google in their guidelines will things make worse than better for 'normal' websites.
And not everything that comes from Google is the "holy grail" just because it comes from Google. And they themselves are not a good role model if you have a look at their HTML markup.
The best advice I could give you is:
Set width and height on replaced elements in your CSS, so that the browser can layout the elements and doesn't have to wait for the replaced content!
Additionally why do you use different CSS files, rather than just one?
The additional request is worse than the small amount of data volume. And after the first request the CSS file is cached anyway.
The things one should always take care of are:
reduce the number of requests as much as possible
keep your overall page weight as low as possible
And don't puzzle your brain about how to get 100% of Google's PageSpeed Insights tool ...! ;-)
Addition 1: Here is the page on which Google shows us, what they recommend for Optimize CSS Delivery.
As said before, I don't think that this is neither realistic nor that it makes sense for a "normal" website! Because mainly when you have a responsive web design it is most certain that you use media queries and other layout styles. So if you are not gonna load your CSS first and in a blocking manner you'll get a FOUT (Flash Of Unstyled Text). I really do not believe that this is "better" than at least some more milliseconds to render the page!
Imho Google is starting a new "hype" (when I have a look at all the question about it here on Stackoverflow) ...!
How I got a 99/100 on Google Page Speed (for mobile)
TLDR: Compress and embed your entire css script between your <style></style> tags.
I've been chasing down that elusive 100/100 score for about a week now. Like you, the last remaining item was was eliminating "render-blocking css for above the fold content."
Surely there is an easy solve?? Nope. I tried out Filament group's loadCSS solution. Too much .js for my liking.
What about async attributes for css (like js)? They don't exist.
I was ready to give up. Then it dawned on me. If linking the script was blocking the render, what if I instead embedded my entire css in the head instead. That way there was nothing to block.
It seemed absolutely WRONG to embed 1263 lines of CSS in my style tag. But I gave it a whirl. I compressed it (and prefixed it) first using:
postcss -u autoprefixer --autoprefixer.browsers 'last 2 versions' -u cssnano --cssnano.autoprefixer false *.css -d min/ See the NPM postcss package.
Now it was just one LONG line of space-less css. I plopped the css in <style>your;great-wall-of-china-long;css;here</style> tags on my home page. Then I re-analyzed with page speed insights.
I went from 90/100 to 99/100 on mobile!!!
This goes against everything in me (and probably you). But it SOLVED the problem. I'm just using it on my home page for now and including the compressed css programmatically via a PHP include.
YMMV (your mileage may vary) pending on the length of your css. Google may ding you for too much above the fold content. But don't assume; test!
Notes
I'm only doing this on my home page for now so people get a FAST render on my most important page.
Your css won't get cached. I'm not too worried though. The second they hit another page on my site, the .css will get cached (see Note 1).
Few tips that may help:
I came across this article in CSS optimization yesterday:
CSS profiling for ... optimization
A lot of useful info on CSS and what CSS causes the most performance drains.
I saw the following presentation on jQueryUK on "hidden secrets" in Googe Chrome (Canary) Dev Tools:
DevTools Can do that.
Check out the sections on Time to First Paint, repaints and costly CSS.
Also, if you are using a loader like requireJS you could have a look at one of the CSS loader plugins, called require-CSS, which uses CSSO - a optimzer that also does structural optimization, eg. merging blocks with identical properties. I used it a few times and it can save quite a lot of CSS from case to case.
Off the question:
I second #Enzino in creating a sprite for all the small icons you are loading. The file sizes are so small it does not really warrant a server roundtrip for each icon. Also keep in mind the total number of concurrent http requests are browser can do. So requests for a larger number of small icons are "render-blocking" as well. Although an empty page compare to yours, I like how duckduckgo loads for example.
Please have a look on the following page https://varvy.com/pagespeed/render-blocking-css.html .
This helped me to get rid of "Render Blocking CSS". I used the following code in order to remove "Render Blocking CSS". Now in google page speed insight I am not getting issue related with render blocking css.
<!-- loadCSS -->
<script src="https://cdn.rawgit.com/filamentgroup/loadCSS/6b637fe0/src/cssrelpreload.js"></script>
<script src="https://cdn.rawgit.com/filamentgroup/loadCSS/6b637fe0/src/loadCSS.js"></script>
<script src="https://cdn.rawgit.com/filamentgroup/loadCSS/6b637fe0/src/onloadCSS.js"></script>
<script>
/*!
loadCSS: load a CSS file asynchronously.
*/
function loadCSS(href){
var ss = window.document.createElement('link'),
ref = window.document.getElementsByTagName('head')[0];
ss.rel = 'stylesheet';
ss.href = href;
// temporarily, set media to something non-matching to ensure it'll
// fetch without blocking render
ss.media = 'only x';
ref.parentNode.insertBefore(ss, ref);
setTimeout( function(){
// set media back to `all` so that the stylesheet applies once it loads
ss.media = 'all';
},0);
}
loadCSS('styles.css');
</script>
<noscript>
<!-- Let's not assume anything -->
<link rel="stylesheet" href="styles.css">
</noscript>
I too have struggled with this new pagespeed metric.
Although I have found no practical way to get my score back up to %100 there are a few things I have found helpful.
Combining all css into one file helped a lot. All my sites are back up to %95 - %98.
The only other thing I could think of was to inline all the necessary css (which appears to be most of it - at least for my pages) on the first page to get the sweet high score. Although it may help your speed score this will probably make your page load slower though.
The 2019 optimal solution for this is HTTP/2 Server Push.
You do not need any hacky javascript solutions or inline styles. However, you do need a server that supports HTTP 2.0 (any modern server version will), which itself requires your server to run SSL. However, with Let's Encrypt there's no reason not to be using SSL anyway.
My site https://r.je/ has a 100/100 score for both mobile and desktop.
The reason for these errors is that the browser gets the HTML, then has to wait for the CSS to be downloaded before the page can be rendered. Using HTTP2 you can send both the HTML and the CSS at the same time.
You can use HTTP/2 push by setting the Link header.
Apache example (.htaccess):
Header add Link "</style.css>; as=style; rel=preload, </font.css>; as=style; rel=preload"
For NGINX you can add the header to your location tag in the server configuration:
location = / {
add_header Link "</style.css>; as=style; rel=preload, </font.css>; as=style; rel=preload";
}
With this header set, the browser receives the HTML and CSS at the same time which stops the CSS from blocking rendering.
You will want to tweak it so that the CSS is only sent on the first request, but the Link header is the most complete and least hacky solution to "Eliminate Render Blocking Javascript and CSS"
For a detailed discussion, take a look at my post here: Eliminate Render Blocking CSS using HTTP/2 Push
Consider using a package to automatically generate inline styles from your css files. A good one is Grunt Critical or Critical css for Laravel.

CSS files that have numbers in their query string? [duplicate]

I was browsing the html of my favorite site...ahem...and I saw this in the markup:
<link href="/Content/all.min.css?d=20090107" rel="stylesheet" type="text/css" />
what does "?d=20090107" do? I'm assuming it's a date of some kind, but I'm not sure why it's in the path to the file. Any ideas?
That is there to add some uniqueness to the filename, so that when they change the CSS file, they can change the extra bit to be totally sure that every client will reload the CSS rather than use a cached version.
The webserver will ignore the parameter and serve /Content/all.min.css normally
Note: While it's possible the CSS is dynamically generated, this is a common idiom for ensuring a reload, and given the parameter is a date, it seems quite likely.
Edit: Podcast 38 mentioned this...
We’ve been using the Expires or
Cache-Control Header since we
launched. This saves the browser
round-trips when getting infrequently
changing items, such as images,
javascript, or css. The downside is
that, when you do actually change
these files, you have to remember to
change the filenames. A part of our
build process now “tags” these files
with a version number so we no longer
have to remember to do this manually.
It's to "clear the cache" every time the style is updated. I would speculate that whoever is responsible for those styles increments it every time there is a change. It's because the browser sees a different URL in the style field, so it will grab the latest version, even though it's technically in the same place on the server.
As helpfully pointed out in the comments, css files often have their expiry set well into the future, this method is a nice sidestep to cache related headers.
Quite a useful trick.
It is to make the browser think it is a new file every-time to it refreshes its cache.
Very useful when your stylesheets change regularly...

What does '?' do in a Css link?

I was browsing the html of my favorite site...ahem...and I saw this in the markup:
<link href="/Content/all.min.css?d=20090107" rel="stylesheet" type="text/css" />
what does "?d=20090107" do? I'm assuming it's a date of some kind, but I'm not sure why it's in the path to the file. Any ideas?
That is there to add some uniqueness to the filename, so that when they change the CSS file, they can change the extra bit to be totally sure that every client will reload the CSS rather than use a cached version.
The webserver will ignore the parameter and serve /Content/all.min.css normally
Note: While it's possible the CSS is dynamically generated, this is a common idiom for ensuring a reload, and given the parameter is a date, it seems quite likely.
Edit: Podcast 38 mentioned this...
We’ve been using the Expires or
Cache-Control Header since we
launched. This saves the browser
round-trips when getting infrequently
changing items, such as images,
javascript, or css. The downside is
that, when you do actually change
these files, you have to remember to
change the filenames. A part of our
build process now “tags” these files
with a version number so we no longer
have to remember to do this manually.
It's to "clear the cache" every time the style is updated. I would speculate that whoever is responsible for those styles increments it every time there is a change. It's because the browser sees a different URL in the style field, so it will grab the latest version, even though it's technically in the same place on the server.
As helpfully pointed out in the comments, css files often have their expiry set well into the future, this method is a nice sidestep to cache related headers.
Quite a useful trick.
It is to make the browser think it is a new file every-time to it refreshes its cache.
Very useful when your stylesheets change regularly...

Resources