Efficiency of data:image in CSS - css

Was wondering if this
background:#092542 url('data:image/png;base64, ....') center top no-repeat;
is more efficient than
background:#092542 url('myimage.png') center top no-repeat;
I'm trying to cut down on load times and if this would help out, great.

If the image is only being used once, on one page, then it might give a very slight (probably not noticeable) difference in loading (faster), however it also means the browser can't cache the image, so if it's used on another page it'll have to be loaded again, which means double (triple, etc) the data is being transferred.
Generally, it's not worth it, although maybe if it's a micro site it could be easier, but not by much.

Data URI images ARE cached. However, the caching of the image is dependent on how the file that is contained in is cached. If the stylesheet that includes the Data URI image is being cached, then the image will not be downloaded until the stylesheet itself is redownloaded.
I'd also recommend this post explaining Data URI's further, especially the section on performance.

Related

Why is page using CSS columns reflowing on load?

I have a single page using CSS columns that reflows on load.
Even when I remove all IMGs and iFrames, so it is fetching no
external resources, it reflows.
I can't figure out what is causing the reflow/repaint. Any CSS exports
out there able to figure this one out?
https://github.com/treenotation/dumbdown/issues/8
There's too much content in document.
The browser will gradually display the content, that is, the content involved in rendering will gradually increase, which affects the layout calculation.
You can add the "loading state" style. When window.onload Event trigger, change the style to "load complete".
Or 'Masonry Layout'.
Reason for the reflow is: a huge amount of content but NO strategy to handle with. Indeed there are many things you can avoid/do/change ...
REASON WHY - just to base the answer ...
The reflow is caused by the mechanic: at first the text (html code) has been downloaded and rendered. But there are still a lot of elements (most over images but youtube videos and iframes as well) which are still on download. As the browser don't know the size of that elements he does not keep the place for that elements.
Now: after the download and rendering of every element has finished the browser injects the element to the content and all off the following content is pushed down and in your case to next column ... reflow.
STRATEGY: MULTIPLE ACTIONS
To your question: there is not only one reason which causes multiple and long downloads. So far there is no simple single answer and even NOT A SINGLE SOLUTION. Your needed strategy will be to optimize the page by a multiple bundle of actions. But I believe doing that you can reduce it to an acceptable amount ... and maybe there is a chance.
THINGS YOU CAN DO
1. Change layout
If you change the layout to an actual web technique. That means don't use columns (flowing left to right) to a style which prepares the pageflow from up to down. Than you can asynchron load the needed elements when the user scrolls down. The technique is named Infinite scrolling: How to do an infinite scroll in plain Javascript
But I assume as the special layout has charm this won't be an option for you!?
2. Images which are not shown - remove not needed elements from download
On your page I found images which are downloaded but which are not shown on the page. (Example: 3.png with INCREDIBILE USELESS 659KB). Remove such elements from your content.
3. Reduce not needed size of elements
Additional a lot of shown images on your page have an incredibly large file size which is not needed.
Example: devices.png
image-size: 692x285px - real size
layout-size: 287x118px - needed size
file-size-download: 110kb
file-size-needed: 4kb - if (lossless) optimized
And think about: many little file downloads add up to a big amount ... and you have a lot of downloads! If you calculate: 10 images your way: 1.1 MB can be done with 40KBs
Additional:
if you you need higher solutions use srcset attributes ...
sometimes that is a practical problem with the knowledge of the editors: than teach them how to lossless reduce images and advise them the sizes to use for the images in the layout
4. Use faster server for images
It seems the download rate from your server is not the fastest one. That's normal by most providers. As you have a lot of images ... think about to load images from a faster server - if possible. Example: the pure download service from AWS (Amazon Web Service) is incredibly fast. As you just need a bucket for downloading that is not as expensive ... try it out.
5. Use placeholders for elements
As you have a lot of elements I think you maybe cannot avoid the later injection which causes the reflow. But you can use placeholders for your element so the needed place is reserved and the reflow still does not happen for this element.
Just define the html structure and possible sizes in your layout. That additional helps the editors as they know what image size they can use. Then size the placeholders with CSS and initiate an ajax image download by JS.
In case of later download now the users maybe see a placeholder at the beginning but no reflow. You can do that with few lines of code. I attach an example at the end of the posting.
NOTE: You can do this with (youtube-)videos or iframes in a similar way ;-)
6. Use vanilla instead of jQuery
As I saw has the download of jQuery an incredible impact to your download time. Really. (That's the reason why I assume your server is a slower one.) Have a look to the download time of your elements. It is one of the elements which needs the most times and blocks your elements from rendering.
jQuery is an old dog. Modern web techniques use vanilla JS ... and as I see there are no complicated things on your page you cannot realize in vanilla. So the recommendation is to remove it from you page (if possible) and you will earn a huge speed advantage.
7. Use CDNs for download when possible
Downloading frameworks and fonts from own server makes pages slow and blocks time for the page download of other elements. Use a CDN instead.
As I have seen your fonts are loaded from a CDN? But jQuery still comes from your server. If you don't want to change to vanilla chose to load it from CDN.
8. Check if Youtube can be loaded more simple
Youtube is loaded by several actions to your page. In this case I AM NOT SURE as I still did not work with youtube for a longer time. But I believe (not sure if I am right) that there is a more direct way to include youtube videos to a page. Maybe you would like to check it.
But nevertheless: work with placeholders for the video players as well. That are almost just few additional lines off css.
9. Optimizing user experience: thinking about a preloader
Reflow is not new phenomenon to webpages. Up to now a lot of pages uses preloaders to generate a better user experience. Today's technique is ajax load...
I don't know if the described techniques will avoid the reflow completely. As there are many elements the download time cannot be set to zero. But optimizing the page will reduce it dramatically. If there still remains a little bit ... maybe you like to think about the older technique. Using a nice and maybe well designed preloader symbol indeed can upgrade the user experience. Maybe on mobile views with medium data speed there is no other chance...?
But that is just to think about an additional possibility ...
[update]
10. Combine placeholder with infinite scroll
If you are using placeholders you can/should combine it with technique infinite scroll.
Means: all media (particularly images but maybe videos and iframes as well) are prepositioned by sized placeholders. That works immediately so there should be no more reflow as needed. Then load media assynchron by AJAX based on their position on their screen. Images which are in view are loaded immediately.
As you don't have so many media elements on starting viewport (most are still below the screen view) that should work as if it is a page with a 'normal number' of pictures/medias.
All others are loading afterwards when scrolling the page the media comes in view like on a 'infinite scroll page'. (Note: that works if the file size off the images is not to large, - so optimizing the images has still to be done.)
That has the additional advantage that thake makes sure that the images are loaded in the sequence they are needed ... which safes a lot of time.
Could be done in javascript:
Place images/media by placeholder technique
On window.onload check which images/media are in the viewport. Don't forget images which are only partly seen.
On window.onscroll check if image(s) comes to viewport and load image
Note: I am not quite sure if there are anchor links on your page to the single articles. I believe not. But if you still use them the starting viewport can be anywhere on the page when the user call an article. In that case window.onscroll has not only to work scrolling down but scrolling up to.
I am not quite sure if there is a ready script avaiable. But I would wonder if not. Otherwise it would not be to tricky to do that on your own. That would have the charm that such scripts mostly have less and cleaner code than preworked scripts ...
[end update]
... ... ...
I am not quite sure if the described issues are complete. Mostly there are found more possibilities to optimize a page when you start with the process. But as I had a nearer look to your page that are the most important chances.
EXAMPLE LAZY IMAGE LOAD WITH PLACEHOLDER EFFECT
Just EASY AND SIMPLIFIED example for lazy image load. Please adapt to your need.
// new html for image
<img class="placeholder-size" src="path/placeholder.jpg" data-lazy-url="https://url-to-your/imag.png" alt="Image Description">
// css
.placeholder-size {
width: 200px;
height: 100px;
}
// js for lazy load
// older code but works, please actualize if needed
window.onload = function(){
var lazyImages = document.querySelectorAll('[data-lazy-url]');
for (var i in lazyImages) if (lazyImages.hasOwnProperty(i)) {
var imgUrl = lazyImages[i].getAttribute('data-lazy-url');
lazyImages[i].src=imgUrl;
};
};

Most effective way to import large images?

I am interested in adding a landsacpe footer on my website but the image size is 115KB and will load on every page... Is there any effective way to load an huge image such as this one:
http://gyazo.com/5b1b7312ec4370873368df0181e41b13.png
Here's a few things that may help you:
EDIT: I tried the second tip in the list below (tinypng.com) and it reduced the size of your image with 71% to 39.1 KB. So that's a big win.
Make sure to set the cache headers on your webserver so that the browser can cache the file. Also use the same URL for all other times you use the image. Doing these two simple things will make sure that the image will only get downloaded the first time the browser requests it. All other times it will be loaded from the browser's cache.
Make sure to check if the image is as small as it can be. If you use a PNG then use tools like https://tinypng.com/ to squash all metadata out of the image. If you use a JPEG then maybe lower its quality. If you use Photoshop make sure to "save the image for web". This will also reduce the size. For photographs you are mostly better of using JPEGs, for text or other images that need to be lossless use PNG or GIF.
Loading images will not really slow down your page that much. Not like JavaScript anyway. Loading Javascript will block the rendering of the page until the JS file is downloaded unless you use special loading techniques. That is not the case for images: the page will continue being rendered and the user can start using the page.
If the image is placed using an IMG tag the make sure to set the width and the height of the image in the CSS (or using the img width and height attributes). That will make sure that the browser does not need to reflow the page when the image is downloaded. It will know what size it needs to be even before the image is downloaded.
There is a maximum number of parallel requests per domain that the browser will do. If the image has a very low priority you could postpone its loading and wait for the onLoad event. This will make sure the other resources (with a a higher prio) will be downloaded first. This will require some JavaScript, but not that much (Use an image lazy loader, there are many).
As I said in the previous item the are a maximum number of requests PER DOMAIN. This is why you could also create a new (sub)domain and load some content from there. It will increase the total number of resources that will be downloaded in parallel. using a CDN will also help here because they also have a separate domain (and they are optimised as well).
If you want to read some REALLY GOOD books about this kind of optimising, read these:
http://www.amazon.com/High-Performance-Web-Sites-Essential/dp/0596529309
http://www.amazon.com/Even-Faster-Web-Sites-Performance/dp/0596522304

Pseudo-Random image via CSS-sprite (random location)

I want a header image in my HTML to be random. I have accomplished this by using this php file, however I would like to do something different. I want to have the random images be a part of one sprite. That way the images can all load at the same time and they user won't have to see the images load when navigating to different pages. I would also like to choose the random factor, i.e. show this picture 50% the others 10% of the time (if there are 6 pictures). Is this even possible and where would an amateur start? Is this the best way to implement my scenario so that the user sees as little image load as possible?
You could have the sprited header style declared in CSS as you normally would, and then simply adjust the xpos/ypos of the sprite in-html. You would be able to recycle some of the logic you already have and manipulate individual header probability.
I'd stick with individual assets though, especially if they are the size of a google doodle. Easier to extend that way and, assuming the rest of your static content has already been cached, any overhead of downloading a new image would be negligible.

Trade off for CSS sprites: how many images to combine into one

I wonder what the best trade-off for combining images for CSS sprites is.
Say I have 50 images, but each page only needs 5. The total size of a sprite image is 100kb.
There obviously are a dozen of parameters, how many pages are visited in each visit, the connection speeds of the users, the lag. I'm not looking for a mathematical formula to compute the best trade-off, since I cannot estimate these parameters precisely enough.
So, do you have any experience values on when to combine images to a sprite and when not. (Actually, the "not" is more interesting IMHO).
Do you put all images on a sprite that "could" be needed for a single page, but anything that will only be needed by a second page on a separate sprite?
In my opinion, only group images together in a sprite if they are relevant.
If you have a menu based off of images, I'd make one image with all of the different elements in the menu. If you have a list with icons that appear on hover, make a spritesheet with all of those icons. It does you no good performance-wise to create one huge image when that same image can be combined into three or four smaller images.
It also helps to have only related images together - it keeps your CSS references to the files easier to follow and you have less complex x/y coordinate references.
here is my two cents... I usually try to sprite images for a given path that i am expecting to be hit harder then others. For example; if my sites so i call the critical path is: user logs on, goes to the home page, checks out today's deals, purchase one and logs out, i would like most of the common images sprited (logically grouped if needed) on this path. Having the sprites here will eliminate a lot of extra requests.
If you go google.com you will see a sprite (nav_logo99.png) that has most of the common images you will need on the very likely next page(s).
Also to answer your "when not to sprite", background repeat and CSS sprites does not blend well so i will stay away from those.
The thing I take into account is how many pages my user will visit. You have to remember that you sprite will normally be cached so only ever loaded once.
If your site user runs around a few pages then its fine having a larger file but you are correct if most off your users only visit your home page then your not going to want to load all your sites images in one sprite.
Its best to just go on your own feel off what is best to optimize into sprites

Sprite/PNG graphics-heavy site, oh my!

I need to create a site that is very graphics-heavy (torn paper backgrounds with transparent shadows over textured graphics, etc.) One way that I was thinking of saving on file size was to drop all my background elements into one PNG. The issue is that this file is now 180k. If I break it up into various GIFs and a couple PNGs then it would be closer to 70k.
Does it really matter? What is "too large" these days for file size? Will anyone notice if the file is 180 or 70k?
If your users have fast access to your site (like, in an intranet), 180k is hardly a problem. If, on the other hand, the site is used by The Generic Older Person With A Humorously Slow Connection, it's probably going to be a problem. If your users use GPRS, but have endless patience, it's probably not going to be a problem. If the site gives out a million dollar to whoever has the patience to wait out the load time, transfer speeds are not an issue. And so on.
What I'm saying, it really depends on your requirements and constrains. This requires you to know (and subsequently tell us, for us to be more helpful) many things before you can get it close to right.
To avoid those pesky downvotes for very-valid-answers-but-simply-doesn't-please-someone, here's my answer:
180k divided by a standard ADSL modem transfer rate = 180kB / 100kB/s = 1.8s = endurable.
Is there a reason not to use the smaller images? It sounds like you've already broken it up, so why not go with the smaller, faster method?
From a pure relativistic point of view, 70k will take only 38% of the download time that 180k would (approximately). If you're expecting high traffic or want fast load times, every bit helps.
You have to compare the time it takes to request all the separate images and the time it takes to download the one large one. The issue is with HTTP requests.
I suggest you run some tests with Google's Firefox extension, Pagespeed to see if there is a huge difference between the large png or the separate ones.
One benefit I can think of, besides fewer HTTP requests is that your site will load all at once instead of gradually as all the graphics are downloaded. The bottom line however, as Henrik said is that it depends on your requirements.
I'm sure you're aware that splitting into multiple images means additional connections to the server to retrieve them, with associated lag on each, and the additional size of the request and response headers.
Since browsers restrict the number of active connections to each server (browser version dependent) this may end up taking longer than retrieving a single image. The usual workaround to lift the limit is to use a separate "images" server, or a DNS alias that maps to the same host.
And unless you require animation, I'd always recommend PNG over GIF.
Make sure that the site looks fine with images disabled first (so alt tags, width and heights set, correct colours used) and then split the images based up into groups. Group all of your buttons into one image if possible (using css sprite sheets), and all of the borders into another. Keep large images in separate files (so site background, headers).
The more images you have, the more the browser can parallelize the requests. However, if you split them up too much then different images will load at different times, making parts of the site pop in. It's a bit of a trade off, but that's the joy of programming :)
The better your site looks before the images are visible, the less the user will mind the speed of downloading the images.

Resources