I need to create a site that is very graphics-heavy (torn paper backgrounds with transparent shadows over textured graphics, etc.) One way that I was thinking of saving on file size was to drop all my background elements into one PNG. The issue is that this file is now 180k. If I break it up into various GIFs and a couple PNGs then it would be closer to 70k.
Does it really matter? What is "too large" these days for file size? Will anyone notice if the file is 180 or 70k?
If your users have fast access to your site (like, in an intranet), 180k is hardly a problem. If, on the other hand, the site is used by The Generic Older Person With A Humorously Slow Connection, it's probably going to be a problem. If your users use GPRS, but have endless patience, it's probably not going to be a problem. If the site gives out a million dollar to whoever has the patience to wait out the load time, transfer speeds are not an issue. And so on.
What I'm saying, it really depends on your requirements and constrains. This requires you to know (and subsequently tell us, for us to be more helpful) many things before you can get it close to right.
To avoid those pesky downvotes for very-valid-answers-but-simply-doesn't-please-someone, here's my answer:
180k divided by a standard ADSL modem transfer rate = 180kB / 100kB/s = 1.8s = endurable.
Is there a reason not to use the smaller images? It sounds like you've already broken it up, so why not go with the smaller, faster method?
From a pure relativistic point of view, 70k will take only 38% of the download time that 180k would (approximately). If you're expecting high traffic or want fast load times, every bit helps.
You have to compare the time it takes to request all the separate images and the time it takes to download the one large one. The issue is with HTTP requests.
I suggest you run some tests with Google's Firefox extension, Pagespeed to see if there is a huge difference between the large png or the separate ones.
One benefit I can think of, besides fewer HTTP requests is that your site will load all at once instead of gradually as all the graphics are downloaded. The bottom line however, as Henrik said is that it depends on your requirements.
I'm sure you're aware that splitting into multiple images means additional connections to the server to retrieve them, with associated lag on each, and the additional size of the request and response headers.
Since browsers restrict the number of active connections to each server (browser version dependent) this may end up taking longer than retrieving a single image. The usual workaround to lift the limit is to use a separate "images" server, or a DNS alias that maps to the same host.
And unless you require animation, I'd always recommend PNG over GIF.
Make sure that the site looks fine with images disabled first (so alt tags, width and heights set, correct colours used) and then split the images based up into groups. Group all of your buttons into one image if possible (using css sprite sheets), and all of the borders into another. Keep large images in separate files (so site background, headers).
The more images you have, the more the browser can parallelize the requests. However, if you split them up too much then different images will load at different times, making parts of the site pop in. It's a bit of a trade off, but that's the joy of programming :)
The better your site looks before the images are visible, the less the user will mind the speed of downloading the images.
Related
I'm optimizing my website and attempting to enhance the Largest Contentful Paint, but the only item that appears to require work are the optimised CSS files made with W3Speedster; here is a link to the Google speed test I performed. I want the LCP to be less than 2.5.
Any advice would be greatly welcomed; thank you!
That does not mean it takes over 3 seconds to render the content.
The rendering of you page from start render to document complete is only about 0.100 seconds.
You server is your biggest problem. It's slow.
The best thing you could do for your pages is either eliminate shareaholic.js, or get it to load sooner. It appears there may be some sort of lazyload. I does not load until 1.8 seconds. That is likely the reason the page rendering starts too late. That and too many font and CSS files. The Browser cannot begin rendering and till it has all the CSS and font files. shareaholic.js is your 15th file being loaded. There are 5 images being prefetched before this js.
You need to get the page to start rendering sooner. You could join some of the CSS files together to reduce the number of files being retrieved.
It's just your ads. :)
You'll most likely not be able to make it any faster. Maybe a little bit. I reduced it from 4 seconds on Cloudways Nginx servers to under 2 seconds on WPX Apache servers.
The problem is, when your ads are loading in the beginning a bidding war takes place. It's only after this they can load in the ads. So it'll always take time.
If I were you, I'd change the host to something like WPX, they will also help you make it as optimized as possible, with ads. If not changing the host, I'd focus on the CLS. Much bigger factor and easier to do something about.
I have a huge image that is going to be uploaded to my server. The width and height are for example: 2000x2000. I have multiple places in my code where I need the image to be 1000x1000, 250 x 110 or like 100 x 50. When they upload the image, should I make ASP automatically re size to these dimensions and save them on my webserver like image-250x110, image-1000x1000, image-original, and image-100x50 or is it fine to just to get the original image and re size on the fly through the code then show it to the users. I'm asking this because I'm worried that if too many people go on your website, re sizing the original huge image all the time would take alot of processing power and slow down the app. Or would something like this would be fine: http://www.hanselman.com/blog/NuGetPackageOfWeek11ImageResizerEnablesCleanClearImageResizingInASPNET.aspx
Your certainly want to resize the image when its uploaded since resizing a huge image of 2000 by 2000 pixels can easily take 2 seconds, even on a fast server. That's a delay your users are unlikely to appreciate.
What you posted, ImageResizer, is going to be your best solution, for many reasons, including:
There are a lot of potential problems when writing your own resizer, as detailed in 20 Image Resizing Pitfalls, including performance and stability issues.
If you redesign your site and want to change those image dimensions... it will be painful.
With ImageResizer, you only have one image - the nice big original image. All subsequent versions are served up (and cached) via the URL call. Note that the cacheable-version of ImageResizer requires a license.
ImageResizer will likely do a better job resizing images than yours will, both in quality and in file size, and also comes with filters, watermarking, and other features.
Resizing on the fly is a really bad option as it will eat up CPU.
If you can't pay for the cacheable version, you CAN use ImageResizer to resize the images on upload and save off the 3 versions or so of the images that you want.
I've used both methods -- written my own and used ImageResizer... ImageResizer is the way to go in my opinion.
I'm looking for good articles around image resolutions, size and quality for web pages, especially around how this affects web sites currently.
I'm working on a web site for a client, and as an honour graduate in arts and design, the client is persistent that her 7mb - 10mb images are sufficient for her web site, totalling in at almost 400mb. I've tried arguing bandwidth limitations and performance but these are not holding ground.
The standard for images are at 72dpi, no larger than your standard screen resolution (1024x768) and above 1mb in size (which is already too large in my opinion). my argument is that loading 7mb+ files into a gallery on page load will seriously hinder the users experience if they have to wait a long period of time for 7 - 10 images to first get streamed into cache before the page is loaded, and even testing this with lazy loading plug-ins (we don't want to go flash) and late-loading has performance penalties.
Does anyone on here have any recommendation around image size, resolution and quality? We don't want to loose the HD quality of the images when users navigate the gallery (obviously we'll have to thumbnail them first)?
i have read guidelines before (when we still used 1Mbps connections or less) and have been following these until now:
high resolution images should not be bigger than 1.5 - 2MB. making it this big is like bigger than the webpage contents itself. try checking http://deviantart.com on how they place big photos in their site and check the image properties using the EXIF if any
dimensions should be enough to be viewable in the browser (and avoid scrolling)
compression is to be tested. it's a case to case basis, no setting is the same for everyone. high quality, high compression without visible quality loss is a practice in web design.
JPEG is best for images, PNG for the layout and GIF for icons.
try loading images in the background when the browser is idle using javascript. that way, they are in the cache before the user knows it.
more on the webpage design, avoid using heavy graphics on the site itself, making the site fast so we only wait for the image.
If you really boil it down you don't have a choice.
You are talking about HUGE file sizes which are not realistic.
You need to download a smaller version. After that you can subsequently download versions with increased quality or offer the full image with an onmouseover or click.
Some general guidelines:
Thumbnails (of course)
Offer multiple image sizes (small, medium, large). While I understand the UX implications of giant images, some people do have fast connections and large displays and/or will be willing to wait for a high-resolution version. But it shouldn't be the only option.
Try different compression levels to see what works best for different sizes. Using one compression level across the board doesn't always work. Again (depending on the source material), there may be a need for near-lossless compression at the high end. For example, images for print, CAD drawings with fine detail, etc.
Use sequential loading techniques if applicable. For example, if you have ten images to load (optimized or not), make sure that the first visible one is the first one actually requested from the server.
When it comes down to it, your client is under the impression that asking to shrink her image represents a 'compromise' that only results in damaging the quality of the image the user receives.
The truth is, of course, that an 8-10MB image is so large that it would take most users many seconds to download, creating a horrible user experience that will increase bounce rates.
Show your client a side-by-side demo of her website loading a handful of web-optimized images, and show her a site loading 8-10MB images, then let her decide. Ultimately, your job as professional is to assist her in making good choices, but she's free to make bad ones if she insists upon it (it's her brand, money, and right).
Something else you can potentially do is detect the size of the window and load larger images if the user is on an ultra-high-resolution monitor or if the window appears to be especially large.
Best of luck!
I wonder what the best trade-off for combining images for CSS sprites is.
Say I have 50 images, but each page only needs 5. The total size of a sprite image is 100kb.
There obviously are a dozen of parameters, how many pages are visited in each visit, the connection speeds of the users, the lag. I'm not looking for a mathematical formula to compute the best trade-off, since I cannot estimate these parameters precisely enough.
So, do you have any experience values on when to combine images to a sprite and when not. (Actually, the "not" is more interesting IMHO).
Do you put all images on a sprite that "could" be needed for a single page, but anything that will only be needed by a second page on a separate sprite?
In my opinion, only group images together in a sprite if they are relevant.
If you have a menu based off of images, I'd make one image with all of the different elements in the menu. If you have a list with icons that appear on hover, make a spritesheet with all of those icons. It does you no good performance-wise to create one huge image when that same image can be combined into three or four smaller images.
It also helps to have only related images together - it keeps your CSS references to the files easier to follow and you have less complex x/y coordinate references.
here is my two cents... I usually try to sprite images for a given path that i am expecting to be hit harder then others. For example; if my sites so i call the critical path is: user logs on, goes to the home page, checks out today's deals, purchase one and logs out, i would like most of the common images sprited (logically grouped if needed) on this path. Having the sprites here will eliminate a lot of extra requests.
If you go google.com you will see a sprite (nav_logo99.png) that has most of the common images you will need on the very likely next page(s).
Also to answer your "when not to sprite", background repeat and CSS sprites does not blend well so i will stay away from those.
The thing I take into account is how many pages my user will visit. You have to remember that you sprite will normally be cached so only ever loaded once.
If your site user runs around a few pages then its fine having a larger file but you are correct if most off your users only visit your home page then your not going to want to load all your sites images in one sprite.
Its best to just go on your own feel off what is best to optimize into sprites
I'm asking if you know if there is a ready-made solution, not really how to do it.
I'm quite sure I can pull it off myself, even if I never ever touched the bytes of a JPEG manually. If you'd like a crack on it, you're invited to do so ;)
The basic Idea is that you have a site with a few JPEG images, but you want to reduce load as much as possible for mobile users.
So you ensure that all of your JPEG´s are progressive and only sends the low-frequency bits of it first, idles down the TCP-connection, and waits for the client to report in how big the available space is in the browser window.
Or alternatively, you have some sort of browsercaps.ini or similar, and rely on that to get the initial resolution -- and then have the reporter report a correction if necessary.
I actually needs this for two entirely separate environments, one is using PHP and the other is using node.js (The latter one is of more importance).
I'm quite sure picasaweb is doing this stuff already, or at least did. You could view an image, and it loads progressively -- then you could enlarge it, it got blocky but continued to load in progressively, I remember that I was quite impressed by that!
(And its unfair that Google keep the cool stuff for them selves, remember their motto {°«°] )
Why not send the client a list of images that could be used for a specific img tag, then have the client determine which one it should use?
It is possible to determine the screen size of the device document.write(screen.width+'x'+screen.height);or the size of the browser. And instead of adding a src attribute for each image, adding the possible sources to a html5 data- attribute like so:
<img data-img="mobile:some-img.jpg,desktop:other-img.jpg" />
JavaScript (With jQuery):
$('img').each(function(){
$(this).attr('src', $(this).attr('data-img').split(',')[0].split(':')[1]);
});