Is it possible to configure NGINX to compress PNG files it serves. For example I have some PNG files generated by a third party tool and they are uncompressed (do not use PNG compression). I would like to get them compressed before serving them through NGINX. I do not mean gzip, but real PNG compression. Is it even possible ?
As #Tarun mentioned, Google's pagespeed project could help with that. I have used it in the past with some level of success. However, I would advise against it unless you really optimize NGINX's caching as well. You really don't want to be compressing images on the fly every time your server receives a request. Instead, I would compress the png images before they even get to your server. I personally use kraken.io for this, but there are a zillion great compression tools. Then, you are just compressing them one time and not performing the same compression on the fly every time a visitor requests the image.
Related
I have a WordPress webiste and its waiting time is too high I have done the optimization using autoptimizer and use the gzip compression code in the htaccess file for the compression.
please help me out is this the issue from the my end or is this server issue
I am placing the link of the recent test fone using Gtmatrix fro my website:
https://gtmetrix.com/reports/unicommerce.com/vein49uQ
help me out with the valuable suggestion that I can do for the optimization of speed and scoores
Your server sends the data with an outdated HTTP protocol and is quite slow generally.
HTTP/1.1 200 OK
Try to implement the HTTP2 Protocol on your server because it is capable of multithreading. Then the combining of css files and scripts and move them to footer and so on gets obsolete.
What kind of hosting do you have?
And there are some typical problems, like render blocking ressources and so on to eleminate:
https://developers.google.com/speed/pagespeed/insights/?hl=en&url=https%3A%2F%2Funicommerce.com
Regards Tom
You need to compress JS , CSS and HTML files using htaccess files
Combine CSS into one CSS so all css will not load
Block unnessesary JS and CSS files to load
Add some browser cache plugin
EWWW plugin for image compress
Visit https://samaxes.com/2008/04/htaccess-gzip-and-cache-your-site-for-faster-loading-and-bandwidth-saving/
first thing you can check that which file taking longer time for load and apply defer or async keyword on link so your loading can improve
This discussion around gzip, minified, and combining both for maximum compression effects made me wonder if I can get a net performance effect by placing jQuery libraries directly inside my HTML file?
I already do this for icon sprites, which I base64 encode and then load via CSS datasource, in an effort to minimize the number of HTTP requests.
Now I ask: do you know if loading the jQuery library in its raw min+gzip form, as available on the official production download link, would be more efficient compared to loading it via the Google AJAX libraries that are hosted on their CDN?
TL;DR:
You should always keep common items (like a JS library) in external files, because that allows for caching on your server, in the browser, and at many nodes along the way. Caching times dramatically outweigh compression or request times when looking at the overall speed of a page.
Longer Version:
There are a number of things to consider for this question. Here are the ones that came off the top of my head.
Minifying scripts and styles is always good for speed. I don't think there's much to discuss here.
Compression (and extraction in the browser) has an overhead. Firing up the gzip module takes time away from sending the request, so for small files or binary files (images, etc.), it's usually better to skip the gzip. Something large and consisting mostly of ascii (like a JS library), however, is a good thing to compress, so that checks out.
Reducing HTTP requests is generally a good thing to do. Note, though, that ever since HTTP 1.1 (supported in all major browsers), a new HTTP request does not mean a new socket connection. With keep-alive, the same socket connection can serve the webpage and all the affiliated files in one go. The only extra overhead on a request is the request and response headers which can be negligible unless you have a lot of small images/scripts. Just make sure these files are served from a cookie-free domain.
Related to point 3, embedding content in the page does have the benefit of reducing HTTP requests, but it also has the drawback of adding to the size of the page itself. For something large like a JS library, the size of library vastly outweighs the size of the HTTP overhead needed for an extra file.
Now here's the kicker: if the library is embedded in the page, it will almost never be cached. For starters, most (all?) major browsers will NOT cache the main HTML page by default. You can add a CACHE-CONTROL meta tag to the page header if you want, but sometimes that's not an option. Furthermore, even if that page is cached, the rest of your pages (that are probably also using the library) will now need to be cached as well. Suddenly you have your whole site cached on the user's machine including many copies of the JS library, and that's probably not what you want to do to your users. On the other hand, if you have the JS library as an external file, it can be cached by itself just once, especially (as #Barmar says) if the library is loaded from a common CDN like Google.
Finally, caching doesn't only happen in the client's browser. If you're running an enterprise-level service, you may have multiple server pools with shared and individual caches, CDN caches that have blinding speed, proxy caches on the way to your domain, and even LAN caches on some routers and other devices. So now I direct you back to the TL;DR at the top, to say that the speed-up you can gain by caching (especially in the browser) trumps any other consideration you are likely to encounter.
Hope that helps.
How to upload huge (or not huge) files from ftp/http to http (vice versa) directly without downloading them on computer?
My example: I have FTP, where I upload a lot of videos (every video has a http link btw). I want to transfer a couple videos to another website, like youtube, but I dont want to download them. How to make it directly without downloading?
You can't unless you have shell access to one of the remote servers. There are file transfer protocols that can do this but they aren't common.
I think I understand the question, but I don't think what you want to do is possible. You have to have a client talking the protocol in order to make the transfer.
I set dynamic compression and static compression on in iis7 manager. So html,css,js content is compressed ok and i see in headers Content-Encoding:gzip, but not with image formats: jpeg, gif and even bmp.
Generally speaking, you wouldn't want to. JPEG, GIF, and most (internet) image formats are already compressed. Compressing them a second time would add a lot of server overhead for very little size gain (and possibly a loss).
I'm trying to find the best way to speed up the delivery of the static images that compose the design of an mvc site. The images are not gzipped, nor cached in the server or on the client (with content expire). Options are:
Find why images are not cached and gzipped direcly from IIS6
Write a specialized http handler
Register a special route for static images and write a bynary actionresult method
What could be the best solution in terms of performance?
Best solution is to let IIS do it.
IIS6 Compression - most likely you need to specify file types to be compressed like .jpg, .png, .gif types, etc.
Caching will come from making sure that the correct headers are being sent to the client from code, and i believe there is a setting you can set in IIS that enables it for static content, but i'm not sure on that one.
Surely the gain from gzipping most images is negligable since they're already compressed ?
Naybe you have some really badly compressed PNG files or something?
You might want to check out yahoo's performance advice site which includes some useful tips on optimizing images including links to utilities such as pngcrush.
its much better to use an image optimizing utility ONCE than to rely on IIS to compress them (possibly inefficiently) on the fly.
There's a nice library up on the MSDN Code Gallery that does this. It's called FastMVC.