I am trying to enable gzip compression for components of my website. I have ubuntu 11.04 server and nginx 1.2.
in my nginx configuration of the website, i have this
gzip on;
#gzip_min_length 1000;
gzip_http_version 1.1;
gzip_vary on;
gzip_comp_level 6;
gzip_proxied any;
gzip_types text/plain text/html text/css application/json application/javascript application/x-javascript text/javascript text/xml application/xml application/rss+xml application/atom+xml application/rdf+xml;
#it was gzip_buffers 16 8k;
gzip_buffers 128 4k; #my pagesize is 4
gzip_disable "MSIE [1-6]\.(?!.*SV1)";
and Yslow and google speed measures are advising me to use gzip to reduce transmission over network.
now when i try to curl -I my_js_file i got
curl -I http://www.albawaba.com/sites/default/files/js/js_367664096ca6baf65052749f685cac7b.js
HTTP/1.1 200 OK
Server: nginx/1.2.0
Date: Sun, 14 Apr 2013 13:15:43 GMT
Content-Type: application/x-javascript
Content-Length: 208463
Connection: keep-alive
Last-Modified: Sun, 14 Apr 2013 10:58:06 GMT
Vary: Accept-Encoding
Expires: Thu, 31 Dec 2037 23:55:55 GMT
Cache-Control: max-age=315360000
Pragma: public
Cache-Control: public
Accept-Ranges: bytes
any idea of what i have done wrong or what shall i do to get compressed content?
As others have written, it's not enough to enable gzip compression in your server -- the client also needs to ask for it in its requests via the Accept-Encoding: gzip header (or a superset thereof). Modern browsers include this header automatically, but for curl you'll need to include one of the following in your command:
-H "Accept-Encoding: gzip" : You should see the Content-Encoding: gzip header in the response (might need to output headers with curl's -v flag), as well as some seemingly garbled output for the content, the actual gzip stream.
--compressed : You should still see Content-Encoding: gzip in the response headers, but curl knows to decompress the content before outputting it.
I can't find anything obviously wrong with your config, usually gzip on & gzip_types application/x-javascript would be enough to get you going. If everything is working right you'll get a "Content-Encoding:gzip" returned back to you.
PLEASE KEEP IN MIND: I have much more consistency with GOOGLE DEVELOPER TOOLS (curl just doesn't behave the way a browser would).
In Chrome, right click and go to "inspect element" then go to "network" (then reload the page if you have to), then click on a resource and check the header tab, the output should look like this (notice the content-encoding is gzip, yay):
Request URL:https://ssl.gstatic.com/gb/js/sem_a3becc1f55aef317b63a03a400446790.js
Request Method:GET
Status Code:200 OK (from cache)
Response Headersview source
age:199067
cache-control:public, max-age=691200
content-encoding:gzip
content-length:19132
content-type:text/javascript
date:Fri, 12 Apr 2013 06:32:58 GMT
expires:Sat, 20 Apr 2013 06:32:58 GMT
last-modified:Sat, 23 Mar 2013 01:48:21 GMT
server:sffe
status:200 OK
vary:Accept-Encoding
version:HTTP/1.1
x-content-type-options:nosniff
x-xss-protection:1; mode=block
Anyway if you are SURE your content is not getting gzipped, I normally get up and running pretty fast with the following:
## Compression
gzip on;
gzip_buffers 16 8k;
gzip_comp_level 4;
gzip_http_version 1.0;
gzip_min_length 1280;
gzip_types text/plain text/css application/x-javascript text/xml application/xml application/xml+rss text/javascript image/x-icon image/bmp;
gzip_vary on;
You could try this in replacement for your code, and/or tweak your values one at a time to help you localize your issue.
Remember to restart or reload nginx after changing the config.
It may also be useful to check your logs and see if there's anything interesting there should you still be stuck.
I just changed gzip_http_version 1.1; to be gzip_http_version 1.0; and then it worked
I had to enable gzip in my /etc/nginx/nginx.conf configuration:
gzip on;
gzip_disable "msie6";
gzip_types text/plain text/css application/json application/javascript application/x-javascript text/xml application/xml application/xml+rss text/javascript;
Please note that I had to add application/javascript to the standard gzip_types configuration.
You need to run:
curl -I --compressed my_js_file
to make curl send an Accept-Encoding header for gzip - the server will only compress content if the client sends a header saying it will accept it.
NB you can write:
gzip_disable "msi6"
rather than using a regex to disable in IE 5.5 and 6, and you needn't specify text/html as a type because it is always compressed as long as gzip is activated.
I am just taking a guess here, but I think you may have to increase your gzip buffer size.
Here are the files that the browser pulls down from the domain. The number on the right is the file download size.
You may not be able to tell from the screen shot, but all of the text content files ARE gzipped, except for the js file you mention in your question. In the screenshot the js file is the file in green, with a size of about 200K. This file size is greater than what you have specified for your gzip buffers (128K).
The Gzip module docs do not really give a good indication as to what the gzip buffers are used for (whether the buffers are used for uncompressed or compressed data). However, the following post seems to indicate that the buffer size should be greater than the uncompressed file size: Large files with NGINX, GZip, and SSL
Here is my nginx configuration and it works.
gzip on;
gzip_min_length 1000;
gzip_buffers 4 8k;
gzip_http_version 1.0;
gzip_disable "msie6";
gzip_types text/plain text/css application/json application/javascript application/x-javasc ript text/xml application/xml application/xml+rss text/javascript;
gzip_vary on;
I think the keypoints are gzip, gzip_disable and gzip_types.
Just like Alaa I had to add gzip_http_version 1.0; (no version was previously specified) for it to work (I tried on Firefox 27.0.0).
I've experienced the same problem as Alaa, and the problem is caused by Antivirus software, that is currently installed on my computer.
Proxy servers and anti-virus software can disable compression when files are downloaded to a client machine. So if you are running web site in a browser on a client machine that is using such anti-virus software, or that sits behind an intermediate proxy server (many proxies are transparent, and you may not even be aware of a proxy intervening between your client and web server), they may be the cause of this issue.
Disabling antivirus solved my problem with browsers and you don't even need to set gzip_http_version to 1.0.
Hope that will help you.
Related
I want to enable gzip compression on my virtual host with nginx. My control panel is Plesk17 but I have access to server root. I found the vhost nginx config file in this dir:
/etc/nginx/plesk.conf.d/vhosts
and add this codes in server block to enable gzip:
gzip on;
gzip_disable msie6;
gzip_proxied any;
gzip_buffers 16 8k;
gzip_types text/plain application/javascript application/x-javascript text/javascript text/xml text/css;
gzip_vary on;
After all and restarting the nginx, when I check the gzip status, it looks disabled!
For your information, I also have this comments at the top of my config file:
#ATTENTION!
#
#DO NOT MODIFY THIS FILE BECAUSE IT WAS GENERATED AUTOMATICALLY,
#SO ALL YOUR CHANGES WILL BE LOST THE NEXT TIME THE FILE IS GENERATED.
What's wrong? how can I enable the gzip?
To enable gzip compression for particular domain open Domains > example.com > Apache & nginx Settings > Additional nginx directives and add directives to this section.
If you want to enable it server-wide just create new file /etc/nginx/conf.d/gzip.conf add content there and restart nginx.
I have a system use HTTP(S) load balancer of google cloud. I have a problem about gizp with static file. On each web server, I config enable gzip as bellow code:
gzip on;
gzip_comp_level 2;
gzip_min_length 1000;
gzip_proxied any; # expired no-cache no-store private auth;
gzip_types text/plain application/x-javascript text/xml text/css application/xml image/svg+xml;
gzip_disable "MSIE [1-6]\.";
then I access home page, the gzip applied, but I access css file, gzip does not apply. Everything I see when I access css page is:
accept-ranges:bytes
alt-svc:clear
cache-control:max-age=31536000
content-length:614170
content-type:text/css
date:Sat, 22 Jul 2017 04:14:08 GMT
etag:"5967cbd7-95f1a"
expires:Sun, 22 Jul 2018 04:14:08 GMT
last-modified:Thu, 13 Jul 2017 19:36:55 GMT
server:nginx
status:200
via:1.1 google
x-host:instance2
Please help me find a reason and how to fix. Thanks
If I understand correctly it's better not to gzip small resources as they might actually get bigger while still having a performance hit on the CPU.
So using the gzip_min_length directive is an obvious solution to that.
However, when trying this on a server that runs a REST API I'm working on this doesn't seem to work.
When I receive an empty json response, or a very small one, the Content-Encoding header is still present and reading "gzip".
HTTP Response headers
My question is why this setting is not being respected by NginX and what can I do to fix it?
The API is built on the Lumen microframework.
I have attached the Gzip setting I'm using in my nginx.conf:
# Compression
# Enable Gzip compressed.
gzip on;
# Enable compression both for HTTP/1.0 and HTTP/1.1.
gzip_http_version 1.1;
# Compression level (1-9).
# 5 is a perfect compromise between size and cpu usage, offering about
# 75% reduction for most ascii files (almost identical to level 9).
gzip_comp_level 5;
# Don't compress anything that's already small and unlikely to shrink much
# if at all (the default is 20 bytes, which is bad as that usually leads to
# larger files after gzipping).
gzip_min_length 1000;
# Compress data even for clients that are connecting to us via proxies,
# identified by the "Via" header (required for CloudFront).
gzip_proxied any;
# Tell proxies to cache both the gzipped and regular version of a resource
# whenever the client's Accept-Encoding capabilities header varies;
# Avoids the issue where a non-gzip capable client (which is extremely rare
# today) would display gibberish if their proxy gave them the gzipped version.
gzip_vary on;
# Compress all output labeled with one of the following MIME-types.
gzip_types
application/atom+xml
application/javascript
application/json
application/rss+xml
application/vnd.ms-fontobject
application/x-font-ttf
application/x-web-app-manifest+json
application/xhtml+xml
application/xml
font/opentype
image/svg+xml
image/x-icon
text/css
text/plain
text/x-component;
# text/html is always compressed by HttpGzipModule
Confirming my note above, this does seem to correspond to the note in the NGINX gzip module documentation stating "The length is determined only from the “Content-Length” response header field."
With gzip_min_length 1000;, my JSON responses were being gzip'ed, even if they were only 100 bytes.
I changed my application to add the Content-Length: 100 header and NGINX sends the JSON response without using the gzip encoding.
If I change the configuration to gzip_min_length 80; with the same 100-byte Content-Length, then NGINX applies the gzip encoding as expected.
Short story: you need to apply the Content-Length header for NGINX to properly handle the gzip_min_length check.
I'm trying to improve page speed on a site and using "Yslow" and "Page Speed" to monitor the speeds. I am being told by both to "compress components with gzip" and given a listing of a number of CSS and JavaScript files, for example
/css/styles.css?v=6.5.5
/jquery.flexslider.js
/4878.js
/6610.js
/homepage.css?v=6.5.5
Our hosting have informed us that nginx is doing the gzip compression on ALL assets, even if it reverse proxies back to Apache and the folllowing values from the nginx site-enable files, which is enabled at a virtual host level, confirms this:
gzip on;
gzip_disable msie6;
gzip_static on;
gzip_comp_level 9;
gzip_proxied any;
gzip_types text/plain text/css application/x-javascript text/xml application/xml application/xml+rss text/javascript;
Is there a reason these tools are not picking up by the compression or is it in fact they are not being compressed at all and we need to get our hosting to add something extra?
your hosting provider claims that the requests leave nginx compressed that leaves as potential problem causes:
there's a proxy/cache/virusscanner somewhere on the network path between the nginx server and your client that strips out the compression.
your browser saves an uncompressed version of the asset, and yslow/pagespeed ends up using that (if so make sure you trying it with an empty browser-cache should fix it)
you're hosting provider's claim is false (but the posted config bit seems ok to me )
the problem could be a proxy or cache inbetween the nginx server and your browser that strips out the compression.
Some things to try:
Try checking the url's with on online checker for gzip like http://www.whatsmyip.org/http-compression-test/ or http://www.dnsqueries.com/en/check_http_gzip.php
check locally what the result of curl --compressed --head <your-asset-url> is (you should see a Content-Type: gzip if the response coming in is compressed)
Setup:
IIS7 serving ASP classic VB script code which generates a dynamic VSC page/file with headers to download.
Response.ContentType = "text/x-vCalendar"
Response.Expires = -1
Response.Buffer = True
Response.Clear
Response.AddHeader "Content-Disposition", "filename=" & strFileName & ".vcs;"
Response.Write strFileContent
Our IIS7 serrvers are behind a nginx reverse proxy. Everything is working fine, except this file download.
Problem:
When using IE and going in through the reverse proxy (load balancer) the file is not downloading as a .vcs but wanting to download the .asp file/page.
When using other browsers through the reverse proxy (load balancer) it works fine.
When using IE and bypassing the reverse proxy (load balancer), going straight to the IIS server, it works fine.
Assumption:
Sounds like it's a HTTP header issue. The only differences I could find in the responses were the additional response headers of:
Connection: keep-alive
Vary: Accept-Encoding
Header Responses:
HTTP/1.1 200 OK
Cache-Control: private
Content-Length: 1431
Content-Type: text/x-vCalendar
Expires: Fri, 09 Jul 2010 13:26:38 GMT
Server: Microsoft-IIS/7.5
Content-Disposition: filename=2507541_16268.vcs;
X-Powered-By: ASP.NET
backend: iis1
Date: Fri, 09 Jul 2010 13:27:37 GMT
HTTP/1.1 200 OK
Connection: keep-alive
Vary: Accept-Encoding
Cache-Control: private
Content-Length: 1431
Content-Type: text/x-vCalendar
Expires: Fri, 09 Jul 2010 13:26:19 GMT
Server: nginx
Content-Disposition: filename=2507541_16268.vcs;
X-Powered-By: ASP.NET
backend: iis1
Date: Fri, 09 Jul 2010 13:27:15 GMT
Request
Is there any light anyone can shed on the issue?
nginx settings to change, or ASP code to add?
So I finally figured this out, thought I'd post it for anyone else who needed the assist.
I commented out the gzip_vary line, from my nginx.conf file - that seemed to fix things but I chose not to set the setting to "off" because I didn't want to forcefully remove the Vary header from other browsers where things were working... commenting out worked.
# commenting this out seemed to work, but I could have set to: off
# gzip_vary on;
additionally, I also told gzip to disable for IE6.
Gotcha: I found that specifying an expires header also caused problems. I suggest you comment out any expires directives while testing and figure out how to filter out as needed.
so, for good measure, here's the updated compression part of my conf for nginx
## Compression
gzip on;
gzip_buffers 16 8k;
gzip_comp_level 6;
gzip_http_version 1.0;
gzip_min_length 0;
gzip_types text/plain text/css image/x-icon text/html text/xml application/x-javascript;
#gzip_vary on;
gzip_disable "msie6";
gzip_proxied any;