I want to use GZIP files to reduce the page load time.
I have Converted All my JS files in GZIP files.
Now I want to know How Do I set IIS (I am using IIS 7) that this can be work.
Second thing How do i call GZIP file in My Asp.net Pages as well?
Thanks in Advance
I have Converted All my JS files in GZIP files.
Not necessary. Just enable GZIP compression in IIS settings. This article gives details on how to do so. This SO discussion talks about how to validate that the settings were successfully applied.
Second thing How do i call GZIP file in My Asp.net Pages as well
Compression can be enabled for static and/or dynamic files. ASPX pages are considered a dynamic type; enabling dynamic content compression will cause them to be GZIP'd before they are served to a compatible user agent.
To validate, you can use a tool like http://gtmetrix.com/ which will warn about uncompressed resources.
Related
We use HttpContext.RewritePath() to rewrite the path to static files. We do this to be able to virtualize the location of the static files on the server.
When we do so, IIS seems to behave strangely by sometimes not compressing the content of the file, but by still putting gzip encoding in the http response header. When this happens, browsers fail to properly parse the static file since they try to uncompress cleartext data.
We have been looking around and found a few threads talking about this, but we could not find a satisfying answer as to how to avoid this problem, or why should rewriting paths of static files not be allowed.
Shouldn't it be possible to rewrite the path of a static file?
Here a few pages I found about the issue:
GZip compression in IIS7 not working, but content-encoding header is set to gzip
HttpContext.RewritePath breakes buildin IIS gzip
If instead of using mod_deflate or mod_gzip I would manually gzip a css or js file obtaining something like:
base.css.gz
And I rename it as:
base.css
And then load it from an HTML page, would it work?
This could be useful on environments with very limited memory and resources, like for examples wireless access points in mesh networks.
I also wanted to ask if it would have sense to do it on normal websites to save resources of the server's cpu? At the moment I use mod_deflate, I think with this method the contents gets gzipped at every request on the fly, is it so? Isn't that a bit of waste of resources?
I answer to myself because no one addressed my question.
It is possible to serve manually gzipped files, for example css, but they need to be served by the web-server with the correct content-type and compression headers, just loading a gzipped file from a <link> or <javscript> tag won't work.
Here's an example with php:
<?php
header("Content-Encoding: gzip");
header("Content-Type: text/css");
echo file_get_contents("base.min.css.gz");
exit(0);
// no closing tag to prevent occasionally printing any chars
HTTP servers if set to use compression on static files cache the compressed file for you - so don't worry about it.
IIS documentations here
I'm not too up on Apache, but mod_deflate and mod_cache work together https://serverfault.com/a/220418/7869
I'm creating a robots.txt file for my website, but looking through my project structure, I'm not sure what to disallow.
Do I need to disallow standard .NET MVC directories and files like /App_Data, /web.config, /Controllers, /Models, /Global.asax? Or will those not be indexed already?
What about directories like /bin and /obj?
If I want to disallow a page, do I disallow /Views/MyPage/Index.cshtml, or /MyPage?
Also, when specifying the sitemap in the robots.txt file, can I use my Web.sitemap, or does it need to be a different xml file?
'robots.txt' refers to paths as they are publically seen from Web crawlers.
There's nothing particularly special about a crawler: it merely uses HTTP to request pages from your site precisely like a user does.
So, given that your MVC site is properly configured, files like /web.config or the paths you mention won't be visible to the outside world as neither IIS nor your application will be configured to serve them. Even if it was pointed to those files the spider would receive a 404 Not Found and continue.
Similarly, your .cshtml or .aspx content files won't be seen with those extensions. Rather, a Web crawler will see precisely what you'll show to users.
I have a web application running on JBoss and I am using IIS 7 for load balancing JBoss instances. Static files (ex: CSS, JS) are served from IIS. I am using mod_jk ISAPI filter to bridge IIS and JBoss.
I have enabled static compression in IIS. However, the CSS files served from IIS were not getting gzip compressed (I have checked this by examining the response header and it doesnt have content-encoding: gzip header).
Post this, I enabled dynamic compression in IIS and then the CSS files were compressed with gzip. I checked my uriworkermap.properties file and it is not routing CSS file request to JBoss. I am puzzled as to why IIS wouldnt compress CSS files with static compression enabled and only compresses when dynamic compression is enabled.
Thanks,
Kishor
This is probably a result of IIS deciding not to compress the content as it's not considered "frequently hit". If you request the file twice within 10 seconds (make sure you're not hitting a cache, ctrl-F5), does it then compress it?
If so, setting the frequentHitThreshold attribute to 1 on the system.webServer/serverRuntime node in the applicationHost.config file should do the trick, as documented at http://www.iis.net/ConfigReference/system.webServer/serverRuntime.
You can do this by executing the following command as an administrator:
%windir%\system32\inetsrv\appcmd set config /section:serverRuntime /frequentHitThreshold:1 /commit:apphost
A word of warning - the "frequent hit" concept does not seem specific to compression. I have no idea whether there are other consequences as a result of setting this!
When a page is served, i.e., http://some.tld/FAQ.aspx, dynamic compression works just fine; The aspx extension is registered as a file type to compress.
But when using routes.MapPageRoute to rewrite the url as some.tld/FAQ IIS will not compress this page. There's no extension to register here.
Does anyone know how to get IIS to compress pages when using RouteCollections?