I have installed Static and dynamic compression for IIS7, as well as setting the two web.config values at my application Virtual Folder level. As I understand it, I don't need to enable compression at the server, or site level anymore, and I can manage it on a per folder basis using my web.config file.
I have two settings in my .config file that I have set to customize gzip for my app:
<httpCompression dynamicCompressionDisableCpuUsage="90"
dynamicCompressionEnableCpuUsage="0">
<scheme name="gzip" dll="%Windir%\system32\inetsrv\gzip.dll" />
<dynamicTypes>
<remove mimeType="*/*"/>
<add mimeType="*/*" enabled="true" />
</dynamicTypes>
</httpCompression>
<urlCompression doDynamicCompression="true"
dynamicCompressionBeforeCache="true" />
However, when I run the application, I can clearly see that gzip is not used, because my page sizes are the same. I am also using YSlow for FireFox, which also confirms that my pages are not being gziped.
What am I missing here? In IIS6 it was a simple matter of specifying the file types, and setting the compression level between 0-10. I don't see the need documented to specify the file types or compression level, since the defaults seem to cover the file types, and I'm not seeing the level anywhere.
There was a thread on forums.iis.net about this during the iis 7 beta. Turned out the guy didn't have the modules installed, but it sounds like you've ruled that out from your opening sentence.
Microsofts key advice for him was to enable failed request tracing to find out what was going wrong. This is possibly one of the most under-appreciated features of IIS7, but certainly one of the most powerful.
Open IIS Manager.
Go to your site, and on the actions pane (the very far right), click 'Failed Request Tracing...' under the 'Configure' section.
Click 'enable'.
Then, in the features view, click 'Failed request tracing rules'. Click add, next, enter 200 for the status code, next, click finish.
If you don't see "Failed Request Tracing" in the actions pane, you'll need to add the feature to the server - either using the "Add Role Services" wizard (Health and Diagnostics\Tracing) or through the Web Platform Installer (Products\Server\IIS: Tracing), and then close and re-open IIS Manager.
Next, rerun your test. This will generate some log info for us to examine.
Look in c:\inetpub\logs\FailedReqLogFiles\w3svcx. You will see a bunch of files named fr000xx.xml. Open up any one of them in your browser. (By the way, if you copy these files anywhere, make sure freb.xsl is there. Also, don't delete freb.xsl - if you do, just delete the whole directory or copy it from another location, as IIS only creates it once per folder.)
Click the 'request details' tab and select 'complete request trace'. Search the page for 'compress' - you should find it in several areas; once for static content, and once for dynamic content.
If you don't find either of them, IIS isn't configured correctly. If you do find them, you should see them followed by a compression_success and a compression_do. Success is self explanatory; the 'do' indicates what it did - in my case, it showed "OriginalSize 1462784 CompressedSize 179482"
Since yours isn't working, hopefully you will see something different that helps you solve the problem.
Make sure you turn this off when you're done by disabling failed request tracing in the actions pane for your website.
We had a similar problem and it turns out that IIS7 does some dynamic CPU based throttling here..
http://www.iis.net/ConfigReference/system.webServer/httpCompression
dynamicCompressionDisableCpuUsage
Optional uint attribute.
Specifies the percentage of CPU utilization at which dynamic compression will be disabled.
Note: This attribute acts as an upper CPU limit at which dynamic compression is turned off. When CPU utilization falls below the value specified in the dynamicCompressionEnableCpuUsage attribute, dynamic compression will be reenabled.
The default value is 90.
dynamicCompressionEnableCpuUsage
Optional uint attribute.
Specifies the percentage of CPU utilization below which dynamic compression will be enabled. The value must be between 0 and 100. Average CPU utilization is calculated every 30 seconds.
Note: This attribute acts as a lower CPU limit below which dynamic compression is turned on. When CPU utilization rises above the value specified in the dynamicCompressionDisableCpuUsage attribute, dynamic compression will be disabled.
The default value is 50.
Note the defaults -- if your IIS7 hits 90% CPU usage, it will disable all dynamic gzipped content until CPU usage dips back below 50%!
Also, some great recommendations and benchmarks here on the real CPU cost of GZIP.
http://weblogs.asp.net/owscott/archive/2009/02/22/iis-7-compression-good-bad-how-much.aspx
Long story short, unless you regularly have dynamic pages well in excess of 200kb, it's a non-issue.
Following the excellent advice of JohnW, I too enabled logging to find the culprit, though the reason for the failure turned out to be different:
STATIC_COMPRESSION_NOT_SUCCESS
Reason 14
Reason NOT_FREQUENTLY_HIT
In short, it appears that if you don't hit the page frequently enough then IIS7 will not deem it worthy of compressing, which seems a little bit odd to me. Nonetheless, makes sense in this case because I was just trying to test it on a local machine.
According to this page, the default appears to be that a page has to be hit 2 times within 10 seconds to be a "frequent hit". If you really want to, you can override the default in applicationHost.config (%systemroot%\Windows\System32\inetsrv\config). At least for me it's a locked attribute, so you won't be able to override it in your own web.config.
<serverRuntime frequentHitThreshold="1" />
Also, I note now that SO already had this answer here: In IIS7, gzipped files do not stay that way.
I solved my problem by installing dynamic compression at Add/Remove programs.
In the system.webServer section of your Web.config file, add the following lines:
<remove fileExtension=".js" />
<mimeMap fileExtension=".js" mimeType="application/x-javascript" />
The compression scheme in IIS7 is enabled by default, but it maps only a single javascript mime type to be compressed, application/x-javascript. Adding the line above tells IIS to give all your .js files that mime type, which in turn makes the compression work.
turn on static compression. dynamic compression is for dynamic pages like asp, php, aspx, etc.
Here's a link to the IIS config reference for compression:
For me it turned out to be the setting
noCompressionForProxies
as we are on a proxy here...took myself off proxy and voila, compression.
Related
I know that we're supposed to be careful about using RAMMFAR (runAllManagedModulesForAllRequests), such as the advice in this Hanselman post. The reason is that it sends all requests to every managed module.
But I recently had to create an HttpModule, and I noticed that it gets all requests anyway (because that's what a module is for), and now I'm wondering if there's any performance difference between setting RAMMFAR=true, and simply having a managed module, if that module is going to get all requests.
To put this another way, are managed modules considered harmful to performance? If I test the url and ignore requests I don't care about, will that hurt my scalability at all?
Edit: By all requests, I mean that I see requests for static content, such as css, js, and jpg files that are on disk.
Edit: The module is registered like this:
<modules>
<add name="MyModule" type="MyNamespace.MyModule, MyAssembly"/>
</modules>
There are a few questions in your post. I'll try to address them individually.
Does having a dummy (no-op) managed module impact performance / throughput?
Yes, from the perspective that IIS and ASP.NET need to coordinate the native <-> managed code transition, and this transition incurs some overhead. For the overwhelming majority of applications this overhead is dwarfed by the actual application logic. The types of applications where this tends to show up in profiles are sites which are serving tens of thousands or hundreds of thousands of requests per second. At that point we generally recommend paying very close attention to which modules are included in the application and trimming them down as much as possible.
Why is my module running for static files?
Because you don't have a managedHandler precondition on the module. If this precondition is present on the module declaration, this module will only run if the request is destined for a managed endpoint (.aspx, .axd, extensionless, and so on). If this precondition is not present, the module always runs.
To specify the managedHandler precondition:
<modules>
<add name="..." type="..." preCondition="managedHandler" />
</modules>
Note: if you're on IIS 7.0 or 7.5, you might need to install the patch http://support.microsoft.com/kb/980368 to get IIS to see extensionless URLs as "managed" endpoints.
What does RAMMFAR actually do?
In a nutshell, it ignores the precondition specified in the module registrations. It does what its name implies: it runs all managedHandler modules for all requests, even if those requests weren't destined for a managed endpoint.
I set up iis to handle .exe with "asp.net 2.0"s isapi filter to enable dynamic url replacement.
For this I setup the extension in iis and added the following line to web.config. Works fine so far.
<add path="*.exe" verb="*" type="System.Web.StaticFileHandler" />
the problem is that form that point w3wp process has several handles on these files. Most likely because someone is downloading them at the moment.
Is there a way to tell iis/asp.net not to put a exclusive handle on the file? I want to replace them even if the site is running. This works as long as asp.net does not handle these files.
I don't think there is a way to do this. You are telling IIS to handle this filetype, so the server is assuming a certain exclusivity. This behavior is intended IMO, because it prevents corrupted files in production environment.
Maybe it helps to limit the requests to some selected HTTP verbs (POST/GET), but i don't see any other options.
And the other question is: why do you want to replace a file, that is currently downloaded by a user? It will corrupt his download, forcing him to start all over again.
Trying to upload a large file (20MB), I set the maxRequestLength to a high enough level (and the expiration time too) in the httpRuntime entry of web.config. The Event Log stopped reporting that the post size exceeds allowed limits, but I still get the same behavior in the browser (IE or FF): "The connection to the server was reset while the page was loading." I'm using VS2008 and the built-in web server, not IIS. I've read the Q&A on this topic and even put in the check for the exception (IsMaxRequestExceededEexception) in the application global handler. This was referenced in another StackOverflow thread about this topic. Anyone had anything similar, have any ideas?
Thanks,
Bob
Not come accross this myself with the built in server, however you could try increasing the time out too in the web.config file.
<httpRuntime executionTimeout="****" maxRequestLength="****" />
Figured out my problem. I put the entry into the wrong web.config file -- I have two web apps in the same VS solution. doh!
Http uploads are notoriously unreliable. You should look into some other options that utilize Flash or Silverlight for file uploads.
I'm building a static ASP.NET site (using Masterpages and a few forms) and I'm about to release it onto my production server.
I know about changing <compilation debug="true"> to false, but I'm wondering what other things I can do to obtain the highest speed possible. There is no data access in the site, it's all static content.
Does anyone have a checklist they run through or know of a good resource for setting up sites in a production environment, with a focus on performance?
Checklist so far (Feel free to edit this yourself with any worth additions)
Make sure <compilation debug="false" /> is actually set to false in Web.Config
Make sure <trace enabled="false" /> is actually set to false in Web.Config
Set necessary read/write/modify folder permissions for site
Enable GZIP in IIS (reduces size of pages/css/javascript dramatically)
Have you considered OutputCaching for any pages / controls?
Consider setting up Web Tests (Eg WatiN for .NET) to make sure functionality on your site is still working ok
Make sure it isn't Friday afternoon!
If you're writing any log or output files, make sure the proper folder permissions are setup in the production environment. Typically debug/test environments are much more lax on file read/write permissions than production.
Don't deploy on Friday afternoons! This is guaranteed to mess up your head for the weekend.
Also, don't forget to check the gzip settings in IIS. Compressing output will make things travel across the wire much faster.
There is actually a very good checklist on how to perform a security deployment review provided on MSDN.
if its all static content, you'll want to use aggressive Output Caching
If your site use a database and only presenting information, make the database read-only. That takes away all locking handling and speeds upp the access a great deal.
If you have a back-end that updates the data, make it a separate database and have sheduled periods that update the readonly database once a day or what is needed for that application.
If you just present news and other small things on a company web-site that not change so often then this solution is probably for you. Even if its a site with gigabytes of data.. The key word is, how often does we update the data?
From what I see in daily business,noone really thinks about this solution because everything has to be "real time", but there are plenty of cases where this would be a perfect solution.
Review your web.config
Check debug (web.config / *.svc), tracing, ...
Update debug to production values:
email addresses
(web)service addresses
location log files
quick search: link
You should have some sort of test to verify various functions of your site, and the permissions. For instance, once you publish. Walk through a checklist, can I access x if I don't have permission? Does x,y,z work on the application? I do this after every publish because small changes can have a big impact.
You should read this:
https://stackoverflow.com/questions/72394/what-should-a-developer-know-before-building-a-public-web-site
It's currently the 9th highest voted question on SO and in the top 3 most favorited. The caveat is that it's platform agnostic, so it's missing some ASP.Net-specific items.
Thoroughly test the site outside of your corporate firewall / proxy after clearing your browser cache. This will help to ensure that all resources are publicly accessible (and are not on a local server or cached). For instance, you might find that you have used absolute URLs to include, say, JavaScript or CSS files. These work fine in your development environment, but as soon as the site goes live they are inaccessible. Or you have a CSS file in your cache that has subsequently been deleted, but you don't notice.
Ensure that any products / applications you use that have keys that are tied to a domain will work on your live site. This includes things like Google Map keys or commercial 3rd party applications. It also includes automatically generated hyper-links sent out in, say, emails. You wouldn't want a user registration to have a link back to http://localhost/comfirm.aspx or the like, would you?
Platform: IIS 6, ASP.Net 2.0 (.Net 3.5), Server 2003.
I'm building an application that accepts files from a user, processes them, and returns a result. The file is uploaded using HTTP POST to an ASP.Net web form. The application is expecting some large files (hundreds of MB).
I'm using SWFUpload to accomplish the upload with a nice progress bar, but that's not contributing to the issue, because when I bypass it using a standard HTML form pointing at my upload accepter page, I get the exact same error. When using the progress bar, the upload continues to 100%, then fails. With a standard form, the behavior appears to be the same.
I'm having a problem right now uploading a file that's about 150MB. I've changed every settings I can find, but still no luck.
Here's a summary of what I've changed so far:
In Web.config:
Added this inside system.web:
<httpRuntime executionTimeout="3600" maxRequestLength="1536000"/>
In machine.config:
Inside system.web, changed:
<processModel autoConfig="true" />
to:
<processModel autoConfig="true" responseDeadlockInterval="00:30:00" responseRestartDeadlockInterval="00:30:00" />
and in MetaBase.xml:
Changed:
AspMaxRequestEntityAllowed="204800"
to:
AspMaxRequestEntityAllowed="200000000"
When the upload fails, I get a 404 error from IIS. My web form does not begin processing, or at least, it doesn't make it to the Page_Load event. I threw an exception at the beginning of that handler, and it doesn't execute at all on large files.
Everything works fine with smaller files (I've tested up to about 5.5MB). I'm not exactly sure what file size is the limit, but I know that my limit needs to be higher than 150MB, since this is not the largest file that the client will need to upload.
Can anyone help?
Urlscan was active on all websites, and has it's own request entity length limit. I wasn't aware that Urlscan was running on our server because it was a global ISAPI filter, not running on my individual website.
Note: to locate global ISAPI filters, right click on the Web Sites folder in IIS Admin and click Properties, then on the ISAPI Filters tab.
(A note for googlers):
For IIS7 add below to web.config (I added above <system.serviceModel>):
<system.webServer>
<security>
<requestFiltering><requestLimits maxAllowedContentLength="262144000" /></requestFiltering> <!-- maxAllowedContentLength is in bytes. Defaults to 30,000,000 -->
</security>
</system.webServer>
When we ran into this issue we had to increase the buffer size limit according to this KB article:
http://support.microsoft.com/kb/944886/en-us
I know this mentions ASP, but I believe it worked for ASP.NET as well.
Edit: Here is a link that might be more relevant to your issue and provide other options:
http://weblogs.asp.net/jgalloway/archive/2008/01/08/large-file-uploads-in-asp-net.aspx
404 and missing Page_Load: IIS can only process the request once the complete POST is on the server. Therefore, if the POST fails (due to its size), it cannot fire the page's events.
You might try NeatUpload http://www.brettle.com/neatupload.
From the Manual: "By default, NeatUpload does not directly limit the size of uploads."
You can also try Velodoc XP Edition which has several advantages over NeatUpload including the fact that it uses ASP.NET Ajax extensions. See also the Velodoc web site for more information.
You say:
But 1536000 is only 1.5MB?