I have put my web site through Google's pagespeed test and it has told me to "Leverage browser caching".
I did some research into this and found that I need to enable the content expiration in IIS 6. I did this and set the content to expire every 30 days. I then put my web site through the pagespeed test again and it still came up with the "Leverage browser caching" recommendation.
I have also put the web site through http://web-sniffer.net to see whats coming back and it comes back with Cache-Control: private.
I then tried <%# OutputCache Duration="30" VaryByParam="none" %> in the web form and now it's coming back with Cache-Control: public, max-age=30 which I guess is along the right lines but the google page speed test is still coming back with a list of resources (mostly images) on my web page that have no expiration.
I'm quite confused on this subject. I was under the impression that the web site would inherit from the IIS settings but this wasn't the case until I turned on the outputcache on the page. Is there a way I can get the web site to use the IIS settings or has it got to be done on a page-by-page basis?
Try adding the following to the code-behind in your form:
Response.Cache.SetCacheability(HttpCacheability.Server);
Response.Cache.SetExpires(DateTime.Parse("6:00:00PM"));
http://msdn.microsoft.com/en-us/library/system.web.httpcachepolicy.aspx
HttpCacheability enumeration:
NoCache
Private
Server
ServerAndNoCache
Public
ServerAndPrivate
http://msdn.microsoft.com/en-us/library/system.web.httpcacheability.aspx
Pagespeed probably refers to static resources like js, css, png, gif, etc. files.
By default IIS content expiration do not apply to those files. You need to edit IIS metabase manually.
http://www.galcho.com/Blog/post/2008/02/27/IIS7-How-to-set-cache-control-for-static-content.aspx
Related
In my quest to get the best performance for an ASP.Net/IIS based web application, I would like to have static content be served up from a cookieless domain as suggested by Google.
I have followed this discussion and understand how it would work. Stopping cookies being set from a domain (aka "cookieless domain") to increase site performance
What I fail to understand is how to have the image/js/css files' src point to the new domain name instead of the one resolved by the browser when served up from the original web application.
Here is what I mean -
Original Web Application in IIS at myapp.mydomain.com
Cookieless Web Application in IIS at static.mydomain.com
img tag in a web page is served to the browser from the original Web Application as src="Images/someimage.jpg" which the browser automatically resolves to myapp.mydomain.com/Images/someimage.jpg
The problem for which I am looking for a simple and a smart solution -
Across the application, how do I get the img and asp:image to serve absolute URLs instead of the relative URLs. i.e. have the img tag get the absolute URL as src="//static.mydomain.com/image/someimage.jpg"
PS: I have also referred to How do I setup IIS with a cookieless domain to improve performance? and lot of similar contents, but all of these are tell you what is to be done, but not the how to implement at the ground level.
You will have to provide the fully qualified url:
src="//static.mydomain.com/image/someimage.jpg"
or
src="http://static.mydomain.com/image/someimage.jpg"
I don't see any other way around it.
We have a Sitecore/Webforms based website that we'd like to run behind Akamai CDN however we're having an issue with ViewState MAC validation on our postbacks.
We've worked around this for most of the core forms on the site (by taking them out of the CDN cache and serving them direct for every user), but we're left with a simple form in the footer of every page that posts back to the server.
Currently we're seeing errors:
Validation of viewstate MAC failed.
I believe this is caused by the CDN caching the viewstate fields from the original request and these (obviously) not matching for other users.
As we are running this site on multiple servers, we already have the machinekey correctly configured (we've been able to use postBackUrl settings to post back to other pages/SSL instances/etc.) before we added Akamai.
As we're running Asp.NET 4.5.2 there's no way we can even attempt to disable viewstate MAC even if we thought it was a good idea.
Setting ViewStateMode=Disabled still leaves us with a tiny viewstate (presumably the MAC) which still causes problems.
Is there anyway we can remove the session dependence from the viewstate?
The basic steps we can use to replicate this:
Request page from Browser A - Akamai caches page.
Submit form from Browser A - Success!
Request page from Browser B - Akamai serves cached page.
Submit form from Browser B - ERROR!
Nope, Akamai CDN never caches POST requests. But its good idea to try adding the forms to do not cache list and try replicate the issue.
My setup: ASP.NET 4.0 Web Site running on IIS 6.0.
I have a site with many landing pages with virtual URLs, all being served by a single physical landingpage.aspx file (via ASP.NET 4.0 Routing), in which I use the OutputCache directive, caching the page on the server for 24 hours.
Recently, I started to experience a problem that the __doPostBack JavaScript block was missing from some of the generated pages. There is a LinkButton in the page (inside a webusercontrol), so the JS block should be there, but sometimes it isn't. This is naturally leading to JS errors in the browser upon clicking the LinkButton.
My suspicion is that maybe on the first visit to a given URL handled by the above mentioned physical .aspx file it might have been a visit by a client (a browser or a search bot) which maybe was considered by ASP.NET as a down-level browser, and therefore the doPostBack was not output into the generated cached version of the page, and then this wrong cached version is served to all the subsequent visitors...? On the other hand, I'd say ASP.NET is smart enough to generate different cached version for different levels of browsers, isn't it?
Nevertheless, my question is: can I find the cached files that are served to the visitors somewhere on the server, and somehow check if my assumptions are correct? Also, I would like to disable this ASP.NET recognition of browsers altogether and serve the same content to every browser, just as it would serve to any modern browser.
Thanks a lot for any advice and help.
Answering my own question: Confirmed that the website was sending back HTML without __doPostBack() for unrecognized browsers. Adding ClientTarget="uplevel" to the # Page directive at the top of the .aspx page in question solved the problem and __doPostBack() block is now always present.
I have a situation where I want to catch 404 errors fired by HTML pages (not just aspx pages) but I only have access to the web.config of the root folder of my website, and all sub directories (note, i don't have access to the actual IIS server and I cannot create applications or change settings)
So I did try the web.config customerrors on a subdirectory, and they do work, for ASPX pages only, not HTML pages, does anyone know why?
Note that the two answers above are correct for the usual case. However, IIS 6.0 and below can be configured to process HTML pages or anything else through ASP.NET. Also, IIS 7 has changed things radically - in effect, the ASP.NET pipeline is the IIS pipeline now, so that any piece of content is processed through any HttpModules.
Thus, in IIS 7 and above, anything you can configure for ASPX pages, you can configure for HTML pages.
You could have a look at the new routing capabilities for ASP.NET: http://msdn.microsoft.com/en-us/library/cc668201.aspx.
HTML pages are not parsed by IIS therefore are not affected by web.config settings. I am not aware of any way around this without configuring the settings in IIS.
To be a bit more specific than what Jeremy said, IIS maps different file extensions to different executables. By default it will be configured to let the .NET runtime handle .aspx files (in which case your web.config will be loaded & used), but it will serve the .html pages directly itself (& therefore fall back on its own 404 error handling).
Annoying, but I don't think there's much you can do beyond either having control of IIS, or by making your flat html pages into aspx pages (even though they contain no actual server-side content), to trick IIS into letting .NET handle them.
I set up 404 handler page in web.config, but it works ONLY when extension of URL is .aspx (or other which is handled by ASP.NET).
I know I can setup static HTML page in website options, but I want to have a page.
Is there any options to assign ASPX handler page for all request extensions in IIS?
The direct question was whether or not there are options to assign the ASPX handler to all request extensions: Yes, there is. I'll discuss how to do that shortly.
First, I think the "hidden" question -- the answer you really want -- is whether or not there's a way to redirect all 404 errors for pages other than ASPX, ASMX, etc. Yes, there is, and this is the better choice if it'll solve the issue you're having.
To redirect all 404s in IIS 6, right click your web application root (whether it be its own site or a virtual directory in the main site), and choose "Properties." From there, choose the "Custom Errors" tab. Find 404 in the list and change it to the redirect you want.
Now, if that won't suffice -- and I really hope it does -- yes, you can run every page through the ASPX handler. However, doing so comes at a fairly high cost in terms of efficiency -- raw HTML/image serving is considerably faster than anything dynamic.
To do this, right click your web application root and choose "Properties." Choose the "Home Directory" tab. Click "Configuration;" a new window will pop up. Copy the path from one of the ASP.NET page serves, and then use it for a wildcard application map.
Bear in mind, again, this is the wrong answer most of the time. It will negatively impact your performance, and is the equivalent of using a chainsaw to carve a turkey. I highly recommend the first option over this one, if it will work out for you.
For information:
This is one of the several nice things that IIS7 brings - all pages are routed through the handler such that you can do custom 404s and - usefully - directory and file level security for any file (based on the same web.config stuff as for asp.net files prior to IIS7).
So notionally "use II7" is an answer (will be "the" answer in time) - but of course its not a terribly practical one if you're not hosting/being hosted on W2k8 (or higher).
The web.config can only set up errors pages for pages controlled by it's web site. If you have any other pages outside the purview of the ASP.Net application, then you set up handling for them in IIS. There's an option in there for configuring the 404 page where you can point it to your custom page.
Only other thing i can think of is passing ALL extensions to asp.net.
This way all types of files get processed by asp.net and your custom error page will work.
In the IIS application configuration, you can set a wildcard mapping (".*") to C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\aspnet_isapi.dll
You can setup wild card mapping in IIS (Application configuration/Mappings/Wildcard mappings/ - just set aspnet_isapi.dll as executable and uncheck the Verify that file exists box) that will route all incoming requests to your app - so you can control the behavior directly from it.
You don't have to setup static page in your IIS application settings. Imho, you should be able to setup valid url (e.g. /error_handler.aspx) from your app that will be used as landing page in case of specific server error.
In IIS you can set a Custom Error for 404 errors and direct it to a URL in the site properties.
It shows a static html by default
C:\WINDOWS\help\iisHelp\common\404b.htm
You can change it to a relative url on your site.