I'm building a web app that uses a lot of CSS3 and session storage. My goal is to deny users of older browsers by redirecting them to a "we don't support your browser" page. I'm thinking of using an http module that looks at the incoming request. In the client page I would encode a hidden field that contains the user agent version.
I want this to work for both regular aspx requests and ajax requests to asmx files. What would a good method to do this be?
You may use HttpRequest.Browser property to detects browser type in ASP.NET and have a look at article by Scott Mitchell - Performing Browser Detection Using ASP.NET.
I don't think you would need to encode anything onto the page. Create a module that detects the browser, or the version of javascript the browser supports (for example), using the Http.Browser capabilities.
Related
We have a Sitecore/Webforms based website that we'd like to run behind Akamai CDN however we're having an issue with ViewState MAC validation on our postbacks.
We've worked around this for most of the core forms on the site (by taking them out of the CDN cache and serving them direct for every user), but we're left with a simple form in the footer of every page that posts back to the server.
Currently we're seeing errors:
Validation of viewstate MAC failed.
I believe this is caused by the CDN caching the viewstate fields from the original request and these (obviously) not matching for other users.
As we are running this site on multiple servers, we already have the machinekey correctly configured (we've been able to use postBackUrl settings to post back to other pages/SSL instances/etc.) before we added Akamai.
As we're running Asp.NET 4.5.2 there's no way we can even attempt to disable viewstate MAC even if we thought it was a good idea.
Setting ViewStateMode=Disabled still leaves us with a tiny viewstate (presumably the MAC) which still causes problems.
Is there anyway we can remove the session dependence from the viewstate?
The basic steps we can use to replicate this:
Request page from Browser A - Akamai caches page.
Submit form from Browser A - Success!
Request page from Browser B - Akamai serves cached page.
Submit form from Browser B - ERROR!
Nope, Akamai CDN never caches POST requests. But its good idea to try adding the forms to do not cache list and try replicate the issue.
My setup: ASP.NET 4.0 Web Site running on IIS 6.0.
I have a site with many landing pages with virtual URLs, all being served by a single physical landingpage.aspx file (via ASP.NET 4.0 Routing), in which I use the OutputCache directive, caching the page on the server for 24 hours.
Recently, I started to experience a problem that the __doPostBack JavaScript block was missing from some of the generated pages. There is a LinkButton in the page (inside a webusercontrol), so the JS block should be there, but sometimes it isn't. This is naturally leading to JS errors in the browser upon clicking the LinkButton.
My suspicion is that maybe on the first visit to a given URL handled by the above mentioned physical .aspx file it might have been a visit by a client (a browser or a search bot) which maybe was considered by ASP.NET as a down-level browser, and therefore the doPostBack was not output into the generated cached version of the page, and then this wrong cached version is served to all the subsequent visitors...? On the other hand, I'd say ASP.NET is smart enough to generate different cached version for different levels of browsers, isn't it?
Nevertheless, my question is: can I find the cached files that are served to the visitors somewhere on the server, and somehow check if my assumptions are correct? Also, I would like to disable this ASP.NET recognition of browsers altogether and serve the same content to every browser, just as it would serve to any modern browser.
Thanks a lot for any advice and help.
Answering my own question: Confirmed that the website was sending back HTML without __doPostBack() for unrecognized browsers. Adding ClientTarget="uplevel" to the # Page directive at the top of the .aspx page in question solved the problem and __doPostBack() block is now always present.
I work on a web application in ASP.NET and HTML5. I have a simple page Default.aspx. In its Page_Load handler I call 'Response.Redirect("xxx.aspx"). I also defined a manifest file, Default.appcache as I want my application to work offline (in such case I javascript methods are used for redirection). Browser cached the page as expected but a problem occured - even though server is online, browser uses the cached page. When user enters Default.aspx no call is sent to server. How can I prevent this behavior? I would like the browser to send a normal request to IIS if it is online and use cached page only when server doesn't respond.
I would be grateful for all suggestions.
You can't, pages in the cache are always served from the cache. The only way to update them is update the manifest and force new versions to be downloaded.
If you want one page to be served when online and a different one when offline then you should investigate the FALLBACK section of the manifest. Note that the page which references the manifest is always cached, so you need to set the fallback up on a different pair of pages.
I am redesigning a web project that requires a lot of involved data entry. I would like to make use of ASP.NET's ajax functionality to improve the user experience. But a large portion of my user base is still using Internet Explorer 7, which has caused problems for us in the past when it comes to AJAX functionality. We cannot request they upgrade, and not supporting them is not an option.
Is there an effective way to disable AJAX functionality for those users on Internet Explorer 7 and provide the full ajax experience for users on more compliant browsers?
If you use Modernizr, it will accurately detect which browser you have, and set a specific css value in the html tag of the page. You can then use jquery (or just DOM api) to check the browser version and set a flag that disables your ajax.
Many hosting companies let you define which page will be shown to the user if the user goes to a page that does not exist. If you define some .aspx page then it will execute and be shown.
My question is, why not use this for routing. since I can parse the users URL and then do a server.transfer to the page I want. I checked and there is no redirect sent to the client and the http headers are HTTP/1.1 200 OK.
So, is this a good idea for servers that don't have ASP.NET 3.5 SP1 or if you are not using MVC?
Thanks
You "can" do that, but why not just create an HttpModule and handle the routing there? That's how most of the URL rewriting systems work (in actuality, it's also how the MVC routing works since global.asax is just a pre-build HttpModule with a few extras).
The big thing with relying on that kind of server handling you describe is that you really aren't in control of it, and it is a hackish mechanism... by that I mean you are taking a function of the web server that has a specific purpose and design and laying a different meaning and function on top of it... which means you now have no built in handling for an actual 404 error. Plus, the mechanism you are contemplating is harder to adapt to your purpose than just using the other options available to you.
Unless you need something special from routing, you should consider using an existing routing component such as Mod-Rewrite or one of the dozen or so other popular URL rewriters that were built before the MVC routing engine was implemented and work fine in older versions of asp.net. There are also numerous tutorials on using HttpModules, HttpHandlers, and various other techniques to do routing in asp.net webform environments.