viewstate field completely missing in safari - asp.net

This is happening in multiple versions of Safari, including 5.x
It will post _EVENTTARGET=&_EVENTARGUMENT= but nothing for __VIEWSTATE=
This is only happening in Safari, and only on one page of our site.
I can't reproduce it - we've spent days trying to.
The viewstate isnt overly huge on this page.
Thanks!

We ran into a lot of viewstate problems with version 3. Safari sets limits to the amount of data that can appear in any one field that gets posted back to the server.
The way we got around our problems was to set viewstate to span multiple input controls.
You can do this in the system.web / pages section of the web.config. For example:
<system.web>
<pages maxPageSTateFieldLength="500" />
</system.web>
You might have to play with the value. I can't remember what the limits are for the various versions of safari. A few people have said 1k, but if I remember correctly from our testing some versions were only passing around 500 bytes.
Another option is to store viewstate server side. You can see an example of this here. You should also read this blog about potential issues. We did try this path and eventually abandoned it as it conflicted with some other encryption things we were doing.

(taking a different tact from previous answer)
To sum up what we know thus far:
only safari
only a particular page
there is a device called StrangeLoop in the mix which removes viewstate on the way out and puts it back in when the page is posted back. It does so through some type of token value.
A couple of questions:
First, is this limited to just a particular customer or set of people? I ask because it might be important that it's "only" safari.
Second, does the StrangeLoop device have some type of timeout value or traffic limit where it's token cache is garbage collected?
I can envision a scenario where a particular client goes to this page and sits for awhile (10 minutes.. longer?). In the meantime either a timeout value is met or the amount of traffic you have forces the strangeloop device to throw viewstate for this particular client out. Then when they go ahead and post back the device has no viewstate to inject back into the html stream.
It seems to me that in order for you to not have any viewstate at all, the device itself must not be injecting it. The only reason I can come up with for that would be if the token value wasn't sent by safari (unlikely as it has to be quite small) or the device couldn't locate a match in it's cache table.
Does the device have any sort of logging or metrics where you can see if it can't match an incoming token value?
A similar idea is if this page has some ajax going on. Does the device send a different token back for each request or does a single client browser retain the token for the entire browsing session? If it sends a different token.. then it might be that safari isn't properly updating itself client side with the new token value. Although this path ought to be pretty easy to duplicate.

Related

stylesheet linked with question mark and numeric value

I can see this site.com/assets/css/screen.css?954d46d92760d5bf200649149cf28ab453c16e2bwhat is this random alpha numeric vales question mark ? i don't think it's taking some value to use or what is it about ?
edit : also on refreshing page the alpha-numeric value is same.
It is for preventing the browser from caching the CSS. When a CSS is requested by some browsers, specifically Internet Explorer, the browser will have a local copy of the CSS.
When a request is given to a server as:
site.com/assets/css/screen.css?skdjhfk
site.com/assets/css/screen.css?5sd4f65
site.com/assets/css/screen.css?w4rtwgf
site.com/assets/css/screen.css?helloWd
The server at site.com sees only:
site.com/assets/css/screen.css
And gives the latest version. But when the HTML page is requesting the browser to fetch the CSS as: site.com/assets/css/screen.css, for the first time, it fetches from the site.com server. There are many possibilities that the content might be changed in the meantime when the next request is sent. So programmers generally add a ?and-some-random-text, which is called Query String. This will force the browser to get a new copy from the server.
Some more detailed explanation:
It is a well known problem that IE caches too much of html, even when
giving a Cache-Control: no-cache or Last-Modified header to
everypage.
This behaiviour is really troubling when working with querystrings to
get dynamic information, as IE considers it to be the same page
(i.e.: http://example.com/?id=10) and serves the cached version.
I've solved it adding either a random number or a timestring to the
querystring (as others have done) like this
http://example.com/?id=10&t=2009-08-06_13:12:56 that I just ignore
serverside.
Is there a better option? Is there another, cleaner way to acomplish
this? I'm aware that POST isn't cached, but it is semanticaly
correct to use GET here.
Reference: Random Querystring to avoid IE caching

Error trapping on the Response.Redirect

We are using the Response.Redirect to send users to a web site to take a questionnaire. We have a database that stores information about the user's eligibility to take a survey and if they are eligible, a 'Take Survey' button appears on their home page and a variable stores the URL for the Survey.
On the TakeSurvey_Click event, the code originally the following:
FormsAuthentication.SignOut();
Response.Redirect(TheURL);
Pretty straight forward and worked great for years. Recently, we have changed the web site to which the user was being redirected. There have been no issues for many thousands of users; however, for a reasonably significant group (2-3%), nothing happens when they press the 'Take Survey' button.
I am reasonably certain after searching the internet, that I have tried all of the recommended methods for handling this situation, but none really does what I want. What I would like to happen is, that if the Take Survey doesn't send the user to the link (BTW, we have checked the link being generated for the non-working links, and they are good links), I want an informational page to appear telling them that we are having issues and if they could send us some information about their environment, it would be useful in fixing the issue. Seems simple enough, but no matter what I try, I either can't get the page to not display, or, if I use the override and send a false, it never displays and never redirects.
Anyone have any ideas?
Response.Redirect sends a "302 Moved" response to the browser. The browser is responsible for navigating to the destination URL.
Thus, once you've issued the Response.Redirect, it's out of your hands and there is no easy way to detect that the browser has not successfully navigated to your chosen URL.
Now, there are a few different ways (that I can think of do deal with this):
You might be able to set document.location from client-side JavaScript.
The page that issues the redirect could refresh itself periodically (using JavaScript or meta refresh). If it finds itself still on the same page after it should have gone somewhere else, then it can issue a warning.
Or, more simply, the page that issues the redirect could just have instructions stating "We're sending you to take the survey. If, after 30 seconds, you're still looking at this message, something went wrong.".

Cross Tab Browser Caching, Forcing Refresh

I have a JSON resource, let's call it /game/1, which is being publicly cached with a long duration. Based on some client-side logic, I want to occasionally want to refresh this resource (for instance, when I know something should be happening server-side - a game ending, in my case).
Once refreshed, I would like all downstream caches to update with the new content, so any requests to /game/1 will fetch the refreshed content. Appending a querystring with a random parameter won't work in this case.
I have tried adding the following headers on the request, which seems to work in a temperamental fashion in browsers other than IE:
headers['Cache-Control'] = 'max-age=0, no-cache';
headers['Pragma'] = 'no-cache';
Using these headers, Chrome seems to sometimes refresh the content, presumably based on some internal heuristics.
Does anyone have any better ideeas for what I'm trying to achieve?
Try setting meta http-equiv="expires" content to zero.
Setting the 'expires' metatag to zero should force the browser to reload everything on each page visit. Forcing constant cache deletion will obviously slow down page loading (if all browsers obey it!) but maybe that's an acceptable trade-off. This won't help with downstream caches however, so it's far from a complete solution.

ASP.NET fragment cache -- control is null second time round

I have an ascx control which works just fine. It is contained in a larger aspx page. I want to put it in the fragment cache, so I added the appropriate CacheOutput directive at the top. However, now the control on the underlying aspx.cs file has the control variable set to null the second time the page has loaded. I found a few places on the web where it said this would happen, but I also didn't find a solution to accessing the control.
What am I missing?
Also, can I control where it is cached? Can I make it cache in the browser cache rather than at the server?
Question #1: Output caching only stores the HTML result on the server. If you want to interact or run any code in the user control at all, you may not use full output caching. You may want to look into a lower-level caching, perhaps database or object caching, or embed another user control within this one that uses full output caching itself but the outer user control no longer does.
Question #2: "Can I control where it is cached?" If you use output caching, then no. That always means cache on the server. However, there are lots of different levels of caching. You can only cache a full HTTP response at the browser: a single HTML page, a CSS file, etc. If you want to cache only part of a page at the browser, but have the rest of the page dynamic, you would have to do it with some kind of JavaScript. Either HTML5 local storage, or via AJAX that has appropriate caching headers or responds with a 304 Not Modified response.
Side note: The term "fragment cache" is more often referred to "partial caching" in the ASP.Net world.
SO Tips: These are two questions, and should really be asked as two individual questions in the future.
Also, there are many ways to solve your problems here; if you provided more context to what you are doing and the performance problem you are trying to solve, we could offer more specific answers.

Standard way to persist data between requests in ASP.NET-MVC

What is the most standard or best way to persist data between requests?
Should I use cookies or session variables? I'm interested in keeping data like sort order, sort column, and page number (for paginiation).
I'm coming from a webforms background so normally this type of thing was automatically handled for me in the viewstate of the controls I was using.
update
I like the querystring idea, for searching and more meaningful URLs; however, I'm working on an "index/list" view, which consists of a View with header, and "control" options, like DDLs for filtering and a partial view that renders the table of data.
The DDLs use a $.load() to call an ActionResult on the controller, which returns the partial view, passing parameters there in the querystring, but since these are ajax requests the main page url of the user's browser does not get updated.
Is there a best-practice for taking querystrings off the main-page URL and using them in ajax requests to other ActionResults?
If you want it to survive only through one request/redirect TempData is your friend.
However, for things like your pagination, URL is the best method, for the ability to share links alone.
A standard way is to pass those sort of things via URL Query Parameters. You can modify your routing to expect certain URL variables. That way the pages become more search engine friendly as well.
It depends on how permanent you want the information to be:
Things like the page number should indeed be in the URL (as others have pointed out) - this helps with bookmarking, etc, but remember that if you add more content to the list, then that bookmarked result set will not always be what the user wanted...
If you're happy for these values to be lost when a session times out (by default around 20 minutes), then put them in Session.
If you think that sessions are going to timeout before the next request, or you want to save it across visits then you should be storing them in either cookies, or a profile (potentially allowing "Anonymous" profiles, which work with the users cookies, so they would lose them across machines).
Personally, I'd think very carefully about putting sort order and columns in the URL if you do you could actually end up really confusing search engines:
Lots of pages with very similar content (page 1, sorted by date desc, page 1 sorted by date asc, etc) - search engines don't like duplicate content, and nor should you as Google (for instance) will only show two pages from your site in a default result set, you want them to be valid, not duplicates.
Search engines will spend lots more time crawling your site, and potentially give up - If on every page they find links to "Sort by this column", they will attempt to follow them, resulting in more work on the server, higher bandwidth use, etc.
These can be mitigated through the use of a Robots.txt file denying access to sorted versions of the page, but if this is generated almost dynamically that will be very complex to maintain going forward.
In response to your update, a nice way to achieve that for pages would be to have links to "Previous" and "Next" pages of results (or better yet, a list of all pages in the list), output on the page, with the page numbers, that you then hide with JavaScript.
This way users should see your nice, AJAXy behaviour, and search engines (and users without JavaScript - mobile, or those using older screen readers for instance) will still be able to get access to all your pages - this will help your pages to degrade gracefully, or use "Progressive Enhancement".
Things that were previously in viewstate should probably be put back in the clients hands via either hidden fields or cookies.
Session is "too" easy. In a dev environment it works great, pretty much no matter what you put in it. In production scalability and persistence become a problem. In-process session is likely to disappear unexpectedly if you have crashing bug in your site, and requires server affinity when load balancing. Out-of process session fixes the durability and affinity issues, but can still be a performance bottle neck if too much stuff is put in session. A VERY common problem is that each page will put 1 or 2 items into session but never take them out again when they are done. And even if a page removes it session data when it is no longer needed, the data can still get orphaned if a user starts a process and never completes it.
Cookies is a fast and simple way to persist data between requests, and you can also make them live only for a limited time depending on your needs.
Session are easiest.

Resources