Cross Tab Browser Caching, Forcing Refresh - http

I have a JSON resource, let's call it /game/1, which is being publicly cached with a long duration. Based on some client-side logic, I want to occasionally want to refresh this resource (for instance, when I know something should be happening server-side - a game ending, in my case).
Once refreshed, I would like all downstream caches to update with the new content, so any requests to /game/1 will fetch the refreshed content. Appending a querystring with a random parameter won't work in this case.
I have tried adding the following headers on the request, which seems to work in a temperamental fashion in browsers other than IE:
headers['Cache-Control'] = 'max-age=0, no-cache';
headers['Pragma'] = 'no-cache';
Using these headers, Chrome seems to sometimes refresh the content, presumably based on some internal heuristics.
Does anyone have any better ideeas for what I'm trying to achieve?

Try setting meta http-equiv="expires" content to zero.
Setting the 'expires' metatag to zero should force the browser to reload everything on each page visit. Forcing constant cache deletion will obviously slow down page loading (if all browsers obey it!) but maybe that's an acceptable trade-off. This won't help with downstream caches however, so it's far from a complete solution.

Related

stylesheet linked with question mark and numeric value

I can see this site.com/assets/css/screen.css?954d46d92760d5bf200649149cf28ab453c16e2bwhat is this random alpha numeric vales question mark ? i don't think it's taking some value to use or what is it about ?
edit : also on refreshing page the alpha-numeric value is same.
It is for preventing the browser from caching the CSS. When a CSS is requested by some browsers, specifically Internet Explorer, the browser will have a local copy of the CSS.
When a request is given to a server as:
site.com/assets/css/screen.css?skdjhfk
site.com/assets/css/screen.css?5sd4f65
site.com/assets/css/screen.css?w4rtwgf
site.com/assets/css/screen.css?helloWd
The server at site.com sees only:
site.com/assets/css/screen.css
And gives the latest version. But when the HTML page is requesting the browser to fetch the CSS as: site.com/assets/css/screen.css, for the first time, it fetches from the site.com server. There are many possibilities that the content might be changed in the meantime when the next request is sent. So programmers generally add a ?and-some-random-text, which is called Query String. This will force the browser to get a new copy from the server.
Some more detailed explanation:
It is a well known problem that IE caches too much of html, even when
giving a Cache-Control: no-cache or Last-Modified header to
everypage.
This behaiviour is really troubling when working with querystrings to
get dynamic information, as IE considers it to be the same page
(i.e.: http://example.com/?id=10) and serves the cached version.
I've solved it adding either a random number or a timestring to the
querystring (as others have done) like this
http://example.com/?id=10&t=2009-08-06_13:12:56 that I just ignore
serverside.
Is there a better option? Is there another, cleaner way to acomplish
this? I'm aware that POST isn't cached, but it is semanticaly
correct to use GET here.
Reference: Random Querystring to avoid IE caching

Knowing the status of a form POST that is protected by X-Frame-Options

I am going to describe my specific case below, but this might be useful to a number of web-mashups.
My web application POSTs to Twitter by filling a form and then submitting it (via javascript). The target of the form is set to an iframe which has an onload trigger. When the onload trigger is called the application knows that the POST was completed.
This used to work fine until Chrome version 11, which now respects the X-Frame-Options=SAMEORIGIN sent by Twitter in the POST response. The POST goes through, but the iframe's onload is not called anymore.
It still works in Firefox 4, but I suppose that's a bug that will eventually get fixed.
Is there any other way to know the status of the POST? I understand that knowing the contents of the POST response would violate the security policy, but I am not interested in the contents. I would just like the app to be notified when the POST is completed.
If you just need to know when the POST was submitted, and not necessarily whether it succeeded or not, you could poll the iframe's contentWindow and contentWindow.document on an interval. When you can no longer access one of those objects, or when the document has an empty body, that means that the iframe has loaded a page with X-Frame-Options restrictions, which likely means that the submission went through. It's hacky, but it looks like it will work for this purpose. (You'll probably have to go through a few combinations to figure out what the contents of restricted iframes look like in your target browsers.)
You can do it by getting the headers of the page. In php it will be looks like,
$url = 'http://www.google.com';
print_r(get_headers($url));

viewstate field completely missing in safari

This is happening in multiple versions of Safari, including 5.x
It will post _EVENTTARGET=&_EVENTARGUMENT= but nothing for __VIEWSTATE=
This is only happening in Safari, and only on one page of our site.
I can't reproduce it - we've spent days trying to.
The viewstate isnt overly huge on this page.
Thanks!
We ran into a lot of viewstate problems with version 3. Safari sets limits to the amount of data that can appear in any one field that gets posted back to the server.
The way we got around our problems was to set viewstate to span multiple input controls.
You can do this in the system.web / pages section of the web.config. For example:
<system.web>
<pages maxPageSTateFieldLength="500" />
</system.web>
You might have to play with the value. I can't remember what the limits are for the various versions of safari. A few people have said 1k, but if I remember correctly from our testing some versions were only passing around 500 bytes.
Another option is to store viewstate server side. You can see an example of this here. You should also read this blog about potential issues. We did try this path and eventually abandoned it as it conflicted with some other encryption things we were doing.
(taking a different tact from previous answer)
To sum up what we know thus far:
only safari
only a particular page
there is a device called StrangeLoop in the mix which removes viewstate on the way out and puts it back in when the page is posted back. It does so through some type of token value.
A couple of questions:
First, is this limited to just a particular customer or set of people? I ask because it might be important that it's "only" safari.
Second, does the StrangeLoop device have some type of timeout value or traffic limit where it's token cache is garbage collected?
I can envision a scenario where a particular client goes to this page and sits for awhile (10 minutes.. longer?). In the meantime either a timeout value is met or the amount of traffic you have forces the strangeloop device to throw viewstate for this particular client out. Then when they go ahead and post back the device has no viewstate to inject back into the html stream.
It seems to me that in order for you to not have any viewstate at all, the device itself must not be injecting it. The only reason I can come up with for that would be if the token value wasn't sent by safari (unlikely as it has to be quite small) or the device couldn't locate a match in it's cache table.
Does the device have any sort of logging or metrics where you can see if it can't match an incoming token value?
A similar idea is if this page has some ajax going on. Does the device send a different token back for each request or does a single client browser retain the token for the entire browsing session? If it sends a different token.. then it might be that safari isn't properly updating itself client side with the new token value. Although this path ought to be pretty easy to duplicate.

http cookies and embedded images

I'm setting a cookie during http GET request of the .html pages with embedded images. I'm expecting the browser to return the cookies when getting all the embedded images, but apparently it does not happen for the first embedded image.
Is this how it's supposed to work or am I missing something ?
Make sure the domain name matches your domain and that you've set a valid expiration date/time for it. These are the 2 most common mistakes.
It would help if we knew how you were setting the cookies. Note that NRNR's response is a bit misleading - he/she's right about the domain, but there's no requirement to set an expiration. However you will get varying results unless you explicitly set a path too - even if it's just '/'.
Browsers do vary a lot in how they handle all sorts of things, including cookies - so I wouldn't be too surprised if there are browsers out there which start retrieving additional content before the response headers for the referencing html page are processed. This is not how its supposed to work though.
C.

Standard way to persist data between requests in ASP.NET-MVC

What is the most standard or best way to persist data between requests?
Should I use cookies or session variables? I'm interested in keeping data like sort order, sort column, and page number (for paginiation).
I'm coming from a webforms background so normally this type of thing was automatically handled for me in the viewstate of the controls I was using.
update
I like the querystring idea, for searching and more meaningful URLs; however, I'm working on an "index/list" view, which consists of a View with header, and "control" options, like DDLs for filtering and a partial view that renders the table of data.
The DDLs use a $.load() to call an ActionResult on the controller, which returns the partial view, passing parameters there in the querystring, but since these are ajax requests the main page url of the user's browser does not get updated.
Is there a best-practice for taking querystrings off the main-page URL and using them in ajax requests to other ActionResults?
If you want it to survive only through one request/redirect TempData is your friend.
However, for things like your pagination, URL is the best method, for the ability to share links alone.
A standard way is to pass those sort of things via URL Query Parameters. You can modify your routing to expect certain URL variables. That way the pages become more search engine friendly as well.
It depends on how permanent you want the information to be:
Things like the page number should indeed be in the URL (as others have pointed out) - this helps with bookmarking, etc, but remember that if you add more content to the list, then that bookmarked result set will not always be what the user wanted...
If you're happy for these values to be lost when a session times out (by default around 20 minutes), then put them in Session.
If you think that sessions are going to timeout before the next request, or you want to save it across visits then you should be storing them in either cookies, or a profile (potentially allowing "Anonymous" profiles, which work with the users cookies, so they would lose them across machines).
Personally, I'd think very carefully about putting sort order and columns in the URL if you do you could actually end up really confusing search engines:
Lots of pages with very similar content (page 1, sorted by date desc, page 1 sorted by date asc, etc) - search engines don't like duplicate content, and nor should you as Google (for instance) will only show two pages from your site in a default result set, you want them to be valid, not duplicates.
Search engines will spend lots more time crawling your site, and potentially give up - If on every page they find links to "Sort by this column", they will attempt to follow them, resulting in more work on the server, higher bandwidth use, etc.
These can be mitigated through the use of a Robots.txt file denying access to sorted versions of the page, but if this is generated almost dynamically that will be very complex to maintain going forward.
In response to your update, a nice way to achieve that for pages would be to have links to "Previous" and "Next" pages of results (or better yet, a list of all pages in the list), output on the page, with the page numbers, that you then hide with JavaScript.
This way users should see your nice, AJAXy behaviour, and search engines (and users without JavaScript - mobile, or those using older screen readers for instance) will still be able to get access to all your pages - this will help your pages to degrade gracefully, or use "Progressive Enhancement".
Things that were previously in viewstate should probably be put back in the clients hands via either hidden fields or cookies.
Session is "too" easy. In a dev environment it works great, pretty much no matter what you put in it. In production scalability and persistence become a problem. In-process session is likely to disappear unexpectedly if you have crashing bug in your site, and requires server affinity when load balancing. Out-of process session fixes the durability and affinity issues, but can still be a performance bottle neck if too much stuff is put in session. A VERY common problem is that each page will put 1 or 2 items into session but never take them out again when they are done. And even if a page removes it session data when it is no longer needed, the data can still get orphaned if a user starts a process and never completes it.
Cookies is a fast and simple way to persist data between requests, and you can also make them live only for a limited time depending on your needs.
Session are easiest.

Resources