I have an asp.net web page generated by server (IIS 8.5), it displays some graphs based on data stored in the back-end. I manually updated the database (bulk insert some data), and refresh the browser, but the page does not show the new data.
I think it a cache problem, since when I press ctrl + F5, the new data appears. So how should I solve this problem? Do something in the web server?
You can control browser caching via the Expires, Cache-Control, Last-Modified and ETag headers.
Take a look at these two Google Developers pages.
If you want to disable caching at any cost, include a unique tag in your image URLs that changes every time the image content changes, for example:
http://example.test/path/to/image/graph1.png?version=2014-3-19
With version changing every time you update the image. Since it is a new URL it is guaranteed not to be cached. Be careful with this technique though, since using this technique when it's not really needed can lead to long loading times (since well, you disabled caching of the images).
Related
With caching headers I can either make the client not check online for updates for a certain period of time, and/or check for etags every time. What I do not know is whether I can do both: use the offline version first, but meanwhile in the background, check for an update. If there is a new version, it would be used next time the page is opened.
For a page that is completely static except for when the user changes it by themselves, this would be much more efficient than having to block checking the etag every time.
One workaround I thought of is using Javascript: set headers to cache the page indefinitely and have some Javascript make a request with an If-Modified-Since or something, which could then dynamically change the page. The big issue with this is that it cannot invalidate the existing cache, so it would have to keep dynamically updating the page theoretically forever. I'd also prefer to keep it pure HTTP (or HTML, if there is some tag that can do this), but I cannot find any relevant hits online.
A related question mentions "the two rules of caching": never cache HTML and cache everything else forever. Just to be clear, I mean to cache the HTML. The whole purpose of the thing I am building is for it to be very fast on very slow connections (high latency, low throughput, like EDGE). Every roundtrip saved is a second or two shaved off of loading time.
Update: reading more caching resources, it seems the Vary: Cookie header might do the trick in my case. I would like to know if there is a more general solution though, and I didn't really dig into the vary-header yet so I don't know yet if that works.
Solution 1 (HTTP)
There is a cache control extension stale-while-revalidate which describes exactly what you want.
When present in an HTTP response, the stale-while-revalidate Cache-
Control extension indicates that caches MAY serve the response in
which it appears after it becomes stale, up to the indicated number
of seconds.
If a cached response is served stale due to the presence of this
extension, the cache SHOULD attempt to revalidate it while still
serving stale responses (i.e., without blocking).
cache-control: max-age=60,stale-while-revalidate=86400
When browser firstly request the page it will cache result for 60s. During that 60s period requests are answered from the cache without contacting of the origin server. During next 86400s content will be served from the cache and fetched from origin server simultaneously. Only if both periods 60s+86400s are expired cache will not serve cached content but wait for origin server to fresh data.
This solution has only one drawback. I was not able to find any browser or intermediate cache which currently supports this cache control extension.
Solution 2 (Javascript)
Another solution is usage of Service workers with its feature to make custom responses to requests. With combination with Cache API it is enough to provide the requested feature.
The problem is that this solution will work only for browsers (not intermediate caches nor another http services) and even not all browsers supports Services workers and Cache API.
IIS has featured to set "Output Caching" on asp.net sites. I would like to know what is the benefit of this type of caching compared to caching done by our browser.
I am wondering because if our browser has the power to cache content(such as js/css/image), why would .net implement feature such as output caching?
Imagine a page that takes a lot of server-side resources to create -- maybe database calls, heavy computation, etc.
If one user requests that page, and it gets cached by the browser, then the next time that user requests the same page, it will already be on their machine -- so it won't have to be generated by the server or transferred over the network again.
Next, imagine that a second user requests the same page. The fact that a copy of the page was cached by the first user's browser doesn't help. Without output caching, the server will need to perform those time-consuming operations all over again to generate the page.
If the page used output caching, then the results from the first time is was created would be stored in memory on the server, so the cached results could be sent in response to subsequent requests -- which saves time and server-side resources.
Think of it for multiple users, let´s say 100.
Without Output Caching IIS would have to process and generate the page for each user request so the page is processed 100 times.
With Output Caching IIS would have to process the page once (for the first user requesting it), then cache it and return the same version for the other 99 users.
I am having ASP .Net application which is running perfectly in IE 7.0 but as due to session sharing in IE 8.0 (also in case of new window), application is giving unexpected behavior as session can be modified by other window.
Some quick facts
I know the -NoCache option and open New Session file menu item of IE 8
I just wanted to know that is there any option to disable this session sharing behavior in new window through ASP .Net code (by getting the browser) or any other solution
I also wanted to have your suggestions for future web application development, what we need to take care to avoid session sharing issue
Session sharing has always been there is not unique to Internet Explorer 8. New tabs, Ctrl-N in any browser (IE5,6,7 FF1,2,3 OP6,7,8,9,10 etc) shares the session data of the global process. It just received a fancy name because now tabs can have multiple processes on the computer (not new either), but will still "share" the sessions. And thats' kinda "new".
It is good that you're aware of this, but it's not so good if you're trying to take this "experience" or "feature" away from the user. If you want that, I'd check into JScript/JavaScript solutions instead and issue a warning when a user tries to open several sessions, but I doubt you'd get a good "prohibit sharing sessions across windows" solution. Even notable banks have already given up on this (they never liked this session sharing thing)
From a design perspective: on the server side, it is rather simple. Just always assume that the session is changed. This can, for instance, mean that on one screen, the user is not logged in, on another he is. That's ok. If he refreshes or goes to another page, you'll show him the correct view: logged in user for the same page.
Just make sure that you check for invalidated data as the result of a changed session in another window (i.e., request). But that's a general advice: be liberal in what you accept, but make sure you validate any input.
EDIT: On extra sessions: just treat them like that. It has always been possible that users open up more then one session for the same user (two different browsers). Just as it has always been possible to change a session through another tab, window etc of the same browser.
On the "solving" side: Configure the session as cookieless. This places the session in the URL query params. Any old window not having the SESSIONID in the URL will not be considered part of the session. However, a warning is in place: this approach eventually causes more trouble then it solves (i.e., now you have to worry about with and without session requests from same user, same browser, same ip and it's still possible to "copy" a session by copying the URL or tab).
Moving some of your information from Session to ViewState may help you solve the issues you are having.
We are talking about Classic ASP and NOT ASP.NET!
Lets start from top. We are using ISAPI_Rewrite and we would like to dynamically offer our customers to control rewriting of urls (giving them httpd.ini is not an option). We were thinking that all unknown url requests (we define this in httpd.ini) are controlled by one asp file which creates a GET request to select url (customers creates key -> value table). Now, we can make a request to another page and just print the output but we cannot make a request to our own server. As I am aware, ASP doesnt offer this.
We could write a .NET extension to control this but we are looking for other options. I know that declining .NET is a stupid thing, but its a long story...
Is there a solution to this problem in ASP?
Have a look at Server.Execute it allows dynamic (run time) code inclusion of other ASP files. An added bonus is that it's treated as part of the original request so SESSION, COOKIE are all available in the included file. HOWEVER variables defined in the master are not available to the included the page. You circumvent this using temporary Session variables though.
Session("variable") = "value";
Server.Execute(url);
Session.Abandon;
Response.end;
Session.Abandon will clear ALL session variables, you might want to clear them individually.
You can make a request to your own server but the page making the request needs to NOT have session enabled in the page declaration right at the top of the page:
Each page locks the session object and its that which stops you making a request to your own server. If you declare you are not going to use session in the calling script then it wont lock it and you can run it again using a XMLRequest and pass what you like on the querystring, post data and session cookies too so session etc. will all still exist.
My team is working on a crappy old website and most of the pages are still ASP classic. However, we've recently migrated to forms authentication using ASP.NET and wildcard mapping. Everything works surprisingly well except for one thing: logged in users are timing out too quickly. After looking in the logs it appears people are timing out exactly after 20 minutes (which is the specified timeout due to inactivity).
So, our hypothesis is that the ASP classic pages are not tripping whatever mechanism in the forms authentication framework that resets the inactivity timer. I've googled around and even read the wildcard mapping post by the Great Gu but still can't find anyone else who is having this problem. So, 1) Have you ever seen this problem? and 2) What's the best workaround? (other than manually placing a hidden frame in every janky ASP page that loads a dumb .NET page in the background)
Update: slidingExpiration is set to true
Also: We can't use perpetual sessions because we need the application to time out after 20 minutes of inactivity. Also, this terrible site was written so that the interface is usually stored in the page. There's no simple piece of interface code I could slip the JavaScript into. We tried to put some js into an include file that was called by about 80% of our pages but it's caused some esoteric problems with file download buffers so we may have to try a different tack. Thanks.
Create a perpetual session.
Essentially you end up emitting some JavaScript and an image tag in your master page or navigation users controls (whatever you're using for consistent navigation). This JavaScript on some interval changes the source of the image tag to an http handler endpoint (some .aspx, .ashx) which returns a 1x1 pix clear gif as a response for the image. The constant request ensures that idle pages will keep the session alive.
As long as a browser window is open to your page your ASP.NET session will never time out.
Often the JavaScript will tack on a random number to the request so that the browser doesn't cache the request.
A decent walkthrough is available here.
I am assuming that you have manually created the cookie, in which case your timeout value in code is probably overriding your timeout value in the configuration.
First, if possible (which it probably isn't) don't create the cookie manually, it will save you from not only this headache but dozens of others.
If you must manually create the cookie, make sure that the timeout you are using is actually reading the timeout value that you have set in the configuration file and that sliding expiration is set to true (which you have said it was).
That said, we still have ocassional strange timeout problems when the cookies are manually created. Where I work we implemented a solution which allowed the cookies to be created automatically and timeouts were no longer a problem; however, it did create other issues and we were forced to switch back.