Fresh response vs Cached response in Asp.net - asp.net

With my recent development work, I need a way to determine whether the current response received is from Cache or if the Server has sent a very fresh response. This is so because there are some javascript codes that needs to be executed for every fresh response & NOT every fresh user.
You all may agree that showing the Javascript code Which Will be executed on every fresh response won't add anything meaningfull to my question, since it's totally irrelevant and not connected with the way a server respose is sent.
So, Is there any way to differentiate whether the response is from the Cache or is a new fresh copy sent by the server ??

You should consider creating a custom OutputCacheProvider that extends the built in cache provider used in MVC.
Some links that might help:
MSDN Article: Building and Using Custom OutputCache Providers in ASP.NET
Creating a Custom Output Cache Provider in ASP.NET 4
Custom Output Caching with MVC3 and .NET 4.0 – Done Properly!
Within your provider, you can use the same functionality as the regular output cache provider. And on the Get() action, you can add content to the item returned from the cache that will indicate that it was in fact retrieved from cache (you will want to experiment with this, making sure that you are only adding this to the items that you want, and are doing it in a way that doesn't mess up the output).

Related

API caching with Symfony2

How can I cache my API responses built with Symfony?
I started to dig into FosCacheBundle and the SymfonyHttpCache, but I'm not sure about my usecase.
You can only access the API with a token in header and every users get the same data in their response for the same URL called (and with the same GET parameters).
I would like to have a cache entry for each of my URL (including get parameters)
and also, is it possible to reorder my GET parameters before the request is processed by my cache system (so that the system dont create multiple cache entries for "URL?foo=bar&foz=baz" and "URL?foz=baz&foo=bar" which returns the same data)
Well there are multiple ways.
But the simplest is this:
If the biggest problem is database access than just caching the compiled result in memcache or similar will go a long way. Plus, this way you stick to your already working authentication.
In your current controller action, after authentication and before payload creation check if there's an entry in memchache. If no, build the payload and save it into memcache than return it. When next request comes along there will be no DB access as it will be returned from memcache. Just don't forget to refresh the cache how ever often you need.
Note:
"Early optimization is the root of all evil" and "Servers are cheaper than programmer hours" are to things to keep in mind. Don't complicate your life with really advanced caching methods if you don't need to.

Strategy for developing a multi function asp.net web application

I'm about to start a new project and want some advice on how to implement.
I need a web application which contains a booking module for reserving timeslots, and a time management module which will enable employees to clock in / clock out.
If I am writing an update to the time managment module, I don't want to disrupt the booking engine availability by releasing a new solution containing both modules.
to make things more difficult, there is some shared functionality like common users, roles and security.
Here's a suggestion I've gotten, which sounds a bit cruddy, but may be functional.
Write a 'container' web application which consists of basically a frame, and authentication / security features. This then has links which, will load the 2 independantly built and released web applications into the frame.
I can see that say, if I wanted to update the time management module, I would only need to build and release this separately, and the rest of the solution would be 'untouched'
Any better alternatives?
Unless I am missing something, if you run ASPX.net (v2, 3, whatever) you can replace the ASPX files (including any CLASS Files) on the fly and the WEB SERVER will automagically "do the right thing."
So if you wrap your "modules" in classes, you can replace those files on a whim without harming the functionality of other classes (not modified.)
As I re-read this am I getting convinced that I am misunderstanding your goal...
Sounds like you what you want is to have some thing along the lines of the Composite Application Block but for a web application (the CAB is for a smart client application).
One of the main things you would want to do is reduce and abstract the coupling between the modules as much as possible.
Keeping the session in the database would go a long way help your ability to dynamically load modules into the application.
This would allow you to have the time management in one server and the booking engine in another. When you update the functionality of one you simply update one server while the other keeps on serving the user.
Add two class library to your web application. one for "booking module" and one for "time management" module.
After compiling you will have one DLL for each module and put them in bin folder of web app (Visual Studio will do) then you can replace them separately when you need.
Maybe you know this already :
Sessions are the heart of problems in web if misunderstood.
Http is a connection-less protocol which means both sides of connection don't care about the flow of the communication. Simply a request has a single response. Without tracking a client how web applications can work ? assume we login to Yahoo mail. Single request (filled login page) is sent to server and a single response (inbox page) returns, then what if we want to see "Draft" folder ?
To inspire state to HTTP a simple solution added which we know as "cookies".
cookies are simple texts send with each request to a specified server. So on login page Yahoo server sends the response with some other text (cookie) which forces client (browser) to remember it and sent with every new request. This way Yahoo server (web application) can keep track of sequence of requests. This is why we should not simply close the browser window when we are done with yahoo and should logout. With logout yahoo server will forget about that cookie and any subsequent requests with that cookie are not accepted. So because Yahoo can't find out we closed the browser "connectionless" is a good name.
How asp.net handle this ?
simply asp.net uses a "session cookie" for any new request (requests without cookie) and let's you put your variables in "Session" object on server side. As long as we are at the same application we can use same session variables. What asp.net is doing behind is creating a table for "session Id" cookies and you "session variables". This is transparent to asp.net programmer. We just simply put a value in a session variable like this : session("Age") = 19; and read it when we need. ASP.NET take care of the rest with session cookies this way: you create a session variable (here "Age") so asp.net should track of this request; whatever is the response, asp.net adds a "session cookie" to it. "Session cookie" is a unique text which should be send by that client on consequent request till it expires (usually 20 minutes in asp.net). Use Firefox with "web developer" add-on to see and manipulate cookies.
Related concepts: session cookies vs permanent cookies, cookie properties (domain, expiration date, ...)
how server deals with cookies (keeping in memory, storing in database, ...)

Stop Direct Page Calls to Ajax Pages

Is there a "clever" way of stopping direct page calls in ASP.NET? (Page functionality, not the page itself)
By clever, I mean not having to add in hashes between pages to stop AJAX pages being called directly. In a nutshell, this is stopping users from accessing the Ajax pages without it coming from one of your websites pages in a legitimate way. I understand that nothing is impossible to break, I am simply interested in seeing what other interesting methods there are.
If not, is there any way that one could do it without using sessions/cookies?
Have a look at this question: Differentiating Between an AJAX Call / Browser Request
The best answer from the above question is to check for a requested-by or custom header.
Ultimately, your web server is receiving requests (including headers) of what the client sends you - all data that can be spoofed. If a user is determined, then any request can look like an AJAX request.
I can't think of an elegant method to prevent this (there are inelegant and probably non-perfect methods whereby you provide a hash of some sort of request counter between ajax and non-ajax requests).
Can I ask why your application is so sensitive to "ajax" pages being called directly? Could you design around this?
You can check the Request headers to see if the call is initiated by AJAX Usually, you should find that x-requested-with has the value XMLHttpRequest. Or in the case of ASP.NET AJAX, check to see if ScriptMAnager.IsInAsyncPostBack == true. However, I'm not sure about preventing the request in the first place.
Have you looked into header authentication? If you only want your app to be able to make ajax calls to certain pages, you can require authentication for those pages...not sure if that helps you or not?
Basic Access Authentication
or the more secure
Digest Access Authentication
Another option would be to append some sort of identifier to your URL query string in your application before requesting the page, and have some sort of authentication method on the server side.
I don't think there is a way to do it without using a session. Even if you use an Http header, it is trivial for someone to create a request with the exact same headers.
Using session with ASP.NET Ajax requests is easy. You may run into some problems, like session expiration, but you should be able to find a solution.
With sessions you will be able to guarantee that only logged-in users can access the Ajax services. When servicing an Ajax request simply test that there is a valid session associated with it. Of course a logged-in user will be able to access the service directly. There is nothing you can do to avoid this.
If you are concerned that a logged-in user may try to contact the service directly in order to steal data, you can add a time limit to the service. For example do not allow the users to access the service more often than one minute at a time (or whatever rate else is needed for the application to work properly).
See what Google and Amazon are doing for their web services. They allow you to contact them directly (even providing APIs to do this), but they impose limits on how many requests you can make.
I do this in PHP by declaring a variable in a file that's included everywhere, and then check if that variable is set in the ajax call file.
This way, you can't directly call the file ever because that variable will never have been defined.
This is the "non-trivial" way, hence it's not too elegant.
The only real idea I can think of is to keep track of every link. (as in everything does a postback and then a response.redirect). In this way you could keep a static List<> or something of IP addresses(and possible browser ID and such) that say which pages are allowed to be accessed at the moment from that visitor.. along with a time out for them and such to keep them from going straight to a page 3 days from now.
I recommend rethinking your design to be sure that this is really needed though. And also note IPs and such can be spoofed.
Also if you follow this route be sure to read up about when static variables get disposed and such. You wouldn't want one of those annoying "your session has expired" messages when they have been using the site for 10 minutes.

Is there conditional caching in ASP.NET?

Is there a built-in asp.net way to conditionally serve pages, for example I want the following logic:
If there is a session data I generate
a page, if there is no session data I
serve the cached page.
I am only interested in knowing about a built-in asp.net mechanism for this. If it does not exist I am probably going to simply cache my page manually and decide whether to serve it or not for each request, based on the session data availability.
I don't think there is built-in support (like varyByParam) for generating fresh output for users with Session Data.
As you suggest, I would recommend manually caching the pages. I would probably determine the user's Session state in the PreRequestHandlerExecute event handler in the Global.asax and then maybe set:
HttpContext.Current.Response.Cache.SetCacheability(HttpCacheability.NoCache);
At the risk of karmabombing, I really don't like this approach to caching.
For me if a GET request is made, then a server should respond to that in good faith. Caching at a page level should be controlled by http headers because the primary goal is not to get the redundant request at all - you don't want to allocate server/bandwidth resources full stop.
Caching objects which are resources involved in making up a page I can totally get behind, but I can't see great arguments for caching a page wholesale.
Respect the headers.
You might want to look at the substitution control (Link) new in .NET 2.0, however it might not be exactly what you are after.

ASP Classic GET request without multithreading

We are talking about Classic ASP and NOT ASP.NET!
Lets start from top. We are using ISAPI_Rewrite and we would like to dynamically offer our customers to control rewriting of urls (giving them httpd.ini is not an option). We were thinking that all unknown url requests (we define this in httpd.ini) are controlled by one asp file which creates a GET request to select url (customers creates key -> value table). Now, we can make a request to another page and just print the output but we cannot make a request to our own server. As I am aware, ASP doesnt offer this.
We could write a .NET extension to control this but we are looking for other options. I know that declining .NET is a stupid thing, but its a long story...
Is there a solution to this problem in ASP?
Have a look at Server.Execute it allows dynamic (run time) code inclusion of other ASP files. An added bonus is that it's treated as part of the original request so SESSION, COOKIE are all available in the included file. HOWEVER variables defined in the master are not available to the included the page. You circumvent this using temporary Session variables though.
Session("variable") = "value";
Server.Execute(url);
Session.Abandon;
Response.end;
Session.Abandon will clear ALL session variables, you might want to clear them individually.
You can make a request to your own server but the page making the request needs to NOT have session enabled in the page declaration right at the top of the page:
Each page locks the session object and its that which stops you making a request to your own server. If you declare you are not going to use session in the calling script then it wont lock it and you can run it again using a XMLRequest and pass what you like on the querystring, post data and session cookies too so session etc. will all still exist.

Resources