Caching across requests with HttpModule - asp.net

I have written an HttpModule that accepts the request, processes it (via a database lookup), and outputs results (html) back to the response. (Note that the HttpModule actually ends the request after it is done, so there is no normal ASP.NET processing of the request.)
As the database lookup can be expensive/time-consuming, I would like to store the results in-memory so that subsequent (identical) requests can be served the same content without going to the database.
Where can I store (in-memory) data so that it is available for subsequent invocations of the HttpModule?

You could store it in the Application.Cache, the result would be available Application wide then. Be sure to check for "new data" every now and then if necessary.

How is response size compared to your data fetched from db? What about rendering once you have that data in memory? If rendering is not an issue, I would just cache the data and render it on each request. If rendering takes lots of cpu, then you should cache the full response and directly serve it from cache.

Related

ASP.NET Web API split response in chuncks

Is it possible to split a response from WEB API into chunks as follow.
I have a win forms app that can handle 100 KB of data at a time.
When this client makes a request to my ASP.NET WEB API for some data let's assume that the WEB API response is 2 MB ... can I somehow cache this response, split it in 100KB chunks and return the first chunk to my app. That response will contain a link/token to the next chunk and so forth? Does this sound crazy? Is this feasible?
One more question: when we are talking about request/response content length (size), what does this means: that the content itself cannot be bigger then 100 KB or the content with headers and so on ... I want to know if headers are included or not in the content length?
Example: if my response is 99KB and the headers are 10 KB (109 KB) will this pass if the limit is 100KB?
Pagination is a pretty common solution to large data sets in webservices. Facebook, for example, will paginate API results when they exceed a certain number of rows. Your idea for locally caching the full result is a good optimization, though you can probably get away with not caching it as a first implementation if you are unsure of whether or not you will keep this as your final solution.
Without caching, you can just pass the page number and total number of pages back to your client, and it can then re-make the call with a specific page number in mind for the next set. This makes for an easy loop on the client, and your data access layer is marginally more complicated since it will only re-serialize certain row numbers depending on the page parameter, but that should still be pretty simple.
Web API has access to the same HttpRuntime.Cache object as the rest of the ASP.NET project types do, so it should be easy to write a wrapper around your data access call and stick the result of a larger query into that cache. Use a token as the key to that value in the cache and pass the key, likely an instance of the GUID class, back to the client with the current page number. On subsequent calls, skip accessing your normal persistence method (DB, file, etc) and instead access the GUID key in the HttpRuntime.Cache and find the appropriate row. One wrinkle with this is if you have multiple webservers hosting your service since the HttpRuntime.Cache will exist on only the machine that took the first call, so unless your load balancer has IP affinity or you have a distributed caching layer, this will be more difficult to implement.

Benefit of Output Caching Compared to Caching by Browser

IIS has featured to set "Output Caching" on asp.net sites. I would like to know what is the benefit of this type of caching compared to caching done by our browser.
I am wondering because if our browser has the power to cache content(such as js/css/image), why would .net implement feature such as output caching?
Imagine a page that takes a lot of server-side resources to create -- maybe database calls, heavy computation, etc.
If one user requests that page, and it gets cached by the browser, then the next time that user requests the same page, it will already be on their machine -- so it won't have to be generated by the server or transferred over the network again.
Next, imagine that a second user requests the same page. The fact that a copy of the page was cached by the first user's browser doesn't help. Without output caching, the server will need to perform those time-consuming operations all over again to generate the page.
If the page used output caching, then the results from the first time is was created would be stored in memory on the server, so the cached results could be sent in response to subsequent requests -- which saves time and server-side resources.
Think of it for multiple users, let´s say 100.
Without Output Caching IIS would have to process and generate the page for each user request so the page is processed 100 times.
With Output Caching IIS would have to process the page once (for the first user requesting it), then cache it and return the same version for the other 99 users.

What is output caching, is it always good to use output caching to improve web app's performance?

What is output caching for a web application, is it always good to use output caching to improve web app's performance. Besides output caching, are there some other caching techniques?
Output caching stores a rendered page/control and spits back the stored HTML instead of having to generate it again for each request. Typically you do output caching for a specified period of time, for example 60 seconds.
On the first request, the output is cached, subsequent requests for the 60 second duration use the cached page instead of generating the html again. If this control is database intensive, then all subsequent requests for the 60 second duration saved database calls, etc, and the page load for the subsequent requests should be much faster.
Information on Output Caching is readily available on google.
Other caching techniques would include, but is definitely not limited to:
Browser Caching
Object Caching
Query Caching
Have a read of this:
http://msdn.microsoft.com/en-us/library/aa478965.aspx

How long should I cache an object which can be changed at any time?

I'm in the process of making a fairly complex application. It is expected to run across multiple web servers, otherwise this would be easy.
Basically, we have a set of Client records. Each Client record has an XML column which contains all of the "real" data, such as the clients name and other fields which are made dynamically. Our users can update a client's record at anytime. Also, we have Application records. Each application is tied to multiple clients. Each application is usually tied to more than 3 clients. Each client's XML data is greater than 5k of text, usually.
In some profiling I've done, obtaining and deserializing this XML data is a fairly expensive operation. At one portion of our web application, we must have very low latencies (related). So during this portion, our web application is a JSON web service. When a request is made to it, usually, every client record will be needed(in full, due to how it's currently coded). I'm attempting to make as few database hits as possible in this portion.
How long should I cache the Client records' XML objects? Knowing the user can change it at anytime, I'm not sure if I should cache it at all, but can users live with slightly stale data?
Instead of refreshing the cache on any kind of schedule, just compare the last modified date of any critical records with the cached value when accessed, which should be a very inexpensive operation. Then update the cache only when needed.
You could store a hash of the xml in the database that the clients validate their cached XML against.
Then if it doesn't match up, invalidate your cache and retrieve new.
When the XML is updated, update the hash with it and then your clients will notice and update their cache.
Maybe you should use an SqlCacheDependency to ensure the data removed from the cache and reloaded from the database whenever it was changed.

Cache and outputcache

What is the difference between property Cache and OutputCache directive?
Cache is where you can put data - stuff coming from the database, or as the result of an expensive calculation, for example. Anything in cache should be available to all users.
OutputCache caches HTML - an entire page, or the output from a user control.
I suppose you can make a desition in your deep heart that both of them are different.
Most of time the Cache is used as the data or business result store. So you could only process your business logic or against your DB at the first time. You'll find it's very efficient one the process need much time. You can use it in your layers: data lay, business lay and so on.
OutPutCache: It announce the IIS server, proxy, or client to cache the response result. Especially benefit for the dynamic pages. The server will response the cached result to the client once it requested before.

Resources