ASP.NET Caching Asynchronously - asp.net

First off I think I should link to this article which pretty much accomplishes what I what.
Here's my problem:
I had a user control on my site that will need to cache some data for at least 15 minutes and then pull the data again from the DB. The problem is the pull takes about 7-10 seconds to pull the result from the DB.
My thought is that I can set the cache to like two hours, then have a property in the cached object to say when the object was loaded (let's call this the LoadDate property). I would then have the code pull the cached object.
If it's null, I have no choice but to pull the data synchronously and then load my user control
If it's not null, I want to go ahead and load the data onto my user control from the cached object. I would then check the LoadDate property. If it's been 15 minutes or more, then set up an asynchronous process to reload the cache.
There needs to be a process to lock the cache object for this while it's updating
I need an if statement that says if the object is locked, then just forget about updating it. This would be for subsequent page loads by other users - as the first user would already be updating the cache and I don't want the update the cache over and over again; it should just be updated by the first call. Remember I'm already loading my user control before I even do the cache check
In the article I linked to before, the answer set up the cache updating perfectly, but I don't believe it is asynchronous. The question started with doing it asynchronously using the Page.RegisterAsyncTask. [Question 1] I can't seem to find any information on whether this would allow a asynchronous process to continue even if the user left the page?
[Question 2] Anybody have a good idea on how to do this? I have some code, but it has grown extremely long and still doesn't seem to be working correctly.

Question 1 (RegisterAsyncTask)
Very important thing to remember: from the client/user/browser perspective, this does NOT make the request asynchronous. If the Task you're registering takes 30 seconds to complete, the browser will still be waiting for 30+ seconds. The only thing RegisterAsyncTask does, is to free up your worker thread back to IIS for the duration of the asynchronous call. Don't get me wrong - this is still a valuable and important technique. However, for the user/browser making that particular request, it does not have a noticeable impact on response time.
Question 2
This may not be the best solution to your problem, but it's something I've used in the past that might help: when you add your item to the cache, specify an absoluteExpiration, and an onRemoveCallback. Create a function which gets fresh data, and puts it into the cache: this is the function you should pass as the onRemoveCallback. That way, every time your cached data expires, a callback occurs to put fresh data back into the cache; because the callback occurs as a result of a cache expiration event, there isn't a user request waiting for the 7-10 seconds it takes to cache fresh data.
This isn't a perfect design. Some caveats:
How to load the cache initially? Easiest way would be to call your cache-loader function from the Application_Start function.
Every 15 minutes, there will be a 7-10 second window where the cache is empty. Any requests during that time will need to go get the data themselves. Depending on your system usage patterns, that may be an acceptable small window, and only very few requests will occur during it.
The cache callback is not guaranteed to happen precisely when your cached item expires. If the system is under extremely heavy load, there could be a delay before the callback is triggered and the cache re-loaded. Again, depending on your system's usage, this may be a non-issue, or a significant concern.
Sorry I don't have a bulletproof answer for you (I'm going to keep an eye on this thread - maybe another SO'er does!). But as I said, I've used this approach with some success, and unless your system is extremely high-load, it should help address the question.
Edit: A slight twist on the above approach, based on OP's comment
You could cache a dummy value, and use it purely to trigger your refreshCachedData function. It's not especially elegant, but I've done it before, as well. :)
To elaborate: keep your actual cached data in the cache with a key of "MyData", no expiration, and no onRemoveCallback. Every time you cache fresh data in "MyData" you also add a dummy value to your cache: "MyDataRebuildTrigger", with a 15-minute expiration, and with an onRemoveCallback that rebuilds your actual cached data. That way, there's no 7-10 second gap when "MyData" is empty.

Related

Asp.Net MVC weird error when filling a really big form

On a new website, I've an huge formular(meaning really big, needs at least 15-20min to finish it), that configure the whole website for one client for the next year.
It's distributed between several tabs(it's a wizard). Every time we go to the next tab, it makes a regular(non ajax) call to the server that generate the next "page". The previous informations are stored in the session(an object with a custom binder).
Everything was working fine until we test it today with all real data. Real data needs reflexion, work to find correct elements, ... And it takes times.
The problem we got is that the View receive a Model partialy empty. The session duration is set to 1440 minutes(and in IIS too). For now what I know is that I get a NullException the first time I try to access the Model into my view.
I'm checking the controller since something like 1 hour, but it's just impossible it gives a null model. If I put all those data very fast, I don't have any problem(but it's random data).
For now I did only manage to reproduce this problem on the IIS server, and I'm checking elmah logs to debug it, so it's not so easy to reproduce it.
Have you just any idea about how should I debug this? I'm a little lost here
I think you should assume session does not offer reliable persistence. I am not sure about details but I guess it will start freeing some elements when it exceeds its memory limit.
You will be safer if you use database to store that information or you could introduce your own implementation for persisting state.
in addition to ans provided by #Ufuk
you can easily send an ajax request every 1 minute which would actually do nothing but by doing this the session wont get expired and site will continue to run in extended periods
The problem was that the sessions wasn't having enough space I think. I resolved temporary my problem by restarting the application pool. Still searching a solution that will not implies to changes all this code. Maybe with other mode of session states, but I need to make my models serializable.

What can i use Session or Cache

I am using master page in which menu is generating dynamically according to user role in code. The same menu is used for all application to a particular user up to log out. So instead of recreating it, i need the same menu for all of the application. The Menu is in StringBuilder which is very large size. Is Session or Data cache is better and less memory consuming in my situation and why. Please suggest?
I want to improve performance of master page.
Thanks
I think Cache will be better as you will have only one instance created for one role, but Session will make it create multiple instance as many as user is accessing, and you will have to wait session timeout sometimes to free the memory
If every user will get the same menu:
You should consider putting it in the Application "Cache" - Application["MyMenu"] or a static field on one of your objects.
The main reason for this is lifetime. If you put it in an application level object, then it will last for the lifetime of the application. Putting it in a session level object will cause it to be lost when that session ends - as a session is started per user, then you will soon find yourself recaching the data.
On the other hand... if it's unique per user:
The session provides a handy place to put this data, as it is unique to that user, and will not live long beyond that user leaving the site.
Also think about:
If you really think memory is going to be an issue or you want to define exactly how long you keep it for
Put it in the Cache. You can determine the amount of time it lives in the cache, and, additionally, the cache will start to dump objects when it gets short on memory - so it is more sensitive to load than the other options.
There is a good discussion of Session vs Cache on SO already
Additionally
Are you sure your menu is that big? If it is, you might want to consider alternatives - just how big are you talking?

What perfmon counters are useful for identifying ASP.NET bottlenecks?

Given the chart here, what should I be looking at to identify the bottleneck? As you can see, requests are averaging nearly 14 seconds under load and the bulk of that time is attributed to the CLR in New Relic's profiling data. In the performance breakdown for a particular page, it attributes the bulk of the time to the WebTransaction/.aspx page.
I see that the database is readed also (the orange) and this is seams that one of all pages have delay the rest of pages because of the lock that session make on the pages.
you can read also :
Replacing ASP.Net's session entirely
My suggestion is totally remove the session calls and if this is not possible, find an other way to save them somewhere in database by your self.
Actually in my pages I have made all three possible options. 1. I call the page with out session. 2 I have made totally custom session that are values connected to the user cookie, and last 3. I have made threads that are run away from the session and they make calculations on background, when they finish I show the results.
In some cases the calculations are done on iframe that call a page without session and there later I show the results.
In the Pro version, you can use Transaction Traces, which help pinpoint exactly where the issue is happening.

Caching issue with javascript and asp.net

I asked a question a while back on here regarding caching data for a calendar/scheduling web app, and got some good responses. However, I have now decided to change my approach and stat caching the data in javascript.
I am directly caching the HTML for each day's column in the calendar grid inside the $('body').data() object, which gives very fast page load times (almost unnoticable).
However, problems start to arise when the user requests data that is not yet in the cache. This data is created by the server using an ajax call, so it's asynchronous, and takes about 0.2s per week's data.
My current approach is simply to block for 0.5s when the user requests information from the server, and cache 4 weeks either side in the inital page load (and 1 extra week per page change request), however I doubt this is the optimal method.
Does anyone have a suggestion as to how to improve the situation?
To summarise:
Each week takes 0.2s to retrieve from the server, asynchronously.
Performance must be as close to real-time as possible. (however the data is not needed to be fully real-time: most appointments are added by the user and so we can re-cache after this)
Currently 4 weeks are cached on either side of the inial week loaded: this is not enough.
to cache 1 year takes ~ 21s, this is too slow for an initial load.
As I read your description, I thought of 2 things: Asynchrony and Caching.
First, Asynchrony
Why would you block for 0.5s? Why not use an ajax call, and in the callback, update the page with the retrieved info. There is no blocking for a set time, it is done asynchronously. You'd have to suppress multiple clicks though, while a request is outstanding, but that shouldn't be a problem at all.
You can also pre-load the in-page cache in the background, using setInterval or better, setTimeout. Especially makes sense if the cost to compute or generate the calendar is long and the data size is relatively small - in other words, small enough to store months in the in-page cache even if it is never used. Sounds like you may be doing this anyway and only need to block when the user jumps out of the range of cached data.
Intelligent Caching
I am imagining the callback function - the one that is called when the ajax call completes - will check if the currently selected date is on the "edge" of the cached data - either the first week in cache or the last week (or whatever). If the user is on the edge, then the callback can send out an additional request to optimistically pre-load the cache up to the 4 week limit, or whatever time range makes sense for your 80% use cases.
You may also consider caching the generated calendar data on the server side, on a per-user basis. If it is CPU- and time-intensive to generate these things, then it should be a good trade to generate once and keep it in the server-side cache, invalidating only when the user makes an update. With x64 servers and cheap memory, this is probably very feasible. Depending on the use cases, it may make for a much more usable interaction, the 2nd time a user connects to the app. You could even consider pre-loading the server-side cache on a per-user basis, before the user requests any calendar.

Best way to keep track of current online users

I have a requirement that my site always display the number of users currently online. For example, "35741 Users Currently Online". This is not based on a log in, simply how many users are currently on my site. I have tried using Session Start/Session End for this, however session end is not reliable. Therefore I get inflated numbers, as my session start adds numbers but session end doesn't remove them because it doesn't fire.
There is no additional information to be gathered from this (reporting, etc), it's simply requested that the number show up. Very simple request that's turning into a huge deal. Any help is appreciated.
EDIT:
I should specify that I have also tried using a database for this. Simple table that contains a session ID and a last activity column. With each page hit, I check to see if the session is in my database. If not, insert. If so, update with activity time. Then I run a procedure that sweeps the database looking for sessions with no activity in the last 20 minutes. This approach seemed to kill my SQL server and/or IIS. Had to restart the site.
Best way is like you do, but time it out via activity. If a given session doesn't access a page within 5 minutes or so, you may consider them no longer active.
If you're using ASP.Net membership, take a look at GetNumberOfUsersOnline.
For every user action that you can record, you need to consider them "online" for a certain window of time. Depending on the site, you may set that to 5 minutes. The actual web request should take less than a second. You have to make some assumption about how long they might stay on that page and do nothing but be considered online.
This approach requires that you keep track of the time of each users last activity.
Use Performance Counters:
State Server Sessions Active: The
number of active user sessions.
Expanding what silky said in his answer - since really http is stateless to determine if the user is currently 'online' you can really only track how long since the user last accessed your site and make a determination on how long between requests your consider to still be active.
Since you stated that this isn't based upon users logging in may it's a simple of how many different IP addresses you received requests from in the past 5 minutes (or however long you consider the 'online' timeout to be).
Don't use sessions for this unless you also need sessions for something else; it's overkill otherwise.
Assuming a single-server installation, do something like this:
For each user, issue a cookie that contains a unique ID
Maintain a static table of unique IDs and their last access time
In an HttpModule (or Global.asax), enter new users into the table and update their access times (use appropriate locking to prevent race conditions)
Periodically, either from a background thread or in-line with a user request, remove entries from the table that haven't made a request within the last N minutes. You might also want to support an explicit "log out" feature.
Report the number of people online as the size of the table
If you do use sessions, you can use the Session ID as the unique identifier. However, keep in mind that Session IDs aren't issued until you store something in the Session dictionary, unless you have a Session_Start() event configured.
In a load balanced or web garden scenario, it gets a little more complicated, but you can use the same basic idea, just persisting the info in a database instead of in memory.
When the user logs in write his user name into the HttpContext.Current.Cache with a sliding expiration (say 20 minutes).
Then in the Global.asax.cs in the Application_PreRequestHandlerExecute "touch" the cache entry for the current users so it resets the sliding expiration.
When a user explicitly logs out, remove his username from HttpContext.Current.Cache.
If you do this, at any given time HttpContext.Current.Cache.Count will give you the # of current users.
Note: this is assuming you aren't using the Cache for other purposes.

Resources