Caching issue with javascript and asp.net - asp.net

I asked a question a while back on here regarding caching data for a calendar/scheduling web app, and got some good responses. However, I have now decided to change my approach and stat caching the data in javascript.
I am directly caching the HTML for each day's column in the calendar grid inside the $('body').data() object, which gives very fast page load times (almost unnoticable).
However, problems start to arise when the user requests data that is not yet in the cache. This data is created by the server using an ajax call, so it's asynchronous, and takes about 0.2s per week's data.
My current approach is simply to block for 0.5s when the user requests information from the server, and cache 4 weeks either side in the inital page load (and 1 extra week per page change request), however I doubt this is the optimal method.
Does anyone have a suggestion as to how to improve the situation?
To summarise:
Each week takes 0.2s to retrieve from the server, asynchronously.
Performance must be as close to real-time as possible. (however the data is not needed to be fully real-time: most appointments are added by the user and so we can re-cache after this)
Currently 4 weeks are cached on either side of the inial week loaded: this is not enough.
to cache 1 year takes ~ 21s, this is too slow for an initial load.

As I read your description, I thought of 2 things: Asynchrony and Caching.
First, Asynchrony
Why would you block for 0.5s? Why not use an ajax call, and in the callback, update the page with the retrieved info. There is no blocking for a set time, it is done asynchronously. You'd have to suppress multiple clicks though, while a request is outstanding, but that shouldn't be a problem at all.
You can also pre-load the in-page cache in the background, using setInterval or better, setTimeout. Especially makes sense if the cost to compute or generate the calendar is long and the data size is relatively small - in other words, small enough to store months in the in-page cache even if it is never used. Sounds like you may be doing this anyway and only need to block when the user jumps out of the range of cached data.
Intelligent Caching
I am imagining the callback function - the one that is called when the ajax call completes - will check if the currently selected date is on the "edge" of the cached data - either the first week in cache or the last week (or whatever). If the user is on the edge, then the callback can send out an additional request to optimistically pre-load the cache up to the 4 week limit, or whatever time range makes sense for your 80% use cases.
You may also consider caching the generated calendar data on the server side, on a per-user basis. If it is CPU- and time-intensive to generate these things, then it should be a good trade to generate once and keep it in the server-side cache, invalidating only when the user makes an update. With x64 servers and cheap memory, this is probably very feasible. Depending on the use cases, it may make for a much more usable interaction, the 2nd time a user connects to the app. You could even consider pre-loading the server-side cache on a per-user basis, before the user requests any calendar.

Related

What perfmon counters are useful for identifying ASP.NET bottlenecks?

Given the chart here, what should I be looking at to identify the bottleneck? As you can see, requests are averaging nearly 14 seconds under load and the bulk of that time is attributed to the CLR in New Relic's profiling data. In the performance breakdown for a particular page, it attributes the bulk of the time to the WebTransaction/.aspx page.
I see that the database is readed also (the orange) and this is seams that one of all pages have delay the rest of pages because of the lock that session make on the pages.
you can read also :
Replacing ASP.Net's session entirely
My suggestion is totally remove the session calls and if this is not possible, find an other way to save them somewhere in database by your self.
Actually in my pages I have made all three possible options. 1. I call the page with out session. 2 I have made totally custom session that are values connected to the user cookie, and last 3. I have made threads that are run away from the session and they make calculations on background, when they finish I show the results.
In some cases the calculations are done on iframe that call a page without session and there later I show the results.
In the Pro version, you can use Transaction Traces, which help pinpoint exactly where the issue is happening.

ASP.NET Caching Asynchronously

First off I think I should link to this article which pretty much accomplishes what I what.
Here's my problem:
I had a user control on my site that will need to cache some data for at least 15 minutes and then pull the data again from the DB. The problem is the pull takes about 7-10 seconds to pull the result from the DB.
My thought is that I can set the cache to like two hours, then have a property in the cached object to say when the object was loaded (let's call this the LoadDate property). I would then have the code pull the cached object.
If it's null, I have no choice but to pull the data synchronously and then load my user control
If it's not null, I want to go ahead and load the data onto my user control from the cached object. I would then check the LoadDate property. If it's been 15 minutes or more, then set up an asynchronous process to reload the cache.
There needs to be a process to lock the cache object for this while it's updating
I need an if statement that says if the object is locked, then just forget about updating it. This would be for subsequent page loads by other users - as the first user would already be updating the cache and I don't want the update the cache over and over again; it should just be updated by the first call. Remember I'm already loading my user control before I even do the cache check
In the article I linked to before, the answer set up the cache updating perfectly, but I don't believe it is asynchronous. The question started with doing it asynchronously using the Page.RegisterAsyncTask. [Question 1] I can't seem to find any information on whether this would allow a asynchronous process to continue even if the user left the page?
[Question 2] Anybody have a good idea on how to do this? I have some code, but it has grown extremely long and still doesn't seem to be working correctly.
Question 1 (RegisterAsyncTask)
Very important thing to remember: from the client/user/browser perspective, this does NOT make the request asynchronous. If the Task you're registering takes 30 seconds to complete, the browser will still be waiting for 30+ seconds. The only thing RegisterAsyncTask does, is to free up your worker thread back to IIS for the duration of the asynchronous call. Don't get me wrong - this is still a valuable and important technique. However, for the user/browser making that particular request, it does not have a noticeable impact on response time.
Question 2
This may not be the best solution to your problem, but it's something I've used in the past that might help: when you add your item to the cache, specify an absoluteExpiration, and an onRemoveCallback. Create a function which gets fresh data, and puts it into the cache: this is the function you should pass as the onRemoveCallback. That way, every time your cached data expires, a callback occurs to put fresh data back into the cache; because the callback occurs as a result of a cache expiration event, there isn't a user request waiting for the 7-10 seconds it takes to cache fresh data.
This isn't a perfect design. Some caveats:
How to load the cache initially? Easiest way would be to call your cache-loader function from the Application_Start function.
Every 15 minutes, there will be a 7-10 second window where the cache is empty. Any requests during that time will need to go get the data themselves. Depending on your system usage patterns, that may be an acceptable small window, and only very few requests will occur during it.
The cache callback is not guaranteed to happen precisely when your cached item expires. If the system is under extremely heavy load, there could be a delay before the callback is triggered and the cache re-loaded. Again, depending on your system's usage, this may be a non-issue, or a significant concern.
Sorry I don't have a bulletproof answer for you (I'm going to keep an eye on this thread - maybe another SO'er does!). But as I said, I've used this approach with some success, and unless your system is extremely high-load, it should help address the question.
Edit: A slight twist on the above approach, based on OP's comment
You could cache a dummy value, and use it purely to trigger your refreshCachedData function. It's not especially elegant, but I've done it before, as well. :)
To elaborate: keep your actual cached data in the cache with a key of "MyData", no expiration, and no onRemoveCallback. Every time you cache fresh data in "MyData" you also add a dummy value to your cache: "MyDataRebuildTrigger", with a 15-minute expiration, and with an onRemoveCallback that rebuilds your actual cached data. That way, there's no 7-10 second gap when "MyData" is empty.

very large viewstate breaking web app

I have a web app, that consumes a web service. The main page runs a search - by passing parameters to a particular web service method, and I bind the results to a gridview.
I have implemented sorting and paging on the grid. By putting the datatable that the grid is bound to in the viewstate and then reading / sorting / filtering it as necessary - and rebinding to the grid.
As the amount of data coming back from the web service has increased dramatically, when I try to page/sort etc I receive the following errors.
The connection was reset
The connection to the server was reset while the page was loading.
I have searched around a bit, and it seems that a very large viewstate is to blame for this.
But surely the only other option is to
Limit the results
Stick the datatable in the session rather than the viewstate
Something else I am unaware of
Previously I did have the datatable in the session, as some of this data needed to persist from page to page - (not being posted however so viewstate was not an option). As the amount of data rose and the necessity to persist it was removed, I used the viewstate instead. Thinking this was a better option than the session because of the amount of data the session would have to hold and the number of users using the app.
It appears maybe not.
I thought that when the viewstate got very big, that .net split it over more than one hidden viewstate field, but it seems all I'm getting is one mammoth viewstate that I have trouble viewing in the source.
Can anyone enlighten me as to how to avoid the error I'm getting? If it is indeed to do with the amount of data in the viewstate?
It sounds like your caching the whole dataset for all pages even though you are only presenting one page of that data. I would change your pagination to only require the data for the current page the user is on.
If the query is heavy and you don't want to have to be constantly calling it over and over because there is a lot of paging back and forth (you should test typical useage pattern) then I would implement some type of caching on the web service end to cache page by page (by specific user if the data is specific to a user) and have it expire rather quick (eg a few minuites).
I think you need to limit the total amount of data your dealing with. Change your code to not pass back extra data that might never be needed is a good place to start.
EDIT: Based on your comments:
You can't change the web service
The user can manipulate the query by filtering or sorting
There is a large amount of data returned by the web service
The data is user specific
Well I think you have a perfect case for using the Session then. This can be taxing the the server with large amounts of users and data so you might want to implement some logic to clear the data from the Session and not wait for it to expire (like on certain landing pages you know the user will go when they are done, clear the session data).
You really want to get it out of the ViewState beacuse it is a huge bandwidth hog. Just look at your physical page size and that data is being passed back and forth with every action. Moving it to the Session would eliminate that bandwidth useage and allow for you to do everything you need.
You could also look at the data the web service is bringing back and store it in a custom object that you make as 'thin' as possible. If your storing a DataSet or a DataTable in your Session, those objects have some extra overhead you probably don't need so store the data as an array of some custom thin object and just bind to that. You would need to map the result from the WS to your custom object but this is a good option you cut down on memory useage.
Let me know if there is something else I am missing.
I wouldn't put the data in either the view state or the session. Instead store the bare minimum information to re-request the dataset from the web service and store that (in either view state or session, or even on the URL). Then call the web service using that data and reaction the data on each request. If necessary, look to use some form of caching (memCache) to improve performance.

How to reduce processing time on a web form

I have a webform with quite a few fields (between 15 to 40, based on user options). When user ends filling the form, I block it with jQuery.blockUI, and then on Server Side I process the form, packing it on an xml and call a new page. But transition between pages usually takes about 1 or 2 seconds, and I want to reduce it.
It's possible to make all processing on the next page, as the data is then send to external web services and wait for a response. That takes up to 2 minutes, thus 1 or 2 seconds are less to notice there.
So, Is there a simple way to make all data processing, and still reduce transition time?
Thanks in advance
UPDATE: I'm pretty sure that would be the better aproach. But right know time is top priority, and I'm convinced that I know the bottle neck and have no little idea as how to solve or speed up the parsing of the data to an xml that has nearly 200 fields (about 50 come from the form, rest from queries or code).
On a side note, that 2 secs are come not only from data parsing, but from our slow out conection on the development server, and connection speed on Spain in general. I'm 80% sure that it won't be as slow on the production server, but don't want to run the risk of asuming that nothing can be speed up.
Then, the couple of minutes querying external web services is out of my hands. It contacts a provider's webservice that links to a couple of Car Insurance companies, that get the data and throw out a list of insurance ¿prices? (sorry, don't know the correct word). And as this is lost time I think I can hide that two seconds of XML construction here.
The only thing I don't know is how to send form values from the Form to the Results page, that loads the data with Ajax.
I think you need to focus on why it takes so long processing 40 fields. What are the potential bottlenecks on the backend? What queries are you performing that take so long? If you can reduce the processing time to less than 10 seconds you can get away with your page handling the processing otherwise you need a different architecture like REST or NServiceBus to off load the long running execution and somehow notify the client that you are done.
You could try to do the processing in a different thread. Just take in the string, spin of the thread and return the result. Unfortunately thread programming doesn't qualify as "simple". Btw typically now is perceived as anything below 3 sec.
I re-read your question and sorry for not thinking of asking first: do you have to parse the form back to XML? Is it possible to serialize your data to JSON, pass it up to the server, de-serialize and make the web request? The JSON format is much "lighter" than XML and you serialize and de-serialize with a library such as JSON.Net. This should eliminate some of your processing overhead.
With respects to the web service you call, is the data new on each request? Is there anyway of requesting less data or storing portions of the data and refreshing periodically? Potentially you could run a messaging server such as MSMQ and refresh your data on a scheduled basis and then only request what you need once you have the user specific data. 30 seconds is 30 seconds.
I keep thinking about the data - you say you have over 200 fields. I am unclear as to whether you have to perform queries or calculations. If you have numerous records, have you considered a different type of schema that might make your retrievals faster? Can you pull static lookups into a shared memory so you don't have to hit the disk?

Ajax data update. Extjs

I need to keep certain data ( in a grid) up to date
and was gonna do a poll to the server every 15 seocnds or so to get the data and refresh the grid, however it feels a bit dirty ( the grid will have the loading icon every 15 sec..) doesnt look great...
Another option is to check if there is new data and compare the new data with the current data and only refresh the grid if there is any changes ( I would have to do this client side tho because maintaing the current state of every logged in user also seems like an overkill)
I m sure there are better solutions and would love to hear about them
I heard about COMET, but tit seems to be a bit of an overkill
BTW i m using asp.net MVC on the server side
I d like to hear what people have to say for or against continuos polling with js
Cheers
Sounds like COMET is indeed the solution you're looking for. In that scenario, you don't need to poll, nor do comparisons, as you can push out only the "relevant" changed data to your grid.
Check out WebSync, it's a nice comet server for .NET that'll let you do exactly what you've described.
Here's a demo using ExtJS and ASP.NET that pushes a continuous stream of stock ticker updates. The demo is a little more than you need, but the principal is identical.
Every time you get the answer from the server, check if something has changed.
Do a request. Do let the user know that you are working with some spinner, don't hide it. Schedule the next request in 15 seconds. The next request executes; if nothing has changed, schedule the next one in 15 + 5 seconds. The next request executes; if nothing has changed, schedule the next on in 15 +5 +5 seconds. And so on. The next request executes; if something has indeed changed, reset the interval to 15 seconds.
Prototype can do this semi-automatically with Ajax.PeriodicalUpdater but you probably need stuff that is more customized to your needs.
Anyway, just an idea.
As for continuous polling in general; it's bad only if you hit a different site (using a PHP "bridge" or something like that). If you're using your own resources you just have to make sure you don't deplete them. Set decent intervals with a decay.
I suggest Comet is not an overkill if "updates need to be constant." 15 seconds is very frequent; is your visited by many? Your server may be consumed serving these requests while starving others.
I don't know what your server-side data source looks like, or what kind of data you're serving, but one solution is to server your data with a timestamp, and send a timestamp of the last poll with every subsequent request.
Poll the server, sending the timestamp of when the service was last polled (eg: lastPollTime).
The server uses the timestamp to determine what data is new/updated and returns only that data (the delta), decreasing your transmission size and simplifying your client-side code.
It may be empty, it may be a few cells, it may be the entire grid, but the client always updates data that is returned to it because it is already known to be new.
The benefits of this method are that it simplifies your client side code (which is less code for the client to load), and decreases your transmission size for subsequent polls that have no new data for the user.
Also, this allows you to maintain state on the server side because you don't have to save a state for each individual user. You just have one state, the state of the current data, that is differentiated by access time.
I think checking if there is any new data is a good option.
I would count the number of rows in the database and compare that with the number of rows in your (HTML) table. If they're not the same, get the difference in rows.
Say you got 12 table rows and there are 14 database rows as you check: Get the latest (14 - 12) = 2 rows.

Resources