Ajax data update. Extjs - asp.net

I need to keep certain data ( in a grid) up to date
and was gonna do a poll to the server every 15 seocnds or so to get the data and refresh the grid, however it feels a bit dirty ( the grid will have the loading icon every 15 sec..) doesnt look great...
Another option is to check if there is new data and compare the new data with the current data and only refresh the grid if there is any changes ( I would have to do this client side tho because maintaing the current state of every logged in user also seems like an overkill)
I m sure there are better solutions and would love to hear about them
I heard about COMET, but tit seems to be a bit of an overkill
BTW i m using asp.net MVC on the server side
I d like to hear what people have to say for or against continuos polling with js
Cheers

Sounds like COMET is indeed the solution you're looking for. In that scenario, you don't need to poll, nor do comparisons, as you can push out only the "relevant" changed data to your grid.
Check out WebSync, it's a nice comet server for .NET that'll let you do exactly what you've described.
Here's a demo using ExtJS and ASP.NET that pushes a continuous stream of stock ticker updates. The demo is a little more than you need, but the principal is identical.

Every time you get the answer from the server, check if something has changed.
Do a request. Do let the user know that you are working with some spinner, don't hide it. Schedule the next request in 15 seconds. The next request executes; if nothing has changed, schedule the next one in 15 + 5 seconds. The next request executes; if nothing has changed, schedule the next on in 15 +5 +5 seconds. And so on. The next request executes; if something has indeed changed, reset the interval to 15 seconds.
Prototype can do this semi-automatically with Ajax.PeriodicalUpdater but you probably need stuff that is more customized to your needs.
Anyway, just an idea.
As for continuous polling in general; it's bad only if you hit a different site (using a PHP "bridge" or something like that). If you're using your own resources you just have to make sure you don't deplete them. Set decent intervals with a decay.

I suggest Comet is not an overkill if "updates need to be constant." 15 seconds is very frequent; is your visited by many? Your server may be consumed serving these requests while starving others.

I don't know what your server-side data source looks like, or what kind of data you're serving, but one solution is to server your data with a timestamp, and send a timestamp of the last poll with every subsequent request.
Poll the server, sending the timestamp of when the service was last polled (eg: lastPollTime).
The server uses the timestamp to determine what data is new/updated and returns only that data (the delta), decreasing your transmission size and simplifying your client-side code.
It may be empty, it may be a few cells, it may be the entire grid, but the client always updates data that is returned to it because it is already known to be new.
The benefits of this method are that it simplifies your client side code (which is less code for the client to load), and decreases your transmission size for subsequent polls that have no new data for the user.
Also, this allows you to maintain state on the server side because you don't have to save a state for each individual user. You just have one state, the state of the current data, that is differentiated by access time.

I think checking if there is any new data is a good option.
I would count the number of rows in the database and compare that with the number of rows in your (HTML) table. If they're not the same, get the difference in rows.
Say you got 12 table rows and there are 14 database rows as you check: Get the latest (14 - 12) = 2 rows.

Related

Can we set a refresh rate for SignalR?

I have implemented SignalR and it is working fine. My concern is that I only want the data after minimum 5 seconds. I don't want to refresh the data if the data changes within 5 seconds of the last refresh.Is it possible?
You can simply store the messages someplace in your app and just broadcast them whenever you choose to. Its that simple. However I question the validity of using signalr for this, there are other ways to do it that are simpler and less resource intensive.

What perfmon counters are useful for identifying ASP.NET bottlenecks?

Given the chart here, what should I be looking at to identify the bottleneck? As you can see, requests are averaging nearly 14 seconds under load and the bulk of that time is attributed to the CLR in New Relic's profiling data. In the performance breakdown for a particular page, it attributes the bulk of the time to the WebTransaction/.aspx page.
I see that the database is readed also (the orange) and this is seams that one of all pages have delay the rest of pages because of the lock that session make on the pages.
you can read also :
Replacing ASP.Net's session entirely
My suggestion is totally remove the session calls and if this is not possible, find an other way to save them somewhere in database by your self.
Actually in my pages I have made all three possible options. 1. I call the page with out session. 2 I have made totally custom session that are values connected to the user cookie, and last 3. I have made threads that are run away from the session and they make calculations on background, when they finish I show the results.
In some cases the calculations are done on iframe that call a page without session and there later I show the results.
In the Pro version, you can use Transaction Traces, which help pinpoint exactly where the issue is happening.

Viewstate or Session or Database Call

I have a this asp.net page which upon first time load:
1: Make a DB call and get data - XML string (this chunk can go beyond 100kb). And this DB call is a bit expensive takes about 5-10 secs.
2: I loop through this XML and create a Custom Collection with values from XML. Then Bind it to a Repeater Control.
Now the repeater control has one text input. User is free to enter values in one or more or all TBs or leave all blank. Then then hit Save button.
On Save Postback, I will have to loop through all rows in the Repeater, Collect all the rows that has some input in the and generate an XML using the initial values and this new input value and Save it to DB.
Problem:
So I will need reference to all the initial XML values. I can think of these options and looking for inputs on selecting a best one.
1: ViewState: Store my Collection or XML string in ViewState - I'm sure it is will be too huge
2: Session: Use Session to store Collection of xml string - Again
3: DB Call: Make a DB call to get the data again - as I said it is kind of expensive call and my DBA is asking me to avoid this
4: HiddenField: Store the essential data from XML in to HiddenField and use that for Save manipulation. i.e. in each repeater item find all the hiddenfields
Which one is best in terms of better request response and less resource utilization on server?
Or is there a better way I am missing?
PS: Have to use ASP.NET 2.0 WebForms only.
Update1:
I tried the following with ViewState:
1: Store entire xml string: ViewState length = 97484 and FireBug shows pagesize - 162Kb
2:Store stripped down version of Collection with required only data: ViewState length = 27372 and FireBug shows pagesize - 94Kb and with gzip compression it reduces to 13kb.
With the existing website FireBug shows Size 236Kb.
So definitely option 2 is better and my new version is better then current website.
So any inputs?
A quick question - who is your target audience for this page? If it's an internal website for a company then just storing the data in viewstate might be acceptable. If it's for external people, e.g. potential customers, then speed and performance probably matter to you more.
Viewstate - have you tried adding your XML to viewstate? How much did it increase the page size by? If you're gzipping all of your pages rather than sending them over the wire uncompressed then you could see about a 70% reduction in size - 30kb isn't that much these days...
Session - it is worth remembering that the server can and will drop data from sessions if it runs out of space. They can also expire. Do you trust your users not to log in again in a new tab and then submit the page that they've had open for the last 10 hours? While using session results in less data on the wire you might need to re-pull the data from the db if the value does end up being dropped for whatever reason. Also, if you're in a web farm environment etc there are complications involving synchronizing sessions across servers.
DB Call - can the query be optimised in any way? Are the indices on all the fields that need them? Maybe you and your DBA can make it less painful to pull. But then again, if the data can change between you pulling it the first time and the user submitting their changes then you wouldn't want to re-pull it, I suspect.
Hidden Fields - With these you'd be saving less data than if you put the whole string in Viewstate. The page wouldn't be depending on the state of the webserver like with session and nor would you be racing against other users changing the state of the db.
On the whole, I think 4 is probably the best bet if you can afford to slow your pages down a little. Use Firebug/YSlow and compare how much data is transmitted before and after implementing 4.
One final thought - how are things like this persisted between postbacks in the rest of your webapp? Assuming that you haven't written the whole thing on your own/only just started it you might be able to find some clues as to how other developers in a similar situation solved the problem.
Edit:
there is a load-balancer, not sure how it will play with Session
If you have a load balancer then you need to make sure that session state is stored in a state server or similar and not in the process ("inproc"). If the session is stored on the webserver then option 2 will play very badly with the load balancer.
While I'm not a huge fan of overusing session, this will probably be your best bet as it will be your fastest option from the user's standpoint.
Since session state does have it's own inherit issues, you could load the data you need into session, and if your session drops for whatever reason, just do another database hit and reload it.
I would really stay away from options 1 and 4 just because of the amount of unnecessary data you will be sending to the client, and potentially slowing down their experience.
Option 3 will also slow down the user experience, so I would stay away from that if at all possible unless you can speed up your query time.

Caching issue with javascript and asp.net

I asked a question a while back on here regarding caching data for a calendar/scheduling web app, and got some good responses. However, I have now decided to change my approach and stat caching the data in javascript.
I am directly caching the HTML for each day's column in the calendar grid inside the $('body').data() object, which gives very fast page load times (almost unnoticable).
However, problems start to arise when the user requests data that is not yet in the cache. This data is created by the server using an ajax call, so it's asynchronous, and takes about 0.2s per week's data.
My current approach is simply to block for 0.5s when the user requests information from the server, and cache 4 weeks either side in the inital page load (and 1 extra week per page change request), however I doubt this is the optimal method.
Does anyone have a suggestion as to how to improve the situation?
To summarise:
Each week takes 0.2s to retrieve from the server, asynchronously.
Performance must be as close to real-time as possible. (however the data is not needed to be fully real-time: most appointments are added by the user and so we can re-cache after this)
Currently 4 weeks are cached on either side of the inial week loaded: this is not enough.
to cache 1 year takes ~ 21s, this is too slow for an initial load.
As I read your description, I thought of 2 things: Asynchrony and Caching.
First, Asynchrony
Why would you block for 0.5s? Why not use an ajax call, and in the callback, update the page with the retrieved info. There is no blocking for a set time, it is done asynchronously. You'd have to suppress multiple clicks though, while a request is outstanding, but that shouldn't be a problem at all.
You can also pre-load the in-page cache in the background, using setInterval or better, setTimeout. Especially makes sense if the cost to compute or generate the calendar is long and the data size is relatively small - in other words, small enough to store months in the in-page cache even if it is never used. Sounds like you may be doing this anyway and only need to block when the user jumps out of the range of cached data.
Intelligent Caching
I am imagining the callback function - the one that is called when the ajax call completes - will check if the currently selected date is on the "edge" of the cached data - either the first week in cache or the last week (or whatever). If the user is on the edge, then the callback can send out an additional request to optimistically pre-load the cache up to the 4 week limit, or whatever time range makes sense for your 80% use cases.
You may also consider caching the generated calendar data on the server side, on a per-user basis. If it is CPU- and time-intensive to generate these things, then it should be a good trade to generate once and keep it in the server-side cache, invalidating only when the user makes an update. With x64 servers and cheap memory, this is probably very feasible. Depending on the use cases, it may make for a much more usable interaction, the 2nd time a user connects to the app. You could even consider pre-loading the server-side cache on a per-user basis, before the user requests any calendar.

How to reduce processing time on a web form

I have a webform with quite a few fields (between 15 to 40, based on user options). When user ends filling the form, I block it with jQuery.blockUI, and then on Server Side I process the form, packing it on an xml and call a new page. But transition between pages usually takes about 1 or 2 seconds, and I want to reduce it.
It's possible to make all processing on the next page, as the data is then send to external web services and wait for a response. That takes up to 2 minutes, thus 1 or 2 seconds are less to notice there.
So, Is there a simple way to make all data processing, and still reduce transition time?
Thanks in advance
UPDATE: I'm pretty sure that would be the better aproach. But right know time is top priority, and I'm convinced that I know the bottle neck and have no little idea as how to solve or speed up the parsing of the data to an xml that has nearly 200 fields (about 50 come from the form, rest from queries or code).
On a side note, that 2 secs are come not only from data parsing, but from our slow out conection on the development server, and connection speed on Spain in general. I'm 80% sure that it won't be as slow on the production server, but don't want to run the risk of asuming that nothing can be speed up.
Then, the couple of minutes querying external web services is out of my hands. It contacts a provider's webservice that links to a couple of Car Insurance companies, that get the data and throw out a list of insurance ¿prices? (sorry, don't know the correct word). And as this is lost time I think I can hide that two seconds of XML construction here.
The only thing I don't know is how to send form values from the Form to the Results page, that loads the data with Ajax.
I think you need to focus on why it takes so long processing 40 fields. What are the potential bottlenecks on the backend? What queries are you performing that take so long? If you can reduce the processing time to less than 10 seconds you can get away with your page handling the processing otherwise you need a different architecture like REST or NServiceBus to off load the long running execution and somehow notify the client that you are done.
You could try to do the processing in a different thread. Just take in the string, spin of the thread and return the result. Unfortunately thread programming doesn't qualify as "simple". Btw typically now is perceived as anything below 3 sec.
I re-read your question and sorry for not thinking of asking first: do you have to parse the form back to XML? Is it possible to serialize your data to JSON, pass it up to the server, de-serialize and make the web request? The JSON format is much "lighter" than XML and you serialize and de-serialize with a library such as JSON.Net. This should eliminate some of your processing overhead.
With respects to the web service you call, is the data new on each request? Is there anyway of requesting less data or storing portions of the data and refreshing periodically? Potentially you could run a messaging server such as MSMQ and refresh your data on a scheduled basis and then only request what you need once you have the user specific data. 30 seconds is 30 seconds.
I keep thinking about the data - you say you have over 200 fields. I am unclear as to whether you have to perform queries or calculations. If you have numerous records, have you considered a different type of schema that might make your retrievals faster? Can you pull static lookups into a shared memory so you don't have to hit the disk?

Resources