Client data cache in isolated storage - asp.net - jquery - ajax - silverlight - asp.net

I have a couple of customer websites that have been in use for 4+ years. One of the pages on both sites contain drop downs that now contain 3000+ items. I have attempted initial solutions to this problem by adding new pages to both sites where one site is using silverlight and the other is using ajax.
The silverlight page currently performs better than the ajax page once the control has loaded but requires the user to have silverlight or the permissions to install it. The ajax version has issues that it sends the still requires an initial download of all of the data to complete the drop downs when the site is first loaded.
The ajax version still uses view state and sends 400k+ to the server on every request.
What I would like to do is use the ajax version but cache the drop down data on the client and only download the data once a day.
If I save the data using asp.net to isolated storage (I have that part sorted) is it possible to access it using client side code such as jquery?

Personally, there's no way I would use a dropdown for 3000+ items. Not only is there a problem with data transfer & viewstate but also its a pain for any user to scroll through that many items to find the option they require.
Have you considered something like this:
http://jquery.bassistance.de/autocomplete/demo/
You have a textbox that says something like 'start typing'... the second the user types the first letter of what they are looking for, an ajax query is made grabbing all the entries that begin with that letter.
Given that there are 26 letters in the alphabet, you will return on average 4% of the data i.e 120 entries rather than 3000!! Also as users get to know your system, they can type more and more letters and find what they are looking for much faster. Beats scrolling through a list of 3000 entries, and makes your application easier to use, more responsive and more network friendly!

Related

load large amount of data in asp.net web page

I have a table in SQL Server which contains more than 12000 rows. When I load all the rows into an asp.net web page it takes 10 to 15 minutes to load all the rows.
Please help me loading data in seconds..
Which part of the process is taking the most time? There are many distinct stages to this action:
Query execution time in SQL Server.
Transfer time from SQL Server to your data reader (ADO.NET? across a network?)
Binding the data to the grid in ASP.NET.
Transferring the rendered HTML to the client.
Only when you know exactly what is slow, can you properly optimise.
Solutions:
You can use qzip. Apparently, it won't bring load times down to seconds. But you will see good improvements in load time.
Write an handler which returns results in set of 100's and display results using jQuery asynchronously as in Facebook, where when you move down, more stories are feed, you can do a similar thing for your website, or you can have a "Load More" button which does the same thing except the user has to press the "Load More" button instead.
Note: Implementing the above is not that hard.
probably you need to find an third party grid control which have on-demand virtual paging support.

Viewstate or Session or Database Call

I have a this asp.net page which upon first time load:
1: Make a DB call and get data - XML string (this chunk can go beyond 100kb). And this DB call is a bit expensive takes about 5-10 secs.
2: I loop through this XML and create a Custom Collection with values from XML. Then Bind it to a Repeater Control.
Now the repeater control has one text input. User is free to enter values in one or more or all TBs or leave all blank. Then then hit Save button.
On Save Postback, I will have to loop through all rows in the Repeater, Collect all the rows that has some input in the and generate an XML using the initial values and this new input value and Save it to DB.
Problem:
So I will need reference to all the initial XML values. I can think of these options and looking for inputs on selecting a best one.
1: ViewState: Store my Collection or XML string in ViewState - I'm sure it is will be too huge
2: Session: Use Session to store Collection of xml string - Again
3: DB Call: Make a DB call to get the data again - as I said it is kind of expensive call and my DBA is asking me to avoid this
4: HiddenField: Store the essential data from XML in to HiddenField and use that for Save manipulation. i.e. in each repeater item find all the hiddenfields
Which one is best in terms of better request response and less resource utilization on server?
Or is there a better way I am missing?
PS: Have to use ASP.NET 2.0 WebForms only.
Update1:
I tried the following with ViewState:
1: Store entire xml string: ViewState length = 97484 and FireBug shows pagesize - 162Kb
2:Store stripped down version of Collection with required only data: ViewState length = 27372 and FireBug shows pagesize - 94Kb and with gzip compression it reduces to 13kb.
With the existing website FireBug shows Size 236Kb.
So definitely option 2 is better and my new version is better then current website.
So any inputs?
A quick question - who is your target audience for this page? If it's an internal website for a company then just storing the data in viewstate might be acceptable. If it's for external people, e.g. potential customers, then speed and performance probably matter to you more.
Viewstate - have you tried adding your XML to viewstate? How much did it increase the page size by? If you're gzipping all of your pages rather than sending them over the wire uncompressed then you could see about a 70% reduction in size - 30kb isn't that much these days...
Session - it is worth remembering that the server can and will drop data from sessions if it runs out of space. They can also expire. Do you trust your users not to log in again in a new tab and then submit the page that they've had open for the last 10 hours? While using session results in less data on the wire you might need to re-pull the data from the db if the value does end up being dropped for whatever reason. Also, if you're in a web farm environment etc there are complications involving synchronizing sessions across servers.
DB Call - can the query be optimised in any way? Are the indices on all the fields that need them? Maybe you and your DBA can make it less painful to pull. But then again, if the data can change between you pulling it the first time and the user submitting their changes then you wouldn't want to re-pull it, I suspect.
Hidden Fields - With these you'd be saving less data than if you put the whole string in Viewstate. The page wouldn't be depending on the state of the webserver like with session and nor would you be racing against other users changing the state of the db.
On the whole, I think 4 is probably the best bet if you can afford to slow your pages down a little. Use Firebug/YSlow and compare how much data is transmitted before and after implementing 4.
One final thought - how are things like this persisted between postbacks in the rest of your webapp? Assuming that you haven't written the whole thing on your own/only just started it you might be able to find some clues as to how other developers in a similar situation solved the problem.
Edit:
there is a load-balancer, not sure how it will play with Session
If you have a load balancer then you need to make sure that session state is stored in a state server or similar and not in the process ("inproc"). If the session is stored on the webserver then option 2 will play very badly with the load balancer.
While I'm not a huge fan of overusing session, this will probably be your best bet as it will be your fastest option from the user's standpoint.
Since session state does have it's own inherit issues, you could load the data you need into session, and if your session drops for whatever reason, just do another database hit and reload it.
I would really stay away from options 1 and 4 just because of the amount of unnecessary data you will be sending to the client, and potentially slowing down their experience.
Option 3 will also slow down the user experience, so I would stay away from that if at all possible unless you can speed up your query time.

How to load all data on client

I'm working on an asp.net website with telerik controls. Im using multiple conditional grids (Show data based on selection in a grid.)
Every time I do new selection it is kinda slow (I'm using ajax call). Is it possible to preload all data to the client and then instantly show it to user.
I mean, is there any simple way of doing so?
There is a good chance that the slowness comes from the amount of data being rendered on your page. Keep in mind that AJAX still goes through the entire life-cycle of the page; the savings come from not having to render the entire page, just the updated parts.
Are your AJAX settings correctly updating the controls, or do you have a massive 'pnlAllControls' updating 'pnlAllControls'?
For example -- if you have Grid1, Grid2, Grid3; and Grid1 updates (Grid2, Grid3) while Grid2 updates only (Grid3), you should set your AJAX accordingly.
If your controls are getting data from the server, it does not make sense to cache the data on the client. I am not sure how much control you have over configuring them [controls].
You can store/cache data on server side instead (e.g. Cache, Session etc.). The data retrieval from there should be fast unless you send tons of data back and forth. But, caching data (on either client or server) should only be considered if the amount is 1)predictable and 2)relatively small.
Another technique to consider is paging/sorting on the server side. From my experience, you can get a real performance boost using just this.
There is no simple answer to your question. It all depends on data volume, security requirements. Moreover, your controls may not be able to pull data from the client-side.

How to make ajaxtoolkit autocomplete super fast and bind it on clientside

I have used ajaxtool kit's autocomplete on a page which gets data from a web service. This autocomplete is slow, at the time I only have 10 to 20 records in table and it take about 3 to 5 seconds to search and show result in autocomplete. User have to wait about 4 second on average to see data.
I am not getting how to make it super fast please guide me. Is it possible to bind autocomplete on client side? My idea is if I get data with page load from server and put it in some array in JavaScript and as user click it get data from client side.
The problem may be that you are getting the data from the webservice which may take a few seconds. Why dont you cache the data in the servlet itself(in a Hasmap or a List) and then periodically ( say every 2 mins ) calling the webservice and getting the latest.
Hence when your autocomplete plugin requests for the latest autocomplete data you will return the cached values and not the actual values from the webservice.
I've noticed that some sites will store the hash map/list on another page and reference that page from the autocomplete function. Therefore the loading of the said page will not be impacted, and the autocomplete will be extremely fast (practically instantaneous). Also, you can maintain that list at your liesure once a minute/hour/day/month/year and it will be completely independant of the users expperience.

very large viewstate breaking web app

I have a web app, that consumes a web service. The main page runs a search - by passing parameters to a particular web service method, and I bind the results to a gridview.
I have implemented sorting and paging on the grid. By putting the datatable that the grid is bound to in the viewstate and then reading / sorting / filtering it as necessary - and rebinding to the grid.
As the amount of data coming back from the web service has increased dramatically, when I try to page/sort etc I receive the following errors.
The connection was reset
The connection to the server was reset while the page was loading.
I have searched around a bit, and it seems that a very large viewstate is to blame for this.
But surely the only other option is to
Limit the results
Stick the datatable in the session rather than the viewstate
Something else I am unaware of
Previously I did have the datatable in the session, as some of this data needed to persist from page to page - (not being posted however so viewstate was not an option). As the amount of data rose and the necessity to persist it was removed, I used the viewstate instead. Thinking this was a better option than the session because of the amount of data the session would have to hold and the number of users using the app.
It appears maybe not.
I thought that when the viewstate got very big, that .net split it over more than one hidden viewstate field, but it seems all I'm getting is one mammoth viewstate that I have trouble viewing in the source.
Can anyone enlighten me as to how to avoid the error I'm getting? If it is indeed to do with the amount of data in the viewstate?
It sounds like your caching the whole dataset for all pages even though you are only presenting one page of that data. I would change your pagination to only require the data for the current page the user is on.
If the query is heavy and you don't want to have to be constantly calling it over and over because there is a lot of paging back and forth (you should test typical useage pattern) then I would implement some type of caching on the web service end to cache page by page (by specific user if the data is specific to a user) and have it expire rather quick (eg a few minuites).
I think you need to limit the total amount of data your dealing with. Change your code to not pass back extra data that might never be needed is a good place to start.
EDIT: Based on your comments:
You can't change the web service
The user can manipulate the query by filtering or sorting
There is a large amount of data returned by the web service
The data is user specific
Well I think you have a perfect case for using the Session then. This can be taxing the the server with large amounts of users and data so you might want to implement some logic to clear the data from the Session and not wait for it to expire (like on certain landing pages you know the user will go when they are done, clear the session data).
You really want to get it out of the ViewState beacuse it is a huge bandwidth hog. Just look at your physical page size and that data is being passed back and forth with every action. Moving it to the Session would eliminate that bandwidth useage and allow for you to do everything you need.
You could also look at the data the web service is bringing back and store it in a custom object that you make as 'thin' as possible. If your storing a DataSet or a DataTable in your Session, those objects have some extra overhead you probably don't need so store the data as an array of some custom thin object and just bind to that. You would need to map the result from the WS to your custom object but this is a good option you cut down on memory useage.
Let me know if there is something else I am missing.
I wouldn't put the data in either the view state or the session. Instead store the bare minimum information to re-request the dataset from the web service and store that (in either view state or session, or even on the URL). Then call the web service using that data and reaction the data on each request. If necessary, look to use some form of caching (memCache) to improve performance.

Resources