I have a table in SQL Server which contains more than 12000 rows. When I load all the rows into an asp.net web page it takes 10 to 15 minutes to load all the rows.
Please help me loading data in seconds..
Which part of the process is taking the most time? There are many distinct stages to this action:
Query execution time in SQL Server.
Transfer time from SQL Server to your data reader (ADO.NET? across a network?)
Binding the data to the grid in ASP.NET.
Transferring the rendered HTML to the client.
Only when you know exactly what is slow, can you properly optimise.
Solutions:
You can use qzip. Apparently, it won't bring load times down to seconds. But you will see good improvements in load time.
Write an handler which returns results in set of 100's and display results using jQuery asynchronously as in Facebook, where when you move down, more stories are feed, you can do a similar thing for your website, or you can have a "Load More" button which does the same thing except the user has to press the "Load More" button instead.
Note: Implementing the above is not that hard.
probably you need to find an third party grid control which have on-demand virtual paging support.
Related
I am working on ASP.Net webforms i have a user control with multiple input fields around 50-60 fields. The data from these fields is being saved and loaded on a save button click right now which takes alot of time.
I wanted to know if I save the data from these input fields using ajax calling a webservice on the focusout or blur event of the input field! is that a better alternative? And will it cause problems if multiple users are calling this web service so many times!! Any other solutions to my problem will be appreciated :)
No, this is not a good method, as people might end up sending a lot of blank data and this is just a waste of bandwidth. Remember, it will take almost equally long for sending data from all text fields, when compared with sending from one text field. Validation of the data could also be a problem and there will end up being a lot of code redundancies, regarding the extra validation and ajax calls being made. On the other hand, if you use an intermediate server, it can be a lot more responsive, assuming the ping between the intermediate and the user is low and the ping between the intermediate and the back-end server is also low and controlled by you.
Other than that, I don't know of any better method than sending all the data on a form submit.
I have a this asp.net page which upon first time load:
1: Make a DB call and get data - XML string (this chunk can go beyond 100kb). And this DB call is a bit expensive takes about 5-10 secs.
2: I loop through this XML and create a Custom Collection with values from XML. Then Bind it to a Repeater Control.
Now the repeater control has one text input. User is free to enter values in one or more or all TBs or leave all blank. Then then hit Save button.
On Save Postback, I will have to loop through all rows in the Repeater, Collect all the rows that has some input in the and generate an XML using the initial values and this new input value and Save it to DB.
Problem:
So I will need reference to all the initial XML values. I can think of these options and looking for inputs on selecting a best one.
1: ViewState: Store my Collection or XML string in ViewState - I'm sure it is will be too huge
2: Session: Use Session to store Collection of xml string - Again
3: DB Call: Make a DB call to get the data again - as I said it is kind of expensive call and my DBA is asking me to avoid this
4: HiddenField: Store the essential data from XML in to HiddenField and use that for Save manipulation. i.e. in each repeater item find all the hiddenfields
Which one is best in terms of better request response and less resource utilization on server?
Or is there a better way I am missing?
PS: Have to use ASP.NET 2.0 WebForms only.
Update1:
I tried the following with ViewState:
1: Store entire xml string: ViewState length = 97484 and FireBug shows pagesize - 162Kb
2:Store stripped down version of Collection with required only data: ViewState length = 27372 and FireBug shows pagesize - 94Kb and with gzip compression it reduces to 13kb.
With the existing website FireBug shows Size 236Kb.
So definitely option 2 is better and my new version is better then current website.
So any inputs?
A quick question - who is your target audience for this page? If it's an internal website for a company then just storing the data in viewstate might be acceptable. If it's for external people, e.g. potential customers, then speed and performance probably matter to you more.
Viewstate - have you tried adding your XML to viewstate? How much did it increase the page size by? If you're gzipping all of your pages rather than sending them over the wire uncompressed then you could see about a 70% reduction in size - 30kb isn't that much these days...
Session - it is worth remembering that the server can and will drop data from sessions if it runs out of space. They can also expire. Do you trust your users not to log in again in a new tab and then submit the page that they've had open for the last 10 hours? While using session results in less data on the wire you might need to re-pull the data from the db if the value does end up being dropped for whatever reason. Also, if you're in a web farm environment etc there are complications involving synchronizing sessions across servers.
DB Call - can the query be optimised in any way? Are the indices on all the fields that need them? Maybe you and your DBA can make it less painful to pull. But then again, if the data can change between you pulling it the first time and the user submitting their changes then you wouldn't want to re-pull it, I suspect.
Hidden Fields - With these you'd be saving less data than if you put the whole string in Viewstate. The page wouldn't be depending on the state of the webserver like with session and nor would you be racing against other users changing the state of the db.
On the whole, I think 4 is probably the best bet if you can afford to slow your pages down a little. Use Firebug/YSlow and compare how much data is transmitted before and after implementing 4.
One final thought - how are things like this persisted between postbacks in the rest of your webapp? Assuming that you haven't written the whole thing on your own/only just started it you might be able to find some clues as to how other developers in a similar situation solved the problem.
Edit:
there is a load-balancer, not sure how it will play with Session
If you have a load balancer then you need to make sure that session state is stored in a state server or similar and not in the process ("inproc"). If the session is stored on the webserver then option 2 will play very badly with the load balancer.
While I'm not a huge fan of overusing session, this will probably be your best bet as it will be your fastest option from the user's standpoint.
Since session state does have it's own inherit issues, you could load the data you need into session, and if your session drops for whatever reason, just do another database hit and reload it.
I would really stay away from options 1 and 4 just because of the amount of unnecessary data you will be sending to the client, and potentially slowing down their experience.
Option 3 will also slow down the user experience, so I would stay away from that if at all possible unless you can speed up your query time.
I'm working on an asp.net website with telerik controls. Im using multiple conditional grids (Show data based on selection in a grid.)
Every time I do new selection it is kinda slow (I'm using ajax call). Is it possible to preload all data to the client and then instantly show it to user.
I mean, is there any simple way of doing so?
There is a good chance that the slowness comes from the amount of data being rendered on your page. Keep in mind that AJAX still goes through the entire life-cycle of the page; the savings come from not having to render the entire page, just the updated parts.
Are your AJAX settings correctly updating the controls, or do you have a massive 'pnlAllControls' updating 'pnlAllControls'?
For example -- if you have Grid1, Grid2, Grid3; and Grid1 updates (Grid2, Grid3) while Grid2 updates only (Grid3), you should set your AJAX accordingly.
If your controls are getting data from the server, it does not make sense to cache the data on the client. I am not sure how much control you have over configuring them [controls].
You can store/cache data on server side instead (e.g. Cache, Session etc.). The data retrieval from there should be fast unless you send tons of data back and forth. But, caching data (on either client or server) should only be considered if the amount is 1)predictable and 2)relatively small.
Another technique to consider is paging/sorting on the server side. From my experience, you can get a real performance boost using just this.
There is no simple answer to your question. It all depends on data volume, security requirements. Moreover, your controls may not be able to pull data from the client-side.
I have used ajaxtool kit's autocomplete on a page which gets data from a web service. This autocomplete is slow, at the time I only have 10 to 20 records in table and it take about 3 to 5 seconds to search and show result in autocomplete. User have to wait about 4 second on average to see data.
I am not getting how to make it super fast please guide me. Is it possible to bind autocomplete on client side? My idea is if I get data with page load from server and put it in some array in JavaScript and as user click it get data from client side.
The problem may be that you are getting the data from the webservice which may take a few seconds. Why dont you cache the data in the servlet itself(in a Hasmap or a List) and then periodically ( say every 2 mins ) calling the webservice and getting the latest.
Hence when your autocomplete plugin requests for the latest autocomplete data you will return the cached values and not the actual values from the webservice.
I've noticed that some sites will store the hash map/list on another page and reference that page from the autocomplete function. Therefore the loading of the said page will not be impacted, and the autocomplete will be extremely fast (practically instantaneous). Also, you can maintain that list at your liesure once a minute/hour/day/month/year and it will be completely independant of the users expperience.
I have a couple of customer websites that have been in use for 4+ years. One of the pages on both sites contain drop downs that now contain 3000+ items. I have attempted initial solutions to this problem by adding new pages to both sites where one site is using silverlight and the other is using ajax.
The silverlight page currently performs better than the ajax page once the control has loaded but requires the user to have silverlight or the permissions to install it. The ajax version has issues that it sends the still requires an initial download of all of the data to complete the drop downs when the site is first loaded.
The ajax version still uses view state and sends 400k+ to the server on every request.
What I would like to do is use the ajax version but cache the drop down data on the client and only download the data once a day.
If I save the data using asp.net to isolated storage (I have that part sorted) is it possible to access it using client side code such as jquery?
Personally, there's no way I would use a dropdown for 3000+ items. Not only is there a problem with data transfer & viewstate but also its a pain for any user to scroll through that many items to find the option they require.
Have you considered something like this:
http://jquery.bassistance.de/autocomplete/demo/
You have a textbox that says something like 'start typing'... the second the user types the first letter of what they are looking for, an ajax query is made grabbing all the entries that begin with that letter.
Given that there are 26 letters in the alphabet, you will return on average 4% of the data i.e 120 entries rather than 3000!! Also as users get to know your system, they can type more and more letters and find what they are looking for much faster. Beats scrolling through a list of 3000 entries, and makes your application easier to use, more responsive and more network friendly!