Performane improvement in getting paged data for asp.net grid - asp.net

I am encountering a performance problem in the development of current UI. The problem, I suppose, is however general.
I have a page with a simple asp.net grid. Grid will display data from a table based on certain search criteria. Moreover, grid has fixed page size (say 10). There is pager at the bottom which can be used to navigate b/w pages. On the back end, whenever search button is pressed a stored procedure is called which returns the desired data.
Stored procedure has parameters like currentpageIndex, pagesize, other search criteria, etc. Here is a pseudo code for sp:
-- SP begins
-- calculate the page index range to return required using current page index and page size
-- query the table in a CTE and do all filtering. Also calculate row numbers so that
-- correct record range can be returned.
-- use the cte to return the correct record based on the row number calculated in CTE
-- SP ends
I have following problems/queries in this approach
When Db table size is large (say 10 million records), performance degrades and this approach becomes impractical.
Is using table variable or a temporary table more useful?
Is there any other efficient way to get paged data from database?
Hi Dan, the article provided a new insight for calculation of total rows. Really helpful. Thanks.
But still is there better way than using CTE when data is large?
Update: I have found few other performant approaches for efficiently getting paged records.

There's a good article on SqlServerCentral called SQL Server 2005 Paging – The Holy Grail that shows a few techniques for server-side paging. You will need to register to view it, though.
I know for really large result sets then software like Google will simply estimate how many rows will be returned, bypassing the need to get a count of all the rows returned.
Sorry, if I can't give more help.

Related

SQL server inserting lots of data from ASP.NET?

I have this application, where there is a parent child table, and customers can order products. The whole structure is quite complex to post here but suffice to say, there is one Order table and one OrderDetails table for storing the orders. Currently what we are doing is INSERT one record in Order table, and then for each item the customer added, insert each item in a loop to OrderDetails table. The solution is not scalable for obvious reasons. It works fine for 100 or so items, but if user goes over 1000 items, or 1000 qty of a item or so, one can start to notice the unresponsiveness of the application.
There are a couple of solutions that come to mind, but I am not sure which one would scale well. One is I use BulkInsert from my asp.net application to insert into the OrderDetails table. Second is I generate XML and then pass that to a sql proc and extract / insert data into OrderDetails table from that XML, but that have associate overhead of memory consumption of the XML generated. I know I could benchmark and see for myself what would suit best for my application, but I would like to know what is the most common strategy and would scale better when compared to other. Also, if there is another technique that I could use instead of these two, that would be better performance wise ( I know performance is subjective word, but let me narrow it down to speed ) I could use that. Which is generally used the most? What do you use in your application?
You could consider exploring the option of using a table valued parameter in the database. You will have to create a table type object, whose structure will mimic that of the OrderDetails table. The stored proc for inserting the data will accept an input parameter of this type (such parameters are always READONLY).
In your server side code, you can construct a DataTable object containing all the Order Details data, which will be mapped to the input parameter of the stored proc. Ensure that the order of columns in the DataTable object exactly matches the order in the table valued parameter. Upon executing the query, all the data will be inserted in one shot. This will save you from looping for each row of data that is there, and will also prevent the overhead of XML parsing. This approach though will involve passing an entire object over the network.
You can read more about it here : MSDN Table Valued Parameters
1000 items for an order does seem quite excessive!
Would it be feasible to introduce a limit of 100 items per order into the business logic of the application?

Recommended strategy for ASP.NET paging with large data, and optional sorting

The project I am currently working requires retrieving/searching from large amount of data, the flow as below:-
Enter a keyword and search from about 500,000 members
Retrieve only top 6 members.
Allow sorting based on the member country or gender.
Requirements: Using EF5.0
The data is currently displayed using a UserControl and DataBinded using Repeater, will be updated through an UpdatePanel with next, previous button, etc.It is preferably but not limited to using EF5.0, and I am opened to other options (e.g. SqlDataReader) and cast it back to the members object manually.
My current solution calling the Entities with skip by using the page number, i.e.
members = context.Members.Where( conditions here ).Skip(page number * size).Take(size);
My question will be: Is my strategy the industrial / common ways of doing it? Anyone with similar experience can share with me in terms of performance / optimization, is there any other better way to do so?
I got really good performance using a stored procedure, instead of a LINQ query. This saves performance because of the query metadata generation/sql translation. If you are returning a large result set, disabling change tracking is a good option too.
I am using database paging which uses ROWCOUNT check it here https://web.archive.org/web/20211020131201/https://www.4guysfromrolla.com/webtech/042606-1.shtml and it gives really good performance with 200000 records including sorting and paging.

Display Paging In ASP.NET GridView Even with a single Page

In My Database I Have 10000000 Records. In GridView First I Am Showing First 10 Records. In Order To See the Next Records User Need to Click Page Numbers ( 1,2,3,------10000). But As I'm Retrieving 10 Records for The First Time GridView Paging is not Showing.
Is There Any Way To Show Paging In ASP.NET GridView Statically ?
For so many records, I won't recommend Paging. You can show Top 20 recently added records and provide options to filter out records. A user can enter keywords. ReQuery and ReBind the GridView with this new result set.
You might also consider using PetaPoco, a Micro-ORM, which will help you fetch paged result.
With so many records, you really need to take into account the exact queries being run to pull back the data.
There are numerous ways of doing data paging ( http://beyondrelational.com/modules/2/blogs/28/posts/10434/sql-server-server-side-paging-with-rownumber-function.aspx ). However, the "best" way is dependent on the exact version of SQL Server you are running.
Essentially, the solutions boil down to you passing a page number and number of records per page through some type of query. Usually a stored procedure as the query can be quite messy.
Once there, you have an option. Either send the total record count back as an OUT parameter in your query and the result set back normally, or you send the total record count back as a column. There are definitely efficiency concerns with both options as one way requires the query to be run twice and the other requires an extra column of data which increases network traffic.
Once you have that solved then you can figure out exactly how you want the UI to work.

Strategy for handling variable time queries?

I have a typical scenario that I'm struggling with from a performance standpoint. The user selects a value from a dropdown and clicks a button. A stored procedure takes that value as an input parameter, executes, and returns the results to a grid. For just one of the values ('All'), the query runs for roughly 2.5 minutes. For the rest of the values the query runs less than 1ms.
Obviously, having the user wait for 2.5 minutes just isn't going to fly. So, what are some typical strategies to handle this?
Some of my own thoughts:
New table that stores the information for the 'All' value and is generated nightly
Cache the data on the caching server
Any help is appreciated.
Thanks!
Update
A little bit more info:
sp returns two result sets. The first is a group by rollup summary and the second is the first result set, disaggregated (roughly 80,000 rows).
I would first look at if your have the proper indexes in place. Using the Query Analyzer and the Database Tuning Assistant is a simple and often effective way of seeing what indexes might help.
If you still have performance problems after creating the appropriate indexes you might then look at adding tables/views to speed things up. If your query does a lot of joins you might consider creating an indexed view that allows you to do a select with no joins on the denormalized data. Since indexed views are persisted you can see big gains from their use.
You can read up on indexed views here:
http://msdn.microsoft.com/en-us/library/dd171921%28v=sql.100%29.aspx
and read about the database tuning adviser here:
http://msdn.microsoft.com/en-us/library/ms166575.aspx
Also, how many records does "All" return? I have seen people get hung up on the "All" scenario before, but if it returns 1 million records or something then the data is not usable to a person anyways...
Caching data is a good thing, but.... if the SP is inherently flawed, then you might want to actually fix it instead of trying to bandage it with caching.
You might also want to (since you didn't mention here) look at the number of rows "All" returns compared to the other selections and think about your indexes.
Also in your SP does the "All" cause it to run a different sets of tsql as in maybe a case or an if... or is it running the same code just with a different "WHERE"?
It might simply be that "ALL" just returns A LOT of records. You may want to implement paging and partial dataset return using ajax... (kinda like return the first 1000 records early so that it can be displayed and also show a throbber on the screen while the rest of the dataset is returned)
These are all options... if the number of records really isnt that different between ALL and the others... then it probably has something to do with the query/index/program flow.

Autocomplete optimization for large data sets

I am working on a large project where I have to present efficient way for a user to enter data into a form.
Three of the fields of that form require a value from a subset of a common data source (SQL Table). I used JQuery and JQuery UI to build an autocomplete, which posts to a generic HttpHandler.
Internally the handler uses Linq-to-sql to grab the data required from that specific table. The table has about 10 different columns, and the linq expression uses the SqlMethods.Like() to match the single search term on each of those 10 fields.
The problem is that that table contains some 20K rows. The autocomplete works flawlessly, accept the sheer volume of data introduces deleays, in the vicinity of 6 seconds or so (when debugging on my local machine) before it shows up.
The JqueryUI autocomplete has 0 delay, queries on the 3 key, and the result of the post is made in a Facebook style multi-row selectable options. (I almost had to rewrite the autocomplete plugin...).
So the problem is data vs. speed. Any thoughts on how to speed this up? The only two thoughts I had were to cache the data (How/Where?); or use straight up sql data reader for data access?
Any ideas would be greatly appreciated!
Thanks,
<bleepzter/>
I would look at only returning the first X number of rows using the .Take(10) linq method. That should translate into a sensbile sql call, which will put much less load on your database. As the user types they will find less and less matches, so they will only see that data they require.
I'm normally reckon 10 items is enough for the user to understand what is going on and still get to the data they need quickly (see the amazon.com search bar for an example).
Obviously if you can sort the data in a meaningful fashion then the 10 results will be much more likely to give the user what they are after quickly.
Returning the top N results is a good idea for sure. We found (querying a potential list of 270K) that returning the top 30 is a better bet for the user finding what they're looking for, but that COMPLETELY depends on the data you are querying.
Also, you REALLY should drop the delay to something sensible like 100-300 ms. When you set delay to ZERO, once you hit the 3-character trigger, effectively EVERY. SINGLE. KEY. STROKE. is sent as a new query to your server. This could easily have the unintended and unwelcome effect of slowing down the response even MORE.

Resources