I'm following the example provided here:
http://msdn.microsoft.com/en-us/library/bb497936.aspx
The top half shows a stored procedure which can be used to return just the records needed, so for example if I have 100 records and want to display just 10 per page I can use the proc to tell it the maximum number of rows I want and the pageIndex to begin on. Is there a way to do the same thing without ObjectDataSource, so use a custom DataAccess layer? I'm able to get back the 10 records I want, but I don't get paging then as the GridView see only 10 coming back from the proc, is there a way to tell it I have 100, but am only displaying these 10?
As far as your DAL is concerned if the stored procedure returns 10 records, then that's all your DAL will know. Your DAL will have no way of knowing how many records are in the underlying data unless you tell it.
If you want the stored procedure to return say 10 of 100 rows and you want the DAL to know there was 100 rows in total, then you need to pass out that value separately - in an output variable perhaps?
If you are doing paging in your stored procedure, you may lose the performance you were trying to gain if you also evaluate the total number of rows affected.
It sounds as if you solution could be architected better, but we would need to better understand your requirements and what you are trying to achieve before providing specific architectural advice.
Related
We have developed Asp.net web application. and we are using asp.net gridview to display the records and edit.
here we have 5000 rows and 23 columns in single grid. it is taking long time for binding. our client refuse the pagination option. how to make the binding faster with 5000 to 7000 records.
Please Advise.
Thanks
Mayil.M
Where is your data come from? Is it database or other external resource?
You can use caching so you don't load the whole dataset from the external resource but from the memory. Please note this solution will not work if your data changes often.
Another approach would be to use some kind of partial loading mechanism using for example Ajax. This will however require changing approach as I am not sure the grid view control supports this. You would have to create custom control and then make sequential requests (using for example Ajax) for smaller chunks (eg. 200 records) of data and display them. Eventually you will have complete set but the data will be available faster.
Finally you can combine both, to make it even faster.
You should implement your own paging mechanism. The problem is that DataBind retrieves all 7000 records (although only i.e. 20 is displayed/rendered). Create for example a stored procedure that will fetch only selected range of records (if you are on Page 2 you need to display only record id > 20 and <= 40 - considering that your pagesize is 20). Use SQL server CTE to get the row number (on SQL Server side) and features like BETWEEN. This stored procedure would return only those records which you really need to. Then change your grid view to get the data from this stored procedure.
You can load data on scroll like facebook wall.
The project I am currently working requires retrieving/searching from large amount of data, the flow as below:-
Enter a keyword and search from about 500,000 members
Retrieve only top 6 members.
Allow sorting based on the member country or gender.
Requirements: Using EF5.0
The data is currently displayed using a UserControl and DataBinded using Repeater, will be updated through an UpdatePanel with next, previous button, etc.It is preferably but not limited to using EF5.0, and I am opened to other options (e.g. SqlDataReader) and cast it back to the members object manually.
My current solution calling the Entities with skip by using the page number, i.e.
members = context.Members.Where( conditions here ).Skip(page number * size).Take(size);
My question will be: Is my strategy the industrial / common ways of doing it? Anyone with similar experience can share with me in terms of performance / optimization, is there any other better way to do so?
I got really good performance using a stored procedure, instead of a LINQ query. This saves performance because of the query metadata generation/sql translation. If you are returning a large result set, disabling change tracking is a good option too.
I am using database paging which uses ROWCOUNT check it here https://web.archive.org/web/20211020131201/https://www.4guysfromrolla.com/webtech/042606-1.shtml and it gives really good performance with 200000 records including sorting and paging.
In My Database I Have 10000000 Records. In GridView First I Am Showing First 10 Records. In Order To See the Next Records User Need to Click Page Numbers ( 1,2,3,------10000). But As I'm Retrieving 10 Records for The First Time GridView Paging is not Showing.
Is There Any Way To Show Paging In ASP.NET GridView Statically ?
For so many records, I won't recommend Paging. You can show Top 20 recently added records and provide options to filter out records. A user can enter keywords. ReQuery and ReBind the GridView with this new result set.
You might also consider using PetaPoco, a Micro-ORM, which will help you fetch paged result.
With so many records, you really need to take into account the exact queries being run to pull back the data.
There are numerous ways of doing data paging ( http://beyondrelational.com/modules/2/blogs/28/posts/10434/sql-server-server-side-paging-with-rownumber-function.aspx ). However, the "best" way is dependent on the exact version of SQL Server you are running.
Essentially, the solutions boil down to you passing a page number and number of records per page through some type of query. Usually a stored procedure as the query can be quite messy.
Once there, you have an option. Either send the total record count back as an OUT parameter in your query and the result set back normally, or you send the total record count back as a column. There are definitely efficiency concerns with both options as one way requires the query to be run twice and the other requires an extra column of data which increases network traffic.
Once you have that solved then you can figure out exactly how you want the UI to work.
I am working on a large project where I have to present efficient way for a user to enter data into a form.
Three of the fields of that form require a value from a subset of a common data source (SQL Table). I used JQuery and JQuery UI to build an autocomplete, which posts to a generic HttpHandler.
Internally the handler uses Linq-to-sql to grab the data required from that specific table. The table has about 10 different columns, and the linq expression uses the SqlMethods.Like() to match the single search term on each of those 10 fields.
The problem is that that table contains some 20K rows. The autocomplete works flawlessly, accept the sheer volume of data introduces deleays, in the vicinity of 6 seconds or so (when debugging on my local machine) before it shows up.
The JqueryUI autocomplete has 0 delay, queries on the 3 key, and the result of the post is made in a Facebook style multi-row selectable options. (I almost had to rewrite the autocomplete plugin...).
So the problem is data vs. speed. Any thoughts on how to speed this up? The only two thoughts I had were to cache the data (How/Where?); or use straight up sql data reader for data access?
Any ideas would be greatly appreciated!
Thanks,
<bleepzter/>
I would look at only returning the first X number of rows using the .Take(10) linq method. That should translate into a sensbile sql call, which will put much less load on your database. As the user types they will find less and less matches, so they will only see that data they require.
I'm normally reckon 10 items is enough for the user to understand what is going on and still get to the data they need quickly (see the amazon.com search bar for an example).
Obviously if you can sort the data in a meaningful fashion then the 10 results will be much more likely to give the user what they are after quickly.
Returning the top N results is a good idea for sure. We found (querying a potential list of 270K) that returning the top 30 is a better bet for the user finding what they're looking for, but that COMPLETELY depends on the data you are querying.
Also, you REALLY should drop the delay to something sensible like 100-300 ms. When you set delay to ZERO, once you hit the 3-character trigger, effectively EVERY. SINGLE. KEY. STROKE. is sent as a new query to your server. This could easily have the unintended and unwelcome effect of slowing down the response even MORE.
I am encountering a performance problem in the development of current UI. The problem, I suppose, is however general.
I have a page with a simple asp.net grid. Grid will display data from a table based on certain search criteria. Moreover, grid has fixed page size (say 10). There is pager at the bottom which can be used to navigate b/w pages. On the back end, whenever search button is pressed a stored procedure is called which returns the desired data.
Stored procedure has parameters like currentpageIndex, pagesize, other search criteria, etc. Here is a pseudo code for sp:
-- SP begins
-- calculate the page index range to return required using current page index and page size
-- query the table in a CTE and do all filtering. Also calculate row numbers so that
-- correct record range can be returned.
-- use the cte to return the correct record based on the row number calculated in CTE
-- SP ends
I have following problems/queries in this approach
When Db table size is large (say 10 million records), performance degrades and this approach becomes impractical.
Is using table variable or a temporary table more useful?
Is there any other efficient way to get paged data from database?
Hi Dan, the article provided a new insight for calculation of total rows. Really helpful. Thanks.
But still is there better way than using CTE when data is large?
Update: I have found few other performant approaches for efficiently getting paged records.
There's a good article on SqlServerCentral called SQL Server 2005 Paging – The Holy Grail that shows a few techniques for server-side paging. You will need to register to view it, though.
I know for really large result sets then software like Google will simply estimate how many rows will be returned, bypassing the need to get a count of all the rows returned.
Sorry, if I can't give more help.