I am using gridview control of .net framework 4.0. My list contains 1000 of rows which i am binding to a gridview on each postback. Hence, My page is taking time to load I want to speed up the system. Is there any other control available which can enhance the performance or is there any other way to achieve this?
What all i want is faster performance
You might be able to use faster controls, such as the Repeater, but it depends on what feature you really need. Are you only displaying the data or is it editable?
With such a large amount of data you can look at optimizing what HTML you use for rendering, as you may be able to split the page size by half...
First step should be to do a preformance check to find out what exacly is slow.
Check where in the code things take time, it could be one of many things.
1) If the control uses javascript, perhaps the users are on a old version of their browser with a slow javascript engine.
2) Perhaps the issue is bandwidth?
3) Perhaps its missing SQL index
and on it goes.
Dont guess at what is wrong, find out for sure what is taking the time, and solve then one at the time.
Like Forgotten Semicolon suggested loading via Ajax might be solution, it would give the user a better idea what is going on.
Other then that i would would heavly suggest caching if posible, you can use the built in Cache options to cache the datatable.
In addition to the point about not binding on every postback, I am assuming the datasource is an SQL database. You should probably check to see how fast the query runs. Make sure you have properly indexed your tables.
Related
I have this textbox in asp.net webform page used to enter a city. On entering some text it provides suggestions just like facebook does of matching results.
I tried these two methods to implement this.
I first used onTextChanged event and AJAX and found out it only works when the textbox loses focus. I wanted a solution to work as you type. Advantage of using this was that I could use a database and it would be fast, because no xml files will be transferred in the process.
2.I used ajax, clientside using js. But the problem is the xml containing cities, there states, country is a massive 30MB file. So, it was impossible to use it, so thought of making 26 small xml files of each alphabet out of that big one but still they would be big enough to actually use. So, now I am planning to use 26*26 files containing the cities with same first two alphabets but I think its ineffective way to do what I want.
Is there any other efficient way of accomplishing it?
The best way would be to use a database, if I could.
You need to use onkeypress and/or onkeyup events instead.
Did you know that there are plug-and-play auto-complete components out there that are free? For example http://jqueryui.com/demos/autocomplete/
Use JSON! It's much more compact. You'll probably save 30-40% on the size of that data.
Did you know that you don't need to pass the whole data set for that to work? You can have it live on the server (e.g. in the database, or cached on the webserver for faster access and less db traffic), and have clients only pull small set of data at a time, based on characters that they type. That JQuery UI AutoComplete supports that feature.
If you cannot use JQuery and JQuery UI (not wanting would be an unacceptable answer), then I'm pretty sure there are other free alternatives, including this one: http://www.asp.net/ajaxLibrary/AjaxControlToolkitSampleSite/AutoComplete/AutoComplete.aspx
I've received a project for internal use. My application has to store about 100 rows of meta data of a game and each row has about 15 fields maximum. Fields can be game name, game category, maker, source code path, etc. I will most likely have to join about 5-10 tables for each row of record. Only a few people are using it and will receive very little hits. Speed performance is not a much of an issue. The rows of data I have to present must be sortable and searchable
My current solution is to use ASP.NET's GridView control with ASP.NET's AJAX UpdatePanel to give it that ajax feel. I'm thinking of using LINQ-to-SQL as my data access layer. I'm thinking of building my own custom search engine but if there's an existing control that has this feature already, i would prefer to use that; anyone know of such control exist? Anyways what do you guys think?
Update #1:
I'm looking into creating a DynamicData website. Any have thoughts on that?
Use ext.js!
Look at the Grid Samples, its a very shallow learning curve and provides you with amazing results in little to no time.
http://extjs.com/products/extjs/
Basically, you expose your data via a web service (asmx or WCF, your choice), throw the Ext.Js grid onto your html/aspx page and point it at your webservice. Configure the control for things like sorting/searching/expanding/grouping/paging etc (use the api reference http://extjs.com/deploy/dev/docs/).
ASP.NET Dynamic Data looks really cool, particularly for sites where you've got:
lots of data
not a lot of worries about performance
no / little desire to skin / design the site
no / little desire to extend existing / write new functionality.
So I'd say that's a good match for your project.
Gridview is your best bet. It's so powerful if you know how to use it correctly. It does automatic sorting and if you can code pretty well you can get the data to be filterable(if that's a word). It also makes the Connection to the database for you....so in my opinion, you can't beat the gridview when it comes to reports like that.
I'm creating an ASP.Net website that displays large amounts of data. The data is served to me through a data access layer. From the data I'm getting I'm building up large data tables and then displaying these using either gridview's or dynamically created web controls.
The problem I'm finding is that the website is slow when a lot of data is passed to it. I've read that data readers are the way to go but I can't use a datareader directly from the SQL table due to needing to use the data access layer.
I also don't have the option of partially filling the datatable as I need to apply a lot of sorting methods to the data to display what I need.
Any suggestions of ways to speed up data tables? or perhaps use something else that's more efficient?
Since you are “.. building up large data tables and then displaying these using either gridview's or dynamically created web controls”, the network can be a bottleneck. See the answers to the similar SO question that may be helpful.
Are you absolutely certain that the bottleneck is in the Web Application?
The first thing I would do would be to guess what the longest SQL query you execute on a slow page would be, then see how it runs in the query browser.
If it's slow, work on optimizing that.
Pulling 'large' amounts of data into a web application and doing sorting / filtering there is always going to be slow, depending on your definition of 'large'. If you can apply any sorting / filtering on the database server before you pull the data to your web application that should speed things up. You say you don't have the option to do this but sorting is something that database servers are made for, are you sure you can't make this work?
You can use distributed cache, to cache your data. Memcache (http://www.danga.com/memcached/) or Velocity Microsoft Distributed Cache (http://www.microsoft.com/downloads/details.aspx?FamilyId=B24C3708-EEFF-4055-A867-19B5851E7CD2&displaylang=en).
The first thing you will want to do is to pinpoint exactly what part of the process that is being slow. It might not be where you think it is. Do code profiling or timing of different parts to determine exactly how much time each piece of code consume during a request. In our case we found that the data layer (executing readers, populating object models) were really fast (with a couple of exceptions that were taken care of by indexes in the database), while we had some javascript on the client that was really slow.
So, start with measuring, then decide where to optimize.
I have a search module with Auto Suggest feature to build in ASP.Net
The search criteria is Training Name and there is a table in database that stores trainings. The size would be as large as 30,000 trainings in the table so I have to be very careful in selecting the approach keeping in mind the performance.
There could be about 3000 users logging in the system simultaneously. When the user starts typing a training name the system should autosuggest.
The approaches that came in my mind were as under
Cache object - There would be a database hit after the user types 3 (e.g. saf) characters and the system would search the activity table for all trainings starting with saf and would cache them. The other requests would go thro this cache.
But the problem with this approach would be if there are 3000 concurrent users using the system and if they all search for different combinations of 3 different letters the cache would just blow.
Client side caching - Did not think much on this. The only drawback I see here is we might have to purge the temporary internet folder periodically.
Using Session - I thought to rule this out completely as I thought it would hit performance.
Can you please suggest the best or any other different approach I can take here. I am looking for all information/ideas that you have on this.
Thank you so much
Deepa.
My favourite jQuery plug-in to do that (if you're in intent to use jQuery) is the Flexbox.
It has a really impressive list of features.
You could use the jQuery Auto Complete plugin, which has caching features built in.
$(document).ready(function()
{
$(".landingpage").autocomplete('/AutoSuggestHandler.ashx',
{
minChars: 1,
matchSubset: 1,
autoFill: false,
delay: 10,
scroll: false
}).result(OnResultSelected);
}
Furthermore, you could specify outpu caching on the generic handler, to accommodate the need of caching across users.
I think your first approach will work.
Make sure there is an index on the field - you probably won't need to index the whole field. This should give the database a decent boost. You may need to look at full text indexing depending on how your search works, or even use an external library like lucene for the index is performance is an issue.
Cache the object, or even the resulting xml/json from the queries to improve performance.
You should also set the http headers so that browsers cache the xml/json as well.
Your posting really contains two questions:
How can I get autocomplete on my webpage?
I am concerned about performance due to a large number of queries hitting my database at the same time.
My answers...
1: We've found the ASP.NET AJAX AutoComplete Extender works well on all modern browsers, provides a slick user experience and is pretty easy to implement.
In your web application you need to create a web service that has a method with a specific signature (covered in the documentation linked to above).
2: Have you proven that you actually have a performance bottleneck with this part of your project? I'd recommend setting up a test harness and hitting your database with a large number of autocomplete queries to see how much it can take. Be wary of premature optimization.
it seems common issue so I am surprised I didn't find solution already, maybe someone can help me out.
I have a gridview that displays list of users of the app, this list is very big, and takes forever to load the data. Otherwise, data is paged through and once loaded everything goes fine. To help admins, I made search box and that works well.
Only issue is initial load of data, it seems that asp.net is retrieving all the records initially.
Is there a way to get only records for current page. Maybe there is a setting for gridview that I am missing or I am doing something else wrong.
Thank you in advance for suggestions.
Zeljko
you're going to have to do your own custom paging. Depending on your datasource you have to pass the PageIndex and size, to only get the pages results to be returned.
Ive used this run through before:
https://web.archive.org/web/20210510021915/http://aspnet.4guysfromrolla.com/articles/031506-1.aspx
Sounds like maybe you're using a Datatable/Dataset when you really want a Datareader. Alternatively, maybe I'm not understanding what you mean by "initial" and what you're experiencing is that you hit the jit compiler every time you do a new deployment.