I'm building a asp.net web application with lots and lots of controls and huge volumes of data. My application is very slow and it is taking a large amount of time to load the data into the .net controls like grid, tree view etc. I also have some ajaxified pages and controls in my application. I want to reduce the page load time in each postbacks.
What are the standards/best practices to be followed while developing large asp.net applications?
Thank you.
NLV
Cache certain data, either in the application or in the database (thus breaking normalization but it's okay)
Retrieve the minimum subset of data you really need. Don't pull 10000 records from the database into a grid to only display 50. Query for exactly 50.
Mimimize the amount of server controls and dynamic markup creation. Replace what you can with passive HTML elements.
Switch off the view state, which can potentially expand pages to many megabytes in size. Pull the data from the database on each request (in combination with caching strategies).
You Can use JQuery to retreive the data from database which is much better than using ajax. Check this http://www.codeproject.com/KB/webservices/JsonWebServiceJQuery.aspx
Related
I'm relatively new to .NET and the controls it contains. I realize that this question is really open to opinion, but would appreciate the input of those of you who've had to support .NET pages where your database record sets were in the millions of records.
I have an Oracle table that will grow to several hundred thousand rows and likely to over 5-10 million over the next few years. Our current system that this is replacing is 6 million records. Not massive, but enough to impact page performance. We want to be able to display the data via an ASPX page with typical Next Page/Previous Page navigation. The user would have the ability to search/filter the data to limit their record set, but could still potentially have two to three pages (100-300 records) of results.
It is my understanding that the ListView object in .NET retrieves all of the records in your data source's SELECT statement and then the filtering/paging controls manipulate the results at the .NET server instead of at the database (until you issue a new databind()). This seems to be a potential performance issue in larger record sets that we are already seeing with just a few thousand records where the filter controls don't appear to filter until the entire underlying tables records are returned to .NET. If this is an incorrect assumption or understanding, please correct me.
Given the potential growth of our system, we're concerned about performance down the road. We're considering that we might be better off using a SELECT ... FROM ... where rownum>1 and rownum<=50 to work through our data pagination in our databind forcing the pagination to happen on the database side and executing on every page change, so that we're only working with a few hundred records at a time on the .NET side. Is this a false assumption regarding performance? Is there a performance issue with .NET performing the paging on large record sets?
My final question is if we're better off doing our pagination in the database rather than .NET, would we be better off storing our initial search results in a session specific temporary table and then using that as the basis of our paginated data rather than running the above query repeatedly against the master table as we move through the records?
Thank you in advance for everyone's input.
I'm working on an asp.net website with telerik controls. Im using multiple conditional grids (Show data based on selection in a grid.)
Every time I do new selection it is kinda slow (I'm using ajax call). Is it possible to preload all data to the client and then instantly show it to user.
I mean, is there any simple way of doing so?
There is a good chance that the slowness comes from the amount of data being rendered on your page. Keep in mind that AJAX still goes through the entire life-cycle of the page; the savings come from not having to render the entire page, just the updated parts.
Are your AJAX settings correctly updating the controls, or do you have a massive 'pnlAllControls' updating 'pnlAllControls'?
For example -- if you have Grid1, Grid2, Grid3; and Grid1 updates (Grid2, Grid3) while Grid2 updates only (Grid3), you should set your AJAX accordingly.
If your controls are getting data from the server, it does not make sense to cache the data on the client. I am not sure how much control you have over configuring them [controls].
You can store/cache data on server side instead (e.g. Cache, Session etc.). The data retrieval from there should be fast unless you send tons of data back and forth. But, caching data (on either client or server) should only be considered if the amount is 1)predictable and 2)relatively small.
Another technique to consider is paging/sorting on the server side. From my experience, you can get a real performance boost using just this.
There is no simple answer to your question. It all depends on data volume, security requirements. Moreover, your controls may not be able to pull data from the client-side.
I have a web app, that consumes a web service. The main page runs a search - by passing parameters to a particular web service method, and I bind the results to a gridview.
I have implemented sorting and paging on the grid. By putting the datatable that the grid is bound to in the viewstate and then reading / sorting / filtering it as necessary - and rebinding to the grid.
As the amount of data coming back from the web service has increased dramatically, when I try to page/sort etc I receive the following errors.
The connection was reset
The connection to the server was reset while the page was loading.
I have searched around a bit, and it seems that a very large viewstate is to blame for this.
But surely the only other option is to
Limit the results
Stick the datatable in the session rather than the viewstate
Something else I am unaware of
Previously I did have the datatable in the session, as some of this data needed to persist from page to page - (not being posted however so viewstate was not an option). As the amount of data rose and the necessity to persist it was removed, I used the viewstate instead. Thinking this was a better option than the session because of the amount of data the session would have to hold and the number of users using the app.
It appears maybe not.
I thought that when the viewstate got very big, that .net split it over more than one hidden viewstate field, but it seems all I'm getting is one mammoth viewstate that I have trouble viewing in the source.
Can anyone enlighten me as to how to avoid the error I'm getting? If it is indeed to do with the amount of data in the viewstate?
It sounds like your caching the whole dataset for all pages even though you are only presenting one page of that data. I would change your pagination to only require the data for the current page the user is on.
If the query is heavy and you don't want to have to be constantly calling it over and over because there is a lot of paging back and forth (you should test typical useage pattern) then I would implement some type of caching on the web service end to cache page by page (by specific user if the data is specific to a user) and have it expire rather quick (eg a few minuites).
I think you need to limit the total amount of data your dealing with. Change your code to not pass back extra data that might never be needed is a good place to start.
EDIT: Based on your comments:
You can't change the web service
The user can manipulate the query by filtering or sorting
There is a large amount of data returned by the web service
The data is user specific
Well I think you have a perfect case for using the Session then. This can be taxing the the server with large amounts of users and data so you might want to implement some logic to clear the data from the Session and not wait for it to expire (like on certain landing pages you know the user will go when they are done, clear the session data).
You really want to get it out of the ViewState beacuse it is a huge bandwidth hog. Just look at your physical page size and that data is being passed back and forth with every action. Moving it to the Session would eliminate that bandwidth useage and allow for you to do everything you need.
You could also look at the data the web service is bringing back and store it in a custom object that you make as 'thin' as possible. If your storing a DataSet or a DataTable in your Session, those objects have some extra overhead you probably don't need so store the data as an array of some custom thin object and just bind to that. You would need to map the result from the WS to your custom object but this is a good option you cut down on memory useage.
Let me know if there is something else I am missing.
I wouldn't put the data in either the view state or the session. Instead store the bare minimum information to re-request the dataset from the web service and store that (in either view state or session, or even on the URL). Then call the web service using that data and reaction the data on each request. If necessary, look to use some form of caching (memCache) to improve performance.
How much difference will there be if I bind data to a gridview in comparison to a loop through the data and build out the Html?
I am currently using an html table in the ItemTemplate of a gridview and have <%#Eval("ID")%> in that table to bind the data from iQueryable
What if i loop through the IQueryable and build out html from the code behind instead. How much is the performance difference if someone has done that comparison or has good knowledge of which should be the way to go ?
Thanks.
I am using Asp.net /C#
Generally speaking the performance benefit of avoiding complex controls and binding is not measurable on an individual page level, and thus inconsequential. The developer time saved in using existing controls and simpler api's, like data binding, greatly outweigh the small performance hit.
In our main application, we use complex controls and data binding throughout the ASP.NET page. The data binding portion of the full page life cycle is under 2% of the time to process the whole page. It's much less than the I/O for the page and particularly the DB calls.
One exception is in reports. The reporting engine we use supports directly setting data in a loop or using data binding. Data binding is much easier. However, with some reports hitting 200+ pages with over 300,000 bound data items, the performance hit of data binding was noticeable in this case. In our reports, we don't use data binding.
I have a couple of customer websites that have been in use for 4+ years. One of the pages on both sites contain drop downs that now contain 3000+ items. I have attempted initial solutions to this problem by adding new pages to both sites where one site is using silverlight and the other is using ajax.
The silverlight page currently performs better than the ajax page once the control has loaded but requires the user to have silverlight or the permissions to install it. The ajax version has issues that it sends the still requires an initial download of all of the data to complete the drop downs when the site is first loaded.
The ajax version still uses view state and sends 400k+ to the server on every request.
What I would like to do is use the ajax version but cache the drop down data on the client and only download the data once a day.
If I save the data using asp.net to isolated storage (I have that part sorted) is it possible to access it using client side code such as jquery?
Personally, there's no way I would use a dropdown for 3000+ items. Not only is there a problem with data transfer & viewstate but also its a pain for any user to scroll through that many items to find the option they require.
Have you considered something like this:
http://jquery.bassistance.de/autocomplete/demo/
You have a textbox that says something like 'start typing'... the second the user types the first letter of what they are looking for, an ajax query is made grabbing all the entries that begin with that letter.
Given that there are 26 letters in the alphabet, you will return on average 4% of the data i.e 120 entries rather than 3000!! Also as users get to know your system, they can type more and more letters and find what they are looking for much faster. Beats scrolling through a list of 3000 entries, and makes your application easier to use, more responsive and more network friendly!