ASP.Net Data Driven Website Efficiency - asp.net

I'm creating an ASP.Net website that displays large amounts of data. The data is served to me through a data access layer. From the data I'm getting I'm building up large data tables and then displaying these using either gridview's or dynamically created web controls.
The problem I'm finding is that the website is slow when a lot of data is passed to it. I've read that data readers are the way to go but I can't use a datareader directly from the SQL table due to needing to use the data access layer.
I also don't have the option of partially filling the datatable as I need to apply a lot of sorting methods to the data to display what I need.
Any suggestions of ways to speed up data tables? or perhaps use something else that's more efficient?

Since you are “.. building up large data tables and then displaying these using either gridview's or dynamically created web controls”, the network can be a bottleneck. See the answers to the similar SO question that may be helpful.

Are you absolutely certain that the bottleneck is in the Web Application?
The first thing I would do would be to guess what the longest SQL query you execute on a slow page would be, then see how it runs in the query browser.
If it's slow, work on optimizing that.

Pulling 'large' amounts of data into a web application and doing sorting / filtering there is always going to be slow, depending on your definition of 'large'. If you can apply any sorting / filtering on the database server before you pull the data to your web application that should speed things up. You say you don't have the option to do this but sorting is something that database servers are made for, are you sure you can't make this work?

You can use distributed cache, to cache your data. Memcache (http://www.danga.com/memcached/) or Velocity Microsoft Distributed Cache (http://www.microsoft.com/downloads/details.aspx?FamilyId=B24C3708-EEFF-4055-A867-19B5851E7CD2&displaylang=en).

The first thing you will want to do is to pinpoint exactly what part of the process that is being slow. It might not be where you think it is. Do code profiling or timing of different parts to determine exactly how much time each piece of code consume during a request. In our case we found that the data layer (executing readers, populating object models) were really fast (with a couple of exceptions that were taken care of by indexes in the database), while we had some javascript on the client that was really slow.
So, start with measuring, then decide where to optimize.

Related

Entity Framework MVC Slow Page Loads

I have a web application running on Windows Azure.
It is built with ASP.Net 4.0, MVC 3, Entity Framework 4.1, SQL Server 2008 using the Repository pattern.
The app was performing very well until recently. Most of our customers have a few hundred rows of data but one of them is starting to reach 2000+. This has dramatically slowed their page load times (15 - 20 secs on some pages).
We just started using MiniProfiler which indicates we have a very chatty app- with duplicate SQL calls.
In an effort to give as much detail as possible and to figure out how we can do things better, I'll explain some of the stuff are doing.
We have a base controller that has two protected objects (CurrentUser and CurrentCompany). We use these quite a bit in our Actions, but it hits the DB each time. So, I am assuming we need to store these objects in session the first time. Is there a big overhead in lumping around these objects? What about accessing their relationships later (CurrentCompany.Offices.First(), etc)?
We get the data for each page based on these objects: selecting, ordering, filtering their relationships, for instance:
CurrentCompany.Employees.Where(r => r.StatusId = Enums.Statuses.Active);
Here 'CurrentCompany.Employees' returns an EntityCollection but the Where changes it to IEnumerable. I heard that IQueryable is the way to go?
I have also read that EF is great to get things up and running quickly but you must do some tweaking to make sure it performs well with lots of data. From what I have read, it will bring back the entire row even if you only asked for 2 columns?
So, with all that in mind- can someone point out what I should be doing to make this scale for bigger clients. 2000+ rows is not that much after all. Should we be using views/stored procs?
There are plently of resources out there explaining how to get EF setup with basic selects, etc- but nothing really one scalling it once the data set gets bigger.
Any help would be greatly appreciated.
Brian
Strangely enough the clue might be in the 2000 rows. SQL changes the way it accesses data when the choice passes 0.1% of the dataset. You do not say if you have appropriate indexes on the tables.
http://www.sqlteam.com/article/sql-server-indexes-the-basics
may help
If you run SQL managment studio then there is a missing index feature
http://msdn.microsoft.com/en-us/library/ms345524.aspx
HTH
also here
http://blogs.msdn.com/b/sqlazure/archive/2010/08/19/10051969.aspx
How much eager loading are you doing? That was one thing we found to be a major performance hit, to solve it we started caching the properties using Entlib Caching Application Block
and then merging them into the object from cache vs getting them from the DB
based on my own experience, EF is very slow because it uses LINQ (very slow)
i had a personnal web site (a home page which displays multiple kinds of data, so i had to do multiple requests to the database server in LINQ). it was very slow, i tried to tweak the EF .edmx with lazy loading options, etc......the only solution i found was to get rid of linq, and rewrite it in ado.net, it takes more time to code, but i get rid of speed problems.....
LINQ requests must be translated to sql, there a lot of steps before it gets to the database, and when it gets data (select*), it translates it to a list of object models in EF.
thanks you

SQL Server database or XML, what to choose for asp.net app?

I have a database with about 10,000 records. Each record has one text field (200 chars or so) and about 30 numeric fields.
The asp.net app only does some searching and sorting and displaying data in a grid. No data sharing between users, read only operations (no database updating), very little calculation.
Should I go with an XML file and use Linq or should I use an SQL Server database? Please give me explanations of your choice.
Should I go with an XML file and use Linq or should I use an SQL Server database?
TOTAL non issue - SQL.
The asp.net app only does some searching and sorting
Read up on a beginner SQL book what an "INDEX" is. XML files have none - so SQL databases are a lot more efficient with sorting and filtering.
It really depends on your needs, Ask your self following questions.
Is your data set going to increase?
Is speed one of the most desired thing of your app?
Are you going to run complex queries?
Is the schema of your data going to change?
If answer to most of the questions above is 'no' then feel free to use XML. Sql provides lot's of features and mainly intended for data storage and retrieval, while with XML you can store data but I would say its main usage is data interoperability and exchange
If your data set increases then SQL should be a choice because you can create indexes on your dataset which will increase speed of retrieval for the data, files are usually read serially and thus are slower for ad-hoc data search.
I think you'll find SQL to be much easier to develop and maintain. XML is great in some scenarios, but I've found it often presents a steady stream of headaches in the long term.
From a performance perspective alone, it's hard to say which approach would be better without knowing the details of your queries and schema. In general, though, SQL tends to win, since it's built for searching and sorting, where XML is not.
For a readonly dataset of that size, you might be able to read the whole thing into memory on the web server side, and do your searching and sorting there.

Querying the Cache using Linq in asp.net

Search is the most used feature on our website and the search query is the most CPU intensive, complex and frequent query that executes on our db, causing heavy CPU usages on the db server. To reduce the load on the db we have been looking at various caching strategies. For now, we intend to use the ASP.NET Cache.
The idea is to have an in-memory db of the most frequently/recently created/accessed objects in the cache and then query the in-memory db using linq to come up with search results. My initial thought was to Cache a List of the Users and then query or modify this List using linq. But given the complexities of multiple threads accessing or trying to modify List I was looking at other options.
Which is when I thought that instead of Caching a List, cache the individual User objects with its Id as the key and try and query the Cache. At http://msdn.microsoft.com/en-us/library/system.web.caching.cache.aspx I see that the Cache has an extension method AsQueryable but I am not sure what does this mean. Cache is a key value pair so with AsQueryable will I able to query the keys and get a set of User Objects or will I able to query the User objects and get my desired result?
Before you start this you really need to have some measurability in place around it -- there is no way to figure out if your changes help or hurt without having some good, solid data to make that judgement on. Performance, especially performance at scale isn't something you can think or guess through. You have to know your way through it.
As for your solution, I think you might well make the problem worse or at least create another problem here. Your database server is theoretically designed to handle arbitrary user queries across vast information sets efficiently. Linq is awesome but it is not really meant to be an ad-hoc search engine -- it doesn't have the sorts of indexing capabilities one really expects from search engines. Just because it can expose things as an IQueryable doesn't mean you should treat it that way. And even if you've got a way to efficently search the cache, you've got another problem to get past -- how do you identify what is most frequently used? And how do you manage the ASP.NET cache to not start ejecting things when it gets low on memory?
You would probably be better served here by:
Starting with some good old fashioned database tuning -- why are your queries so slow and expensive? Are you missing an index somewhere?
Looking at caching the results page output, especially if your search URLs are GET-able as that is pretty easy to manage. This is a great short term solution if the site is melting.
Look at building the search bits properly. Using LIKE %whatever% is not a proper search. Full text indexes in your database is a good start. Something like lucene.net is probably better.
No, cannot use AsQueryable to query User objects and get the desired result I was looking for. So now I will be using a static List for the time being though I know I will have to change sooner rather than later.

How to improve page loading delays?

I am using gridview control of .net framework 4.0. My list contains 1000 of rows which i am binding to a gridview on each postback. Hence, My page is taking time to load I want to speed up the system. Is there any other control available which can enhance the performance or is there any other way to achieve this?
What all i want is faster performance
You might be able to use faster controls, such as the Repeater, but it depends on what feature you really need. Are you only displaying the data or is it editable?
With such a large amount of data you can look at optimizing what HTML you use for rendering, as you may be able to split the page size by half...
First step should be to do a preformance check to find out what exacly is slow.
Check where in the code things take time, it could be one of many things.
1) If the control uses javascript, perhaps the users are on a old version of their browser with a slow javascript engine.
2) Perhaps the issue is bandwidth?
3) Perhaps its missing SQL index
and on it goes.
Dont guess at what is wrong, find out for sure what is taking the time, and solve then one at the time.
Like Forgotten Semicolon suggested loading via Ajax might be solution, it would give the user a better idea what is going on.
Other then that i would would heavly suggest caching if posible, you can use the built in Cache options to cache the datatable.
In addition to the point about not binding on every postback, I am assuming the datasource is an SQL database. You should probably check to see how fast the query runs. Make sure you have properly indexed your tables.

ajaxified auto suggest

I have a search module with Auto Suggest feature to build in ASP.Net
The search criteria is Training Name and there is a table in database that stores trainings. The size would be as large as 30,000 trainings in the table so I have to be very careful in selecting the approach keeping in mind the performance.
There could be about 3000 users logging in the system simultaneously. When the user starts typing a training name the system should autosuggest.
The approaches that came in my mind were as under
Cache object - There would be a database hit after the user types 3 (e.g. saf) characters and the system would search the activity table for all trainings starting with saf and would cache them. The other requests would go thro this cache.
But the problem with this approach would be if there are 3000 concurrent users using the system and if they all search for different combinations of 3 different letters the cache would just blow.
Client side caching - Did not think much on this. The only drawback I see here is we might have to purge the temporary internet folder periodically.
Using Session - I thought to rule this out completely as I thought it would hit performance.
Can you please suggest the best or any other different approach I can take here. I am looking for all information/ideas that you have on this.
Thank you so much
Deepa.
My favourite jQuery plug-in to do that (if you're in intent to use jQuery) is the Flexbox.
It has a really impressive list of features.
You could use the jQuery Auto Complete plugin, which has caching features built in.
$(document).ready(function()
{
$(".landingpage").autocomplete('/AutoSuggestHandler.ashx',
{
minChars: 1,
matchSubset: 1,
autoFill: false,
delay: 10,
scroll: false
}).result(OnResultSelected);
}
Furthermore, you could specify outpu caching on the generic handler, to accommodate the need of caching across users.
I think your first approach will work.
Make sure there is an index on the field - you probably won't need to index the whole field. This should give the database a decent boost. You may need to look at full text indexing depending on how your search works, or even use an external library like lucene for the index is performance is an issue.
Cache the object, or even the resulting xml/json from the queries to improve performance.
You should also set the http headers so that browsers cache the xml/json as well.
Your posting really contains two questions:
How can I get autocomplete on my webpage?
I am concerned about performance due to a large number of queries hitting my database at the same time.
My answers...
1: We've found the ASP.NET AJAX AutoComplete Extender works well on all modern browsers, provides a slick user experience and is pretty easy to implement.
In your web application you need to create a web service that has a method with a specific signature (covered in the documentation linked to above).
2: Have you proven that you actually have a performance bottleneck with this part of your project? I'd recommend setting up a test harness and hitting your database with a large number of autocomplete queries to see how much it can take. Be wary of premature optimization.

Resources