Entity Framework MVC Slow Page Loads - asp.net

I have a web application running on Windows Azure.
It is built with ASP.Net 4.0, MVC 3, Entity Framework 4.1, SQL Server 2008 using the Repository pattern.
The app was performing very well until recently. Most of our customers have a few hundred rows of data but one of them is starting to reach 2000+. This has dramatically slowed their page load times (15 - 20 secs on some pages).
We just started using MiniProfiler which indicates we have a very chatty app- with duplicate SQL calls.
In an effort to give as much detail as possible and to figure out how we can do things better, I'll explain some of the stuff are doing.
We have a base controller that has two protected objects (CurrentUser and CurrentCompany). We use these quite a bit in our Actions, but it hits the DB each time. So, I am assuming we need to store these objects in session the first time. Is there a big overhead in lumping around these objects? What about accessing their relationships later (CurrentCompany.Offices.First(), etc)?
We get the data for each page based on these objects: selecting, ordering, filtering their relationships, for instance:
CurrentCompany.Employees.Where(r => r.StatusId = Enums.Statuses.Active);
Here 'CurrentCompany.Employees' returns an EntityCollection but the Where changes it to IEnumerable. I heard that IQueryable is the way to go?
I have also read that EF is great to get things up and running quickly but you must do some tweaking to make sure it performs well with lots of data. From what I have read, it will bring back the entire row even if you only asked for 2 columns?
So, with all that in mind- can someone point out what I should be doing to make this scale for bigger clients. 2000+ rows is not that much after all. Should we be using views/stored procs?
There are plently of resources out there explaining how to get EF setup with basic selects, etc- but nothing really one scalling it once the data set gets bigger.
Any help would be greatly appreciated.
Brian

Strangely enough the clue might be in the 2000 rows. SQL changes the way it accesses data when the choice passes 0.1% of the dataset. You do not say if you have appropriate indexes on the tables.
http://www.sqlteam.com/article/sql-server-indexes-the-basics
may help
If you run SQL managment studio then there is a missing index feature
http://msdn.microsoft.com/en-us/library/ms345524.aspx
HTH
also here
http://blogs.msdn.com/b/sqlazure/archive/2010/08/19/10051969.aspx

How much eager loading are you doing? That was one thing we found to be a major performance hit, to solve it we started caching the properties using Entlib Caching Application Block
and then merging them into the object from cache vs getting them from the DB

based on my own experience, EF is very slow because it uses LINQ (very slow)
i had a personnal web site (a home page which displays multiple kinds of data, so i had to do multiple requests to the database server in LINQ). it was very slow, i tried to tweak the EF .edmx with lazy loading options, etc......the only solution i found was to get rid of linq, and rewrite it in ado.net, it takes more time to code, but i get rid of speed problems.....
LINQ requests must be translated to sql, there a lot of steps before it gets to the database, and when it gets data (select*), it translates it to a list of object models in EF.
thanks you

Related

is it the right way to do the project with asp.net mvc 4 and stored procedures?

Can you please spend ur time 4 this and help me as it is a big project if we do mistakes noe we will repeat the same mistake in next 3 years.
we are starting a new big project in asp.net mvc4, Html5 and CSS3 (We preffer these Because our application should target tablet, desktop, mobile).
So I feel bec its new we can better start with code first approach(latest) like first creating classes and prepare db(sql express) and deploy well and also we have support of EF migrations, So I think that way the proj become more successful and maintainable easily.
But the small concern with the manager, I dont know whether he is correct or iam wrong bec he has good experience in working but he is also new to MVC. below is my query ive explained pls look into this,
-Database with tables are ready.
-Now we have to start the project. (I think we can do code first reverse engineering and start)
But, For that our manager asked our team to prefer and write stored procedures, and use asp.net mvc 4?
My question is, is it the right way to do the project with that combination? Why am asking you is I have been watching videos/tutorials through online they never said the samples with that combination and all are saying with out using stored procedures, is it any problem like performance we will not get as we are using storedprocedures through EF OR because we are using stored procedures in backend we will get performance and maintaing easily as our PM says,
Iam totally confused???? please help me in this if you have solution.
There is no any issue using stored procedure in MVC, you can go for it. There would be worth reading this about your performance question.
If controller is not depended on the implementation of data access layer then no matter what you use whether EF or Stored procedure, you are good to go.
Darin Dimitrov already explain it about here.
That videos doesn't represend examples of using stored procedures, because they doesn't need stored procedures! All of results from SPs can be achived by EF easily and with much less effort and waisting times.
However, you should consider 3 things:
1) It seems that you all are new to EF. So, using EF is a potentially thechnical risk for your project!
2) Keep in mind that EF will cause some overhead to your app and if you're creating an app which works with huge data or should handle many requests at once, maybe it's better for you to keep going through SPs.
3) From the other hand, if you get familar with EF, it will dramatically progress your project fast and easy because of its useful abstractions and conventions.
So, first know your project, then deside which way you should go...

Getting a handle on a huge classic asp project

I've been working with an ASP Classic site with over 500 files, some of which aren't used and some of which are; along with a database with hundreds of procs and functions and tables, in the same shape.
I need a way to get a grip on the project so I can eventually migrate it. I don't have time yet to walk through every single page and look at the SQL (stored procedures are in the database and are called properly within the ASP pages), so I'm at a loss as to how to get a handle on this.
My immediate thought is to make ASP classes and put them into the pages as I go - they'd pretty much be used for getting and setting fields, validation, and sending recordsets into display functions.
Is this a reasonable approach? Am I missing some strategy?
How would you approach this? A migration to another platform at this point is considered, but not feasible for the short term (next couple of months)
You can try to compile the project using http://aspclassiccompiler.codeplex.com/ or you can migrate to ASP.net MVC one page at a time (when needed) and using a mix of both in the meantime.
My simple advice is stop think about code. Spend more time with the UI actually using it and spend time examining in detail the database schema.
Edit
If you are trying to determine what pages are active then use IIS logging to harvest distinct pages hit. Also do some scripting to collect the names of files and text search the files in the site looking for any occurance of those files. This info should identify parts of the site that are rarely used or dead.
However in all probability there will considerable content in the "active" files which are also dead. Let me re-iterate do not actually to add classes or refactor the code at this stage you should concentrate on understanding what it does not how it does it. Understanding the DB Schema is a vital step and then understanding what UI interactions bring about specific changes in the DB.

Querying the Cache using Linq in asp.net

Search is the most used feature on our website and the search query is the most CPU intensive, complex and frequent query that executes on our db, causing heavy CPU usages on the db server. To reduce the load on the db we have been looking at various caching strategies. For now, we intend to use the ASP.NET Cache.
The idea is to have an in-memory db of the most frequently/recently created/accessed objects in the cache and then query the in-memory db using linq to come up with search results. My initial thought was to Cache a List of the Users and then query or modify this List using linq. But given the complexities of multiple threads accessing or trying to modify List I was looking at other options.
Which is when I thought that instead of Caching a List, cache the individual User objects with its Id as the key and try and query the Cache. At http://msdn.microsoft.com/en-us/library/system.web.caching.cache.aspx I see that the Cache has an extension method AsQueryable but I am not sure what does this mean. Cache is a key value pair so with AsQueryable will I able to query the keys and get a set of User Objects or will I able to query the User objects and get my desired result?
Before you start this you really need to have some measurability in place around it -- there is no way to figure out if your changes help or hurt without having some good, solid data to make that judgement on. Performance, especially performance at scale isn't something you can think or guess through. You have to know your way through it.
As for your solution, I think you might well make the problem worse or at least create another problem here. Your database server is theoretically designed to handle arbitrary user queries across vast information sets efficiently. Linq is awesome but it is not really meant to be an ad-hoc search engine -- it doesn't have the sorts of indexing capabilities one really expects from search engines. Just because it can expose things as an IQueryable doesn't mean you should treat it that way. And even if you've got a way to efficently search the cache, you've got another problem to get past -- how do you identify what is most frequently used? And how do you manage the ASP.NET cache to not start ejecting things when it gets low on memory?
You would probably be better served here by:
Starting with some good old fashioned database tuning -- why are your queries so slow and expensive? Are you missing an index somewhere?
Looking at caching the results page output, especially if your search URLs are GET-able as that is pretty easy to manage. This is a great short term solution if the site is melting.
Look at building the search bits properly. Using LIKE %whatever% is not a proper search. Full text indexes in your database is a good start. Something like lucene.net is probably better.
No, cannot use AsQueryable to query User objects and get the desired result I was looking for. So now I will be using a static List for the time being though I know I will have to change sooner rather than later.

ASP.NET MVC and Linq, when to use?

I just started working on an asp.net / C#.net application that is going to call a number of different procedures. The -only- thing these procedures do is create database table views, and the only thing I want to do is to store the information in variables. Then pick out which columns I want to convert to JSON, and then make a JSON string. I've actually written code for that in C#.net already, which is smaller, but since I switched to asp.net mvc I'm a little unsure if I should keep it or go with the whole Linq thing.
I checked out the Linq --> SQL drag & drop functionality, and that instantly created about 200 lines of code with set & get methods and everything.
So my question is, is it still worth using Linq even for just extracting data? Eventually this data will be fed to a javascript timeline, which is where I was told MVC would be highly useful with regards to Ajax functionality.
Since you are only using LINQ-to-SQL for data retrieval, I can't think of a single reason NOT to fully utilize it. I've been working on an MVC 1.0 project since last April. During that time, I've had to quickly become familiar with a number of technologies, LINQ-to-SQL being one of them. Get comfortable with it, and also look at the repository pattern...you will be very content and things will go relatively smoothly.
Now, when you get to INSERTs and UPDATEs, things are going to get a little more sticky. LINQ-to-SQL is still up to the job, but you'll need to understand how things work internally a little better. I highly recommend "Pro LINQ (Language Integrated Query) in C# 2008" by Joseph C. Rattz, Jr. The sections covering LINQ-to-SQL easily take up over a third of the book with detailed examples.
As far as the JSON objects go, LINQ-to-SQL's biggest contribution is that it allowed me not to have to worry about creating specialized views or stored procedures just to handle those one-off-types of data retrievals. My current project has a database of 65 tables...NO stored procedures. I can do filters, unions, multi-level joins...and it's all maintained in the application. Sweet...
Yes, it's totally worth it!
LINQ2SQL provides you a great subsurface to retrieve and save data to.
However, you'll need to implement your own Repository Pattern as you dig deeper into ASP.NET MVC.
And during the implementation of the Repository and the required (and even custom / webapp-state based) Queries, you'll be very glad to have all the power available at your fingertips that LINQ provides.
LINQ only adds to the available toolkit you can lean on when creating code. Even if you are using LINQ in a trivial way now, implement it, get familiar with it, and take advantage of the power it gives you both on this project and future projects.
Linq2Sql is quiet good for creating select queries. As it is appearing that you need to create JSON objects from database views it will be quiet useful

ASP.Net Data Driven Website Efficiency

I'm creating an ASP.Net website that displays large amounts of data. The data is served to me through a data access layer. From the data I'm getting I'm building up large data tables and then displaying these using either gridview's or dynamically created web controls.
The problem I'm finding is that the website is slow when a lot of data is passed to it. I've read that data readers are the way to go but I can't use a datareader directly from the SQL table due to needing to use the data access layer.
I also don't have the option of partially filling the datatable as I need to apply a lot of sorting methods to the data to display what I need.
Any suggestions of ways to speed up data tables? or perhaps use something else that's more efficient?
Since you are “.. building up large data tables and then displaying these using either gridview's or dynamically created web controls”, the network can be a bottleneck. See the answers to the similar SO question that may be helpful.
Are you absolutely certain that the bottleneck is in the Web Application?
The first thing I would do would be to guess what the longest SQL query you execute on a slow page would be, then see how it runs in the query browser.
If it's slow, work on optimizing that.
Pulling 'large' amounts of data into a web application and doing sorting / filtering there is always going to be slow, depending on your definition of 'large'. If you can apply any sorting / filtering on the database server before you pull the data to your web application that should speed things up. You say you don't have the option to do this but sorting is something that database servers are made for, are you sure you can't make this work?
You can use distributed cache, to cache your data. Memcache (http://www.danga.com/memcached/) or Velocity Microsoft Distributed Cache (http://www.microsoft.com/downloads/details.aspx?FamilyId=B24C3708-EEFF-4055-A867-19B5851E7CD2&displaylang=en).
The first thing you will want to do is to pinpoint exactly what part of the process that is being slow. It might not be where you think it is. Do code profiling or timing of different parts to determine exactly how much time each piece of code consume during a request. In our case we found that the data layer (executing readers, populating object models) were really fast (with a couple of exceptions that were taken care of by indexes in the database), while we had some javascript on the client that was really slow.
So, start with measuring, then decide where to optimize.

Resources