Upon attempting to populate a grid on the UI layer, the UI asks the BI layer for a list of results, EF returns the list of each result, and they are cast into a DTO that pulls in some additional information, this is converted to a list and returned to the UI layer.
The performance is impossibly slow. EF is creating a new context and hitting the DB for each individual result. This is because the DTO class will create a new dbcontext each time it is initialized if it is no longer open/active. Finalization of the class closes out the context. I believe this is what is killing performance.
Is there any way to batch something like this? In SQL i would performance a JOIN on the tables i need to get the resulting data loaded into a dataset. In EF when i create the DTO i then access the mapped objects relations and access data from other objects as such.
How should I access a large amount of records via EF to be returned to a UI layer grid when i need to access some information that is not stored in that particular entity object? (an example of this would be having a relation between users -> customers via the customer_userID -> userID PK. And wanting to display the User's Name, once i have the Customer object i need to then query the User object for the Users name in relation to that ID.
Anyone have any articles that can point me the correct way?
Functions that were passing a large amount of data to the UI layer were causing the issues. Often because the object from the DB layer had to have some operations performed on it before it could be passed to the UI layer. In essence some of the list generating functions were causing the store to create a new context for each individual request.
Just-in-time paging was one performance boost, so as to request a starting offset and a record count. What is important to note there is we had to create functions to simply return the total counts so the UI grids knew how many records they were dealing with.
The next fix was on the functions that apply BI rules to the objects before they are passed to the UI layer. In these cases we open a new context and pass that into the function, so it uses that context and only closes it out after the results are all completed.
Related
I am trying to develop three tier architecture.
Data Access Layer:Will have methods for Connection String, Executing Store procedure, Executing Select query etc. Most cases, this will return Data Set
Business Layer: Can Access Data Set from Data Set & provide sorting & filter data to the Web Forms.
Presentation Layer: Will have all web pages, user controls (if any). This Layer can access only Business Layer
This works OK for me till requirement was only limited to display records. However, when it comes to paging or sorting, each time i have to bind fresh data, resulting in unnecessary Database Hits. To avoid this,i have stored Data Set into cache & typecasting it to Dataset objects. Is there any other alternatives for the same
I done with binding Grid with List objects & not with Dataset. I Have initialized List for the first time to get required Data.
I have a simple ASPNET MVC list view which passes a custom built model object.First time through I need to go out to the server and return a list of objects and display on the view.
I am building the view to allow sorting by different columns, searching and paging, and I have written all the code for this. However every time I am going to the server and pulling the data.
Can I cut down on these DB roundtrips by using the list that I obtained first time ?
If so how do I pass it from the view back to the controller?
Viewdata, Tempdata - or pass the formcollection perhaps?
Take a look at http://www.knockoutjs.com this will give you a lot of functionality to manipulate the list in the browser and keeping the view in sync.
But it really depends on how big your list of objects is. If the quantity of data is large it is actually a more practical solution the way you implemented it already.
Actually, if you go back to your controller you'll be going to server.
I supose you really mean that you don't want to requery again your database to obtain data filtered, sorted and paginated and would like just to sort or paginate data from your model view classes with data alreay on the view.
Keep in mind that this type of operation doesn't have to be always better than requerying your database, as you'll be sending more info through the net back to the server and, usually, programatically sorting of list-like elements are operations less optimized than sorted retrievals from database.
The critical decision here will be between the cost of your database query and the size of your listview element. If your query is light and gets (or can get) many results, sorting it will be more expensive than requerying, while if your query is complex and usually throws few results then, effectively, it will be more efficient to sort data without requerying database.
Try to create a new controller method for the sort, this method will receive as a parameter your list view model class, and you'll need, somehow, to send back to your server that info. I usually use AJAX calls where I pass data as JSON to the controller.
I am writing an ASP.NET application with a SQLServer database in which I have to calculate rates for members of my application. These calculations affect more than 10 database tables.
I believe I have two options :
In the data access layer, fetch the data for a member from the first table in the database and return it to the business layer. Once there, perform some calculation on the fetched data to generate new data to be saved in a second table. Finally, call back into the data access layer to save the result. Afterwards, repeat this whole process, pulling information from the second table to calculate results to be saved in a third table, and keep doing this for all necessary tables.
Use a stored procedure to encapsulate calculating and saving the new rates for a member in the correct tables within a database transaction.
Which option do you think is best and why? If I use the first option, how should I perform all of the inserts and updates in one ADO.NET transaction from the business logic layer? I am currently barred from using an ADO.NET transaction outside of the data access layer.
IMO it depends on how much priority needs to be given for performance and modular design.
If this was a trading application where I would have to calculate the prices instantaneously from different tables, I would use a stored procedure
Better performance for certain amount of load but when it gets too much queries, then a distributed database becomes essential
One disadvantage is that if you want to move to a different database in the future (in most cases people don't), then you have to port the stored proc to the new ones
The other option I would have is to keep most of the values in a distributed cache (like memcache) and do the calculations in a business layer.
The values will be pre-populated into the cache and the cache will be refreshed as and when there are changes.
This gives flexibility in terms of removing the database dependency.
But it seems to me that your operations are quite heavily DB dependent in functionality and suggest a stored procedure route. If you think the second option of cache is possible, it is worth a try.
I have a business logic layer object for my customers. It has a lot of different fields, around 100: Id, Name, and other fields.
I'm retrieving the customers un a data grid (RadGrid). Obviously in the grid I'm just retrieving a few fields.
The question is: if the business logic layer object has too many fields, even if I don't show all of them in the data grid, is it going to slow down the page? do you think it'd be a good idea to create another object for my customers for the lists?
Thanks
It wil take extra time to populate the server side list, but the key consideration is teh amount of data passed to the clinet, especially as a copy of the data will probably go into viewstate as well and when submitting the form this data will come back (through viewstate)
Rather than create a new object you could just use LINQ on you business objects to reduce the amount of data to pass forwards.
I am developing an ASP.NET 2.0 website. I have created data access and business logic layers. Now in the presentation layer I am returning data from the business layer as a dataset.
My question is whether to use a dataset or object collection (for example a Category object representing the Category table in my database). I have defined all classes that are mapped to database tables (Common objects). But there are situations where I need all of the records from the category table in the presentation layer. I am just confused. What should I do?
You don't want to return datasets, you want to return objects.
Generally when you have a data access layer and a business logic layer you will also want to have an entity layer. The entity layer will be an in memory repersentation of the database result set. If you return one row from the database you load one entitiy object. If you return more than one row, you will load an entity for each row, and return a collection of entities to be consumed by the presentation layer.
If you're using .net 2.0 and above for example, you can create generic collections of the entity type and easily bind different types of controls to these collections.
Hope this is helpful for you.
I would recomend returning objects or IEnumerable/IList etc of objects.
Depending upon your DB access you can populate a list of category objects manually or using something like LINQ2SQL or ADO.NET Entity framework very quickly, and if required cache them.
I've used both methods depending on the situation. If you only need to display the data from a table in a grid, the dataset(or datatable) method isn't terrible because if fields are added to the table they will automatically appear in the grid...assuming you are auto populating the grid with the columns. I look at this method as more of the quick and dirty method.
I would not return datasets at all. You are tightly coupling you service to you database. In the future when you database evolves you will be forced to change anything that uses the datasets. You want to create an abstraction layer between the database and the service.
I'd go with an object/collection solution. So if you are returning one row from a table you have one object, if you are returning several you'd use a generic collection. Generic collection will avoid a boxing/unboxing hit.
[edit]seems I was beaten to it[/edit]
You should create an entity layer, having classes representing each table in database. And then return lists of those classes.