How Many DBContext's Should I have - asp.net

Using Entity I currently have dbcontext that has every table in it.
I'm wondering if that's what everyone does, or do you have a context per module for example. To me the dbcontext was a connection to map the models to a database, and since there is only one database, I only need one.
Before I get too far along I want to see if that's appropriate.
So 1 db context per database or many?

I went through this same process recently and found some great resources on the subject. Here are a couple that were very helpful:
Shrink EF Models with DDD Bound Contexts.
How to decide on a lifetime for your ObjectContext.
I was building a Desktop app, and I ended up using multiple contexts so that I could keep the lifetime tied to the module rather than the application. This has worked out very well for me, and I like that my DbContext isn't swamped with DbSets and is limited to the ones that are relevant for the current module.
In an ASP.NET MVC app, it is different since the DbContext will only live as long as the request, and in those cases, I usually use a single DbContext to simplify things unless the database is very large. With a large database, I would probably break it apart into multiple DbContexts just to limit the overhead and the clutter, and keep things compartmentalized.

currently the EF hasnt been made to be broken down into diff dbContexts. Here a great talk about it
What we have done for this case that we have made a project diff from our MVC website just for the database generation and then have separate dbContexts for every requirement.
This way our dbContexts are never large and are easy to maintain

Related

Sharing stored procedures across multiple apps

Team A has an enterprise app that uses ADO.NET for data access that executes stored procedures. The data access is encapsulated in it's own project (let's call it DAL.dll)
Team B is creating another unrelated app that's reusing the stored procedures in the enterprise app. This app is currently using the MS application block for data access. The issue we run into is that whenever Team A make any change to the input/output params in the stored procedures, there is a runtime error in Team B's app and this app needs to be updated to accommodate the additional params (or params that were removed). So, most of these go unnoticed until a user complains. At the very least, we would like to have the app throw a compilation error so that the build process warns us of the changes made.
One way to do this is to have Team B's project add a reference to the DAL.dll
I'd like to know if there are any other cleaner ways of solving the issue. We are ready to replace Team B's MS Data application block to use a different technology (Entity Framework?) if necessary.
Among the other answers, I'd strongly suggest getting those stored procedures into source control, in a Database Project. You then may be able to use the features of your source control system to do several things:
Lock some of the code so that it cannot be changed
Give you notifications if the code is changed
Warn you if the stored procedures change in a way that would prevent them from being called
Branch the stored procedures so that each team can have their own version of changed code, while keeping the unchanged stored procedures common. You of course will need to separate the different versions in the database.
I agree with the other posters on this thread that you should not share stored procedure's across different .NET DLL's, that is just a recipe for disaster. I would also shy away from ORM's like Entity Framework if you are doing anything at all complicated with your database schema because ORM's excel at getting a simple object model translated from your .NET application classes into SQL tables and SP's, but traditionally do poorly at optimizing them for performance on the database side. There will be people who claim otherwise, and they may have a valid point if you are an expert in wrangling an ORM to do waht you want like they are, but chances are you are not and it will cause you headaches in the long run.
A shared data access layer might work, but conceptually you are then just changing the implementation of the dependency from some code that a DBA wrote to some code that a .NET programmer wrote. Yes, you can use integration tests to achieve better verifiability, but the same case could be made for SQL with tools like Red Gate's SQL Test. I would shy away from this approach if the two applications are already experiencing some sort of pain from sharing SP's. That is an indication that the dependency just should be done away with.
If it were up to me, I'd just make a new schema for Team B's app. You can read more about schemas in SQL Server here: MSDN Schema description for 2008 R2. You can think of them as namespaces for SQL Server but with some additional bells and whistles like permission and access control. Separating out your different applications into separate schemas on the same shared database will probably make for the most flexible implementation in the long run.
unrelated app that's reusing the stored procedures in the enterprise app
If these two application are really unrelated why are those sharing procedures or even the same database. I know this is a long read, but I recommend you to read this: A Better Path to Enterprise Architectures
The partioning concept in there relates to the bounded context in Domain driven design:
Multiple models are in play on any large project. Yet when code based on distinct models is combined, software becomes buggy, unreliable, and difficult to understand. Communication among team members becomes confusing. It is often unclear in what context a model should not be applied.
Therefore: Explicitly define the context within which a model applies. Explicitly set boundaries in terms of team organization, usage within specific parts of the application, and physical manifestations such as code bases and database schemas. Keep the model strictly consistent within these bounds, but don’t be distracted or confused by issues outside.
It is expected you end with problems when you don't explicitely deal with this. You're lucky you're seeing early failures, as it can turn into problems much harder to find on the long run.
Analyze the problem again with the above in mind. Consider if you're missing some explicit context where this common functionality should live.
My question is: which team owns the store procedured and the database shared? Usually as a good architecture/design, you should not have two different apps sharing same database / procedures.
A better way to share data/functionality between two different applications is through a services or API, so the team who owns the functionality would be responsible to maintain it.
Also, have a good communication between both teams is highly recommend.
Depending on the owner of the DAL project, you could host web services and share the API. That way, you separate the Data Access Layer from the business logic, which allows anyone to use the same DAL without having to publish it to each different location.
From my point of view, it looks like both Team A and Team B should share the same core model and look at Multitier architecture as a possible solution.
It sounds like it would make sense to create a shared DAL that both applications can share.
I would add unit tests (or really integration tests) to make sure the DAL is compatible with the apps after changes. That way your tests would fail if incompatible changes have been made
"I'd like to know if there are any other cleaner ways of solving the issue."
The cleanest way is for Team B to sit down with Team A and encapsulate the relevant business logic into a shared API. It doesn't matter so much how you implement that API; what does matter is that the API's interface is documented and versioned so everyone knows what to expect.
One reasonable mechanism for this in a .NET environment is to use Microsoft's WebAPI.
In short, the question of "how do we share a stored procedure?" is most likely looking at the wrong level of abstraction.

EF and customer data separation

Is it possible to build an ASP.NET website using EF where each customer logging in has separately stored data? We have customers demanding that their data won’t be stored in the same tables as other customers’ data.
I’ve read that EF can’t work with several databases but is it possible to switch database at runtime depending on input parameters? I have a feeling it won’t be possible since the migration features are tightly connected to the database being used, but I'm not sure.
One solution could be to have a separate website deployment and database for each customer. They’ll get separate domains to access but that’s not a problem. But this solution feels a bit clumsy if you’re having many customers, especially with deployment and future upgrades.
Am I missing some smart ways of solving this or is this a very tricky issue?
is structure (of the db) the same ?
if so you could switch connections - not w/o issues though, but should work. For details on how that should be done check the long discussion we've had here (and linked previous questions etc.)...
Code first custom connection string and migrations without using IDbContextFactory

Entity Framework Best Practices in ASP.Net

I have just started working on entity framework in an ASP.net application and I was wondering if someone could point in the right direction with respect to best practices. I have some questions in particular which I have listed below.
First of all I am using entity framework 4.0. I already have my database created and so I created a class library and generated the entity model from the database. I had been using Guids generated by the database so I had to modify the ssl to include the attribute StoreGeneratedPattern="Identity". Is there a way to do this automatically or do I have to manually edit the ssl everytime I update the database and the model? (For those of you know are facing a problem with guids or want to know why I am doing this.. this is a clear article on the problem with auto generated GUIDs)
I was planning on using a single file in the class library to hold all the database queries. Is this good practice? How would I ensure different programmers dont rewrite the same queries over and over?
I was planning on using a unique context per method. Is this the right way to go? I read through Rick Strahl's post on context lifetime management. But I am still not sure if a unique context per method is the right way to go.
Can I have my database queries as static methods since they do not make use of any instance variables?
If I use a unique context per method as mentioned in 3 and I wish to modify an entity object returned by one context what would be the best practice? Do I use the attach functionality to attach the object to a new context and save the changes ? I havent tried this but I have read a couple of articles and it seems a bit straightforward but wanted to know if there are any alternatives to this.
If you any suggestions on how you use Entity Framework in an ASP.net application I definitely could use help. This is my first ASP.net/Entity framework application so any tips will help
This was issue in initial version of VS 2010. In some cases this should already work correctly once you have VS 2010 SP1 installed. If it doesn't install this KB.
You can easily get huge class with a lot of static methods. Try to use some separation by the entity type you are querying. You will never fully ensure that another programmer will not create the same query again. This is about correct query naming following same naming policy, documentation and communication among programmers.
Unique context "per method" is usually not needed. In most cases you should be happy with unique context per logical (business) transaction - in case of web application logical operation is in most cases single request processing = one context per request.
If you pass context instance to your queries the answer is yes. Once you don't create them as static and they will take context instance from their class instance you will be very close to repository pattern.
This is exactly problem with context per method and it is hard to solve because to make this work you must first detach entity from the first context and attach it to the second context. If your entity has also related entities loaded all these relations will be nulled during detaching (unless you use deep clone instead of detaching = creating second instance of the entity).

Entity Framework considerations for ASP.NET applications

I've created a business layer with a database model to be used in an ASP.NET application. I've used Linq To SQL Classes in Windows Forms before but using ORMs in per-request web applications is foreign to me. I've a few things I couldn't figure out and I'd appreciate if anyone give me any insight..
My BLL has static methods like GetRecord() or UpdateRecord(). Each one of these methods creates a new ObjectContext instance, destroyed after unit of work. I don't have any HttpContext.Current.Items cache implementation.
I'm using EF .NET 3.5.
I've created a pre-generated view (Model.View.cs) and added it to my solution. Is this all I have to do to use it? Also do I need to publish csdl, msl and ssdl files with my dll?
Is precompiling queries bad for ASP.NET applications? I have like only one or two queries for any ASPX page and very rarely a select query used twice in the same page. Will it slow down the application if precompile my queries? I wonder if a precompile made by Session A would be useful for Session B?
I've created the following method to update a record in ASP.NET page and I wonder if it is a good way to do it:
ASP.NET gets the record(Entity) using BLL.GetRecord()
Updates any values
Sends updated record to BLL.Update()
BLL.Update() checks if the record exists
Uses context.ApplyPropertyChanges() to update the record
I've red a few entity framework performance charts and in every one of those charts there are two different statistics for queries: first run and the second run. Since I work with unit-of-work type of design, will my queries never see second runs?
Thanks.
You need the CSDL, etc., either as files or resources. View pre-generation helps with performances, but doesn't relieve you of the need to include EDMX in some form.
No.
OK as far as it goes. Hard to say more without seeing code.
It depends. This post should help.

Need advice on selecting a data access method

I am in the early stages of planning a conversion of a large classic ASP database application to ASP.Net and I'm having trouble picking out which data access method to use. I have played around with Linq To SQL, Dynamic Data, strongly typed datasets, Enterprise Library (Data Access Application Blocks), and a tiny bit with Entity Framework, but none of them have jumped out to me as "the one". There are just too many choices - my head is swimming, help me choose!
Perhaps it would help to give some background on the application that I am converting along with the priorities...
The back end is Microsoft SQL Server (2005 or later) and we are committed to that, so I don't need to worry about ever supporting a different database platform.
The database is very mature and contains a great deal of the business logic. It is highly normalized and makes extensive use of stored procedures, triggers, and views. I would rather not reinvent two wheels at the same time, so I'd like to make as few changes to the database as possible. So, I need to choose a data access method that is flexible enough to let me work around any quirks in the database.
The application has many data entry forms and extensive searching and reporting capabilities (reports are another beast which I will tackle later).
The application needs to be flexible enough to deal with minor changes to the database structure. The application (and database) may be installed at different sites where minor custom modifications are made to the database. Ideally the application could identify the database extensions and react appropriately. In other words, if I need to store an O/R mapping in the application, I need to be able to swap that out (or refresh it easily) when installing the application and database at a new site.
Rapid application development is critical. Since the database is already done and the user interface is going to closely match the existing application, I'm hoping to find something where we can crank this out fairly quickly. I am willing to sacrifice not using the absolute latest and greatest technology if it will save time in development. In other words, if there is a steep learning curve to using something like Entity Framework, I'm fine with going something like strongly typed Datasets and a custom DAL if it will speed up the process.
I am a total newbie to ASP.Net but am intimately familiar with Classic ASP, T-SQL and the old ADO (e.g. disconnected recordsets). If any of the data access methods is better suited for someone coming from my background, I might lean in that direction.
Thanks for any advice that you can offer!
Look at all three articles in this series:
High Performance Data Access Layer Architecture Part 1
Great advice.
You may want to look at decoupling the database layer from the asp layer so that you can not only give more flexbility in making the decision, but when you have to make changes to a customer's database you can just swap in a new dll without changing anything else.
By using dependency injection you can use xml to tell the framework which concrete class to use for an interface.
The advantage to doing this is that you can then go with one database approach, and if you later decide to change to another, then you can just change the dll and go on without making any changes to other layers.
Since you are more familiar with it why not just go directly to the database at the moment by making your own connections? Then you can move the rest of your code and along the way you can decide which of the myriad of technologies to use.
For a new application I am working on I am starting with LINQ to SQL for it, mainly because development will be quicker, but, later, if I decide that won't meet my needs I will just swap it out.
nHibernate might be a good fit. You can store the mapping in external configuration files which would solve your needs. Another option might be using ActiveRecord, which is based upon nHibernate.
nHibernate has a neat feature which you might find helpful. It's called a Dynamic property which is basically a name value pair collection populated by pulling the column names from the mapping file. So when you add a column at your client site, you update the mapping file and you'd be able to access the data through a collection on the object.

Resources