We are considering/evaluating EF for a new ASP.NET project.
Please tell if you are using, or have used, EF v1 on a project ?
Please also tell how was your experience with EF in web or desktop application ?
Thank You.
We used EF for a medium sized internal project. It was a n-tier app with a server containing business logic and a self contained EF layer. The client was a WPF app (connecting to the server via WCF).
There is lots to like about the EF, and it can make some aspects of your DA layer very quick to write, but one thing I will say is that it currently does not have very good support for disconnected applications. It works fantastically if your application is very self contained, has a direct connection to the database and uses 1 data context throughout, the data context manages your data objects, pulls data from and updates the database appropriately.
As soon as you try to disconnect your client in any form of n-tier structure though things get harder. You either have to manage disconnecting and reconnecting your entities from the data contexts or you have to somehow serialize your data context across to the client. You have to use multiple data contexts (partly because our was a stateless server anyway, but also because you'd get in a massive mess trying to use a single data context for multiple clients), and it all becomes a bit trickier to manage. Part of our solution was to have separate "business objects" that were created from the lower level "ef data objects". The EF would then manage these data objects, (persist them and load them from the database etc), but our own BLL layer would manage the business objects. Both saving and loading required translation from the higher level objects to the lower level ones or vice versa.
All in all things have worked OK, but in hindsight I would say that the EF is not fully ready for serious enterprise level development. I have heard that the next version of the EF in .net 4.0 has much better support for disconnected and n-tier apps, but I have not tired this out personally.
Related
I'm looking some advice on to a strategy for migrating from Asp.Net WebForms to MVC. I currently have a solution of approx 60 projects in the following format:
Solution
ProjectA.DataModel
ProjectA.Business
ProjectA.Web
ProjectB.DataModel
ProjectB.Business
ProjectB.Web
Framework.Core
Framework.Common
…etc
All data models are Entity Framework 6 using database first (.edmx) and T4 templates. Data access code is mixed between both the Business and Web (WebForms) projects. The code base has grown organically initially from a sole developer through to a small team which explains some of the bad practice in terms of SOC however we want to try take this opportunity to rectify this.
I want to start moving towards a full MVC solution and feel the first thing to do is ensure any data access logic currently residing in the Web projects is pushed to the Business layer to get Separation of Concerns. Researching into a best practice for this has taken me towards the Unit of Work and Repository patterns however further reading seems to be suggesting this is overkill.
What would be the best approach to refactoring my Business and Data Layers with the current Web Forms model in readiness for MVC. Secondly is an accepted approach for migration to MVC to bring in Views to the existing WebForms solution to create a Hybrid or to create a new MVC project, reference my existing BAL and DAL and start building the application UI from scratch?
In regards to Entity Framework, everything appears to be moving towards a CodeFirst approach. Is this something we need to be planning for if we want to go forward with a best practice approach?
We are currently a very small team and want to try make best possible re-use of our existing projects and refactor as much as possible to get into a position of beginning to move towards MVC.
Any thoughts on how I can begin to approach this is appreciated.
Thanks
Nasty app you have there. Anywas, first thing to rembmer is that MVC is a UI pattern. So in a properly designed app the swtich from WebForms to Mvc would mean just change the UI layer.
I want to start moving towards a full MVC solution and feel the first thing to do is ensure any data access logic currently residing in the Web projects is pushed to the Business layer to get Separation of Concerns
No! Data access logic should be in the DAL (hint: it's an acronym), not BL, not UI. Persistence Only. BL and UI would ask the DAL to save/retrieve their objects via Repository. And btw, EF deals with db only. Don't make the mistake to build your business objects on top of Ef Entities. One models business concept and behaviour, the other model database access. They're usually not compatible. When dealing with anything but persistence, ignore that you have a db or an ORM.
Researching into a best practice for this has taken me towards the Unit of Work and Repository patterns however further reading seems to be suggesting this is overkill.
It's overkill ONLY if you a have a very simple app that you don't care about maintaining it. I know is hard to believe but probably over 80%(more or less random number) of devs still don't understand how to properly implement the Repository pattern that's why it becomes useless for them. In a nutshell, the repo uses the EF but it's not built on top of it. The repo 'transforms' business/ui objects to EF entities and vice versa.
The repo interface should never expose IQueryable or the fact that you're using a db in the first place. So, no generic repositories and no exposing of EF entities. Also, the Bl/UI shouldn't create queries (it would mean they know how the data is stored - a DAL implementation detail), that's the repository's job. THe higher layers just tell the repository what they want, never how to do it.
Secondly is an accepted approach for migration to MVC to bring in Views to the existing WebForms solution to create a Hybrid or to create a new MVC project, reference my existing BAL and DAL and start building the application UI from scratch?
Although you can mix WebForms and Mvc in the same project, it's better not to do it (less headaches). Start the mvc app from scratch then port the web forms pages to it.
Team A has an enterprise app that uses ADO.NET for data access that executes stored procedures. The data access is encapsulated in it's own project (let's call it DAL.dll)
Team B is creating another unrelated app that's reusing the stored procedures in the enterprise app. This app is currently using the MS application block for data access. The issue we run into is that whenever Team A make any change to the input/output params in the stored procedures, there is a runtime error in Team B's app and this app needs to be updated to accommodate the additional params (or params that were removed). So, most of these go unnoticed until a user complains. At the very least, we would like to have the app throw a compilation error so that the build process warns us of the changes made.
One way to do this is to have Team B's project add a reference to the DAL.dll
I'd like to know if there are any other cleaner ways of solving the issue. We are ready to replace Team B's MS Data application block to use a different technology (Entity Framework?) if necessary.
Among the other answers, I'd strongly suggest getting those stored procedures into source control, in a Database Project. You then may be able to use the features of your source control system to do several things:
Lock some of the code so that it cannot be changed
Give you notifications if the code is changed
Warn you if the stored procedures change in a way that would prevent them from being called
Branch the stored procedures so that each team can have their own version of changed code, while keeping the unchanged stored procedures common. You of course will need to separate the different versions in the database.
I agree with the other posters on this thread that you should not share stored procedure's across different .NET DLL's, that is just a recipe for disaster. I would also shy away from ORM's like Entity Framework if you are doing anything at all complicated with your database schema because ORM's excel at getting a simple object model translated from your .NET application classes into SQL tables and SP's, but traditionally do poorly at optimizing them for performance on the database side. There will be people who claim otherwise, and they may have a valid point if you are an expert in wrangling an ORM to do waht you want like they are, but chances are you are not and it will cause you headaches in the long run.
A shared data access layer might work, but conceptually you are then just changing the implementation of the dependency from some code that a DBA wrote to some code that a .NET programmer wrote. Yes, you can use integration tests to achieve better verifiability, but the same case could be made for SQL with tools like Red Gate's SQL Test. I would shy away from this approach if the two applications are already experiencing some sort of pain from sharing SP's. That is an indication that the dependency just should be done away with.
If it were up to me, I'd just make a new schema for Team B's app. You can read more about schemas in SQL Server here: MSDN Schema description for 2008 R2. You can think of them as namespaces for SQL Server but with some additional bells and whistles like permission and access control. Separating out your different applications into separate schemas on the same shared database will probably make for the most flexible implementation in the long run.
unrelated app that's reusing the stored procedures in the enterprise app
If these two application are really unrelated why are those sharing procedures or even the same database. I know this is a long read, but I recommend you to read this: A Better Path to Enterprise Architectures
The partioning concept in there relates to the bounded context in Domain driven design:
Multiple models are in play on any large project. Yet when code based on distinct models is combined, software becomes buggy, unreliable, and difficult to understand. Communication among team members becomes confusing. It is often unclear in what context a model should not be applied.
Therefore: Explicitly define the context within which a model applies. Explicitly set boundaries in terms of team organization, usage within specific parts of the application, and physical manifestations such as code bases and database schemas. Keep the model strictly consistent within these bounds, but don’t be distracted or confused by issues outside.
It is expected you end with problems when you don't explicitely deal with this. You're lucky you're seeing early failures, as it can turn into problems much harder to find on the long run.
Analyze the problem again with the above in mind. Consider if you're missing some explicit context where this common functionality should live.
My question is: which team owns the store procedured and the database shared? Usually as a good architecture/design, you should not have two different apps sharing same database / procedures.
A better way to share data/functionality between two different applications is through a services or API, so the team who owns the functionality would be responsible to maintain it.
Also, have a good communication between both teams is highly recommend.
Depending on the owner of the DAL project, you could host web services and share the API. That way, you separate the Data Access Layer from the business logic, which allows anyone to use the same DAL without having to publish it to each different location.
From my point of view, it looks like both Team A and Team B should share the same core model and look at Multitier architecture as a possible solution.
It sounds like it would make sense to create a shared DAL that both applications can share.
I would add unit tests (or really integration tests) to make sure the DAL is compatible with the apps after changes. That way your tests would fail if incompatible changes have been made
"I'd like to know if there are any other cleaner ways of solving the issue."
The cleanest way is for Team B to sit down with Team A and encapsulate the relevant business logic into a shared API. It doesn't matter so much how you implement that API; what does matter is that the API's interface is documented and versioned so everyone knows what to expect.
One reasonable mechanism for this in a .NET environment is to use Microsoft's WebAPI.
In short, the question of "how do we share a stored procedure?" is most likely looking at the wrong level of abstraction.
What do you think of using modern data access technologies in legacy apps? Not replacing the data access layer with a new layer, but having a mix of data access methods in the same layer.
Say the current Data Access Layer in my legacy app uses DataSet, SQLDataAdapter, SQLCommand and Stored Proc to access data from the database.
Are there any real reasons not to include Linq to SQL (dbml) or Entity Framework classes (edmx) classes in the DAL? Is there any harm in having a mix of Data Access Methods in the DAL, or in the same class?
Generally there is no harm but unless you plan to do slow upgrade by replacing parts of legacy application when doing new development I would not do it. It will make the whole application like one big mess of many technologies, it will have worse maintenance and it can also mess its design / architecture.
The exception can be implementing new component of the application which is isolated from the rest. In such case you can probably design it from scratch and use newer technology but for support / maintenance team it can still be nuisance.
I am Developing a mid-size application and want to implement Application Architecture, I've read some Architecture Books and Approach and think about
AAFN (Application Arcitecture For .net) presented by Microsoft
SOA
SDLM
SDO
MVC
and vice versa ...
this is a web application that will extended with some other small application ( just think about something like a M.I.S with a (or two) core)
Whitch Projects I should have I think about
Common // to use in all projects
Framework // main framework
DAO // data access object ( entityframework or nHibernate )
UI // will available in 2 variant web and windows(wpf) interface )
BusinessEntities // all subApplication project logic will goes there
ApplicationNameProject // each application have their Own Logic (in BussinessEntities)
ApplicationUnit // each application Entity will place here
ApplicationNameProject // each application data Entity (in Application Unit)
Services // WCF Services goes here to contribute with all applications
this is the architecture witch I think about, I do not have any force to use this, I want to know whats the best fit for me, can Change all of it or add some other projects and remove these projects
any help appriciated
There is no "best small or mid-size application architecture" as a silver bullet to fit any project, so drop that idea right now or you'll be in for a world of pain down the road.
The architecture for any given project will fit the purpose of that project. In some cases, ASP.NET WebForms with a direct queries into the database will be the most appropriate architecture, in some cases MVC will be the right architecture, in some cases a windows forms application built on top of a web service that connects to a relational database through an ORM like LINQ-to-SQL or NHibernate.
You can't decide on a one-architecture-fits-all approach, it just doesn't work. Each architecture has its merits and weaknesses and thus projects for which it is well suited and projects for which it should be avoided. You should pick the approach that makes the most sense for the current project/scenario.
Given that however, I tend to take a fairly uniform approach.
If I need a quick utility project that does a very specific thing and is highly unlikely to be needed for anything else, I might use a console application with queries against my database hardcoded.
If I need a common set of queries that I'm likely to need from multiple projects, I'll write them as stored procedures to get the performance benefits and build a data access layer that will leverage these stored procedures to give me standardized business objects, in a standard DAL (data access layer)/BOL (business object layer)/BLL (business logic layer) approach. This is advantageous because it means that once I've got this set of libraries built I can float any application over the top - for instance a webforms or MVC application.
MVC is advantageous because of separation of concerns - your controller can interact with your business library simply to access the data it needs and your views are really just that - a view of the data that the user can interact with. The views do nothing more than take the current data view to the user and transport any data changes back from the user to the controller - no logic is held in the view and as such it means that it's far easier to unit test and make changes to components without affecting the rest of the application.
The drawback to a multi-tiered or multi-layered approach like this though is that it takes time to architect it properly and if you're only after a throw-away utility application like they demonstrate on stage at developer conferences then this is complete overkill and I wouldn't bother with it.
Think of it like this: Every layer, every library, every component requires justification. If there is less justification for than against, then don't do it. The key is not to do something without reason - anything you do is correct providing that you have a well thought out reason for it, and by well thought out, I mean that you've found very good reasons for and against and you've made an educated decision, you've not made a decision based on half thoughts, or worse, no thought at all.
Anything but the most trivial .NET application should have several projects: a UI layer, some kind of business logic layer, a persistence (storage) layer and accompanying test projects. Each project should interact loosely through interfaces.
In general you should create the minimum number of layers you need to make your code testable and easy to understand.
To figure out what the minimum is that you need it can be a good idea to let your tests drive the internal design of the system. Each layer should have tests in its own right, with (possibly) the exception of the top HTML layer and the bottom SQL layer.
With that in mind it helps to separate concerns as far as possible. For example SQL queries should almost never be in the same block of code as HTML support: split things into multiple layers that each do one and only one thing. This makes changes easier.
Be aware of the difference between systems architecture (where loosely coupled Web services using e.g. REST interact) and the internal design of the system. It's a good idea to decouple the Web service interfaces (as consumer or provider) in their own layers as this is an area that often changes.
These designs are an art that's best learned by practice. With good unit tests you should find refactoring an application design fairly swift, so it's a good idea to look at technologies like Spring.NET or other inversion of control containers to make this easy.
I've been reading the book Pro ASP NET MVC Framework and I'm getting really confused with a lot of things. I've been trying to do some research but I'm finding that with so many different approaches and concepts being thrown at me, it's just making things worse.So I have a few questions:
I know MVC is supposed to split the functionality into three main things: Model -> Controller -> View. Is the MVC a different approach than the three-tier architecture? Or am I still supposed to be thinking of creating a Data Access Layer and a Business Logic Layer in my project?
What exactly are Repositories? It is what acts as my Data Access Layer? Where/How do Repositories fit into the MVC?
The book talks about using LINQ to SQL to interact with the database but yet it states that LINQ to SQL will not be supported in the future and that Microsoft is dropping it for the Entity Framework. Where does the Entity Framework fit into the MVC and how do I interact with it?
Thanks in advance for your help!
Matt
MVC is mostly a pattern for the presentation layer, and it focuses on the interaction between the view and the controller. The model can be considered to be the components of the application that are responsible for maintaining state, including persistence.
In a simple application the model might just be an LINQ-To-SQL model. In a large enterprise application the model might contain a data access layer, business layer, and a domain layer. The ASP.NET MVC does not restrict you to how the M should be implemented.
The Repository pattern is one way to implement the persistence part of the M. The ActiveRecord is another. Which pattern to choose depends on the complexity of the application, and your preferences.
Take a look at Step 3 of the NerdDinner tutorial where they create a simple repository using Linq to SQL.
Linq to SQL will not be dead. Microsoft will still improve the core and add customer requests where it makes sense but Entity Framework would be the primary focus. Take a look at this post for LINQ to SQL changes in .NET 4.0.
The EF can be used is a similar way as LINQ to SQL, but it is also more flexible so it can be used in other ways. For example EF4 will more or less support persistence of your own POCO objects in a more Domain Driven Design.
Yes, I think MVC is a different approach than "the" 3-tier architecture that I think you meant here (the architecture where you create mainly 3 projects DAL, BL, and UI). The main idea behind MVC is the separation of concerns between each of its components (Model, View and Controller). The controller is the component responsible for handling user requests, and in most cases it corporates with the "Model" component in order to display the desired view as a response to the user request. The difference between this and the traditional 3-tier architecture, is that the DAL, and the BL are grouped now and named a Model and yes you still need to create these components.
What are repositories?
Martin Fowler mentions the definition of a repository as "Mediates between the domain and data mapping layers using a collection-like interface for accessing domain objects" Repositories are part of your data access layer, they don't access data by themselves, they mediate between the domain and the data mapping entities, and of course they should be placed in your Model folder/project.
Will Linq to SQL be deprecated?
NO and the same book states so, also Damien Guard ( a developer at the ADO.NET team) mentioned in one of his blog posts that Linq to SQL will be included in .NET 4.0.
How to interact with EF?
As you would with Linq to SQL. Like Linq to SQL, Entity Framework will be your mapping entities, and will reside in the Model project as well.
Hope this helps!
I guess you're a bit confused over these things, and they are confusing, so let's go over them slowly.
N-Tiered Architecture and MVC are different, but intertwined. N-Tier usually talks about separating Data Access, Business Logic and the User Interface. However, some people may argue that it is impossible to totally separate BLLs from the UI; MVC addresses that, in such a way that there is a corresponding Controller talking to your BLL, and to your View, as opposed to having your View talk directly to your BLL.
Yes, having repositories is one approach to having a DAL. There are many ways of doing this, and you should not limit yourself to what is discussed in the book.
The book only uses LINQ to SQL to demonstrate ASP.NET MVC the fastest way possible, but it is NOT the only way. Stop thinking about LINQ to SQL for a minute; ASP.NET MVC can be used whether you use an ORM like NHibernate or you use plain ADO.NET + DAL Factory or whatever -- what you'll not going to be able to use are those ASP.NET ObjectDataSources that you drag and drop with your UI.
As for Entity Framework, Brad Abrams wrote a nice guide on how to use Entity Framework with ASP.NET MVC, that should cover your last question.
HTH
Yes you still need to create data access and business logic layers yourself. Some may argue that the Controller layer IS the business logic but I personally prefer the separation between real business logic (e.g. pricing calculation) from screen business logic (e.g. event handler for the "OK" button). You will then call these from your Controller class. The controller class controls the logic for your screen and manages the translation from your data/business logic layer to the screen value.
the ASP.NET MVC framework puts no restriction on the "Model" layer, which means you can use whatever you want including NHibernate, LINQ to SQL or entity framework. I use LINQ to SQL because it's simple.
Not sure, never read that book. I just downloaded Scott Hanselman's Nerddinner project from codeplex and use that as a guide for writing ASP.NET MVC websites.