What do you think of using modern data access technologies in legacy apps? Not replacing the data access layer with a new layer, but having a mix of data access methods in the same layer.
Say the current Data Access Layer in my legacy app uses DataSet, SQLDataAdapter, SQLCommand and Stored Proc to access data from the database.
Are there any real reasons not to include Linq to SQL (dbml) or Entity Framework classes (edmx) classes in the DAL? Is there any harm in having a mix of Data Access Methods in the DAL, or in the same class?
Generally there is no harm but unless you plan to do slow upgrade by replacing parts of legacy application when doing new development I would not do it. It will make the whole application like one big mess of many technologies, it will have worse maintenance and it can also mess its design / architecture.
The exception can be implementing new component of the application which is isolated from the rest. In such case you can probably design it from scratch and use newer technology but for support / maintenance team it can still be nuisance.
Related
I have the main website that uses a database to store and access user accounts. I'm using EF to manage the schema. I also defined site-specific POCOs and have migrated them to the database.
Now, what if I want a separate website, for example, a resource server (Web API) that would expose (with authorization) the same data set up on the main website?
Do I create the same POCOs and derived DbContext on the resource server again? That seems like duplicating work, though.
What if I wanted to create new POCOs on the resource server and reflect them onto that same database? Wouldn't that conflict with the current migration (which is saved on the database), then subsequently mess up the EF setup on the main website?
I've seen the suggestion of putting the POCOs and DbContexts in a library and have multiple projects reference that same library. This seems viable, however I'd have to hard-code the connection string, which seems dirty to me.
I'm starting to think that EF is probably not recommended for this kind of setup. It seems like a database-first approach plays better here - though I would have to manually reedit the data contexts (most likely, LINQ-SQL) for every database schema change.
Are there any lesser-known capabilities, facts, practices, etc., for/about EF that would help in this situation?
Generally, you can avoid duplication by having one API serving both sites and have resources version for each if needed. On the other hand, if you choose reuse and add approach, creating additional EF code-first entities should not interfere with other site data layer if modeled and mapped carefully. DbContext connection string does not have to be hard-coded.
Team A has an enterprise app that uses ADO.NET for data access that executes stored procedures. The data access is encapsulated in it's own project (let's call it DAL.dll)
Team B is creating another unrelated app that's reusing the stored procedures in the enterprise app. This app is currently using the MS application block for data access. The issue we run into is that whenever Team A make any change to the input/output params in the stored procedures, there is a runtime error in Team B's app and this app needs to be updated to accommodate the additional params (or params that were removed). So, most of these go unnoticed until a user complains. At the very least, we would like to have the app throw a compilation error so that the build process warns us of the changes made.
One way to do this is to have Team B's project add a reference to the DAL.dll
I'd like to know if there are any other cleaner ways of solving the issue. We are ready to replace Team B's MS Data application block to use a different technology (Entity Framework?) if necessary.
Among the other answers, I'd strongly suggest getting those stored procedures into source control, in a Database Project. You then may be able to use the features of your source control system to do several things:
Lock some of the code so that it cannot be changed
Give you notifications if the code is changed
Warn you if the stored procedures change in a way that would prevent them from being called
Branch the stored procedures so that each team can have their own version of changed code, while keeping the unchanged stored procedures common. You of course will need to separate the different versions in the database.
I agree with the other posters on this thread that you should not share stored procedure's across different .NET DLL's, that is just a recipe for disaster. I would also shy away from ORM's like Entity Framework if you are doing anything at all complicated with your database schema because ORM's excel at getting a simple object model translated from your .NET application classes into SQL tables and SP's, but traditionally do poorly at optimizing them for performance on the database side. There will be people who claim otherwise, and they may have a valid point if you are an expert in wrangling an ORM to do waht you want like they are, but chances are you are not and it will cause you headaches in the long run.
A shared data access layer might work, but conceptually you are then just changing the implementation of the dependency from some code that a DBA wrote to some code that a .NET programmer wrote. Yes, you can use integration tests to achieve better verifiability, but the same case could be made for SQL with tools like Red Gate's SQL Test. I would shy away from this approach if the two applications are already experiencing some sort of pain from sharing SP's. That is an indication that the dependency just should be done away with.
If it were up to me, I'd just make a new schema for Team B's app. You can read more about schemas in SQL Server here: MSDN Schema description for 2008 R2. You can think of them as namespaces for SQL Server but with some additional bells and whistles like permission and access control. Separating out your different applications into separate schemas on the same shared database will probably make for the most flexible implementation in the long run.
unrelated app that's reusing the stored procedures in the enterprise app
If these two application are really unrelated why are those sharing procedures or even the same database. I know this is a long read, but I recommend you to read this: A Better Path to Enterprise Architectures
The partioning concept in there relates to the bounded context in Domain driven design:
Multiple models are in play on any large project. Yet when code based on distinct models is combined, software becomes buggy, unreliable, and difficult to understand. Communication among team members becomes confusing. It is often unclear in what context a model should not be applied.
Therefore: Explicitly define the context within which a model applies. Explicitly set boundaries in terms of team organization, usage within specific parts of the application, and physical manifestations such as code bases and database schemas. Keep the model strictly consistent within these bounds, but don’t be distracted or confused by issues outside.
It is expected you end with problems when you don't explicitely deal with this. You're lucky you're seeing early failures, as it can turn into problems much harder to find on the long run.
Analyze the problem again with the above in mind. Consider if you're missing some explicit context where this common functionality should live.
My question is: which team owns the store procedured and the database shared? Usually as a good architecture/design, you should not have two different apps sharing same database / procedures.
A better way to share data/functionality between two different applications is through a services or API, so the team who owns the functionality would be responsible to maintain it.
Also, have a good communication between both teams is highly recommend.
Depending on the owner of the DAL project, you could host web services and share the API. That way, you separate the Data Access Layer from the business logic, which allows anyone to use the same DAL without having to publish it to each different location.
From my point of view, it looks like both Team A and Team B should share the same core model and look at Multitier architecture as a possible solution.
It sounds like it would make sense to create a shared DAL that both applications can share.
I would add unit tests (or really integration tests) to make sure the DAL is compatible with the apps after changes. That way your tests would fail if incompatible changes have been made
"I'd like to know if there are any other cleaner ways of solving the issue."
The cleanest way is for Team B to sit down with Team A and encapsulate the relevant business logic into a shared API. It doesn't matter so much how you implement that API; what does matter is that the API's interface is documented and versioned so everyone knows what to expect.
One reasonable mechanism for this in a .NET environment is to use Microsoft's WebAPI.
In short, the question of "how do we share a stored procedure?" is most likely looking at the wrong level of abstraction.
I am Developing a mid-size application and want to implement Application Architecture, I've read some Architecture Books and Approach and think about
AAFN (Application Arcitecture For .net) presented by Microsoft
SOA
SDLM
SDO
MVC
and vice versa ...
this is a web application that will extended with some other small application ( just think about something like a M.I.S with a (or two) core)
Whitch Projects I should have I think about
Common // to use in all projects
Framework // main framework
DAO // data access object ( entityframework or nHibernate )
UI // will available in 2 variant web and windows(wpf) interface )
BusinessEntities // all subApplication project logic will goes there
ApplicationNameProject // each application have their Own Logic (in BussinessEntities)
ApplicationUnit // each application Entity will place here
ApplicationNameProject // each application data Entity (in Application Unit)
Services // WCF Services goes here to contribute with all applications
this is the architecture witch I think about, I do not have any force to use this, I want to know whats the best fit for me, can Change all of it or add some other projects and remove these projects
any help appriciated
There is no "best small or mid-size application architecture" as a silver bullet to fit any project, so drop that idea right now or you'll be in for a world of pain down the road.
The architecture for any given project will fit the purpose of that project. In some cases, ASP.NET WebForms with a direct queries into the database will be the most appropriate architecture, in some cases MVC will be the right architecture, in some cases a windows forms application built on top of a web service that connects to a relational database through an ORM like LINQ-to-SQL or NHibernate.
You can't decide on a one-architecture-fits-all approach, it just doesn't work. Each architecture has its merits and weaknesses and thus projects for which it is well suited and projects for which it should be avoided. You should pick the approach that makes the most sense for the current project/scenario.
Given that however, I tend to take a fairly uniform approach.
If I need a quick utility project that does a very specific thing and is highly unlikely to be needed for anything else, I might use a console application with queries against my database hardcoded.
If I need a common set of queries that I'm likely to need from multiple projects, I'll write them as stored procedures to get the performance benefits and build a data access layer that will leverage these stored procedures to give me standardized business objects, in a standard DAL (data access layer)/BOL (business object layer)/BLL (business logic layer) approach. This is advantageous because it means that once I've got this set of libraries built I can float any application over the top - for instance a webforms or MVC application.
MVC is advantageous because of separation of concerns - your controller can interact with your business library simply to access the data it needs and your views are really just that - a view of the data that the user can interact with. The views do nothing more than take the current data view to the user and transport any data changes back from the user to the controller - no logic is held in the view and as such it means that it's far easier to unit test and make changes to components without affecting the rest of the application.
The drawback to a multi-tiered or multi-layered approach like this though is that it takes time to architect it properly and if you're only after a throw-away utility application like they demonstrate on stage at developer conferences then this is complete overkill and I wouldn't bother with it.
Think of it like this: Every layer, every library, every component requires justification. If there is less justification for than against, then don't do it. The key is not to do something without reason - anything you do is correct providing that you have a well thought out reason for it, and by well thought out, I mean that you've found very good reasons for and against and you've made an educated decision, you've not made a decision based on half thoughts, or worse, no thought at all.
Anything but the most trivial .NET application should have several projects: a UI layer, some kind of business logic layer, a persistence (storage) layer and accompanying test projects. Each project should interact loosely through interfaces.
In general you should create the minimum number of layers you need to make your code testable and easy to understand.
To figure out what the minimum is that you need it can be a good idea to let your tests drive the internal design of the system. Each layer should have tests in its own right, with (possibly) the exception of the top HTML layer and the bottom SQL layer.
With that in mind it helps to separate concerns as far as possible. For example SQL queries should almost never be in the same block of code as HTML support: split things into multiple layers that each do one and only one thing. This makes changes easier.
Be aware of the difference between systems architecture (where loosely coupled Web services using e.g. REST interact) and the internal design of the system. It's a good idea to decouple the Web service interfaces (as consumer or provider) in their own layers as this is an area that often changes.
These designs are an art that's best learned by practice. With good unit tests you should find refactoring an application design fairly swift, so it's a good idea to look at technologies like Spring.NET or other inversion of control containers to make this easy.
We are considering/evaluating EF for a new ASP.NET project.
Please tell if you are using, or have used, EF v1 on a project ?
Please also tell how was your experience with EF in web or desktop application ?
Thank You.
We used EF for a medium sized internal project. It was a n-tier app with a server containing business logic and a self contained EF layer. The client was a WPF app (connecting to the server via WCF).
There is lots to like about the EF, and it can make some aspects of your DA layer very quick to write, but one thing I will say is that it currently does not have very good support for disconnected applications. It works fantastically if your application is very self contained, has a direct connection to the database and uses 1 data context throughout, the data context manages your data objects, pulls data from and updates the database appropriately.
As soon as you try to disconnect your client in any form of n-tier structure though things get harder. You either have to manage disconnecting and reconnecting your entities from the data contexts or you have to somehow serialize your data context across to the client. You have to use multiple data contexts (partly because our was a stateless server anyway, but also because you'd get in a massive mess trying to use a single data context for multiple clients), and it all becomes a bit trickier to manage. Part of our solution was to have separate "business objects" that were created from the lower level "ef data objects". The EF would then manage these data objects, (persist them and load them from the database etc), but our own BLL layer would manage the business objects. Both saving and loading required translation from the higher level objects to the lower level ones or vice versa.
All in all things have worked OK, but in hindsight I would say that the EF is not fully ready for serious enterprise level development. I have heard that the next version of the EF in .net 4.0 has much better support for disconnected and n-tier apps, but I have not tired this out personally.
I am in the early stages of planning a conversion of a large classic ASP database application to ASP.Net and I'm having trouble picking out which data access method to use. I have played around with Linq To SQL, Dynamic Data, strongly typed datasets, Enterprise Library (Data Access Application Blocks), and a tiny bit with Entity Framework, but none of them have jumped out to me as "the one". There are just too many choices - my head is swimming, help me choose!
Perhaps it would help to give some background on the application that I am converting along with the priorities...
The back end is Microsoft SQL Server (2005 or later) and we are committed to that, so I don't need to worry about ever supporting a different database platform.
The database is very mature and contains a great deal of the business logic. It is highly normalized and makes extensive use of stored procedures, triggers, and views. I would rather not reinvent two wheels at the same time, so I'd like to make as few changes to the database as possible. So, I need to choose a data access method that is flexible enough to let me work around any quirks in the database.
The application has many data entry forms and extensive searching and reporting capabilities (reports are another beast which I will tackle later).
The application needs to be flexible enough to deal with minor changes to the database structure. The application (and database) may be installed at different sites where minor custom modifications are made to the database. Ideally the application could identify the database extensions and react appropriately. In other words, if I need to store an O/R mapping in the application, I need to be able to swap that out (or refresh it easily) when installing the application and database at a new site.
Rapid application development is critical. Since the database is already done and the user interface is going to closely match the existing application, I'm hoping to find something where we can crank this out fairly quickly. I am willing to sacrifice not using the absolute latest and greatest technology if it will save time in development. In other words, if there is a steep learning curve to using something like Entity Framework, I'm fine with going something like strongly typed Datasets and a custom DAL if it will speed up the process.
I am a total newbie to ASP.Net but am intimately familiar with Classic ASP, T-SQL and the old ADO (e.g. disconnected recordsets). If any of the data access methods is better suited for someone coming from my background, I might lean in that direction.
Thanks for any advice that you can offer!
Look at all three articles in this series:
High Performance Data Access Layer Architecture Part 1
Great advice.
You may want to look at decoupling the database layer from the asp layer so that you can not only give more flexbility in making the decision, but when you have to make changes to a customer's database you can just swap in a new dll without changing anything else.
By using dependency injection you can use xml to tell the framework which concrete class to use for an interface.
The advantage to doing this is that you can then go with one database approach, and if you later decide to change to another, then you can just change the dll and go on without making any changes to other layers.
Since you are more familiar with it why not just go directly to the database at the moment by making your own connections? Then you can move the rest of your code and along the way you can decide which of the myriad of technologies to use.
For a new application I am working on I am starting with LINQ to SQL for it, mainly because development will be quicker, but, later, if I decide that won't meet my needs I will just swap it out.
nHibernate might be a good fit. You can store the mapping in external configuration files which would solve your needs. Another option might be using ActiveRecord, which is based upon nHibernate.
nHibernate has a neat feature which you might find helpful. It's called a Dynamic property which is basically a name value pair collection populated by pulling the column names from the mapping file. So when you add a column at your client site, you update the mapping file and you'd be able to access the data through a collection on the object.