I am in the early stages of planning a conversion of a large classic ASP database application to ASP.Net and I'm having trouble picking out which data access method to use. I have played around with Linq To SQL, Dynamic Data, strongly typed datasets, Enterprise Library (Data Access Application Blocks), and a tiny bit with Entity Framework, but none of them have jumped out to me as "the one". There are just too many choices - my head is swimming, help me choose!
Perhaps it would help to give some background on the application that I am converting along with the priorities...
The back end is Microsoft SQL Server (2005 or later) and we are committed to that, so I don't need to worry about ever supporting a different database platform.
The database is very mature and contains a great deal of the business logic. It is highly normalized and makes extensive use of stored procedures, triggers, and views. I would rather not reinvent two wheels at the same time, so I'd like to make as few changes to the database as possible. So, I need to choose a data access method that is flexible enough to let me work around any quirks in the database.
The application has many data entry forms and extensive searching and reporting capabilities (reports are another beast which I will tackle later).
The application needs to be flexible enough to deal with minor changes to the database structure. The application (and database) may be installed at different sites where minor custom modifications are made to the database. Ideally the application could identify the database extensions and react appropriately. In other words, if I need to store an O/R mapping in the application, I need to be able to swap that out (or refresh it easily) when installing the application and database at a new site.
Rapid application development is critical. Since the database is already done and the user interface is going to closely match the existing application, I'm hoping to find something where we can crank this out fairly quickly. I am willing to sacrifice not using the absolute latest and greatest technology if it will save time in development. In other words, if there is a steep learning curve to using something like Entity Framework, I'm fine with going something like strongly typed Datasets and a custom DAL if it will speed up the process.
I am a total newbie to ASP.Net but am intimately familiar with Classic ASP, T-SQL and the old ADO (e.g. disconnected recordsets). If any of the data access methods is better suited for someone coming from my background, I might lean in that direction.
Thanks for any advice that you can offer!
Look at all three articles in this series:
High Performance Data Access Layer Architecture Part 1
Great advice.
You may want to look at decoupling the database layer from the asp layer so that you can not only give more flexbility in making the decision, but when you have to make changes to a customer's database you can just swap in a new dll without changing anything else.
By using dependency injection you can use xml to tell the framework which concrete class to use for an interface.
The advantage to doing this is that you can then go with one database approach, and if you later decide to change to another, then you can just change the dll and go on without making any changes to other layers.
Since you are more familiar with it why not just go directly to the database at the moment by making your own connections? Then you can move the rest of your code and along the way you can decide which of the myriad of technologies to use.
For a new application I am working on I am starting with LINQ to SQL for it, mainly because development will be quicker, but, later, if I decide that won't meet my needs I will just swap it out.
nHibernate might be a good fit. You can store the mapping in external configuration files which would solve your needs. Another option might be using ActiveRecord, which is based upon nHibernate.
nHibernate has a neat feature which you might find helpful. It's called a Dynamic property which is basically a name value pair collection populated by pulling the column names from the mapping file. So when you add a column at your client site, you update the mapping file and you'd be able to access the data through a collection on the object.
Related
Team A has an enterprise app that uses ADO.NET for data access that executes stored procedures. The data access is encapsulated in it's own project (let's call it DAL.dll)
Team B is creating another unrelated app that's reusing the stored procedures in the enterprise app. This app is currently using the MS application block for data access. The issue we run into is that whenever Team A make any change to the input/output params in the stored procedures, there is a runtime error in Team B's app and this app needs to be updated to accommodate the additional params (or params that were removed). So, most of these go unnoticed until a user complains. At the very least, we would like to have the app throw a compilation error so that the build process warns us of the changes made.
One way to do this is to have Team B's project add a reference to the DAL.dll
I'd like to know if there are any other cleaner ways of solving the issue. We are ready to replace Team B's MS Data application block to use a different technology (Entity Framework?) if necessary.
Among the other answers, I'd strongly suggest getting those stored procedures into source control, in a Database Project. You then may be able to use the features of your source control system to do several things:
Lock some of the code so that it cannot be changed
Give you notifications if the code is changed
Warn you if the stored procedures change in a way that would prevent them from being called
Branch the stored procedures so that each team can have their own version of changed code, while keeping the unchanged stored procedures common. You of course will need to separate the different versions in the database.
I agree with the other posters on this thread that you should not share stored procedure's across different .NET DLL's, that is just a recipe for disaster. I would also shy away from ORM's like Entity Framework if you are doing anything at all complicated with your database schema because ORM's excel at getting a simple object model translated from your .NET application classes into SQL tables and SP's, but traditionally do poorly at optimizing them for performance on the database side. There will be people who claim otherwise, and they may have a valid point if you are an expert in wrangling an ORM to do waht you want like they are, but chances are you are not and it will cause you headaches in the long run.
A shared data access layer might work, but conceptually you are then just changing the implementation of the dependency from some code that a DBA wrote to some code that a .NET programmer wrote. Yes, you can use integration tests to achieve better verifiability, but the same case could be made for SQL with tools like Red Gate's SQL Test. I would shy away from this approach if the two applications are already experiencing some sort of pain from sharing SP's. That is an indication that the dependency just should be done away with.
If it were up to me, I'd just make a new schema for Team B's app. You can read more about schemas in SQL Server here: MSDN Schema description for 2008 R2. You can think of them as namespaces for SQL Server but with some additional bells and whistles like permission and access control. Separating out your different applications into separate schemas on the same shared database will probably make for the most flexible implementation in the long run.
unrelated app that's reusing the stored procedures in the enterprise app
If these two application are really unrelated why are those sharing procedures or even the same database. I know this is a long read, but I recommend you to read this: A Better Path to Enterprise Architectures
The partioning concept in there relates to the bounded context in Domain driven design:
Multiple models are in play on any large project. Yet when code based on distinct models is combined, software becomes buggy, unreliable, and difficult to understand. Communication among team members becomes confusing. It is often unclear in what context a model should not be applied.
Therefore: Explicitly define the context within which a model applies. Explicitly set boundaries in terms of team organization, usage within specific parts of the application, and physical manifestations such as code bases and database schemas. Keep the model strictly consistent within these bounds, but don’t be distracted or confused by issues outside.
It is expected you end with problems when you don't explicitely deal with this. You're lucky you're seeing early failures, as it can turn into problems much harder to find on the long run.
Analyze the problem again with the above in mind. Consider if you're missing some explicit context where this common functionality should live.
My question is: which team owns the store procedured and the database shared? Usually as a good architecture/design, you should not have two different apps sharing same database / procedures.
A better way to share data/functionality between two different applications is through a services or API, so the team who owns the functionality would be responsible to maintain it.
Also, have a good communication between both teams is highly recommend.
Depending on the owner of the DAL project, you could host web services and share the API. That way, you separate the Data Access Layer from the business logic, which allows anyone to use the same DAL without having to publish it to each different location.
From my point of view, it looks like both Team A and Team B should share the same core model and look at Multitier architecture as a possible solution.
It sounds like it would make sense to create a shared DAL that both applications can share.
I would add unit tests (or really integration tests) to make sure the DAL is compatible with the apps after changes. That way your tests would fail if incompatible changes have been made
"I'd like to know if there are any other cleaner ways of solving the issue."
The cleanest way is for Team B to sit down with Team A and encapsulate the relevant business logic into a shared API. It doesn't matter so much how you implement that API; what does matter is that the API's interface is documented and versioned so everyone knows what to expect.
One reasonable mechanism for this in a .NET environment is to use Microsoft's WebAPI.
In short, the question of "how do we share a stored procedure?" is most likely looking at the wrong level of abstraction.
My company has a fairly old fat client application written in Delphi. We are very interested in replacing it with a shiny new web application. This will make maintenance a breeze and many clients want a web application.
The application is extremely rich in domain knowledge, some of which is out of our control. Our clients use the program to manage their own clients and report them to the government. So an inaccurate program is a pretty big thing. The old program has no tests. We are not sure yet if we will implement automated testing with the new one.
We first planned to basically start from scratch. But we are short handed and wanting to basically get everyone on the web as soon as possible. So instead of starting from scratch we've decided to try to make use of the legacy fat-client database.
The database is SQL Server and can be used in SQL Server 2008 easily. It is very rich in stored procedures, functions, a few triggers, and lots of tables with over 80 columns... But it is decently normalized. We want for both the web application and fat client to be capable of using the same database. This is so that if something breaks badly in the web application, our clients can still use the fat client and connect to our servers. After the web application is considered "stable", we'd deprecate the fat client.
Has anyone else done this? What tips can you give? We want to, after getting everyone on the website, to slowly change the database structure to take care of some design deficiencies. What is the best way to keep this in a data access layer so that later changes are easy?
And what about actually making the screens? Is there any way easier than just rewriting an 80 field form in ASP.Net? Are there any tools that can make this easier?
The current plan is to use ASP.Net WebForms (.Net 3.5). I'd really like to use MVC, but no one on the team knows it including me.
We are not sure yet if we will implement automated testing with the
new one.
Implement automated testing. What's the point in replacing one buggy program with another?
Good question, but "Slowly change" the db structure after getting everyone on the website, sounds like a joke...
I would rather take the opportunity to create a fresh db structure, write a bulletproof migration script for you db, that you can try out and rewrite a zillion times without any side effect fro your clients, and then write whaterver you want (fat/web) on the new db, have it tested and migrate everyone when it's ready.
I have a couple suggestions:
1) create a service layer to abstract away the dependance on the DAL. In a situation as you describe having a layer of indirection for the UI and BLL to rely on makes DB changes much safer.
2) Create automated tests (both unit and integration), especially if you plan on making fairly significant changes to the Domain or Persistance layers (BLL/DAL). To make this really easy you should always try to program to an interface. This makes your code more flexible as well as letting you use mocking frameworks (Moq is one I like) to ensure your tests truely are unit tests and not integration tests.
3) Take a look at DDD (http://domaindrivendesign.org/) as it seems to fit pretty well with the given scenario. At the very least there are some very useful patterns that can help make your application more flexible.
4) MVC isnt very hard to learn at all, it is however an easy way to get unit testing setup for the UI as a result of the MVC architecture (testing the controller and not the view). That said, there is no reason you couldn't unit test web forms, its just a bit more work. MVC really is just a UI framework/design pattern (more Model2 but we can ignore that for now). It gets you closer to the metal so to speak as you will be writting a lot more HTML and using a Model (the 'M') for passing data around.
For DDD take a look at Eric Evans book: http://www.amazon.com/Domain-Driven-Design-Tackling-Complexity-Software/dp/0321125215/ref=sr_1_1?s=books&ie=UTF8&qid=1317333430&sr=1-1
Hope that helps
ASP.NEt forms is a no starter, is completely inappropriate for something like this. I recommend to start with something like Creating an OData API for StackOverflow including XML and JSON in 30 minutes, then build your Web app on top of that (ie. push it to the client, use JQuery/Silverlight).
I hope you can help me out with any advice, it'd be greatly appreciated.
I've been using EF 4.0 for a while now using the following object context management technique http://dotnetslackers.com/articles/ado_net/Managing-Entity-Framework-ObjectContext-lifespan-and-scope-in-n-layered-ASP-NET-applications.aspx . I have a fairly simple setup with a web project connecting to a BLL connecting to a DAL. The web and BLL reference the DAL Entity objects. It's been functioning fine but it seems very slow. It's an ASP .NET Webforms application that uses an existing database model (i.,e not code first) and points to a SQL Server 2005 DB.
Anyway, I'm now revisiting the architecture as we're getting complaints over screen to screen performance. I've done most of the UI enhancements possible but I think it's just the Save, Redirect and Load using EF that's the sluggish point now.
The site is a series of quote pages for car insurance. I'm now hoping to do create the relevant objects in session, i.e, page 1 create quote object, populate fields, page 2, add X additional drivers, page 3 add claims /convictions and then save the objects to the database. The users will be able to save and exit at any point in the quote process so the save won't always be at the end. We also need to be able to load page 1's info and update it in memory when they click Next and finally update the DB when they're ready to finish the quote.
At the moment, we're doing the retrieve and saves on the same page. How would you advise we move to the storing in session/final commit?
I've trawled through various msdn pages and I'm having trouble putting it all together into the latest 'best practice' for 4.1 for this fairly simple application. I've looked at Julie Lerman's videos but her n'tier only did a simple retrieve and 'Add', suspiciously leaving out the Update section as I suspect it is not straightforward. Do you think I should use Self Tracking Entities (I've read that people are having multiple issues with this but maybe their architecture is more complex?) or some other way of storing the EF objects in session and making the changes
Any help/ideas greatly appreciated.
Storing data in Session can be an effective method to minimise IO, but it's not a simple decision. You shouldn't put large amounts of data in Session State and we can't tell from your description what kind of volume of data you're talking about, or the number of concurrent active sessions.
There's also the question of your Session State provider. If you're using a single server, you'll probably use in-process Session State, which is quick, but if you have lots of users and lots of data, you can soon run into memory pressure issues.
If you're using a web farm, you'll have to use shared Session State, possibly using the SQL Server Session State provider, so you'll end up reading from and writing to a database on every interaction, which could be worse.
However, step 1 is to make sure you understand the problem. Don't make assumptions about where your performance problems are and try to redesign those. Use profiling or instrumentation techniques to identify the real bottlenecks and concentrate your efforts on those.
You might be surprised as to where your problems lie. It may well simply be a database optimisation issue.
I have ASP.NET project and I want to know what is better to use.
ODBC connection and with Server Explorer (drag and drop make DataSet and modify it) or do some DBconnect class with connection to database, queries and use it for GridView?
When I use server explorer, I don't have good feeling because all logic is on aspx page and I do not separate from the application layer logic layer.
It will be a lagre application, databese(PostreSQL) have 18 tables and difficult constraints and application have to generate some documents etc. .
"Better" depends entirely on your situation. Is the purpose to get something done as quickly as possible for internal users at your company, or is this going to be a commercial site that will need to be highly extensible and needs to be as easy as possible to maintain? Will you need to integrate with other platforms possibly built using other languages at some point? The answers to all of these questions should affect your decision.
If you're looking to separate your project into distinct layers, then I would recommend an ORM such as NHibernate or Entity Framework (there are other commercially available ORM products out there, but these are the ones I'm familiar with and which you can easily get help with on this site).
Create a DataSource with LINQ to Entity. It let you the liberty of LINQ with the peace of mind of when you change something il will break your build so you will be able to debug more efficiently.
Well if you have total flexibility, I would recommend using C# ASP.NET 4 with MVC3 razor for the UI and application code. Use Entity Framework 4.1 code first for the data access layer.
This way you will always work with real objects that you create, and with List<realtype> instead of the total mess that exists with datasets.
I am going to make an web application where a lot of users are going to input data into a SQL Server with ASP.NET 3.5. There will be no heavy load of data sent to the client as data will be set to pagesized from the database. Stored procedures are used.
I am asking you guys with experience in web 2.0 aka AJAX, jQuery and other client technologies ( no postbacks ) about performance and responsive matter. I have also looked into ASP.NET MVC, but most examples shows either in LINQ to SQL or with the Entity Framework. LINQ to SQL seems to perform slower than the ordinary ADO.NET. I prefer to load data to objects.
Insert and edit forms are to be opened on the same page with javascript, either through modal pop-up or in an area reserved for it.
Preferably a solution with minimal coding.
What do you suggest?
Reading your post I see the following requirements/desires...
System will be under decent to heavy load, minimal coding, Stored Procedures, load data to objects.
Sounds like an ORM will be a great solution. It may perform slower than raw ADO.net calls BUT you will greatly minimize coding and you can use Stored Procedures in L2S and Entity Framework and they both can work well under stress. For example, this site uses L2S. :)
The use of the ORM should also reduce your development time since you won't have to write all of the database access code.
You can still load data to objects by keeping the L2S or Entity Framework as a layer in your application that just does the raw DB access. You then create another layer that calls this to populate your objects with the data but you can control how to design those objects and how they work. In fact this is a recommended approach. Here's a link that shows how you can create a tiered approach. :)
http://weblogs.asp.net/dwahlin/archive/2008/02/28/building-an-n-layer-asp-net-application-with-linq-lambdas-and-stored-procedures.aspx
As for your client technology with MVC, AJAX, jQuery, etc.... they are fine choices and with MVC you have complete control over the HTML and no viewstate to worry about as compared to Web Forms.
There's a lot here to answer and not everything has a definitive you should do XXX and not XXX. Let me try to break it down.
ASP.MVC vs WebForms (standard ASP.NET)
You can make a decent data entry application using either platform. Webforms has been around longer and definitely has more coverage through tutorials, but ASP.NET MVC is just as capable. MVC is going to be leaner and meaner, which is good if you are going for pure responsiveness, but it's possible to do that with Webforms too, it just takes more work (turning off ViewState, SessionState, Minimizing postbacks etc) and removes some of the benefits of Webforms.
Data Access
If you have already decided on using stored procedures as your primary data access method, you aren't going to get much of anything from any ORM (Linq2Sql, Linq2Entities, NHibernate, Subsonic, etc). If you really want to leverage the benefits of ORM you will have to give up stored procedures for your primary data interface.
However, Linq2Sql is considered plenty fast. Linq2Entities is a bit slower, but that will probably improve. NHibernate and Subsonic are slower still. It's not very useful to compare any of them to ADO.NET since they do very different things (that happen to revolve around talking to a database). But all of that is pretty meaningless as the slowest part of any system is going to be sending data across the internet back and forth to the user.
Have you checked out Asp.net Dynamic Data Project that makes it rather quick from start to a working app. You would then tweak those things that you need to change. But you will maybe have to get to know some new technologies to get it done. Maybe Sps won't be in your end solution.
Definitely minimal coding involved.