I'm about to start a new .net web project.
Previsouly for my n-layer web apps i've used the "Microsoft Data Access Application Block" (sqlhelper.cs) for the data access, then an interface class to interface with the object classes. I'm aware this technique is a bit dated and was looking to use something a little more with the times.
I've looked into LINQ to SQL for data access but was restricted by lacking the many to many relationship.
The entity framework was a whole differnet approach that appears to have too larger learning curve.
Would there be anything wrong with using the sqlhelper.cs class to handle my data access?
Not at all. It supports the creation of multi-tier layers which separates our data access from our logic. I usually have business and data classes in separate folders and include the SqlHelper class with my Data DALC class.
I'm looking forward to the move towards LINQ and the use of generics. That's my next step and I think the the use of the SqlHelper promotes good coding practice in the meantime.
I first started using it when I "borrowed" it from the Enterprise Library which was huge. Ditto for the Entity Framework which I have yet to come to grips with in the workplace. but all in good time :-)
You can get simple and good example of sql helper class on http://followprogrammers.blogspot.com/
The last 8 months I've been using Linq and it works great for all the little jobs. The strong typing and drag-and-drop development makes it fantasticlly easy and super quick.
Previous to that, and still when Linq doesn't seem right I use SQLHelper. It cuts out all the donkey work of ADO.NET. I don't see any problem using it.
Related
I am creating my application in asp.net 3.5. I have to make my Data Access layer, in which I am doing the traditional method of fetching/updating the data. Which is SqlConnection than SQLCommand, than SQLadapter.
Will there be any other way I can create my DAL layer easily.
Specification.
My website is small. Approx 7-10
pages.
Database has around 80
tables.
What I know:
Linq to SQL - I don't want to use it
because I am not fully aware about
the LINQ statement and I need to
develop the application really fast.
[3 days :-( ]. Also, there are 100%
chances that the table structure
will be altered in future.
Enterprise Library: It will take too
much time for me to integrate to my
application.
Any other suggestion to create my data layer, quick ... fast ... and "NOT" dirty.
Thanks in advance.
How about using Codesmith (free version 2.6) to generate a simple set of data access objects off your database? Given the small number of DB objects that you need to model I think this would be a quick and easy way of achieving your goal given the time constraints.
I would have recommended using LINQ to SQL. But, since that is a no from you, only other option I would suggest is Strongly Typed Datasets and Table Adapters generated by Visual Studio. They are old but decent enough to work in any modern application.
They are fast to create. They provide type safety. They are quite flexible for configuration and customization. Since they are generated by Visual Studio, any changes made to database can be easily reflected quickly.
Being a LINQ beginner myself, I would recommend taking the plunge and going with linq-to-sql or entity framework. I cant say for certain without knowing your requirements but theres a good chance taking the time to learn basic linq for this project would speed up development overall.
You may also want to consider SubSonic. It's relatively easy to implement and is fairly intuitive to use. Used it for the first time recently on a small project, and despite some initial configuration problems getting it to work with MySQL, it handled data access pretty well.
I have an existing web app that has a data layer and a bll that calls the data layer. The data layer is ado.net that calls stored procedures.
I created another project in vs.net for linq-to-sql, dragged all my tables over.
Would it be wise to just start using linq or should I spend the time and re-write all the db logic in linq just so I don't have any issues having 2 data layers!
If it ain't broken, don't fix it.
Why would you want to rewrite completely your perfectly working data layer? ADO.NET + stored procedures is a great choice. Keep it. At the same time you can start playing with LINQ.
Anyway, you will need some practice with LINQ to see what it can and what it cannot do before you will be able to decide on the new data layer architecture. There are some situations that LINQ cannot handle right out of the box, so you will need to use tricks or substitute default implementation with your own queries. At the end of the day you may have decided, it was not worth it.
My suggestion is to gain some experience with it separately and not start rewriting everything completely just because LINQ is cool.
Unless your current data layer is broken for some reason, don't just start implementing a new one, just because you can.
Although if currently the datalayer consists of using stored procedures and that becomes cumbersome to maintain, switching to L2S (or any other OR/M for that matter) might be a valid reason. Just don't think it'll be only a matter of dragging some columns to a canvas and be done. Dependent if there's any logic in the sprocs, the logic has to exist somewhere...
I'd say until you can justify the costs of switching your datalayer entirly, stick with your current implementation.
Please be clear: there is a major difference between Linq and LinqToSql. Linq is great and you should be using it if at all possible. LinqToSql is not great and has many problems:
Do not use the Visual Studio 2008 LinqToSql O/R Designer
The drawbacks of adopting Linq To Sql
To use Linq, you need an ORM of some sort. You have many options for ORMs in the .NET world. If you like what LinqToSql offers, you may be most comfortable using SubSonic. In the long run, NHibernate is the best choice for a .NET ORM right now. I wrote a lot more on choosing a .NET ORM here:
.NET and ORM - Decisions, decisions
In the end, there is no reason you can't have two or more different data layer technologies in the same application. There are good reasons not to do this however and so it should be avoided if at all possible.
Also, here's a compelling write-up against using stored procedures:
Stored procedures are bad, m'kay?
I'm thinking through data access for an ASP.NET application. Coming from a company that uses a lot of Windows applications with Client Data sets there is a natural dendancy towards a DataSet approach for dealing with data.
I'm more keen on a Business Object approach and I don't like the idea of caching a DataSet in the session then applying an update.
Does anyone have any experience / help to pass on about the pros and cons of both approaches?
You are smart to be thinking of designing a Data Layer in your app. In an ASP.NET application this will help you standardize and pretty dramatically simplify your data access. You will need to learn how to create and use ObjectDataSources but this is quite straightforward.
The other advantage of a data access layer (built using a separate project/DLL) is that it makes Unit testing much simpler. I'd also encourage you to build a Business Layer to do much of the processing of data (the business layer, for example, would be responsible for pulling ObjectDataSources from the DAL to hand to the UI code). Not only does this let you encapsulate your business logic, it improves the testability of the code as well.
You do not want to be caching DataSets (or DAL objects, for that matter) in the session! You will build a Web app so that record modifications work through a Unique ID (or other primary key spec) and feed changes directly to the DAL as they are made. If you were to cache everything you would dramatically reduce the scalability of your app.
Update: Others on this thread are promoting the idea of using ORMs. I would be careful about adopting a full-blown ORM for reasons that I have previously outlined here and here. I do agree, though, that it would be wise to avoid DataSets. In my own work, I make extensive use of DataReaders to fill my ObjectDataSources (which is trivial due to the design of my DAL) and find it to be very efficient.
DataSets can be incredibly inefficient compared even to other ADO.NET objects like DataReaders. I would suggest going towards the BO/ORM route based off what you are saying.
If you're going to follow Microsoft's direction, then the trend is definitely towards LINQ (ORM) vs. DataSets. When DataSets came into being (ASP.NET 1.0), LINQ wasn't even possible. With LINQ you get type-safety and build-in functions to Create / Update / Delete from the database.
Microsoft has even tried to make the transition easier through LINQ to DataSet.
We're about to do a big update to an existing asp app that used DataSet objects heavily; although I am not looking forward to the pain, I am going to insist on going down the BO route. Just the thought of trying to make datasets work now causes me to break out in a sweat.
I think we are going to go down the LINQ route and use lightweight entity objects.
The company where I work makes heavy use of DataSets as well while there is a business layer as well. BL mainly loads datasets from the DB.
I personally dislike this approach. There is also a practice of direct modifying the datasets after load/before save to meet some immediate needs here and there. To me it really violates the idea of business objects but it's how it is done.
ORM frameworks can really save you a great deal of time, especially in enterprise applications with lots of views with similar buttons and operations.
But it's also easy to lose control. Since that point it will slowly be turning into a mess.
Both options are good when used in right cases. Just don't mix them. Decide to do it one way and follow it.
Mark Brittingham's answer is accurate for two tier applications. But what if I want to use a service tier. DataSets are serializable. Typed DataSets save time over hand coding your own objects. Typed DataSets are extendable. Linq to Entities has performace issues, Linq to SQL is now dead. Linq to DataSet will always be an option.
I will use Typed DataSets and a multi-layered architecture to save time and organize code. I've tried hand coded BOs and the extra time and maintenance time is not worth it.
Getting data from a database table to an object in code has always seemed like mundane code. There are two ways I have found to do it:
have a code generator that reads a database table and creates the
class and controller to map the datafields to the class fields or
use reflection to take the database field and find it on the class.
The problems noted with the above 2 methods are as noted below
Method 1 seems to me like I'm missing something because I have to create a controller for every table.
Method 2 seems to be too labor intensive once you get into heavy data
access code.
Is there a third route that I should try to get data from a database onto my objects?
You normally use OR (Object-Relational) mappers in such situations. A good framework providing OR functionality is Hibernate. Does this answer your question?
I think the answer to this depends on the available technologies for the language you are going to use.
I for one am very successful with the use of an ORM (NHibernate) so naturally I may recommend option one.
There are other options that you may wish to take though:
If you are using .NET, you may opt to use attributes for your class properties to serve either as a mapping within a class, or as data that can be reflected
If you are using .NET, Fluent NHibernate will make it quite easy to make type-safe mappings within your code.
You can use generics so that you will not need to make a controller for every table, although I admit that it will be likely that you will do the latter anyway. However the generics can contain most of the general CRUD methods that is common to all tables, and you will only need to code specific quirks.
I use reflection to map data back and forth and it works well even under heavy data access. The "third route" is to do everything by hand, which may be faster to run but really slow to write.
I agree with lewap, an ORM (object-relational mapper) really helps in these situations. You may also want to consider the Active Record pattern (discussed in Fowler's Patterns of Enterprise Architecture book). It can really speed up creation of the DAL in simple apps.
I have looked at NHibernate and EntitySpaces and they both seem to work differently.
In EntitySpaces, you define the database tables and table relationships and the classes are generated for you.
In NHibernate, you define the classes and the table relationships are generated for you. This is what I am looking for.
Are there any other ASP.NET ORMs that generate tables from classes like NHibernate?
Any recommendations?
DataObjects.Net also uses "Code first" (Model first) approach.
See http://wiki.dataobjects.net/index.php?title=Features
Linq to SQL can create the database table structures and relationships from the classes, with the dataContext.CreateDatabase() method.
Mindscape LightSpeed offers this ability - part of complete scheme round-tripping.
Hope this helps
http://www.mindscape.co.nz/blog/index.php/2008/06/17/schema-round-tripping-in-the-lightspeed-designer/
I prefer an approach that I have full control to generate what I need as well. In the case of ORMs I get classes that match my tables. I believe that using my own domain of objects that derives from my business and not the underlying data store is the right way to go. My class hierarchies that represent my business data should be 100% independent from the data store.
LightSpeed has a really good Visual Studio designer that supports both generating .NET entity classes from the database and updating the database from your .NET entities.
This is something that NHibernate does.
And on the subject (that Draemon) started. My personal view is that unless performance is your absolute 1st priority and all other things must suffer to make that happen (e.g. when writing software for a manufacturing fab), you will be better off working on the domain model first.
My reasoning: you spend a lot more time coding against the domain than you do against the database itself -- especially when using an orm. So spend that time wisely.
I had fairly good success working with Genome ORM. It does many jobs for you. You can first design your domain model and then generate the DB scripts out of that. Beside this Genome generates DTOs for you. It is pretty good at that and saves a lot of time of developers.
http://www.genom-e.com