I have just started reading up on using the repository and unit of work patterns. I am currently trying to use it with Entity Framework in an asp.net web forms application. I do however have a question which I am not sure if I would be able to explain in an easy way.
From what I understand a unit of work is used to encapsulate a business transaction. From the examples I have seen a uow is used in the following manner
businessMethod1()
{
uow u = new uow(); //instantiate unit of work
repository1 rep1 = new repository1(uow); //instantiate repository1
repository2 rep2 = new repository2(uow); //instantiate repository1
rep1.dowork();
rep2.dowork();
u.save(); //save the changes made to the database. (effectively saving changes made
//in both repository classes
}
Now suppose I have a businessMethod2() which is similar to the method described above. Suppose I want to use businessMethod1() from within businessMethod2() what would be the best practice. I would want to share the unit of work so should I pass it as an argument? i.e change the method mentioned above to
businessMethod1(uow u)
{
bool isNew = false;
if (u == null)
{
u = new uow();
isNew = true;
}
repository1 rep1 = new repository1(uow); //instantiate repository1
repository2 rep2 = new repository2(uow); //instantiate repository1
rep1.dowork();
rep2.dowork();
if (isNew)
u.save(); //save the changes made to the database.
}
Is this a proper way of working with this?
I was thinking a better way would be to use a singleton uow. On every page request a new instance of the uow is created and is shared by all the business methods. On a new request a different instance is created. Using a singleton uow would mean i wont have to pass it to any of my business methods and can at the same time share it b/w all my business methods.
Are there any drawbacks of doing this? Also is there a better way to implement this?
One way to solve this problem is to use Dependency Injection. Usually constructor injection is used along side a single point of entry to resolve dependencies.
public class MyBusinessService
{
public MyBusinessService(Repository1 repository1, Repository2, uow u)
{
// assign the params to fields
}
public void businessMethod1()
{
}
public void businessMethod1()
{
}
}
There are many popular DI frameworks out there. Pick what you think works for you.
This is about usage of UoW. If you place usage of UoW into BusinessMethod1 you are saying that it is top level business abstraction (business facade). It should probably not be used by other business operation because it would break its "top level". So if you need to use the logic from BusinessMethod1 in another BusinessMethod2 it is not correct approach to add logic making decision about UoW existence - that breaks separation of concerns. BusinessMethod should handle your application logic not UoW creation. Simplest solution is to refactor your BusinessMethod1 and expose shared functionality as a new method without any dependency on UoW:
public void BusinessMethod1()
{
uow u = new uow();
DoSomeWork();
u.save();
}
private void DoSomeWork()
{
repository1 rep1 = new repository1(uow); //instantiate repository1
repository2 rep2 = new repository2(uow); //instantiate repository1
rep1.dowork();
rep2.dowork();
}
Sure this is only very simple example because your methods still don't follow separation of concerns - they do both logic and object creation. You should handle UoW and repositories creation elsewhere and pass created objects inside. You can use approach mentioned by #Eranga but this refactoring will be still applicable if your method2 wants to call something from method1.
This refactoring approach can be modeled also as low level business services and business facade but it is needed only in big projects. In small projects you can also move interaction with UoW to your "controller" (probably code behind in web forms) because the controller drives application logic and it knows what business methods it wants to call in single unit of work.
Related
I've developed a CQRS style database access framework based on Tripod and other inspirations but targeting .NET Standard and simplifying for easier use. I want to split the IoC into separate integration packages so consumers can get the type registration I'm currently doing internally easily without being locked into a specific IoC container. My issue is I've only really worked closely with SimpleInjector so not familiar with other systems and their nuances around how they handle specific scenarios. I have an iminent need to support Autofac so thought I'd try here to see if anyone can translate.
I have the following Simple Injector CompositionRoot static class:
public static void RegisterDatabase(this Container container, DbContextOptions<EntityDbContext> dbContextOptions, params Assembly[] assemblies)
{
var scopedLifeStyle = container.Options.DefaultScopedLifestyle;
//container.Register<ICreateDbModel, DefaultDbModelCreator>(scopedLifeStyle); // lifestyle c
container.RegisterInitializer<EntityDbContext>( //(container.InjectProperties);
handlerToInitialise => handlerToInitialise.ModelCreator = new DefaultDbModelCreator()
);
// Setup DbContext
var ctxReg = scopedLifeStyle.CreateRegistration(
() => new EntityDbContext(dbContextOptions),
container);
container.AddRegistration<IUnitOfWork>(ctxReg);
container.AddRegistration<IReadEntities>(ctxReg);
container.AddRegistration<IWriteEntities>(ctxReg);
}
In ASP.NET Core solutions I invoke the above from Startup.Configure(...) with:
var optionsBuilder = new DbContextOptionsBuilder<EntityDbContext>()
//.UseInMemoryDatabase("Snoogans");
.UseSqlServer(_config.GetConnectionString("DefaultConnection"));
container.RegisterDatabase(optionsBuilder.Options);
which allows me to switch out to an in memory database for unit testing if needed. EntityDbContext contains all my unit of work methods for calling onto the context without having to specify explicit DbSet for each table. The IUnitOfWork, IReadEntities and IWriteEntities interfaces all define methods on the EntityDbContext.
So I'm not sure how I'd go about making an Autofac module that allows scoped registration of the dbcontext with passed in DbContextOptions followed by multiple registrations of interfaces to this registration.
Does anyone know how this can be achieved?
I worked out the process and now have an AutoFac module. I was able to registermodule by instance of the class and also pass in the options when I instantiate. Because EntityDbContext implements the three interfaces I was registering separately in the Simple Injector scenario, AutoFac has the convenience of being able to just infer them and register with AsImplementedInterfaces()
public class EntityFrameworkModule : Module
{
private readonly DbContextOptions<EntityDbContext> _dbContextOptions;
public EntityFrameworkModule(DbContextOptions<EntityDbContext> dbContextOptions)
{
_dbContextOptions = dbContextOptions;
}
protected override void Load(ContainerBuilder builder)
{
// If the calling code hasn't already registered a custom
// ICreateDbModel then register the internal DefaultDbModelCreator
builder.RegisterType<DefaultDbModelCreator>()
.IfNotRegistered(typeof(ICreateDbModel))
.As<ICreateDbModel>();
// Expecting IUnitOfWork, IReadEntities and IWriteEntities to be registered with this call
builder.Register(c => new EntityDbContext(_dbContextOptions)
{
ModelCreator = c.Resolve<ICreateDbModel>()
})
.AsImplementedInterfaces()
.InstancePerLifetimeScope();
}
}
I'm trying to implement the Identity system in an ASP.NET Core app (RC2 libraries) and there is a particular hangup that is driving me crazy.
First of all, I am not using EntityFramework. I'm not even using SQL. I'm backing up to RavenDB, so I need the implementation to be very specific to that; Which isn't a problem.
So I designed a RavenUserStore class, and it looks like this;
public class RavenUserStore<TUser> :
IUserStore<TUser>,
IUserLoginStore<TUser>,
IUserPasswordStore<TUser>,
IUserRoleStore<TUser>,
IUserSecurityStampStore<TUser>,
IUserClaimStore<TUser>,
IUserLockoutStore<TUser>,
IUserTwoFactorStore<TUser>,
IUserEmailStore<TUser> {
// ...
}
Works great on its own. I've implemented all the methods, etc. It's wonderful. Very clean and efficient.
Now, I go over to my web application and wire things up;
services.AddTransient<ILookupNormalizer>(s => new LowerInvariantLookupNormalizer());
services.AddTransient<IPasswordHasher<Member>>(s => new PasswordHasher<Member>());
services.AddTransient<IUserStore<Member>, RavenUserStore<Member>>();
services.AddIdentity<Member, Role>(o => {
o.Password.RequiredLength = 6;
o.Password.RequireDigit = true;
o.Password.RequireLowercase = false;
o.Password.RequireUppercase = false;
})
.AddUserStore<RavenUserStore<Member>>()
.AddRoleStore<RavenRoleStore<Role>>();
So I go make a controller to use this, per all the samples I've seen, and the very core sample from the Identity Framework Github Repository
//... [PROPERTIES]...//
public AccountController(UserManager<Member> userManager, SignInManager<Member> signInManager) {
// ... [attach constructor parameters to properties] ...//
}
Alright, so I inspect the classes carefully.
UserManager<T> has a property Store,which is a type of IUserStore<T>.
So theoretically.. if the dependency injection resolves types of IUserStore<T> to RavenUserStore<T> when they are injected through a constructor.. shouldn't that mean that the UserManager<T> gets a RavenUserStore<T> as its Store property?
I thought it would too; But when I call methods on the UserManager, it DOES NOT call the ones on my RavenUserStore. Why is this? What can I do?
Do I really have to ALSO make a custom UserManager class and do all of those methods AGAIN?
You need to add your own custom providers before calling services.AddIdentity(). Internally, AddIdentity uses TryAddScoped() which only adds the default items if they don't already exist in the services container.
So just putting the call to AddIdentity() after you registered all your custom implementations should mean that they will take precedence as you expect.
The development is limited to Visual Studio 2010 (Client approved software). We need to access the data through stored procedures. I want to avoid making it too complex with an aggressive schedule. Most of the design I see involve EF and LINQ, Not sure how to design for procs?
I want to create a separate code library project (used Web UI):
Application.Domain
- Interact get/put stored procedures, entities
Application.Web
- containing Web UI (JQuery, AJAX), WCF Service
Can anyone give me sample code on how to approach the Application.Domain?
Examples, I have read:
http://www.developer.com/net/dependency-injection-best-practices-in-an-n-tier-modular-application.html
http://www.kenneth-truyers.net/2013/05/12/the-n-layer-myth-and-basic-dependency-injection/
DAL\AppDAL.cs:
public static IEnumerable<TasCriteria> GetTasCriterias()
{
using (var conn = new SqlConnection(_connectionString))
{
var com = new SqlCommand();
com.Connection = conn;
com.CommandType = CommandType.StoredProcedure;
com.CommandText = "IVOOARINVENTORY_GET_TASCRITERIA";
var adapt = new SqlDataAdapter();
adapt.SelectCommand = com;
var dataset = new DataSet();
adapt.Fill(dataset);
var types = (from c in dataset.Tables[0].AsEnumerable()
select new TasCriteria()
{
TasCriteriaId = Convert.ToInt32(c["TasCriteriaId"]),
TasCriteriaDesc= c["CriteriaDesc"].ToString()
}).ToList<TasCriteria>();
return types;
}
}
Models\TasCriteria.cs:
public class TasCriteria
{
public int TasCriteriaId { get; set; }
public string TasCriteriaDesc { get; set; }
}
Service\Service.svc:
[OperationContract]
[WebInvoke(ResponseFormat = WebMessageFormat.Json,
BodyStyle = WebMessageBodyStyle.WrappedRequest, Method = "GET")]
public List<TasCriteria> GetTasCriteriaLookup()
{
var tc = InventoryDAL.GetTasCriterias();
return tc.ToList();
}
If you:
are running on a tight schedule
have most of the business logic already on the DB side via sprocs/views
have not worked with EF before
I suggest you take a look at the Microsoft Enterprise Library, especially the Data Application Block. It will simplifie ALL of your DAL functionality (without using any ORM framework) and it follows the dependency inversion principle with the help of Unity which is a dependency injection container from Microsoft.
Some helpfull Data Application Block concepts:
Output Mapper
An output mapper takes the result set returned from a database (in the
form of rows and columns) and converts the data into a sequence of
objects.
// Create and execute a sproc accessor that uses default parameter and output mappings
var results = db.ExecuteSprocAccessor<Customer>("CustomerList", 2009, "WA");
Read the whole Retrieving Data as Objects topic.
Parameter Mapper
A parameter mapper takes the set of objects you want to pass to a
query and converts each one into a DbParameter object.
// Use a custom parameter mapper and the default output mappings
IParameterMapper paramMapper = new YourCustomParameterMapper();
var results = db.ExecuteSprocAccessor<Customer>("Customer List", paramMapper, yourCustomParamsArray);
For Entity generation I would try to use this tool. It builds a POCO class from a resultset returned by a sproc. I have not tried this tool yet and maybe there are better alternatives but it is something to get you start with, so you dont have to do this by hand.
If you are using .NET 3.5 then you have to work with Enterprise Library 5.0.
I hope this will steer you in the right direction.
first and foremost, make sure you abstract you DAL using dependency injection such as ninject or unity (or many others freely available). it is quite possible to have your DAL loosely coupled so that if you decide later on the EF (or any other ORM) is not the best course, changing it would no cost blood...
you do NOT want to have an AppDAL class with static methods to call the SP. at the very least add an interface and use injection, if only for the sake of unit testing.
whether you'll use EF or Nhibernate or any other ORM, that decision should be encapsulated in your DAL and not leak into other layers. the domain layer should use interfaces for repository classes from the DAL (and those contain references to the chosen ORM or data access classes).
these repositories will call the stored procedures and return your model classes (POCOs).
in one of my recent project we had this interface to provide basic CRUD operations:
public interface IRepository<T> where T : DomainEntity
{
T Get(Int64 id);
void SaveOrUpdate(T entity);
void Delete(T entity);
IQueryable<T> Find();
}
DomainEntity is a very simple class that all model clasess inherit.
In the rare cases where we needed to use stored procedures I'd create an additional interface that provides a GetXXXEntity method (1 or more), that would do the actual call to the SP.
so, when I need to get an entity from the DB using it's Id, it would look like:
_diWrapper.GetRepository<Person>().Get(id);
_diWrapper.GetRepository<Order>().Get(id);
_diWrapper is my wrapper for the dependency injection container (ninject in this case). I used a wrapper so I could easily replace ninject with something else if needed.
in common cases where I need to use linq:
_diWrapper.GetRepository<Person>().Find().Where(q => q.Name == "Jack").ToList();
the important thing was that I could replace Nhibernate with anything else rather quickly.
I strongly recommend you look at Fluent NHibernate, as it provides a simple solution that does not require much coding.
EDIT: here's an example of the repository class implementing the IRepository interface:
public class NhibernateRepository<T> : IRepository<T> where T : DomainEntity, new()
{
private ISession _session;
public NhibernateRepository()
{
_session = BaseNHibernateHelper<NHibernateHelper>.GetCurrentSession();
}
public T Get(Int64 id)
{
return _session.Get<T>(id);
}
public void SaveOrUpdate(T entity)
{
_session.SaveOrUpdate(entity);
}
public void Delete(T entity)
{
_session.Delete(entity);
}
public IQueryable<T> Find()
{
return _session.Query<T>();
}
}
note that in the constructor I use another nhibernate helper I created that wraps the session factory. this is where I have a dependency on nhibernate.
if I ever wanted to replace NH with another ORM, I would need to modify only the repository class (and the underlying supporting classes), you can see that NH does not leak outside the Repository class and no one that uses it are aware of the usage of NH.
I noticed that most people spoke of implementation/tech but no one mentioned the application or thrust of domain driven design . Well DDD is not necessarily something you can achieve by just adding in dapper/ef/enterprise library blocks. These can help, as can SOLID and things like cqs command/query separation but these are merely enablers there are more considerations and questions which need to be asked. Take a look at " domain driven design quickly" on infoq for a few more ideas.
I'm in the middle of a significant effort to introduce NHibernate into our code base. I figured I would have to use some kind of a DI container, so I can inject dependencies into the entities I load from the database. I chose Unity as that container.
I'm considering using Unity's interception mechanism to add a transaction aspect to my code, so I can do e.g. the following:
class SomeService
{
[Transaction]
public void DoSomething(CustomerId id)
{
Customer c = CustomerRepository.LoadCustomer(id);
c.DoSomething();
}
}
and the [Transaction] handler will take care of creating a session and a transaction, committing the transaction (or rolling back on exception), etc.
I'm concerned that using this kind of interception will bind me to using Unity pretty much everywhere in the code. If I introduce aspects in this manner, then I must never, ever call new SomeService(), or I will get a service that doesn't have transactions. While this is acceptable in production code, it seems too much overhead in tests. For example, I would have to convert this:
void TestMethod()
{
MockDependency dependency = new MockDependency();
dependency.SetupForTest();
var service = SomeService(dependency);
service.DoSomething();
}
into this:
void TestMethod()
{
unityContainer.RegisterType<MockDependency>();
unityContainer.RegisterType<IDependency, MockDependency>();
MockDependency dependency = unityContainer.Resolve<MockDependency>();
dependency.SetupForTest();
var service = unityContainer.Resolve<SomeService>();
service.DoSomething();
}
This adds 2 lines for each mock object that I'm using, which leads to quite a bit of code (our tests use a lot of stateful mocks, so it is not uncommon for a test class to have 5-8 mock objects, and sometimes more.)
I don't think standalone injection would help here: I have to set up injection for every class that I use in the tests, because it's possible for aspects to be added to a class after the test is written.
Now, if I drop the use of interception I'll end up with:
class SomeService
{
public void DoSomething(CustomerId id)
{
Transaction.Run(
() => {
Customer c = CustomerRepository.LoadCustomer(id);
c.DoSomething();
});
}
}
which is admittedly not as nice, but doesn't seem that bad either.
I can even set up my own poor man's interception:
class SomeService
{
[Transaction]
public void DoSomething(CustomerId id)
{
Interceptor.Intercept(
MethodInfo.GetCurrentMethod(),
() => {
Customer c = CustomerRepository.LoadCustomer(id);
c.DoSomething();
});
}
}
and then my interceptor can process the attributes for the class, but I can still instantiate the class using new and not worry about losing functionality.
Is there a better way of using Unity interception, that doesn't force me to always use it for instantiating my objects?
If you want to use AOP but are concerned abut Unity then I would recommend you check out PostSharp. That implements AOP as a post-compile check but has no changes on how you use the code at runtime.
http://www.sharpcrafters.com/
They have a free community edition that has a good feature set, as well as professional and enterprise versions that have significantly enhanced feature sets.
The ValidationManager has a public Dictionary for storing UI components that implement the IValidatable interface.
I am testing a command class that needs an instance of ValidationManager and I want it to fail the validations. So I override the ValidationManager's "validateItem()" method like so:
var validationManagerRepos:ValidationManager = ValidationManager(mockRepository.createStub(ValidationManager));
var validationItem:IValidatable = IValidatable(mockRepository.createStub(IValidatable));
var validatableItems:Dictionary = new Dictionary();
validatableItems[validationItem] = false;
SetupResult.forCall(validationManagerRepos.validateItem(validationItem)).returnValue(false);
My problem is in the execute method of the command. It checks to see if the validationItem is both a DisplayObject (isVisble) and IValidatable. Any slick way to stub a typed object AND an interface? Or do I just need to create an instance of some existing object that already satisfies both?
for (var iVal:Object in validationManager.validatableItems)
{
if (isVisible(DisplayObject(iVal)))
{
passed = validationManager.validateItem(IValidatable(iVal));
eventDispatcher.dispatchEvent(new ValidationEvent(ValidationEvent.VALIDATE_COMPLETED, IValidatable(iVal), passed));
if (!passed)
{
allPassed = false;
}
}
}
I'm fairly sure you can't do both within asMock. It's a limitation of the Flash Player because of lack of polymorphism.
I believe what you'll have to do is create a testing object that does both (extend DisplayObject and implement IValidatable) and create a mock object of that.
The concept of a "multimock" is certainly possible, but floxy (the framework that asmock uses to generate dynamic proxies) doesn't support it. I previously considered adding support for it, but it would be difficult to expose via the various Mock metadata and there's be other issues to worry about (like method name clashes).
I agree with J_A_X's recommendation of creating a custom class and then mocking that.