NUnit Rollback After Test - asp.net

I am pretty new to NUnit (and automated testing in general). I have recently done some Ruby On Rails work and noticed that in my test suite, when I create objects (such as a new user) and commit them during course of the suite, they are never committed to the database so that I can run the test over and over and not worry about that user already existing.
I am now trying to accomplish the same thing in NUnit, but I am not quite sure how to go about doing it. Do I create a transaction in the Setup and Teardown blocks? Thanks.

Why would you talk to the database during unit-tests? This makes your unit-test to integration-tests by default. Instead, create wrappers for all database communication, and stub/mock it during unit-tests. Then you don't have to worry about database state before and after.
Now, if you are not willing to that level of refactoring: The problem with transactions is that you need an open connection. So, if your method targeted for testing handles all communication on its own, it is really difficult to inject a transaction that you can create at setup and roll back at teardown.

Maybe you can use this. It is ugly, but perhaps it can work for you:
namespace SqlServerHandling
{
[TestFixture]
public sealed class TestTransactionRollBacks
{
private string _connectionString = "Data Source = XXXDB; ; Initial Catalog = XXX; User Id = BLABLA; Password = BLABLA";
private SqlConnection _connection;
private SqlTransaction _transaction;
[SetUp]
public void SetUp()
{
_connection = new SqlConnection(_connectionString);
_transaction = _connection.BeginTransaction();
}
[TearDown]
public void TearDown()
{
_transaction.Rollback();
}
[Test]
public void Test()
{
Foo foo = new Foo(_connection);
object foo.Bar();
}
}
internal class Foo
{
private readonly SqlConnection _connection;
object someObject = new object();
public Foo(SqlConnection connection)
{
_connection = connection;
}
public object Bar()
{
//Do your Stuff
return someObject;
}
}

I agree with Morten's answer, but you might want to look at this very old MSDN Magazine article on the subject: Know Thy Code: Simplify Data Layer Unit Testing using Enterprise Services

I use SQLite for unit tests, using NHibenate. Even if you're not using NHibernate it should be possible to do. SQLite has an in memory mode, where you can create a database in memory and persist data there. It is fast, works well, and you can simply throw away and recreate the schema for each test or fixture as you see fit.
You can see the example from Ayende's blog for an overview of how its done. He is using NHibernate, but the concept should work with other ORM or a straight DAL as well.

Related

Re-instantiate a singleton with Prism in Xamarin Forms

How can I dispose and re-instantiate a singleton with Prism/DryIoC in Xamarin Forms?
I'm working with Azure Mobile Apps for offline data. Occasionally, I need to delete the local sqlite database and re-initialize it. Unfortunately the MobileServiceClient occasionally holds the db connection open and there's no method exposed to close it. The suggested solution (https://github.com/Azure/azure-mobile-apps-net-client/issues/379) is to dispose of MobileServiceClient. Only problem is that is registered with DryIoC as a singleton.
I'm not overly familiar with DryIoC, or Prism and Forms for that matter... But for the life of me, I can't see a way to do this.
I did cook up a pretty elaborate scheme that almost worked.
In my ViewModel method, when I needed the db freed up, I fired off an event -
_eventAggregator.GetEvent<RegisterDatabaseEvent>().Publish(false);
Then in App.xaml.cs, I wired up a listener and a handler like so -
_eventAggregator.GetEvent<RegisterDatabaseEvent>().Subscribe(OnRegisterDatabaseEventPublished);
private void OnRegisterDatabaseEventPublished()
{
Container.GetContainer().Unregister<IAppMobileClient>();
Container.GetContainer().Unregister<IMobileServiceClient>();
Container.GetContainer().Register<IMobileServiceClient, AppMobileClient>(new SingletonReuse());
Container.GetContainer().Register<IAppMobileClient, AppMobileClient>(new SingletonReuse());
_eventAggregator.GetEvent<RegisterDatabaseCompletedEvent>().Publish(register);
}
Lastly, back in the ViewModel constructor, I had a final listener that handled the event coming back from App.xaml and finished processing.
_eventAggregator.GetEvent<RegisterDatabaseCompletedEvent>().Subscribe(OnRegisterDatabaseCompletedEventPublished);
So the amazing thing is that this worked. The database was able to be deleted and all was good. But then I navigated to a different page and BOOM. DryIoC said it couldn't wire up the ViewModel for that page. I assume the unregister/register jacked up DryIoC for all injection... So how can I accomplish what needs to be done?
FINAL SOLUTION
Thanks so much to dadhi for taking the time to help. You are certainly a class act and I'm now considering using DryIoC elsewhere.
For anyone who stumbles on this, I'm posting the final solution below. I'll be as verbose as I can to avoid any confusion.
First, in my App.xaml.cs, I added a method for registering my database.
public void RegisterDatabase(IContainer container)
{
container.RegisterMany<AppMobileClient>(Reuse.Singleton,
setup: Setup.With(asResolutionCall: true),
ifAlreadyRegistered: IfAlreadyRegistered.Replace,
serviceTypeCondition: type =>
type == typeof(IMobileServiceClient) || type == typeof(IAppMobileClient));
}
I simply add a call to that method in RegisterTypes in place of registering the types in there directly.
protected override void RegisterTypes(IContainerRegistry containerRegistry)
{
containerRegistry.GetContainer().Rules.WithoutEagerCachingSingletonForFasterAccess();
...
RegisterDatabase(containerRegistry.GetContainer());
...
}
Note also the added rule for eager caching, per dadhi.
Later on when I need to release the database in the ViewModel... I kick things off by resetting my local db variable and sending an event to App.xaml.cs
_client = null;
_eventAggregator.GetEvent<RegisterDatabaseEvent>().Publish(true);
In App.xaml.cs, I have subscribed to that event and tied it to the following method.
private void OnRegisterDatabaseEventPublished()
{
RegisterDatabase(Container.GetContainer());
_eventAggregator.GetEvent<RegisterDatabaseCompletedEvent>().Publish(register);
}
Here I just call RegisterMany again, exactly the same as I do when the app starts up. No need to unregister anything. With the setup and ifAlreadyRegistered arguments (thanks, dadhi!), DryIoC allows the object to be replaced. Then I raise an event back to the VM letting it know the database has been released.
Finally, back in the ViewModel, I'm listening for the completed event. The handler for that event updates the local copy of the object like so.
_client = ((PrismApplication)App.Current).Container.Resolve<IAppMobileClient>();
And then I can work with the new object, as needed. This is key. Without setting _client to null above and resolving it again here, I actually ended up with 2 copies of the object and calls to methods were being hit 2x.
Hope that helps someone else looking to release their Azure Mobile Apps database!
I am not sure how exactly XF handles these things.
But in DryIoc in order for service to be fully deleted or replaced it need to be registered with setup: Setup.With(asResolutionCall: true). Read here for more details: https://bitbucket.org/dadhi/dryioc/wiki/UnregisterAndResolutionCache#markdown-header-unregister-and-resolution-cache
Update
Here are two options and considerations that work in pure DryIoc and may not work XF. But it probably may help with solution.
public class Foo
{
public IBar Bar { get; private set; }
public Foo(IBar bar) { Bar = bar; }
}
public interface IBar {}
public class Bar : IBar {}
public class Bar2 : IBar { }
[Test]
public void Replace_singleton_dependency_with_asResolutionCall()
{
var c = new Container(rules => rules.WithoutEagerCachingSingletonForFasterAccess());
c.Register<Foo>();
//c.Register<Foo>(Reuse.Singleton); // !!! If the consumer of replaced dependency is singleton, it won't work
// cause the consumer singleton should be replaced too
c.Register<IBar, Bar>(Reuse.Singleton,
setup: Setup.With(asResolutionCall: true)); // required
var foo = c.Resolve<Foo>();
Assert.IsInstanceOf<Bar>(foo.Bar);
c.Register<IBar, Bar2>(Reuse.Singleton,
setup: Setup.With(asResolutionCall: true), // required
ifAlreadyRegistered: IfAlreadyRegistered.Replace); // required
var foo2 = c.Resolve<Foo>();
Assert.IsInstanceOf<Bar2>(foo2.Bar);
}
[Test]
public void Replace_singleton_dependency_with_UseInstance()
{
var c = new Container();
c.Register<Foo>();
//c.Register<Foo>(Reuse.Singleton); // !!! If the consumer of replaced dependency is singleton, it won't work
// cause the consumer singleton should be replaced too
c.UseInstance<IBar>(new Bar());
var foo = c.Resolve<Foo>();
Assert.IsInstanceOf<Bar>(foo.Bar);
c.UseInstance<IBar>(new Bar2());
var foo2 = c.Resolve<Foo>();
Assert.IsInstanceOf<Bar2>(foo2.Bar);
}

Injecting DbContext in constructor of Web api 2 controller

I am creating a small proof of concept asp.net web api 2 service with entity framework code first. The controller's constructor looks like
public AccountController: ApiController
{
private readonly DbContext context;
public AccountController(DbContext _context){
context = _context;
}
public AccountController(){context = new ApplicationContext();}
}
I need to unit test my controllers. How can I mock the DbContext class. Is there a simple way of doing this? I want to avoid all that repository pattern with lot of interfaces. Because it will be a way overkill for this prototype.
Its usually something like this if you use Nunit and Moq.
[TestFixture]
public class AccountControllerTest
{
private Mock<DbContext> mockContext;
private AccountController sut;
[SetUp]
public void TestSetup()
{
mockContext = new Mock<DbContext>();
var account = new Account() { Id = 123, Name = "Test Account" };
mockContext.SetUp(x => x.GetAccountOnContext()).Returns(account);
sut = new Controller(mockContext.Object) { Request = new HttpRequestMessage() };
sut.Request.Properties.Add(HttpPropertyKeys.HttpConfigurationKey, new HttpConfiguration());
}
[Test]
public void ControllerMethod_GetLogin_Test()
{
// assuming GetLogin calls GetAccount on DbContext()
var response = sut.GetLogin(someAccount);
Assert.AreEqual(HttpStatusCode.OK, response.StatusCode);
mockContext.Verify();
}
}
You basically want to mock out your external dependencies and test just what the SUT (System Under Test) is supposed to do. I would also strongly encourage to look at Fakes instead of mocks. In general fakes result in less brittle tests.
So in this case, you could have a FakeDbContext() that you can pass to the tests. The FakeDbContext() will behave more like the actual DbContext() but will do all those operations in-memory, so that your tests don't have a dependency with a real database.
Depending on the database you use, you can also look at starting an embedded version of the real database as a part of your tests. Just have to make sure to do the necessary stopping and clean up of the test database records after the test run is complete in the TearDown() method.

unit testing a class that uses linq to sql

I want to write unit test for a class that contains linq to sql codes . I mean inside each method I have created a new DbContext and done database jobs .
I searched the web . first I came to use repository and Unit of Work patterns but I figured out that DbContext itself is a unit of work and its dbset works as repositories . another point is that I think there is no need to test Linq part because it works as it should ( tested by .net team ) . I want to test the logic I have added to the code . so I decided to create an interface with necessary methods with two implementations , one uses linqToSql while another is just a mock . something like this :
public interface IDbManager
{
bool Insert(MyEntity newEntity);
}
public class RealDbManager:IDbManager
{
public bool Insert(MyEntity newEntity)
{
using (DbDataContext db = new DbDataContext())
{
db.MyEntities.InsertOnSubmit(newEntity);
db.SubmitChanges();
}
}
}
public class MockDbManager:IDbManager
{
public bool Insert(MyEntity newEntity)
{
return true;
}
}
is the whole idea correct ? if so is this a correct implementation ?
is it possible to define DbDataContext as a class variable instead of creating new instance inside each method ?
You have the right general idea for a start. Your Mock Insert method should save the entity to some in-memory store so that subsequent queries will return the inserted information, as would be expected. But the very basic idea of having an interface, with a 'real' and a 'mock' implementation is there.
Remember that when using your Mock in tests, you are testing your other code that uses the mock - not the mock itself.
As for defining the DataContext as a member variable; you could use an IDisposable pattern for it, like so:
public class RealDbManager:IDbManager, IDisposable
{
DbDataContext db = new DbDataContext();
public bool Insert(MyEntity newEntity)
{
{
db.MyEntities.InsertOnSubmit(newEntity);
db.SubmitChanges();
}
}
public void Dispose()
{
db.Dispose();
}
}
You would just have to be sure to dispose of your DbManager, then.
Yes. The only thing I would avoid is to create an actual mocked class (in this case it should be called Fake), but using a mocking engine.
In your question you mention two kind of tests. First is testing the behavior of your class, the second is testing the integration of it. They seem the same but it's not.
In the first you need to mock your class to test its 'connection' against your other classes this way (using Moq):
[Test]
public void Test()
{
var entity = new Entity();
var mocked = new Mock<IDbManager>();
//you are telling the moq engine everytimes it finds an invocation of your repository
//to return true as you did in you mocked class
mocked.Setup( x => x.Insert( entity ) ).Returns( true );
var classUnderTest = new ClassUnderTest( mocked.Object );
//in this method you invoke your repository
var ret = classUnderTest.DoSomething( entity );
//assertions
Assert.Equal( something, ret);
//eventually you can verify that your repository has been hit once
mocked.Verify( x => x.Insert( It.IsAny<Entity>), Times.Once);
}
in the later as you correctly state, you have nothing to test on linq (Microsoft did it for us), but in case you need to verify the correctness of your linq you can do it only against a real db (or using a repository pattern against a fake repository). This is an integration test and it's has nothing to share with mocking.
To decouple your class from DbContext you could use repository pattern. Have a look at this article. http://dotnetspeak.com/index.php/2011/03/repository-pattern-with-entity-framework/

Best practice for managing life time of business layer, data layer, datacontext instance asp.net website

Our Asp.net web application is using LINQ-to-SQL (Stored Procs are dragged on dropped on dbml file to create classes) and 3 tier architecture is similar to the one below. I have just created rough methods to give reader proper idea so that he can answer well.
namespace MyDataLayer
{
public class MyDataAccess
{
// global instance of datacontext
MyDataModelDataContext myDB = new MyDataModelDataContext(); (#1)
public void GetUserIDByUsername(string sUserName, ref int iUserID)
{
int? iUserIDout = 0;
// this will make call to SP in SQL DB
myDB.USP_RP_GETUSERIDBYUSERNAME(sUserName, "", ref iUserIDout);
iUserID = (int)iUserIDout;
}
public List<USP_APP_USERDETAILSResult> GetUserDetails(string sUserIDs)
{
// this will make call to SP in SQL DB
return myDB.USP_APP_USERDETAILS(sUserIDs).ToList();
}
...
... // several CRUD methods
}
}
namespace MyBusinessLayer
{
public class SiteUser
{
// global DataAccess instance
MyDataLayer.MyDataAccess myDA = new MyDataAccess(); (#2)
public void GetUserIDByUsername(string sUserName, ref int iUserID)
{
myDA.GetUserIDByUsername(sUserName, ref iUserID);
}
public List<USP_APP_USERDETAILSResult> GetUserDetails(string sUserIDs)
{
// this will make call to SP in SQL DB
return myDA.GetUserDetails(sUserIDs);
}
...
... // several CRUD methods
}
}
namespace MyWebApplication
{
public class BaseWebPage : System.Web.UI.Page
{
// static business layer instance
public static MyBusinessLayer.SiteUser UserBLInstance = new SiteUser(); (#3)
...
}
}
// Index.aspx.cs code fragment
namespace MyWebApplication
{
public class Index : BaseWebPage
{
public void PopulateUserDropDown()
{
// using static business layer instance declared in BaseWebPage
List<USP_APP_USERDETAILSResult> listUsers = UserBLInstance.GetUserDetails("1,2,3");
// do databinding and so on ...
}
...
}
}
Questions
(Ref.#1) Is having global datacontext in DataAccess good approach? yes/no why?
If your suggestion is having datacontext per request what is the best practice for that
(Ref.#2) Is having global DataAccess instance in BusinessLayer good approach? yes/no why?
If your suggestion is having DataAccess instance per request what is the best practice for that
(Ref. #3) Is static business layer instance declared in BaseWebPage good approach? yes/no why?
Best approach to manage life time of BL instance and DL instance in general
We are facing periodic InvalidCastException on production server for a very simple method which works fine if I restart my application from IIS. When this problem is there we can access the same database from SQL Management Studio and can execute same SP
Our prime suspect about this issue is poor DataContext management and I have read many articles on net about managing life time of DataContext but I am now confused about various approach.
That's why I have elaborated my questions so that many in same situation can get clear idea about problem/answer.
(Ref.#1) Is having global datacontext in DataAccess good approach? yes/no why?
Yes.
However, creating it manually inside the dataaccess class means that you can't control the lifetime of the datacontext. Instead, make it then a constructor parameter so that it is injected into the data access
(Ref.#2) Is having global DataAccess instance in BusinessLayer good approach? yes/no why?
Yes. But refer to 1. - make it injectable via the constructor.
(Ref. #3) Is static business layer instance declared in BaseWebPage good approach? yes/no why?
No. Avoid static for complex objects as usually such objects has non-trivial state. And this is when a lot of nasty issues can happen if you share such objects in a concurrent environment.
To summarize.
public class DataAccess {
public DataAccess( DataContext context ) { ... }
}
public class BusinessLayer {
public BusinessLayer( DataAccess access ) { ... }
}
public class MyPage : Page {
...
var ctx = TheDataContext.Current;
var bl = new BusinessLayer( new DataAccess( ctx ) );
}
with data context shared in a request scope:
public partial class TheDataContext {
// Allow the datacontext to be shared in a request-scope
public static TheDataContext Current {
get {
if ( HttpContext.Current.Items["context"] == null )
HttpContext.Current.Items.Add( "context", new TheDataContext() );
return (TheDataContext)HttpContext.Current.Items["context"];
}
}
}
In your sample - your MyDataLayer usually has name Repository. Definitely it is good to have DataContext instance in Repositories and do not try to use them outside. So, only in repositories you will have dependency on Linq-To-Sql, which means that you can create Stub objects for these Repositories and really easy test other parts of your application.
Definitely you should Dispose your Data Context instances, DataContext contains too many objects to keep them alive and let GC to kill them. As you can see you don't create any transaction objects when you are working with DataContextes, so I think that LinqToSql based on idea that you should have everything per transaction (of course you can also try to handle transaction manually, but do you really want to do this?). Disposing datacontextes in methods of Repository is a good approach, because this will not allow you to use cool feature of all ORM frameworks: Lazy Load. If you will try to use Lazy Load - you will like it, but usually it is just one of possible performance degradation cause.
Definitely your should use DataContextes for shorter or the same time of Request, don't try to use LongSession (it is when you trying to keep DataContext for more than one Http Request, it is just pain in ass, nothing else, if you want to read about this, just try to read couple articles about Long Running Session in Hibernate, I tried with nHibernate - don't do this at home ;) ).

How to pass unit of work container into constructor of repository using dependency injection

I'm trying to work out how to complete my implementation of the Repository pattern in an ASP.NET web application.
At the moment, I have a repository interface per domain class defining methods for e.g. loading and saving instances of that class.
Each repository interface is implemented by a class which does the NHibernate stuff. Castle Windsor sorts out the DI of the class into the interface according to web.config. An example of an implemented class is provided below:
public class StoredWillRepository : IStoredWillRepository
{
public StoredWill Load(int id)
{
StoredWill storedWill;
using (ISession session = NHibernateSessionFactory.OpenSession())
{
storedWill = session.Load<StoredWill>(id);
NHibernateUtil.Initialize(storedWill);
}
return storedWill;
}
public void Save(StoredWill storedWill)
{
using (ISession session = NHibernateSessionFactory.OpenSession())
{
using (ITransaction transaction = session.BeginTransaction())
{
session.SaveOrUpdate(storedWill);
transaction.Commit();
}
}
}
}
As pointed out in a previous thread, the repository class needs to accept an unit of work container (i.e. ISession) rather than instantiating it in every method.
I anticipate that the unit of work container will be created by each aspx page when needed (for example, in a property).
How do I then specify that this unit of work container instance is to be passed into the constructor of StoredWillRepository when Windsor is creating it for me?
Or is this pattern completely wrong?
Thanks again for your advice.
David
I have a persistence framework built on top of NHibernate that is used in a few Web apps. It hides the NH implementation behind an IRepository and IRepository<T> interface, with the concrete instances provided by Unity (thus I could in theory swap out NHibernate for, say, Entity Framework fairly easily).
Since Unity doesn't (or at least the version I'm using doesn't) support the passing in of constructor parameters other than those that are dependency injections themselves, passing in an extant NH ISession isn't possible; but I do want all objects in the UOW to share the same ISession.
I solve this by having a controlling repository class that manages access to the ISession on a per-thread basis:
public static ISession Session
{
get
{
lock (_lockObject)
{
// if a cached session exists, we'll use it
if (PersistenceFrameworkContext.Current.Items.ContainsKey(SESSION_KEY))
{
return (ISession)PersistenceFrameworkContext.Current.Items[NHibernateRepository.SESSION_KEY];
}
else
{
// must create a new session - note we're not caching the new session here... that's the job of
// BeginUnitOfWork().
return _factory.OpenSession(new NHibernateInterceptor());
}
}
}
}
In this example, PersistenceFrameworkContext.Current.Items accesses an IList<object> that is stored either ThreadStatic if not in a Web context, or within HttpContext.Current.Items if it is in a Web context (to avoid thread-pool problems). The first call to the property instantiates the ISession from the stored factory instance, subsequent calls just retrieve it from storage. The locking will slow things down slightly but not as much as just locking an appdomain-scoped static ISession instance.
I then have BeginUnitOfWork and EndUnitOfWork methods to take care of the UOW - I have specifically disallowed nested UOWs because frankly they were a pain to manage.
public void BeginUnitOfWork()
{
lock (_lockObject)
{
if (PersistenceFrameworkContext.Current.Items.ContainsKey(SESSION_KEY))
EndUnitOfWork();
ISession session = Session;
PersistenceFrameworkContext.Current.Items.Add(SESSION_KEY, session);
}
}
public void EndUnitOfWork()
{
lock (_lockObject)
{
if (PersistenceFrameworkContext.Current.Items.ContainsKey(SESSION_KEY))
{
ISession session = (ISession)PersistenceFrameworkContext.Current.Items[SESSION_KEY];
PersistenceFrameworkContext.Current.Items.Remove(SESSION_KEY);
session.Flush();
session.Dispose();
}
}
}
Finally, a pair of methods provide access to the domain-type-specific repositories:
public IRepository<T> For<T>()
where T : PersistentObject<T>
{
return Container.Resolve<IRepository<T>>();
}
public TRepository For<T, TRepository>()
where T : PersistentObject<T>
where TRepository : IRepository<T>
{
return Container.Resolve<TRepository>();
}
(Here, PersistentObject<T> is a base class providing ID and Equals support.)
Access to a given repository is thus in the pattern
NHibernateRepository.For<MyDomainType>().Save();
This is then facaded over such that you can use
MyDomainType.Repository.Save();
Where a given type has a specialised repository (ie needs more than it can get from IRepository<T>) then I create an interface deriving from IRepository<T>, an extending implementation inheriting from my IRepository<T> implementation, and in the domain type itself I override the static Repository property using new
new public static IUserRepository Repository
{
get
{
return MyApplication.Repository.For<User, IUserRepository>();
}
}
(MyApplication [which is called something less noddy in the real product] is a facade class which takes care of supplying the Repository instance via Unity so you have no dependency on the specific NHibernate repository implementation within your domain classes.)
This gives me full pluggability via Unity for the repository implementation, easy access to the repository in code without jumping through hoops, and transparent, per-thread ISession management.
There's lots more code than just what's above (and I've simplified the example code a great deal), but you get the general idea.
MyApplication.Repository.BeginUnitOfWork();
User user = User.Repository.FindByEmail("wibble#wobble.com");
user.FirstName = "Joe"; // change something
user.LastName = "Bloggs";
// you *can* call User.Repository.Save(user), but you don't need to, because...
MyApplication.Repository.EndUnitOfWork();
// ...causes session flush which saves the changes automatically
In my Web app, I have session-per-request, so BeginUnitOfWork and EndUnitOfWork get called in BeginRequest and EndRequest respectively.
I have a pretty similar structure to yours, and here's how I solve your question:
1) To specify my container on each method, I have a separate class ("SessionManager") which I then invoke via a static property. By doing so, here's an example using my Save implementation:
private static ISession NHibernateSession
{
get { return SessionManager.Instance.GetSession(); }
}
public T Save(T entity)
{
using (var transaction = NHibernateSession.BeginTransaction())
{
ValidateEntityValues(entity);
NHibernateSession.Save(entity);
transaction.Commit();
}
return entity;
}
2) My container is not created on each ASPX page. I instantiate all of my NHibernate goodness on the global.asax page.
** A few more things spring up **
3) You don't need to have a helper to instantiate the Load. You might as well use Get instead of Load. More information # Difference between Load and Get.
4) Using your current code, you would have to repeat pretty much the same code for each domain object you need (StoredWillRepository, PersonRepository, CategoryRepository, etc..?), which seems like a drag. You could very well use a generic class to operate over NHibernate, like:
public class Dao<T> : IDao<T>
{
public T SaveOrUpdate(T entity)
{
using (var transaction = NHibernateSession.BeginTransaction())
{
NHibernateSession.SaveOrUpdate(entity);
transaction.Commit();
}
return entity;
}
}
In my implementation, I could then use something like:
Service<StoredWill>.Instance.SaveOrUpdate(will);
Technically, the answer to my question is to use the overload of container.Resolve which allows you to specify the constructor argument as an anonymous type:
IUnitOfWork unitOfWork = [Code to get unit of work];
_storedWillRepository = container.Resolve<IStoredWillRepository>(new { unitOfWork = unitOfWork });
But let's face it, the answers provided by everyone else have been much more informative.

Resources