Entity Framework ObjectContext re-usage - asp.net

I'm learning EF now and have a question regarding the ObjectContext:
Should I create instance of ObjectContext for every query (function) when I access the database?
Or it's better to create it once (singleton) and reuse it?
Before EF I was using enterprise library data access block and created instance of dataacess for DataAccess function...

I think the most common way is to use it per request. Create it at the beginning, do what you need (most of the time these are operation that require common ObjectContext), dispose at the end. Most of DI frameworks support this scenario, but you can also use HttpModule to create context and place it in HttpContext.Current.Items. That is simple example:
public class MyEntitiesHttpModule : IHttpModule
{
public void Init(HttpApplication application)
{
application.BeginRequest += ApplicationBeginRequest;
application.EndRequest += ApplicationEndRequest;
}
private void ApplicationEndRequest(object sender, EventArgs e)
{
if (HttpContext.Current.Items[#"MyEntities"] != null)
((MyEntities)HttpContext.Current.Items[#"MyEntities"]).Dispose();
}
private static void ApplicationBeginRequest(Object source, EventArgs e)
{
var context = new MyEntities();
HttpContext.Current.Items[#"MyEntities"] = context;
}
}

Definitely for every query. It's a lightweight object so there's not much cost incurred creating one each time you need it.
Besides, the longer you keep an ObjectContext alive, the more cached objects it will contain as you run queries against it. This may cause memory problems. Therefore, having the ObjectContext as a singleton is a particularly bad idea. As your application is being used you load more and more entities in the singleton ObjectContext until finally you have the entire database in memory (unless you detach entities when you no longer need them).
And then there's a maintainability issue. One day you try to track down a bug but can't figure out where the data was loaded that caused it.

Don't use a singleton.. everyone using your app will share that and all sorts of crazy things will happen when that object context is tracking entities.
I would add it as a private member

Like Luke says this question has been asked numerous times on SO.
For a web application, per request cycle seems to work best. Singleton is definitely a bad idea.
Per request works well because one web page has a User, maybe some Projects belonging to that user, maybe some Messages for that user. You want the same ObjectContext so you can go User.Messages to get them, maybe mark some messages as read, maybe add a Project and then either commit or abandon the whole object graph at the completion of the page cycle.

Late post here by 7 months. I am currently tackling this question in my app and I'm leaning towards the #LukLed solution by creating a singleton ObjectContext for the duration of my HttpRequest. For my architecture, I have several controls that go into building a page and these controls all have their own data concerns that pull read-only data from the EF layer. It seems wasteful for them to each create and use their own ObjectContext's. Besides, there are a few situations where one control may pull data into the Context that could be reused by other controls. For instance, in my masterpage, my header at the top of the page has user information that can be reused by the other controls on the page.
My only worry is that I may pull entities into the context that will affect the queries of other controls. I haven't seen that yet but don't know if I'm asking for trouble. I guess we'll see!

public class DBModel {
private const string _PREFIX = "ObjectContext";
// DBModel.GetInstance<EntityObject>();
public static ObjectContext GetInstance<T>() {
var key = CreateKey<T>();
HttpContext.Current.Items[key] = HttpContext.Current.Items[key] ?? Activator.CreateInstance<T>();
return HttpContext.Current.Items[key] as ObjectContext;
}
private static string CreateKey<T>() {
return string.Format("{0}_{1}", _PREFIX, typeof(T).Name);
}
}

Related

Best way for creating and reusing datacontext in ASP .NET MVC application

I have read this post and I'm confused about what is the best way for datacontext creation and consumption ?
I always thought that you should use singleton like below (+locking)
private static ModelDataContext dataContext=null;
protected static ModelDataContext DataContext
{
get
{
if(dataContext==null)
dataContext = new ModelDataContext();
return dataContext;
}
}
However in the article he says shortly:
LINQ DataContexts cache some of the data and changes you make – you can quickly eat up memory if each instance isn’t disposed fairly quickly. TableAdapters hold open SQLConnections for reuse – so if you use enough classes of TableAdapters, you can have enough different static vars to tie up all of your db connections.
and suggests to use the model below
protected static ModelDataContext DataContext
{
get
{
if(System.Web.HttpContext.Current.Items["ModelDataContext"]==null)
System.Web.HttpContext.Current.Items["ModelDataContext"] = new ModelDataContext();
return (ModelDataContext)System.Web.HttpContext.Current.Items["ModelDataContext"];
}
}
What are your thoughts ?
I highly recommend to use a Dependency Injection container, such as ninject, to instanciate a new DataContext for each web request.

EF, UoW and Repository - When to dispose the UnitOfWork in WebForms?

Recently I started digging into the concept of Repository Patterns and UnitOfWork together with exploring the EntityFramework.
Made my own implementation based on an MVC example, where they were disposing the UnitOfWork from the Controller like so:
protected override void Dispose(bool disposing)
{
unitOfWork.Dispose();
base.Dispose(disposing);
}
I'm not into MVC at all, and pretty new in Webforms as well, but I assume they are overriding the Controller dispose method in order to dispose the UnitOfWork as "everything" else is disposed.
Basically I'd like to implement the same concept in my ASP.NET WebForms website and dispose the UnitOfWork that is used behind a Page's code together with the disposing of the Page itself.
I considered adding the same to the Page_Unload event from the life cycle, but I wasn't sure if this is the proper way to do it as I haven't messed with such things before. My idea as follows:
protected void Page_Unload(object sender, EventArgs e)
{
unitOfWork.Dispose();
base.Dispose();
}
How can I achieve that safely, and am I on the right track?
First of all: Don't invent the wheel.
Use Dependency Injection frameworks like: StructureMap, Ninject, Unity, etc....
Your UoW should be started with beginning of a web request and disposed when the request is ended.
In other words: DataContext of EF should be initialized when a request is started. Then you can store it somewhere(Session, ...), and it can be reused for that request. One instance of DataContext per request.
But if you try to do it yourself, you are in the wrong way, using a dependency injection framework, would make it so easy.
The framework can handle lifetime of your DataContext(UoW).
You should divide your software into layers and components...
UI --> Controller --> Service --> DAL
E.g.
UI.Submit --> Controller.SaveUserInfo --> UsersManagementService.SaveUserInfo -->
public void SaveUserInfo(User user)
{
using (var uow = new Uow())
{
var dbUser = uow.Users.GetByKey(user.Id);
if (dbUser == null)
{
// TODO:
}
dbUser.Name = user.Name;
dbUser.Address = user.Address;
// Or use a framework for injecting properties
uow.SaveAndAcceptChanges();
}
}
You can even receive query logic in certiain service methods e.g.
public User GetUsersMatching(Func<IQueryable<User>, IQueryable<User>> query)
{
using (var uow = new Uow())
{
var users = query(uow.Users).ToList();
return users;
// For this to work using POCOs you may need to disable proxy creation
}
}
However, this makes testing more complex and requires consumers of service to understand the limitations and best practices of LINQ to Entities.
I also recommend not using the pure academic repository-per-base-type pattern, as often to efficiently consume different types of data from the same domain require cross "repository" queries and require join clauses in your LINQ or that the result of multiple queries is combined together and reshaped prior to returning it as one clean result.
Instead treat your services as domain-speicifc-repositories where you can get various types of output. Or in other words, use SOA under your MVC and above your EF.

ASP.NET Repository Pattern/Service Layer Caching

I'm beginning to work on the caching infrastructure for my ASP.NET MVC site. The problem is, I can't seem to find a reasonable place for data caching (other than 'everywhere')
Right now my architecture looks like this:
Controller -> Service Layer -> Repository. The repository uses Linq to SQL for data access.
The repository exposes generic methods like Insert, GetById, and GetQueryable, which returns an IQueryable that the service layer can further refine.
I like the idea of putting caching in the repository layer, since the service layer shouldn't really care where the data comes from. The problem though is with cache invalidation. The service layer has more information about when data becomes stale than the repository. For instance:
Suppose we have a Users table and an Orders table (the canonical example). The service layer offers methods like GetOrder(int id), which would call the repository layer:
public Order GetOrder(int id)
{
using(var repo = _repoFactory.Create<Order>())
{
return repo.GetById(id)
}
}
or
repo.GetQueryable(order => order.Id == id && order.HasShipped == false).Single();
If we cache in the repository layer, it seems like it would be very limited in knowing when that order data has changed. Suppose the user was deleted, causing all their orders to be deleted with a CASCADE. The service layer could invalidate the Orders cache, since it knew the user was just removed. The repository though (since it's a Unit of Work), wouldn't be aware. (Ignore the fact that we shouldn't be querying orders for a deleted user, since it's just an example).
There's other situations where I think this shows itself. Suppose we want to fetch all the users orders:
repo.GetQueryable(order => order.UserId == userId).ToList()
The repository can cache the results of this query, but, if another order is added, this query is no longer valid. Only the service layer is aware of this though.
It's also possible my understanding of the repository layer is wrong. I sort of view it as a facade around the data source (i.e. changing from L2SQL to EF to whatever, the service layer is unaware of the underlying source).
Realistically, you will need another layer; the data caching layer. It will be used by your service layer when requesting data. Upon such a request, it will decide if it has the data in cache or if it needs to query the appropriate repository. Likewise, your service layer can tell this new data caching layer of an invalidation (the deletion of a particular user, etc.).
What this can mean for your architecture though, is that your data caching layer will implement the same interface(s) your repositories do. A fairly simple implementation would cache the data by entity type and key. However, if you are using a more sophisticated ORM behind the scenes (NHibernate, EF 4, etc.), it should have caching as an option for you.
You could put an event on the objects returned by your repositories, and have the repository subscribe the cache invalidation to a handler.
For example,
public class SomethingRepository{
public Something GetById(int id){
var something = _table.Single(x=>x.id==id);
something.DataChanged += this.InvalidateCache;
return something;
}
public void InvalidateCache(object sender, EventArgs e){
// invalidate your cache
}
}
And your Something object needs to have a DataChanged event and some public method for your service layer to call to trigger it. Like,
public class Something{
private int _id;
public int Id{
get { return _id; }
set {
if( _id != value )
{
_id = value;
OnDataChanged();
}
}
}
public event EventHandler DataChanged;
public void OnDataChanged(){
if(DataChanged!=null)
DataChanged(this, EventArgs.Empty);
}
}
So, all your service layer needs to know is that the data is being changed, and the repository handles the cache invalidation.
I also suggest you take ventaur's advice and put the cache invalidation logic in a separate service. You don't need to go so far as to create a separate "data caching layer", but the logic would be cleaner if kept in a different class.

Do all instances of the same ASPX page share the same static field?

Let's consider this page's code-behind:
public partial class Products : Page
{
private static SomeClass SharedField;
public Product()
{
// ... Some logic
}
}
Do all Products pages instances share the same SharedField, I know this is a basic concept of static fields. But in this case, really? all users can have access (and can't have their own instance of) to the same static field on the website-level?
If so, in what aspects this would used by the web developer? or is this non-recommended practice?
Yes, there will be a single instance of that static field for all users, but only within a single worker process. If you have web farms/web gardens, they will each have their own static instance. If the worker process restarts, you'll get a new static instance.
You'll have to use locking around that shared field to ensure thread safety.
As for why to use that, I'm not sure, I never do it. The best example I can give you is the built-in static HttpContext.Current, which gives you access to the Request, Response, etc.
SharedField will be available in one instance for the entire life-cycle of the web site.
To read a bit more about it, see this answer.
A better practice would be to store your object in the Application state.
Application["MyObject"] = new SomeClass();

What is the best way to reuse pages from one website in another?

I'm developing a new ASP .NET website which is effectively a subset of the pages in another site we've just released. Two or three of the pages will need minor tweaks but nothing significant.
The obvious answer is to simply copy all of the code and markup files into the new project, make the aforementioned tweaks, and consider the job done. However I'm not keen on this at all due to the amount of duplicated code it will create.
My next idea was to move the code for the pages (i.e. the code-behind file) into a separate assembly which can then be referenced from both sites. This is a little awkward however as if you don't take the designer file with it, you get a lot of build errors relating to missing controls. I don't think moving the designer file is a good idea though as this will need to be regenerated each time the markup is altered.
Does anyone have any suggestions for a clean solution to this problem?
You might want to take a look at the MVP pattern. Since you are probably using WebForms it would be hard to migrate to ASP.Net MVC, but you could implement MVP pretty easily into existing apps.
On a basic level you would move all the business logic into a Presenter class that has a View that represents some sort of interface:
public class SomePresenter
{
public ISomeView View{get; set;}
public void InitializeView()
{
//Setup all the stuff on the view the first time
View.Name = //Load from database
View.Orders = //Load from database
}
public void LoadView()
{
//Handle all the stuff that happens each time the view loads
}
public Int32 AddOrder(Order newOrder)
{
//Code to update orders and then update the view
}
}
You would define your interface to hold the atomic types you want to display:
public interface ISomeView
{
String Name {get; set;}
IList<Order> Orders{get; set;}
}
Once those are defined you can now simply implement the interface in your form:
public partial class SomeConcreteView : System.Web.UI.Page, ISomeView
{
public SomePresenter Presenter{get; set;}
public SomeConcreteView()
{
Presenter = new SomePresenter();
//Use the current page as the view instance
Presenter.View = this;
}
protected void Page_Load(object sender, EventArgs e)
{
if(!IsPostBack)
{
Presenter.InitializeView();
}
Presenter.LoadView();
}
//Implement your members to bind to actual UI elements
public String Name
{
get{ return lblName.Text; }
set{ lblName.Text = value; }
}
public IList<Order> Orders
{
get{ return (IList<Order>)ordersGrid.DataSource; }
set
{
ordersGrid.DataSource = value;
ordersGrid.DataBind();
}
}
//Respond to UI events and forward them to the presenter
protected virtual void addOrderButton_OnClick(object sender, EventArgs e)
{
Order newOrder = //Get order from UI
Presenter.AddOrder(newOrder);
}
}
As you can see, your code behind is now extremely simple so code duplication is not a big deal. Since the core business logic is all wrapped up in a DLL somewhere, you don't have to worry about functionality getting out of sync. Presenters can be used in multiple views, so you have high reuse, and you are free to change the UI without affecting the business logic as long as you adhere to the contract.
This same pattern can apply to user controls as well, so you can get as modular as you need to. This pattern also opens up the possibility for you to unit test your logic without having to run a browser :)
The patterns and practices group has a nice implementation of this: WCSF
However, you don't have to use their framework to implement this pattern. I know this may look a little daunting at first, but it will solve many of the problems (In my opinion) you are running into.
Create user controls (widgets) or templates to tweak what you want to achieve.
It might also be possible to achieve the same with CSS styles or JavaScript.
Why not create user controls (or custom controls) from the pages which you wish to share? You can then re-use these across both sites.
What we use in our project (JSP, not ASP, but when it comes to building and files it surely isn't an issue?) is to have a base folder of common files, and then another ("instance") folder of additional files and overwrites, and our build script (in ANT, Maven should be fine too) will first copy the base folders, and then based upon a parameter supplied select which instance's files to copy across as well.
Thus we can change a file in the base, and have it apply across all instances.
An issue is that changing a base file will not update any instance file that overwrites it but at least you can make a process for these updates. Presumably you could also use the SVN (etc) revision to flag a build error is an instance file is older than a base file, but we haven't implemented anything that clever yet.
In addition your back-end code (Struts actions in our case) will end up handling all cases rather than any particular instance's cases only. But at least all the code is in one place, and the logic should be clear ("if (instance == FooInstance) { doFooInstanceStuff(...); }").

Resources