I have recently inherited an ASP.Net app using Linq2SQL. Currently, it has its DataContext objects declared as static in every page, and I create them the first time I find they are null (singleton, sort of).
I need comments if this is good or bad. In situations when I only need to read from the DB and in situations where i need to write as well.
How about having just one DataContext instance for the entire application?
One DataContext per application would perform badly, I'm afraid. The DataContext isn't thread safe, for starters, so even using one as a static member of a page is a bad idea. As asgerhallas mentioned it is ideal to use the context for a unit of work - typically a single request. Anything else and you'll start to find all of your data is in memory and you won't be seeing updates without an explicit refresh. Here are a couple posts that talk to those two subjects: Identity Maps and Units of Work
I use to have one DataContext per request, but it depends on the scenarios you're facing.
I think the point with L2S was to use it with the unit of work pattern, where you have a context per ... well unit of work. But it doesn't work well in larger applications as it's pretty hard to reattach entities to a new context later.
Rick Strahl has a real good introduction to the topic here:
http://www.west-wind.com/weblog/posts/246222.aspx
One thing I can say I have had problems with in the past, is to have one context to both read and write scenarios. The change tracking done in the datacontext is quite an overhead when you are just reading, which is what most webapps tends to do most of the time. You can make the datacontext readonly and it will speed up things quite a bit - but then you'll need another context for writing.
Related
I found something funny, I notice it by luck while I was debugging other thing. I was applying MVP pattern and I made a singleton controller to be shared among all presentations.
Suddenly I figured out that some event is called once at first postback, twice if there is two postback, 100 times if there is 100 postbacks.
because Singleton is based on a static variable which hold the instance, and the static variable live across postbacks, and I wired the event assuming that it will be wired once, and rewired for each postback.
I think we should think twice before applying a singleton in a web application, or I miss something??
thanks
I would think twice about using a Singleton anywhere.
Many consider Singleton an anti-pattern.
Some consider it an anti-pattern, judging that it is overused, introduces unnecessary limitations in situations where a sole instance of a class is not actually required, and introduces global state into an application.
There are lots of references on Wikipedia that discuss this.
It is very rare to need a singleton and personally I hold them in the same light as global variables.
You should think twice any time you are using static objects in a multi-threaded application (not only the singleton pattern) because of the shared state. Proper locking mechanisms should be applied in order to synchronize the access to the shared state. Failing to do so some very difficult to find bugs could appear.
I've been using Singletons in my web apps for quite some time and they have always worked out quite well for me, so to say they're a bad idea is really a pretty difficult claim to believe. The main idea, when using Singletons, is to keep all the session-specific information out of them, and to use them more for global or application data. To avoid them because they are "bad" is really not too smart because they can be very useful when applied correctly.
New to .net and was wondering if there is a performance gain to keeping an instance of, for example a DAL object in scope?
Coming from the Coldfusion world I would instanciate a component and store it in the application scope so that every time my code needed to use that component it would not have to be instanciated over and over again effecting performance.
Is there any benefit to doing this in ASP.Net apps?
Unless you are actually experiencing a performance problem, than you need not worry yourself with optimizations like this.
Solve the business problems first, and use good design. As long as you have a decent abstraction layer for your data access code, then you can always implement a caching solution later down the road if it becomes a problem.
Remember that any caching solution increases complexity dramatically.
NO. In the multi-tier world of .asp this would be considered a case of "premature optimization". Once a sites suite of stubs, scripts and programs has scaled up and been running for a few months then you can look at logs and traces to see what might be cached, spawned or rewritten to improve performance. And as the infamous Jeff Atwood says "Most code optimizations for web servers will benifit from money being spent on new and improved hardware rather than tweaking code for hours and hours"
Yes indeed you can and probably should. Oftentimes the storage for this is in the Session; you store data that you want for the user.
If it's a global thing, you may load it in the Application_Start event and place it somewhere, possibly the HttpCache.
And just a note, some people use "Premature Optimisation" to avoid optimising at all; this is nonsense. It is reasonable to cache in this case.
It is very important to do the cost benefit analysis before caching any object, one must consider all the factors like
Performance advantage
Frequency of use
Hardware
Scalability
Maintainability
Time available for delivery (one of the most important factor)
Finally, it is always useful to cache object which are very costly to create or you are using very frequently i.e. Tables's Data (From DB) or xml data
Does the class you are considering this for have state? If not, (and DAL classes often do not have state, or do not need state), then you should make it's methods static, and then you don't need to instantiate it at all. If the only state it holds is a connection string, you can also make that property field a static property field, and avoid the requirement of instantiating it that way.
Otherwise, take a look at the design pattern called Flyweight
I'm looking at some vb.net code I just inherited, and cannot fathom why the original developer would do this.
Basically, each "Domain" class is a collection of properties. And each one implements IDisposable.Dispose, and overrides Finalize(). There is no base class, so each just extents Object.
Dispose sets each private var to Nothing, or calls _private.Dispose when the property is another domain object. There's a private var that tracks the disposed state, and the final thing in Dispose is GC.suppressFinalize(Me)
Finalize just calls Me.Dispose and MyBase.Finalize.
Is there any benefit to this? Any harm? There are no un-managed resources, no db connections, nothing that would seem to need this.
That strikes me as being a VB6 pattern.
I would bet the guy was coming straight from VB6, maybe in the earlier days of .NET when these patterns were not widely understood.
There also is one case were setting an nternal reference to nothing is useful in a call to Dispose: when the member is marked as Withevents.
Without that, you risk having an uncollected object handling events when it really should not be doing that anymore.
It would seem to me that this is something that is NOT needed at all, especially without un-managed resources and data connections.
If you happen to be able to sanitize and post the code we might be able to get a bit more insight, but realistically I can't see a need for it.
Depending on the size of the objects, and how often they are created/destroyed, it could be to ensure GC can happen as early as possible.
It may be, that this pattern was used in other projects and it continues on without understanding why it was used in the first place. Monkey Gardeners
The only reason that I could see for this -- and this is dubious at best -- is if these things are being created and disposed of higher in the "food chain" and there is a potential for some of these domain classes to have either a limited or unmanaged resource at some point.
Even that is sketchy...it sounds like someone came from an unmanaged background and was looking for the .NET equivalent to managing your memory and came across the IDisposable interface.
I read Rick Strahl's post about DataContext lifetime management, and some of the other related questions on Stackoverflow. If they contained an answer to my question, I must have missed it.
I generally follow the atomic approach and instantiate a DataContext for a unit of work when it is needed, and dispose it afterwards. This worked well until I hit a scenario with a complex page that contains a multi-view control with several grids and popup panels that all represent one unit of work. The data is in memory (I actually stuff the root object into the session so that the entire hierarchy is available across post-backs). Obviously, the DataContext is long gone by the time the user clicks on "Save".
Tom Brune's comment caught my eye at first, because it seemed like such an elegant approach - to use reflection to "wet" a fresh copy of the object and to update the database using a new DataContext. However, Rick's concerns about this approach are valid, and since my data structures are complex and hierarchical, I don't think I will try this.
So I am left with few options, as far as I see.
either use Rick's suggestion to deserialize/serialize the object and re-attach it to a new context
hand-code the logic that compares and updates a fresh copy of the object
Which one should I follow, and is there a third option, i.e. can I keep the DataContext around between post-backs? If that's feasible, it would require the least amount of coding, as my root object has about a dozen children.
My suggestion would be to go with your first bullet point there and deserialize/serialize the object and then re-attach it to a new context.
I've used that approach in the past and it has worked well for me. I think you'll run in to less issues and have an easier implementation ahead.
In a web application it would be ok if i declare the context of a entity framework model as static? it would be ok? its not recommended? why?
Thanks!
Almost definitely not.
ObjectContexts get bigger and bigger as more Objects are queried / saved.
Also sharing an ObjectContext between threads, as you would be doing, is not recommended, because of the locking issues and undeterministic side-effects you would have to deal with.
I wrote a tip on this topic a while back.
Tip 18 - How to decide on a lifetime for your ObjectContext
This answer sort of answers your question, so does this one. I certainly wouldn't consider having it as static!
Rick Strahl has an in depth article on the lifecycle management.
I had done this the first time I implemented the entity framework. The problem was that the whole application was getting "completed" events so I had to do a lot of code figuring out where the call came from.
I decided to refactor so each page would have an instance of the context. I like it much better now.
/my experience