I need my linq to sql datacontext to be available across my business/data layer for all my repository objects to access. However since this is a web app, I want to create and destroy it per request. I'm wondering if having a singleton class that can lazily create and attach the datacontext to current HttpContext would work. My question is: would the datacontext get disposed automatically when the request ends? Below is the code for what I'm thinking. Would this accomplish my purpose: have a thread-safe datacontext instance that is lazily available and is automatically disposed when the request ends?
public class SingletonDC
{
public static NorthwindDataContext Default
{
get
{
NorthwindDataContext defaultInstance = (NorthwindDataContext)System.Web.HttpContext.Current.Items["datacontext"];
if (defaultInstance == null)
{
defaultInstance = new NorthwindDataContext();
System.Web.HttpContext.Current.Items.Add("datacontext", defaultInstance);
}
return defaultInstance;
}
}
}
What you imagine makes sense - using the HTTP Request context to store stuff - but No, disposable objects stored in the current HttpContext will not auto-magically be disposed when the request ends. You will have to hande that yourself, somehow.
There is an "End Request" event that you can hook into easily, for example using code that you drop into Global.asax.cs. In your Application_EndRequest() method, you can call Dispose() manually on each object in the list that requires it.
One way to do it is to iterate through each item in the context, test for IDisposable, and then call Dispose if appropriate.
protected void Application_EndRequest(Object sender, EventArgs e)
{
foreach (var key in HttpContext.Current.Items.Keys)
{
var disposable = HttpContext.Current.Items[key] as IDisposable;
if (disposable != null)
{
disposable.Dispose();
HttpContext.Current.Items[key] = null;
}
}
}
I think that oughtta do it. ASPNET doesn't do this for you automatically. Of course you need protection from exceptions and so on, before using this code in a real app.
Keith Craig of Vertigo wrote a relevant post on the topic a while ago, describing what you want to do as a pattern, in other words a way of doing things that ought to be repeated. He provides a class to help out with that, to lazy load the DB context and drop it into the current context. There are some pitfalls with the approach - you can read about them in the comment discussion on that post. Also there are a bunch of related articles cited in the comments.
Cheeso's code will generate an InvalidOperationException "Collection was modified; enumeration operation may not execute" because it is trying to modify the HttpContext items that it's iterating over.
You can use a copy of the list to prevent this.
protected void Application_EndRequest(Object sender, EventArgs e)
{
var keys = new ArrayList(HttpContext.Current.Items.Keys);
foreach (var key in keys)
{
var disposable = HttpContext.Current.Items[key] as IDisposable;
if (disposable != null)
{
disposable.Dispose();
HttpContext.Current.Items[key] = null;
}
}
}
Related
I have a problem when accessing httpcontext.current value at application_start method.
There is a lots of discussions about this topic.
But I want to share my business logic and I need some advice how to handle problem.
Let's look at this business logic together step by step
1-I want to design "custom object" which has "static global List" property and any user can add "LogObj" object to this list whereever actions occured.
public class MyLog
{
public static void List<LogObj> LogObjList {get;set;}
static MyLog()
{
LogObjList = new List<LogObj>();
}
}
2- If I have a "System.Timers.Timer" object which checks the "static global List" every X milliseconds and performs some action which defined in the code
public static init(){
System.Timers.Timer t = new System.Timers.Timer();
t.Elapsed += T_Elapsed;
t.Interval = 3000;
t.Start();
}
private void T_Elapsed(object sender, System.Timers.ElapsedEventArgs e)
{
//perform some code.
var s = HttpContext.Current.Session["test"];
var logObj = MyLog.LogObjList[0] + s;
//save as text file ...
}
3- If I start init() method in application_start event at global.asax I get this error "object reference ..." where the line of "..HttpContext.Current.Session" started.
So
If I do not want to access any httpcontext.current's properties I have no problem at this situation.
But If I need to access any properties of httpcontext.current at Timer_Elapsed event I have problem about it.
So I need your advice or alternative way to making my algorithm.
Thank you
I have a following code example that is used in ASP.NET MVC application.
The purpose of this code is to create "fire and forget" request for queuing some long running operation.
public JsonResult SomeAction() {
HttpContext ctx = HttpContext.Current;
Task.Run(() => {
HttpContext.Current = ctx;
//Other long running code here.
});
return Json("{ 'status': 'Work Queued' }");
}
I know this is not a good way for handling HttpContext.Current in asynchronous code, but currently our implementation not allows us to do something else.
I would like to understand how much this code is dangerous...
The question: Is it theoretically possible that setting the HttpContext inside Task.Run, will set the context to totally another request?
I think yes, but I'm not sure. How I understand it:
Request1 is handled with Thread1 from thread pool, then while Thread1 is handling absolutelly another request (Request2), the code inside Task.Run will set context from Request1 to Request2.
Maybe I am wrong, but my knowledge of ASP.NET internals not allows me to understand it correctly.
Thanks!
Let me bump a little internals on you:
public static HttpContext Current
{
get { return ContextBase.Current as HttpContext; }
set { ContextBase.Current = value; }
}
internal class ContextBase
{
internal static object Current
{
get { return CallContext.HostContext; }
set { CallContext.HostContext = value; }
}
}
public static object HostContext
{
get
{
var executionContextReader = Thread.CurrentThread.GetExecutionContextReader();
object hostContext = executionContextReader.IllogicalCallContext.HostContext;
if (hostContext == null)
{
hostContext = executionContextReader.LogicalCallContext.HostContext;
}
return hostContext;
}
set
{
var mutableExecutionContext = Thread.CurrentThread.GetMutableExecutionContext();
if (value is ILogicalThreadAffinative)
{
mutableExecutionContext.IllogicalCallContext.HostContext = null;
mutableExecutionContext.LogicalCallContext.HostContext = value;
return;
}
mutableExecutionContext.IllogicalCallContext.HostContext = value;
mutableExecutionContext.LogicalCallContext.HostContext = null;
}
}
So
var context = HttpContext.Current;
is equal to (pseudocode)
var context = CurrentThread.HttpContext;
and inside your Task.Run something like this happens
CurrentThread.HttpContext= context;
Task.Run will start new task with thread from thread pool. So you're telling that your new thread "HttpContext property" is reference to starter thread "HttpContext property" - so far so good (well with all the NullReference/Dispose exceptions you'll be facing after your starter thread finishes). Problem is if inside your
//Other long running code here.
You have statement like
var foo = await Bar();
Once you hit await, your current thread is returned to thread pool, and after IO finishes you grab new thread from thread pool - wonder what its "HttpContext property" is, right ? I don't know :) Most probably you'll end with NullReferenceException.
The issue you will run into here is that the HttpContext will dispose when the request is complete. Since you aren't awaiting the result of the Task.Run, you are essentially creating a race condition between the disposal of the HttpContext and it's usage within the task.
I'm pretty sure that the only issue your task will run into is a NullReferenceException or an ObjectDisposedException. I don't see any way where you could accidentally steal another request's context.
Also, unless you are handling & logging exceptions within your task, your fire and forget will throw and you'll never know about it.
Check out HangFire or consider using a message queue for processing backend jobs from a separate process.
Ok first things first. This is some exception information given by the support team. I know the line and code where it happens. It happens in a FirstOrDefault call over a dictionary obtained from cache.
1) Exception Information
*********************************************
Exception Type: System.InvalidOperationException
Message: Collection was modified; enumeration operation may not execute.
Data: System.Collections.ListDictionaryInternal
Now I wanted to simulate the problem and I could do it in a simple ASP.net application.
My page has 2 Buttons - Button_Process and Button_Add
The code behind is as follows:
public partial class _Default : System.Web.UI.Page
{
protected void Page_Load(object sender, EventArgs e)
{
if (!IsPostBack)
{
var data = Cache["key"];
if (data == null)
{
var dict = new Dictionary<int, string>();
for (int i = 0; i < 10; i++)
{
dict.Add(i, "i");
}
Cache["key"] = dict;
}
}
}
protected void ButtonProcess_Click(object sender, EventArgs e)
{
var data = Cache["key"] as Dictionary<int, string>;
if (data != null)
{
foreach (var d in data.Values) //In actual code there is FirstOrDefault here
{
Thread.Sleep(1000);
if (d.Contains("5"))
{
//some operation
}
}
}
}
protected void Button2_Click(object sender, EventArgs e)
{
var data = Cache["key"] as Dictionary<int, string>;
if (data != null)
{
data.Add(new Random().Next(), "101");
Cache["key"] = data;
}
}
}
Now assume there are 2 requests:
Request 1 - Someone clicks on button_Process and some operation on cache object is taking place
Request 2 - Someone clicks on button_Add and the first person gets an exception - collection modified blah blah
I understand the problem - it is happening because we are accessing same bit of memory. I have 2 solutions in my mind:
1. I use a for loop instead of for each (to replace FirstOrDefault in actual code) - I dunno how efficient this operation will be after I make the changes. - I don't ever delete any item from cache so I was thinking of this solution
2. I put some lock over cache object or something on those lines - but I dunno exactly where and how should I lock this object.
Please help me with this. I am not able to figure out an efficient solution. What is the best way to handle such situations?
This happens because you're working directly with object, locating in cache. Good practise, to avoid those exceptions and other wierd behavior (when you accidentally modify cache object) is working with copy of cache data. And there are several ways of achieving it, like doing clone or some kind of deep copy. What i prefer is keeping objects in cache serialized (any kind you like - json/xml/binary or w/e else), since (de)serialization makes a deep copy of your object. Following small code snippet will clarify things:
public static class CacheManager
{
private static readonly Cache MyCache = HttpRuntime.Cache;
public static void Put<T>(T data, string key)
{
MyCache.Insert(key, Serialize(data));
}
public static T Get<T>(string key)
{
var data = MyCache.Get(key) as string;
if (data != null)
return Deserialize<T>(data);
return default(T);
}
private static string Serialize(object data)
{
//this is Newtonsoft.Json serializer, but you can use the one you like
return JsonConvert.SerializeObject(data);
}
private static T Deserialize<T>(string data)
{
return JsonConvert.DeserializeObject<T>(data);
}
}
And usage:
var myObj = new Dictionary<int, int>();
CacheManager.Put(myObj, "myObj");
//...
var anotherObj = CacheManager.Get<Dictionary<int, int>>("myObj");
Check Task Parallel Library for .NET 3.5. It has Concurrent Collections such as ConcurrentStack, ConcurentQueue and ConcurrentDictionary.
http://www.nuget.org/packages/TaskParallelLibrary
The problem is that the cache object is global for appdomain and the data stored in are shared between all request.
The only solution to this problem is to activate a lock when you want to access to the collection and then release the lock (https://msdn.microsoft.com/en-us/library/vstudio/c5kehkcz%28v=vs.100%29.aspx).
(sorry form my bad english)
I'm using EF 5 with Web Forms (ASP.NET 4.5), with the "one DbContext instance per request" approach.
But this situation is a bit complicated: I have a multi-step create/edit screen, and I store the current entity in Session, then I manipulate it and in the final step, I commit it to the Database.
Creating a new instance was fine, but I can't for the life of me edit an existing entity... Because it's another request, my original DbContext instance was lost and when I attach it to a new one, I get the An entity object cannot be referenced by multiple instances of IEntityChangeTracker error.
My code is far too complex to post here, but I'll try and summarize it accurately:
My DbContext:
public class AppContext : DbContext
{
// DbSet declarations...
public static AppContext Current {
get { var context = HttpContext.Current.Items["Contexts.AppContext"] as AppContext;
if (context == null)
{
context = new AppContext();
HttpContext.Current.Items["Contexts.AppContext"] = context;
}
return context;
}
}
}
An example of what the page code looks like:
protected void Page_Load(object sender, EventArgs e)
{
int? id = null; // After this, I try to get it from the QueryString, parse it, etc.. Omitted for sake of brevity
// If I have an ID, it means I'm editing...
Session["Product"] = id.HasValue ? new Product() : AppContext.Current.Products.Find(id));
MethodToPopulateFields(); // Internally, it uses the Session variable
}
protected void Step1(){ // through n
// Manipulates the Session["Product"] based on page input...
}
protected void Save(){
var product = Session["Product"] as Product;
if(product.ID == 0)
product = AppContext.Current.Products.Add(product);
// throws an exception:
// AppContext.Current.Entry(product).State = EntityState.Modified;
// this too:
// AppContext.Products.Attach(product);
AppContext.Current.SaveChanges();
}
I know I can get the old entity from the database, update it manually and save, all in the last step, but I really don't want to do that...
Thank you.
Try calling
AppContext.Current.Entry(product).State = EntityState.Detached;
in the first method.
I want a ASP.NET cache item to be recycled when a specific file is touched, but the following code is not working:
HttpContext.Current.Cache.Insert(
"Key",
SomeObject,
new CacheDependency(Server.MapPath("SomeFile.txt")),
DateTime.MaxValue,
TimeSpan.Zero,
CacheItemPriority.High,
null);
"SomeFile.txt" does not seem to be checked when I'm hitting the cache, and modifying it does not cause this item to be invalidated.
What am I doing wrong?
Problem Solved:
This was a unique and interesting problem, so I'm going to document the cause and solution here as an Answer, for future searchers.
Something I left out in my question was that this cache insertion was happening in a service class implementing the singleton pattern.
In a nutshell:
public class Service
{
private static readonly Service _Instance = new Service();
static Service () { }
private Service () { }
public static Service Instance
{
get { return _Instance; }
}
// The expensive data that this service exposes
private someObject _data = null;
public someObject Data
{
get
{
if (_data == null)
loadData();
return _data;
}
}
private void loadData()
{
_data = GetFromCache();
if (_data == null)
{
// Get the data from our datasource
_data = ExpensiveDataSourceGet();
// Insert into Cache
HttpContext.Current.Cache.Insert(etc);
}
}
}
It may be obvious to some, but the culprit here is lazy loading within the singleton pattern. I was so caught up thinking that the cache wasn't being invalidated, that I forgot that the state of the singleton would be persisted for as long as the worker process was alive.
Cache.Insert has an overload that allows you to specify a event handler for when the cache item is removed, my first test was to create a dummy handler and set a breakpoint within it. Once I saw that the cache was being cleared, I realized that "_data" was not being reset to null, so the next request to the singleton loaded the lazy loaded value.
In a sense, I was double caching, though the singleton cache was very short lived, but long enough to be annoying.
The solution?
HttpContext.Current.Cache.Insert(
"Key",
SomeObject,
new CacheDependency(Server.MapPath("SomeFile.txt")),
DateTime.MaxValue,
TimeSpan.Zero,
CacheItemPriority.High,
delegate(string key, object value, CacheItemRemovedReason reason)
{
_data = null;
}
);
When the cache is cleared, the state within the singleton must also be cleared...problem solved.
Lesson learned here? Don't put state in a singleton.
Is ASP.NET running under an account with the proper permissions for the file specified in the CacheDependency? If not, then this might be one reason why the CacheDependency is not working properly.
I think you'll need to specify a path:
var d = new CacheDependency(Server.MapPath("SomeFile.txt"));
Prepend with ~\App_Data as needed.
Your code looks fine to me. However, beyond this snippet, anything could be going on.
Are you re-inserting on every postback by any chance?
Try making your cache dependency a class field, and checking it on every postback. Modify the file in between and see if it ever registers as "Changed". e.g.:
public partial class _Default : System.Web.UI.Page
{
CacheDependency dep;
protected void Page_Load(object sender, EventArgs e)
{
if (!IsPostBack)
{
dep = new CacheDependency(Server.MapPath("SomeFile.txt"));
HttpContext.Current.Cache.Insert(
"Key",
new Object(),
dep,
DateTime.MaxValue,
TimeSpan.Zero, CacheItemPriority.High, null);
}
if (dep.HasChanged)
Response.Write("changed!");
else
Response.Write("no change :("); }}
The only way I am able to reproduce this behavior is if the path provided to the constructor of CacheDependency does not exist. The CacheDependency will not throw an exception if the path doesn't exist, so it can be a little misleading.