Does Cache.Insert override the previous AbsoluteExpiration? - asp.net

If i have an item that already exists in my ASP.NET Cache ... and I just wish to update the value of that cached item .. not the Absolute Expiry value, or Cache Dependencies, etc.. nothing else BUT the value ... can I use Cache.Insert?
If not, is there anyway I can retrieve all those values for the cached item .. and then re-use them when I do the Cache.Insert?
Cheers :)

You can create functions to handle adding fresh value or updating the existing one as follows:
private static Cache cachingControl;
public void UpdateToCache(object key, object updateValue)
{
try
{
if (key != null)
{
cachingControl.Remove(key);
AddToCache(key, updateValue);
}
}
catch (Exception ex)
{
//**ToDo[Logging]** Code for logging
}
}
public void AddToCache(object key, object saveValue)
{
try
{
if (key != null)
{
cachingControl.Insert(key, saveValue,null,System.Web.Caching.Cache.NoAbsoluteExpiration,TimeSpan.FromMinutes(30));
}
}
catch (Exception ex)
{
//**ToDo[Logging]** Code for logging
}
}
Here you can use AddToCache function to insert new values and UpdateToCache function to update new values to existing key.(This basically involves in removing the existing key and adding it again with updated values.)
There is no direct way to update the existing values.

The Cache.Insert overload that takes just the key and object will simply use default values for the caching behaviour.
From MSDN:
Inserts an item into the Cache object with a cache key to reference its location, using default values provided by the CacheItemPriority enumeration.
You'd be best to create your own helper class to store the values for you as I don't believe there's a way to get at the cached item's behavioural properties.

Related

Objectify, the #Ignore annotation and caching

I'm using Objectify and have a base entity that all others descend from. This base entity has an non-persisted updated flag that is set by descendant classes when there is a change to an object property value that needs to be saved. I do this to prevent needless writes to the datastore when syncing data with clients.
The base entity looks like this
#Cache
public abstract class WordBuzzEntity {
#Ignore
private boolean updated = false;
public boolean isUpdated() {
return updated;
}
public void setUpdated() {
updated = true;
}
public void save(boolean async) {
if (!updated)
return;
if (async)
ofy().save().entity(this);
else
ofy().save().entity(this).now();
}
}
I noticed when loading users with
User user = ofy().load().type(User.class).id(LoginTest.TEST_ID).now();
That the updated flag was sometimes set to true at the point of load.
Is this because of the Objectify session cache or memcache? Are ignored properties cached in this instance when an object is reloaded?
Adding a line to set updated back to false at the point of save resolves my issue, but I'd like to understand what's going on.
The Objectify session cache stores instances of objects, therefore ignored fields are not actually ignored when an item is loaded from the cache as it just pulls up the last instance rather than creating a new one.

Why my object is not updated in linq?

I have a method where I READ objects from DB, for instance:
public Object getProduct(int categoryId, int productId)
{
DataClassesDataContext db = new DataClassesDataContext(Settings.getDefaultConnectionStringName());
switch (categoryId)
{
case CCategorii.CARTI_ID:
{
IEnumerable<Carti> product = (from c in db.Cartis
where c.Carti_id == productId
&& c.Vizibil == true
select c);
if (product.Count() != 0)
return product.First();
break;
}
//so on
}
}
Now I have another method where I do the update:
public void updateProduct()
{
Object productToBeUpdated = getProduct(1,1);
DataClassesDataContext db = new DataClassesDataContext(Settings.getDefaultConnectionStringName());
//update some properties of the product
productToBeUpdated.setQuantity(productToBeUpdated.getQuantity()+1);
db.submitChanges();
}
Well, the product was succcesfully read from previous method but changes were not done into the DB.
I think the cause is that I do this READ-UPDATE in two different DataContext...If this is the cause how do you threat this situations?
Oh yeah, I can read the product and update in the same method but this means to duplicate the method I use for reading and add to it update stuff... and I would like to avoid this.
I would assume it's because you are using a different context for the read and write. Try moving your DataClassesDataContext variable to class level.
One option is: use a common data context, and pass it to your getXXX methods as a parameter:
public Object getProduct(DataClassesDataContext db, int categoryId, int productId)
{
switch (categoryId)
{
case CCategorii.CARTI_ID:
{
IEnumerable<Carti> product = (from c in db.Cartis
where c.Carti_id == productId
&& c.Vizibil == true
select c);
if (product.Count() != 0)
return product.First();
break;
}
//so on
}
}
and then:
public void updateProduct()
{
using (DataClassesDataContext db = new DataClassesDataContext(Settings.getDefaultConnectionStringName()))
{
Object productToBeUpdated = getProduct(db, 1,1);
//update some properties of the product
productToBeUpdated.setQuantity(productToBeUpdated.getQuantity()+1); // THX #AVD, didn't notice that.
db.submitChanges();
}
}
You are using two different instances of your DataContext.
When implementing a web app, the best option is usually to align the lifetime of your DataContext to the lifetime of one http request. The lifetime you use is just too short.
Another option is to attach the object to the write DataContext:
db.Cartis.Attach(yourReadObject);
updateProperties(yourReadObject);
db.submitChanges();
EDIT
Ok, you have to detach the object from your other context first. See this article on how to do it.
But i really would recommend to use a single DataContext object and extend the lifetime to the httprequest scope.
This can be done really nice with an ioc container like autofac.
You can't use ++ operator and use the same context to update an object. Try this,
productToBeUpdated.setQuantity(productToBeUpdated.getQuantity()+1);
As soon as your DataContext goes out of scope your entity becomes detached from it. That means it's no longer being tracked by your Context and it can't save the changes you make to it.
You could share the context so the entity doesn't get detached from your context or you could reattach it to the second context (DataContext.Attach)

HttpRuntime Close does not remove items from the Cache as advertised

I've created my own cache manager for a web site I'm developing and I was looking to find the best way to clear the cache under certain circumstances.
I found many articles saying the proper way to clear the cache is to call HttpRuntime.Close()
However, in my unit tests setup I call the encapsulated function HttpRuntime.Close() and the cache is NOT being cleared out.
I expected it to perform something similar to
foreach (DictionaryEntry cacheItem in HttpRuntime.Cache)
{
HttpRuntime.Cache.Remove(cacheItem.Key.ToString());
}
The foreach loop works great in my encapsulated function, but the Close() never works right.
Am I misunderstanding the purpose of HttpRuntime.Close() or is there something more sinister going on here?
Don't use Close, it does more than the docs say. And the docs also say not to use it while processing normal requests...
This is the reflected source of Close():
[SecurityPermission(SecurityAction.Demand, Unrestricted=true)]
public static void Close() {
if (_theRuntime.InitiateShutdownOnce()) {
SetShutdownReason(ApplicationShutdownReason.HttpRuntimeClose, "HttpRuntime.Close is called");
if (HostingEnvironment.IsHosted) {
HostingEnvironment.InitiateShutdown();
} else {
_theRuntime.Dispose();
}
}
}
Also, you cannot iterate over a collection and remove items from it at the same time, as this renders the enumeration invalid.
So, try this instead, which doesn't change what it loops over:
List<string> toRemove = new List<string>();
foreach (DictionaryEntry cacheItem in HttpRuntime.Cache) {
toRemove.Add(cacheItem.Key.ToString());
}
foreach (string key in toRemove) {
HttpRuntime.Cache.Remove(key);
}
That being said, really, you should try to use cache dependencies to have the invalid cache entries cleared automatically for you, and then all this becomes unnecessary.
I understand the issue with enumeration but for some reason the Cache doesn't seem to have a problem removing an item while walking through the list.
If you drill down to the detail implementation, you will find the Enumerator is created by CacheSingle.CreateEnumerator, a new Hashtable instance is created for enumeration.
That's why you can do the remove in a foreach loop.
you could simply implement your own Cache class, check the below one:
public sealed class YourCache<T>
{
private Dictionary<string, T> _dictionary = new Dictionary<string, T>();
private YourCache()
{
}
public static YourCache<T> Current
{
get
{
string key = "YourCache|" + typeof(T).FullName;
YourCache<T> current = HttpContext.Current.Cache[key] as YourCache<T>;
if (current == null)
{
current = new YourCache<T>();
HttpContext.Current.Cache[key] = current;
}
return current;
}
}
public T Get(string key, T defaultValue)
{
if (string.IsNullOrWhiteSpace(key))
throw new ArgumentNullException("key should not be NULL");
T value;
if (_dictionary.TryGetValue(key, out value))
return value;
return defaultValue;
}
public void Set(string key, T value)
{
if (key == null)
throw new ArgumentNullException("key");
_dictionary[key] = value;
}
public void Clear()
{
_dictionary.Clear();
}
}
you could call items from cache or even clear them using the following:
// put something in this intermediate cache
YourCache<ClassObject>.Current.Set("myKey", myObj);
// clear this cache
YourCache<ClassObject>.Current.Clear();

ASP.NET Cache and File Dependancies

I want a ASP.NET cache item to be recycled when a specific file is touched, but the following code is not working:
HttpContext.Current.Cache.Insert(
"Key",
SomeObject,
new CacheDependency(Server.MapPath("SomeFile.txt")),
DateTime.MaxValue,
TimeSpan.Zero,
CacheItemPriority.High,
null);
"SomeFile.txt" does not seem to be checked when I'm hitting the cache, and modifying it does not cause this item to be invalidated.
What am I doing wrong?
Problem Solved:
This was a unique and interesting problem, so I'm going to document the cause and solution here as an Answer, for future searchers.
Something I left out in my question was that this cache insertion was happening in a service class implementing the singleton pattern.
In a nutshell:
public class Service
{
private static readonly Service _Instance = new Service();
static Service () { }
private Service () { }
public static Service Instance
{
get { return _Instance; }
}
// The expensive data that this service exposes
private someObject _data = null;
public someObject Data
{
get
{
if (_data == null)
loadData();
return _data;
}
}
private void loadData()
{
_data = GetFromCache();
if (_data == null)
{
// Get the data from our datasource
_data = ExpensiveDataSourceGet();
// Insert into Cache
HttpContext.Current.Cache.Insert(etc);
}
}
}
It may be obvious to some, but the culprit here is lazy loading within the singleton pattern. I was so caught up thinking that the cache wasn't being invalidated, that I forgot that the state of the singleton would be persisted for as long as the worker process was alive.
Cache.Insert has an overload that allows you to specify a event handler for when the cache item is removed, my first test was to create a dummy handler and set a breakpoint within it. Once I saw that the cache was being cleared, I realized that "_data" was not being reset to null, so the next request to the singleton loaded the lazy loaded value.
In a sense, I was double caching, though the singleton cache was very short lived, but long enough to be annoying.
The solution?
HttpContext.Current.Cache.Insert(
"Key",
SomeObject,
new CacheDependency(Server.MapPath("SomeFile.txt")),
DateTime.MaxValue,
TimeSpan.Zero,
CacheItemPriority.High,
delegate(string key, object value, CacheItemRemovedReason reason)
{
_data = null;
}
);
When the cache is cleared, the state within the singleton must also be cleared...problem solved.
Lesson learned here? Don't put state in a singleton.
Is ASP.NET running under an account with the proper permissions for the file specified in the CacheDependency? If not, then this might be one reason why the CacheDependency is not working properly.
I think you'll need to specify a path:
var d = new CacheDependency(Server.MapPath("SomeFile.txt"));
Prepend with ~\App_Data as needed.
Your code looks fine to me. However, beyond this snippet, anything could be going on.
Are you re-inserting on every postback by any chance?
Try making your cache dependency a class field, and checking it on every postback. Modify the file in between and see if it ever registers as "Changed". e.g.:
public partial class _Default : System.Web.UI.Page
{
CacheDependency dep;
protected void Page_Load(object sender, EventArgs e)
{
if (!IsPostBack)
{
dep = new CacheDependency(Server.MapPath("SomeFile.txt"));
HttpContext.Current.Cache.Insert(
"Key",
new Object(),
dep,
DateTime.MaxValue,
TimeSpan.Zero, CacheItemPriority.High, null);
}
if (dep.HasChanged)
Response.Write("changed!");
else
Response.Write("no change :("); }}
The only way I am able to reproduce this behavior is if the path provided to the constructor of CacheDependency does not exist. The CacheDependency will not throw an exception if the path doesn't exist, so it can be a little misleading.

Session State removing and adding overhead

The following is how I usually handle objects in Session State, I have a const string as the session name and then have a property with a get and set for the Object.
What I was wondering was if the 'Session.Remove()' call was necessary (to keep things clean and tidy) and if there was significant overhead and doing this removal.
I have the Session.Remove there basically because it makes me feel better (OCD i know), and makes me feel like the session is cleaner, but I would like to know if it isn't needed.
private const string sesMyObject = "{C2CC72C3-1466-42D4-8579-CAA11F261D55}";
public MyObject MyObjectProperty
{
get
{
return Session[sesMyObject] as MyObject;
}
set
{
Session.Remove(sesMyObject);
Session.Add(sesMyObject, value);
}
}
EDIT
per the answers below i have changed my properties to the following:
private const string sesMyObject = "{C2CC72C3-1466-42D4-8579-CAA11F261D55}";
public MyObject MyObjectProperty
{
get
{
return Session[sesMyObject] as MyObject;
}
set
{
Session[sesMyObject] = value;
}
}
thanks!
If you really want to be safe, try converting the object to a IDisposable, and if it succeeds, call Dispose.
IDisposable sesDispObj = Session[sesMyObject] as IDisposable;
if (sesDispObj != null)
sesDispObj.Dispose();
Other than that,
Session[sesMyObject] = value
is pretty much the same as
Session.Remove(sesMyObject);
Session.Add(sesMyObject, value);
It's overkill. Refering MSDN
If the name parameter refers to an
existing session state item, the
existing item is overwritten with the
specified value.
Session[sesMyObject] = value;
is shorter, simpler to read, and should have slightly better performance, but unless this code is being repeated very many times in succession, it shouldn't make a difference.

Resources