Can someone help me here? I have following code to store and revieve catch, however, it doesn't work. The cache expires in mins even I set it to 14 days in slidingExpiration. Thanks in advance!
public static List<ReplyDTO> VideoCommentList()
{
List<ReplyDTO> replyList = new List<ReplyDTO>();
if (HttpRuntime.Cache["videoComment"] == null)
{
HttpRuntime.Cache.Remove("videoComment");
HttpRuntime.Cache.Insert("videoComment", replyList, null, Cache.NoAbsoluteExpiration, TimeSpan.FromDays(14));
}
else
{
replyList = (List<ReplyDTO>)HttpRuntime.Cache["videoComment"];
}
if (replyList.Count > 8)
{
replyList = replyList.OrderByDescending(x => x.DateCreated).Take(8).ToList();
}
else
{
replyList = replyList.OrderByDescending(x => x.DateCreated).ToList();
}
return replyList;
}
public static List<ReplyDTO> AddVideoComment(ReplyDTO replyDTO)
{
List<ReplyDTO> replyList = new List<ReplyDTO>();
replyList = VideoCommentList();
replyList.Add(replyDTO);
HttpRuntime.Cache.Insert("videoComment", replyList, null, Cache.NoAbsoluteExpiration, TimeSpan.FromDays(14));
if (replyList.Count > 8)
{
replyList = replyList.OrderByDescending(x => x.DateCreated).Take(8).ToList();
}
else
{
replyList = replyList.OrderByDescending(x => x.DateCreated).ToList();
}
return replyList;
}
ASP.net cache is in-memory, so if your IIS process or application pool recycles it will get clear. You can check following things which can cause recycling of process
If you modify web.config, IIS shutdown the old instance and slowly transfer the traffic to a new instance, in this process in-memory is recycled. How to check this: You can detect this situation by checking the AppDomain.IsFinalizingForUnload and logging that during the callback.
Application Pool Recycling: There is a configuration in IIS, according to which if IIS process is idle for a specified time, it recycles it. You can check this on server, and increase this time or disable the recycling altogether.
Every process has limitation on how much memory it can consume, if you are adding too many objects in memory, it will increase the memory consumption of IIS, and in critical time OS will recycle the process.
EDIT
In your program you are adding replyList item to cache and then doing .Take() operation. As replyList is reference object, if you modify it, it will get updated in the cache also. So if in your program, if you do replyList == null it will update the item in cache.
So modify your code like this and try
public static List<ReplyDTO> VideoCommentList()
{
List<ReplyDTO> replyList = new List<ReplyDTO>();
if (HttpRuntime.Cache["videoComment"] == null)
{
//Call to .Remove is not required
//HttpRuntime.Cache.Remove("videoComment");
HttpRuntime.Cache.Insert("videoComment", replyList, null,
Cache.NoAbsoluteExpiration, TimeSpan.FromDays(14));
}
else
{
//No need to check count > 8, Take will handle it for you
replyList = ((List<ReplyDTO>)HttpRuntime.Cache["videoComment"])
.OrderByDescending(x => x.DateCreated)
.Take(8).ToList();
}
return replyList;
}
public static List<ReplyDTO> AddVideoComment(ReplyDTO replyDTO)
{
//Read from cache
List<ReplyDTO> replyList = ((List<ReplyDTO>)HttpRuntime.Cache["videoComment"]);
if(replyList == null)
replyList = VideoCommentList();
replyList.Add(replyDTO);
HttpRuntime.Cache.Insert("videoComment", replyList, null, Cache.NoAbsoluteExpiration, TimeSpan.FromDays(14));
//Here you are creating a new list, and not referencing the one in the cache
return replyList.OrderByDescending(x => x.DateCreated).Take(8).ToList();
}
IMPORTANT SUGGESTION
If you want to check when and why your object is removed from the cache, you can take help of CacheItemRemovedCallback option on the insertion. Using this and CacheItemRemovedReason argument, you can log the reason of object removal from cache. Reasons
Removed - Your code has removed the item from cache by calling Insert or Remove method.
Expired - The item is removed from the cache because it expired.
Underused - When system run low on memory, it freed up memory by removing item from the cache.
DependencyChanged - The item is removed from the cache because the cache dependency associated with it changed. (In your case it is not valid)
Hope this information helps you.
In order to track down WHY your item is being removed from cache, I'd recommend using a different overload of the HttpRuntime.Cache.Insert method that allows you to specify a CacheItemRemovedCallback callback function.
Cache.Insert Method (String, Object, CacheDependency, DateTime, TimeSpan, CacheItemPriority, CacheItemRemovedCallback)
Aside from that your caching code seems good. But once you change your code to specify a callback, log the ejection reason and that will probably give you a better understanding of why your cached item is getting clear.
Like most of the other answers, I suspect that your app is getting recycled/reset for any number of reasons. I think that most apps on a production machine recycle at least once a day, especially in a shared hosting environment. So I'd guess that your data will stay cached for a day at most.
The cache is in-memory and will expire when your application recycles. I'm guessing you're evaluating this on a development machine where either the low amount of traffic, or your file edits causes the application to recycle.
Your cache objects can get trimmed due to multiple reasons...
AppDomain recycle.
Memory pressure.
Crash.
Use of Web Garden.
Load balancing.
and so on...
This post should clarify a bit more...
http://blogs.msdn.com/b/praveeny/archive/2006/12/11/asp-net-2-0-cache-objects-get-trimmed-when-you-have-low-available-memory.aspx
In the AddVideoComment() method , change the cache item insert lines to:
HttpRuntime.Cache.Insert("videoComment", replyList, null, Cache.NoAbsoluteExpiration, TimeSpan.FromDays(14),CacheItemPriority.NotRemovable,null);
And in the VideoCommentList() method, use:
if (HttpRuntime.Cache["videoComment"] == null)
{
replyList = VideoCommentList();
HttpRuntime.Cache.Insert("videoComment", replyList, null, Cache.NoAbsoluteExpiration, TimeSpan.FromDays(14),CacheItemPriority.NotRemovable,null);
}
No need of using HttpRuntime.Cache.Remove("videoComment"); as HttpRuntime.Cache.Insert( will replace the existing cache item.
Cheers,
DeveloperConcord
Related
I have a web site running in its own Application Pool (IIS 8). Settings for the pool are default i.e. recycle every 29 hours.
Our web server only has 8gb RAM and I have noticed that the worker process for this web site regularly climbs to 6gb RAM and slows the server to a crawl. This is the only site currently on the web server.
I also have SQL Express 2016 installed as well. The site is using EF version 6.1.3.
The MVC site is very straightforward. It has a GETPDF controller which finds a row in a table, gets PDF info stored in a field then serves it back to the browser as follows :-
using (eBillingEntities db = new eBillingEntities())
{
try
{
string id = model.id;
string emailaddress = Server.HtmlEncode(model.EmailAddress).ToLower().Trim();
eBillData ebill = db.eBillDatas.ToList<eBillData>().Where(e => e.PURL == id && e.EmailAddress.ToLower().Trim() == emailaddress).FirstOrDefault<eBillData>();
if (ebill != null)
{
// update the 'Lastdownloaded' field.
ebill.LastDownloaded = DateTime.Now;
db.eBillDatas.Attach(ebill);
var entry = db.Entry(ebill);
entry.Property(en => en.LastDownloaded).IsModified = true;
db.SaveChanges();
// Find out from the config record whether the bill is stored in the table or in the local pdf folder.
//
Config cfg = db.Configs.ToList<Config>().Where(c => c.Account == ebill.Account).FirstOrDefault<Config>();
bool storePDFDataInEBillTable = true;
if (cfg != null)
{
storePDFDataInEBillTable = cfg.StorePDFDataInEBillDataTable;
}
// End of Modification
byte[] file;
if (storePDFDataInEBillTable)
{
file = ebill.PDFData;
}
else
{
string pathToFile = "";
if (string.IsNullOrEmpty(cfg.LocalPDFDataFolder))
pathToFile = cfg.LocalBackupFolder;
else
pathToFile = cfg.LocalPDFDataFolder;
if (!pathToFile.EndsWith(#"\"))
pathToFile += #"\";
pathToFile += ebill.PDFFileName;
file = System.IO.File.ReadAllBytes(pathToFile);
}
MemoryStream output = new MemoryStream();
output.Write(file, 0, file.Length);
output.Position = 0;
HttpContext.Response.AddHeader("content-disposition", "attachment; filename=ebill.pdf");
return new FileStreamResult(output, "application/pdf");
}
else
return View("PDFNotFound");
}
catch
{
return View("PDFNotFound");
}
Are there any memory leaks here?
Will the file byte array and the memory stream get freed up?
Also, is there anything else I need to do concerning clearing up the entity framework references?
If the code looks OK, where would be a good place to start looking?
Regards
Are there any memory leaks here?
No.
Will the file byte array and the memory stream get freed up?
Eventually, yes. But that may be the cause of your excessive memory use.
Also, is there anything else I need to do concerning clearing up the entity framework references?
No.
If the code looks OK, where would be a good place to start looking?
If this code is the cause of your high memory use, it's because you are loading files into memory. And you're loading two copies of each file in memory, once in a byte[] and copying to a MemoryStream.
There's no need to do that.
To eliminate the second copy of the file use the MemoryStream(byte[]) constructor instead of copying the bytes from the byte[] to an empty MemoryStream.
To eliminate the first copy in memory, you can stream the data into a temporary file that will be the target of your FileStreamResult, or initialize the FileStreamResult using a ADO.NET stream.
See https://learn.microsoft.com/en-us/dotnet/framework/data/adonet/sqlclient-streaming-support
If you go to ADO.NET streaming your DbContext, will need to be scoped to your Controller, instead of a local variable, which is a good practice in any case.
In addition to David's advice. I noticed that I was doing the following
**db.eBillDatas.ToList<eBillData>()**
therefore I was getting all the data from the database then fetching it again with the where clause.
I didn't notice the problem until the database started to fill up.
I removed that part and now the IIS worker processing is about 100mb.
This is a follow-up to this question, which contains contradictory answers. I am also interested in an answer related to a more recent version of ASP.NET.
My application uses HttpRuntime.Cache to cache some lists of models that should never expire. They are loaded on application warmup and they are changed quite rarely, but read quite often.
My code is the following:
private void ReportRemovedCallback(string key, object value, CacheItemRemovedReason reason)
{
if (!ApplicationPoolService.IsShuttingDown())
{
var str = $"Removed cached item with key {key} and count {(value as IDictionary)?.Count}, reason {reason}";
LoggingService.Log(LogLevel.Info, str);
}
}
private IDictionary<int, T> ThreadSafeCacheAccessAction(Action<IDictionary<int, T>, bool> action = null)
{
// refresh cache if necessary
var dict = HttpRuntime.Cache[CacheDictKey] as IDictionary<int, T>;
bool invalidated = false;
if (dict == null)
{
lock (CacheLockObject)
{
// getting expiration times from model attribute
var cacheInfo = typeof(T).GetCustomAttributes(typeof(UseInCachedRepositoryAttribute), inherit: true).FirstOrDefault() as UseInCachedRepositoryAttribute;
int absoluteExpiration = cacheInfo?.AbsoluteExpiration ?? Constants.Cache.IndefiniteRetention;
int slidingExpiration = cacheInfo?.SlidingExpiration ?? Constants.Cache.NoSlidingExpiration;
dict = _modelRepository.AllNoTracking.ToList().Where(item => item.PkId != 0).ToDictionary(item => item.PkId, item => item);
HttpRuntime.Cache.Insert(CacheDictKey, dict, dependencies: null,
absoluteExpiration: DateTime.Now.AddMinutes(absoluteExpiration),
slidingExpiration: slidingExpiration <= 0 ? Cache.NoSlidingExpiration : TimeSpan.FromMinutes(slidingExpiration),
priority: CacheItemPriority.NotRemovable,
onRemoveCallback: ReportRemovedCallback);
invalidated = true;
}
}
Based on the documentation provided here, I have also included the following markup within the web.config:
<caching>
<cache disableExpiration="true" disableMemoryCollection="true" />
</caching>
However, from time to time, ReportRemovedCallback is called for item removal. My feeling is that caching configuration from web.config is ignored (the documentation clearly states it is outdated) and that CacheItemPriority.NotRemovable means only "a very high priority", not "never remove".
Question: is there a way to convince HttpRuntime.Cache to never remove some items? Or should I consider another caching mechanism?
Ok, so I have dug more and there is no definitive answer, but the following configuration from this old docs seems to apply regardless of trials to deny expiration:
The following default cache element is not explicitly configured in
the machine configuration file or in the root Web.config file, but is
the default configuration returned by application in the .NET
Framework version 2.0.
<cache disableMemoryCollection="false"
disableExpiration="false" privateBytesLimit="0"
percentagePhysicalMemoryUsedLimit="90"
privateBytesPollTime="00:02:00" />
So, if physical memory usage is above 90%, it will evict cache items. Since OS tends to use almost all physical memory for system cache (reported by Task Manager), this is not so unlikely as it sounds.
Alternative
I took MemoryCache for a spin, since it is very similar to HttpRuntime.Cache. It provides similar functionality, but lacks the CacheEntryUpdateCallback (you can provide it, but complains with InvalidArgumentException if it is different from null).
Now my code is the following:
var dict = MemoryCache.Default.Get(CacheDictKey) as IDictionary<int, T>;
if (dict == null)
{
lock (CacheLockObject)
{
// getting expiration times from model attribute
var cacheInfo = typeof(T).GetCustomAttributes(typeof(UseInCachedRepositoryAttribute), inherit: true).FirstOrDefault() as UseInCachedRepositoryAttribute;
int absoluteExpiration = cacheInfo?.AbsoluteExpiration ?? Constants.Cache.IndefiniteRetention;
int slidingExpiration = cacheInfo?.SlidingExpiration ?? Constants.Cache.NoSlidingExpiration;
dict = _modelRepository.AllNoTracking.ToList().Where(item => item.PkId != 0).ToDictionary(item => item.PkId, item => item);
var cacheItemPolicy = new CacheItemPolicy
{
AbsoluteExpiration = DateTime.Now.AddMinutes(absoluteExpiration),
SlidingExpiration = slidingExpiration <= 0 ? Cache.NoSlidingExpiration : TimeSpan.FromMinutes(slidingExpiration),
Priority = System.Runtime.Caching.CacheItemPriority.NotRemovable,
// throws InvalidArgumentException
// UpdateCallback = CacheEntryUpdateCallback
};
MemoryCache.Default.Add(CacheDictKey, dict, cacheItemPolicy);
}
}
After some tests, there were no extra removals and the memory consumption of w3wp.exe raised as expected.
More details are provided within this answer.
I found this behaviour by accident, as I return the count of items in a session in an error message and found that some sessions had as many as 120 items in them (they should have 1!). On further investigation I found that every request seems to add an item into the session. They are all negative integers, like -710, -140 -528. I can't seem to see a pattern in what number comes up.
I have checked my code for any interactions with the Session object and as far as I can tell it is not me. I store one item in the session which is my own object which has a number of other properties on it. My session state is SQL server, and I am only serialising a certain set of values that need to be kept.
Has anyone seen anything like this or has any advice on where I can troubleshoot further?
Thank you in advance.
-- Edit, as requested - first where I count the items in the session - this is done in the page load event of my master page. I loop through so I could inspect using the debugger.
int itemCount = Session.Count;
for (int i = 0; i < itemCount; i++)
{
object o = Session[i];
}
-- here is where I add my custom object to the session. This is called at session start and in my master page. It runs on a "get, but if not there, create" principle.
HttpSessionState Session = HttpContext.Current.Session;
HttpRequest Request = HttpContext.Current.Request;
if (Session == null)
return null;
SessionData sessionData = (SessionData)Session[StaticNames.SESSION_NAME];
if (sessionData == null)
{
sessionData = new SessionData();
Session.Add(StaticNames.SESSION_NAME, sessionData);
}
I also have this to get the SessionData object from the session:
public SessionData(SerializationInfo info, StreamingContext ctxt)
{
this.IsManualLogin = (bool)info.GetValue("IsManualLogin", typeof(bool));
this.HasAskedUserForLocation = (bool)info.GetValue("HasAskedUserForLocation", typeof(bool));
// ... etc, more items for all users here
int? loginID = null;
try
{
loginID = info.GetInt32("LoginID");
}
catch
{
return;
}
this.LoginID = loginID.Value;
// ... etc, more items for logged in users only
}
There is also an equivalent method for adding this data to the SerializationInfo used for SqlSessionState.
Credit to the modest jadarnel27.
It turns out the Ajax Control Toolkit NoBot control adds an integer into your session on every request. My website has an auto 40 second refresh, similar to facebook, so this probably would have brought the whole thing crashing down at some point and I am lucky to find it now. Should anyone else consider using the NoBot control, be warned about this behaviour!
we have discover recently that our ASP.NET application has a lot of lines like this:
Label lbXXX = (Label)FormView.FindControl("lbXXX");
same for TextBox, Panel, Image, DropDownList...
Could this be the reason of a memory leak?
Is as bad as I think?
This is not likely to be causing a memroy leak.
To find the cause of a memory leak you should be using a memory profiler and find out what is holding on to references that it shouldn't be.
The most common reason for memory leaks in .NET is event handlers that have not been unregistered, though this tends to not be a problem in ASP.NET due to its request per thread model.
If you suspect a memory leak (how have you determined that you do have one?), profile in order to find the reason - don't assume.
Memory leaks, impossible here. yeah excessive use of find control is not recommended. Since you observing lot of such incidents then better resolve them and clean your code. Performance is also hitting currently due find control.
It's unlikely that it causes memory issues. However, it doesn't come without cost.
As you can see below, the first action is EnsureChildControls which calls CreateChildControls if they are not created yet. That might cause your performance/memory issue.
Called by the ASP.NET page framework to notify server controls that
use composition-based implementation to create any child controls they
contain in preparation for posting back or rendering.
Then not FindControl would have this issue but your custom control(s).
This is the implementation (from ILSpy):
protected virtual Control FindControl(string id, int pathOffset)
{
this.EnsureChildControls();
if (!this.flags[128])
{
Control namingContainer = this.NamingContainer;
if (namingContainer != null)
{
return namingContainer.FindControl(id, pathOffset);
}
return null;
}
else
{
if (this.HasControls())
{
this.EnsureOccasionalFields();
if (this._occasionalFields.NamedControls == null)
{
this.EnsureNamedControlsTable();
}
}
if (this._occasionalFields == null || this._occasionalFields.NamedControls == null)
{
return null;
}
char[] anyOf = new char[]
{
'$',
':'
};
int num = id.IndexOfAny(anyOf, pathOffset);
string key;
if (num == -1)
{
key = id.Substring(pathOffset);
return this._occasionalFields.NamedControls[key] as Control;
}
key = id.Substring(pathOffset, num - pathOffset);
Control control = this._occasionalFields.NamedControls[key] as Control;
if (control == null)
{
return null;
}
return control.FindControl(id, num + 1);
}
}
I have an ASP.NET application that caches some business objects. When a new object is saved, I call remove on the key to clear the objects. The new list should be lazy loaded the next time a user requests the data.
Except there is a problem with different views of the cache in different clients.
Two users are browsing the site
A new object is saved by user 1 and the cache is removed
User 1 sees the up to date view of the data
User 2 is also using the site but does not for some reason see the new cached data after user 1 has saved a new object - they continue to see the old list
This is a shortened version of the code:
public static JobCollection JobList
{
get
{
if (HttpRuntime.Cache["JobList"] == null)
{
GetAndCacheJobList();
}
return (JobCollection)HttpRuntime.Cache["JobList"];
}
}
private static void GetAndCacheJobList()
{
using (DataContext context = new DataContext(ConnectionUtil.ConnectionString))
{
var query = from j in context.JobEntities
select j;
JobCollection c = new JobCollection();
foreach (JobEntity i in query)
{
Job newJob = new Job();
....
c.Add(newJob);
}
HttpRuntime.Cache.Insert("JobList", c, null, Cache.NoAbsoluteExpiration, Cache.NoSlidingExpiration, CacheItemPriority.Default, null);
}
}
public static void SaveJob(Job job, IDbConnection connection)
{
using (DataContext context = new DataContext(connection))
{
JobEntity ent = new JobEntity();
...
context.JobEntities.InsertOnSubmit(ent);
context.SubmitChanges();
HttpRuntime.Cache.Remove("JobList");
}
}
Does anyone have any ideas why this might be happening?
Edit: I am using Linq2SQL to retreive the objects, though I am disposing of the context.
I would ask you to make sure you do not have multiple production servers for load balancing purpose. In that case you will have to user some external dependency architecture for invalidating/removing the cache items.
That's because you don't synchronize cache operations. You should lock on writing your List to the cache (possibly even get the list inside the lock) and on removing it from the cache also. Otherwise, even if reading and writing are synchronized, there's nothing to prevent storing the old List right after your call to Remove. Let me know if you need some code example.
I would also check, if you haven't already, that the old data they're seeing hasn't been somehow cached in ViewState.
You have to make sure that User 2 sent a new request. Maybe the content it saws is from it's browser's cache, not the cache from your server