ASP.NET cache Application Variable - asp.net

I am wondering about potential issues with – alternatives to - caching using an application variable. To preface, I have a custom application object that stores all application data in ONE application variable – the object organizes data in an XML cloud and stores the cloud in one variable for performance.
I am aware of DataSet caching, but all of my DAL objects select (read only) into data readers for performance, so if the solution involves DataSets I have a ton of refactoring to do – not ideal. Record count is low to medium, involving website data and small to medium project management apps; we're not talking a half-million records.
Here is the structure of the function I intend to use in my DAL classes (Select All):
if (AppSetting.GetSetting(_CacheSettingName) == "")
{
SqlHelper objSqlHelper = new SqlHelper();
XmlReader objReader = objSqlHelper.GetXmlReaderByCmd("Select * From FAQ FOR XML PATH('item'), root('" + _CacheSettingName + "')");
//load Cache
StringBuilder _xml = new StringBuilder();
objReader.Read();
while (objReader.ReadState != ReadState.EndOfFile)
{
_xml.Append(objReader.ReadOuterXml());
}
objSqlHelper.Dispose();
AppSetting.SaveSetting(_CacheSettingName, _xml.ToString());
}
//we have cache loaded
// now load the object list from the XML cloud from the application cache
List<FAQBLL> objFAQList = new List<FAQBLL>();
FAQBLL objFAQ;
XmlDocument oXmlDocument = new XmlDocument();
oXmlDocument.LoadXml(AppSetting.GetSetting(_CacheSettingName));
foreach (XmlNode oNode in oXmlDocument.FirstChild.ChildNodes)
{
objFAQ = new FAQBLL();
objFAQ.ID = Convert.ToInt32(oNode.SelectSingleNode("ID").InnerXml);
objFAQ.Question = oNode.SelectSingleNode("Question").InnerXml;
objFAQ.Answer = oNode.SelectSingleNode("Answer").InnerXml;
objFAQList.Add(objFAQ);
objFAQ = null;
}
return objFAQList.Count > 0 ? objFAQList: null;
So my cache returns all to the calling proc, then I LINK to filter the object (by active, by location). On insert, update, and delete I have one line of code to clear the cache. Again, I have the advantage of using read only data readers for performance, and my application object organizes XML into one application variable. MY thinking was that this is low overhead, not having to discard data readers for data sets when I'm just reading data.
Thoughts? Opinions? Are you traumatized by this?

yes, it is a little traumatizing. The problem with using application variables as a cache is that you forfeit some of the best features of caches. For example, the asp.net cache provides some awesome cache invalidation features. This ensures that a) the cache does not increasingly take up more and more resources, and b) that the cache remains fresh (for example, if sourced from a database).
You should really think about using the right tool for the job

Related

Downsides of streaming large JSON or HTML content to a browser in ASP.NET MVC

I am working with a legacy ASP.NET Framework MVC application that is experiencing issues with memory usage such as occasional bursts of OutOfMemory exceptions across a variety of operations. The application is frequently operating with large lists of objects (sometimes 10s to 100s of megabytes), and then serializing them to JSON to return to the client. We are not totally sure what the source of the OutOfMemory exceptions is, but believe a likely candidate is memory fragmentation due to too many large objects going on the Large Object Heap.
We are thinking a quick win is to refactor some of the controller endpoints to serialize their JSON content using a stream writer (as outlined in the JSON.NET Documentation), and to stream the content back to the browser. This won't eliminate the memory load of the data lists prior to serialization, but in theory should cut down the overall amount of data going on to the LOH.
The code is written to send the results in chunks of less than 85kb:
public async Task<ActionResult> MyControllerMethod()
{
var data = GetData();
Response.BufferOutput = false;
Response.ContentType = "application/json";
var serializer = JsonSerializer.Create();
using (var sw = new StreamWriter(Response.OutputStream, Encoding.UTF8, 84999))
{
sw.AutoFlush = false;
serializer.Serialize(sw, data);
await sw.FlushAsync();
}
return new EmptyResult();
}
I am aware of a few downsides with this approach, but don't consider them showstoppers:
More complex to implement a unit test due to the 'EmptyResult' returned by the controller.
I have read there is a small overhead due to a call to PInvoke whenever data is flushed. (In practice I haven't noticed this).
Cannot post-process the content using e.g. an HttpHandler
Cannot set a content-length header which may be useful for the client in some cases.
What other downsides or potential problems exist with this approach?

Slowdown issue in web project

I just need suggestion in this case. There is a PIN code field in my project in asp.net environment. I have stored 50,000 around pin code in sql server database. When I run project in local host, it becomes slow down. Since I have a drop-down to get value from database. I think it is because of huge data is being rendered into html, since when I click on view source at run-time, I can see all the PIN-code inside it.
Moreover, I have also done this for Select CITY, and STATE from database in a same way.
I will really appreciate you, if you get me any logic or technique to lessen this slowdown
If you are using all the Pincode in the single page then You have multiple option to optimized this slow down If this is in initialized phase then Try MongoDB ,No SQL DB otherwise go for Solr , Redis that gives fast accessing of the data. If you are not able to using these then You can optimised it by eager loading , Cache Storing of data.
If its not in single page then break it to batch via paginate the pincode.
This is common problem with any website where we deal with large amount of data. To be frank there is no code level solution for this. You need to select any of following approach.
You can try multiple options for faster retrieval.
Caching -
Use redis or memcache - in simpler words, on the first request cache manager will read and store your data from SQL server. For subsequent requests, data will be served from cache.
Also, don't forget to make a provision to invalidate the data when new pin codes are added.
Edit: You can also use object caching provided by .Net framework. Refer: object caching
Code will be something like.
if (Cache["key_pincodes"] == null)
{
// if No object is present in Cache, add it to the cache with expiry time of 10 minutes
// Read data to datatable or any object
DataTable pinCodeObject = GetPinCodesFromdatabase();
Cache.Insert("key_pincodes", pinCodeObject, null, DateTime.MaxValue, TimeSpan.FromMinutes(10));
}
else // If pinCodes are cached, dont make Database call and read it from cache
{
// This will get execute
DataTable pinCodeObject = (DataTable)Cache["key_pincodes"];
}
// bind it your dropdown
No-sql database-
MongoDB, XML, Txt files could be used to read the data. It will take much lesser time than the database hit.

Cache in ServiceStack web services

I am new to caching and trying to understand how it works in general. Below is code snippet from ServiceStack website.
public object Get(CachedCustomers request)
{
//Manually create the Unified Resource Name "urn:customers".
return base.RequestContext.ToOptimizedResultUsingCache(base.Cache, "urn:customers", () =>
{
//Resolve the service in order to get the customers.
using (var service = this.ResolveService<CustomersService>())
return service.Get(new Customers());
});
}
public object Get(CachedCustomerDetails request)
{
//Create the Unified Resource Name "urn:customerdetails:{id}".
var cacheKey = UrnId.Create<CustomerDetails>(request.Id);
return base.RequestContext.ToOptimizedResultUsingCache(base.Cache, cacheKey, () =>
{
using (var service = this.ResolveService<CustomerDetailsService>())
{
return service.Get(new CustomerDetails { Id = request.Id });
}
});
}
My doubts are:
I've read that cached data is stored in RAM on same/distributed server. So, how much data can it handle, suppose in first method if customers count is more than 1 million, doesn't it occupy too much memory.
In general case, do we apply caching only for GET operations and invalidate if it gets UPDATE'd.
Please suggest any tool to check memory consumption of caching.
I think you can find the answers to your questions here -https://github.com/ServiceStack/ServiceStack/wiki/Caching
I've read that cached data is stored in RAM on same/distributed server...
There are several ways to 'persist' cached data. Again, see here - https://github.com/ServiceStack/ServiceStack/wiki/Caching. 'InMemory' is the option you seem to be questioning. The other options don't have the same impact on RAM.
In general case, do we apply caching only for GET operations and invalidate if it gets UPDATE'd.
In ServiceStack you can manually clear/invalidate the cache or have a time based expiration. If you manually clear the cache I would recommend doing so on DELETES and UPDATES. You are free to choose how you manage/invalidate the cache. You just want to avoid having stale data in your cache. As far as 'apply caching' you would return cached data on GET operations, but your system can access cached data just like any other data store. Again, you just need recognize the cache my not have the most recent set of data.

Does any asp.net data cache support background population of cache entries?

We have a data driven ASP.NET website which has been written using the standard pattern for data caching (adapted here from MSDN):
public DataTable GetData()
{
string key = "DataTable";
object item = Cache[key] as DataTable;
if((item == null)
{
item = GetDataFromSQL();
Cache.Insert(key, item, null, DateTime.Now.AddSeconds(300), TimeSpan.Zero;
}
return (DataTable)item;
}
The trouble with this is that the call to GetDataFromSQL() is expensive and the use of the site is fairly high. So every five minutes, when the cache drops, the site becomes very 'sticky' while a lot of requests are waiting for the new data to be retrieved.
What we really want to happen is for the old data to remain current while new data is periodically reloaded in the background. (The fact that someone might therefore see data that is six minutes old isn't a big issue - the data isn't that time sensitive). This is something that I can write myself, but it would be useful to know if any alternative caching engines (I know names like Velocity, memcache) support this kind of scenario. Or am I missing some obvious trick with the standard ASP.NET data cache?
You should be able to use the CacheItemUpdateCallback delegate which is the 6th parameter which is the 4th overload for Insert using ASP.NET Cache:
Cache.Insert(key, value, dependancy, absoluteExpiration,
slidingExpiration, onUpdateCallback);
The following should work:
Cache.Insert(key, item, null, DateTime.Now.AddSeconds(300),
Cache.NoSlidingExpiration, itemUpdateCallback);
private void itemUpdateCallback(string key, CacheItemUpdateReason reason,
out object value, out CacheDependency dependency, out DateTime expiriation,
out TimeSpan slidingExpiration)
{
// do your SQL call here and store it in 'value'
expiriation = DateTime.Now.AddSeconds(300);
value = FunctionToGetYourData();
}
From MSDN:
When an object expires in the cache,
ASP.NET calls the
CacheItemUpdateCallback method with
the key for the cache item and the
reason you might want to update the
item. The remaining parameters of this
method are out parameters. You supply
the new cached item and optional
expiration and dependency values to
use when refreshing the cached item.
The update callback is not called if
the cached item is explicitly removed
by using a call to Remove().
If you want the cached item to be
removed from the cache, you must
return null in the expensiveObject
parameter. Otherwise, you return a
reference to the new cached data by
using the expensiveObject parameter.
If you do not specify expiration or
dependency values, the item will be
removed from the cache only when
memory is needed.
If the callback method throws an
exception, ASP.NET suppresses the
exception and removes the cached
value.
I haven't tested this so you might have to tinker with it a bit but it should give you the basic idea of what your trying to accomplish.
I can see that there's a potential solution to this using AppFabric (the cache formerly known as Velocity) in that it allows you to lock a cached item so it can be updated. While an item is locked, ordinary (non-locking) Get requests still work as normal and return the cache's current copy of the item.
Doing it this way would also allow you to separate out your GetDataFromSQL method to a different process, say a Windows Service, that runs every five minutes, which should alleviate your 'sticky' site.
Or...
Rather than just caching the data for five minutes at a time regardless, why not use a SqlCacheDependency object when you put the data into the cache, so that it'll only be refreshed when the data actually changes. That way you can cache the data for longer periods, so you get better performance, and you'll always be showing the up-to-date data.
(BTW, top tip for making your intention clearer when you're putting objects into the cache - the Cache has a NoSlidingExpiration (and a NoAbsoluteExpiration) constant available that's more readable than your Timespan.Zero)
First, put the date you actually need in a lean class (also known as POCO) instead of that DataTable hog.
Second, use cache and hash - so that when your time dependency expires you can spawn an async delegate to fetch new data but your old data is still safe in a separate hash table (not Dictionary - it's not safe for multi-reader single writer threading).
Depending on the kind of data and the time/budget to restructure SQL side you could potentially fetch only things that have LastWrite younger that your update window. you will need 2-step update (have to copy dats from the hash-kept opject into new object - stuff in hash is strictly read-only for any use or the hell will break loose).
Oh and SqlCacheDependency is notorious for being unreliable and can make your system break into mad updates.

Thread Safety on my method

Is this code considered thread safe even though multiple threads may be polling the directory for files on the webserver at once?
Thanks,
Mike
//Get a testimonial
virtualRoot = HostingEnvironment.ApplicationVirtualPath;
configuration = WebConfigurationManager.OpenWebConfiguration(virtualRoot);
pathToNewsDirectory = configuration.AppSettings.Settings["VanHinoWeb.news.dir"].Value;
fileListing = Directory.GetFiles(pathToNewsDirectory, "*xml");
int index = generator.Next(0, fileListing.Length);
//Put it into XML and get data into string array to return
testimonialReader = new XmlTextReader(fileListing[index]);
testimonialReader.ReadToFollowing("quote");
testimonialData[0] = testimonialReader.ReadString();
testimonialReader.ReadToFollowing("author");
testimonialData[1] = testimonialReader.ReadString();
testimonialReader.ReadToFollowing("authorTitle");
testimonialData[2] = testimonialReader.ReadString();
return testimonialData;
}
Most of the calls under the Get a testimonial comment should be thread safe. Accessing the hosting environment, app settings, and listing directory contents should be fine.
However, it's unclear where the generator object, testimonialReader, and testimonialData objects are created and whether they are shared across threads or not. If they are shared across threads, then the code is not thread safe.
As you are only reading files it looks that this is thread safe. Also you need to close the XmlTextReader (testimonialReader) after you are finished using it.

Resources