Cache in ServiceStack web services - asp.net

I am new to caching and trying to understand how it works in general. Below is code snippet from ServiceStack website.
public object Get(CachedCustomers request)
{
//Manually create the Unified Resource Name "urn:customers".
return base.RequestContext.ToOptimizedResultUsingCache(base.Cache, "urn:customers", () =>
{
//Resolve the service in order to get the customers.
using (var service = this.ResolveService<CustomersService>())
return service.Get(new Customers());
});
}
public object Get(CachedCustomerDetails request)
{
//Create the Unified Resource Name "urn:customerdetails:{id}".
var cacheKey = UrnId.Create<CustomerDetails>(request.Id);
return base.RequestContext.ToOptimizedResultUsingCache(base.Cache, cacheKey, () =>
{
using (var service = this.ResolveService<CustomerDetailsService>())
{
return service.Get(new CustomerDetails { Id = request.Id });
}
});
}
My doubts are:
I've read that cached data is stored in RAM on same/distributed server. So, how much data can it handle, suppose in first method if customers count is more than 1 million, doesn't it occupy too much memory.
In general case, do we apply caching only for GET operations and invalidate if it gets UPDATE'd.
Please suggest any tool to check memory consumption of caching.

I think you can find the answers to your questions here -https://github.com/ServiceStack/ServiceStack/wiki/Caching
I've read that cached data is stored in RAM on same/distributed server...
There are several ways to 'persist' cached data. Again, see here - https://github.com/ServiceStack/ServiceStack/wiki/Caching. 'InMemory' is the option you seem to be questioning. The other options don't have the same impact on RAM.
In general case, do we apply caching only for GET operations and invalidate if it gets UPDATE'd.
In ServiceStack you can manually clear/invalidate the cache or have a time based expiration. If you manually clear the cache I would recommend doing so on DELETES and UPDATES. You are free to choose how you manage/invalidate the cache. You just want to avoid having stale data in your cache. As far as 'apply caching' you would return cached data on GET operations, but your system can access cached data just like any other data store. Again, you just need recognize the cache my not have the most recent set of data.

Related

Does calling service in foreach slow down speed Asp.Net Core 3?

I would like to ask a question. I want to use services in foreach. But I think that when I call the service to access my database for menu items, the service can open new connections. If so what can I use? After 1000 items what can it be? Can the speed of the page decrease because of this?
#foreach (var parent in Model)
{
var menuItem = _uow.Menu.GetById(parent.Id);
#if (menuItem != null)
{
<span>#menuItem.Title</span>
}
}
The interface service
T GetById(int id);
The Entity Framework Service
public T GetById(int id)
{
return _context.Set<T>().Find(id);
}
Thank you.
Individually retrieving 1000 items will perform 1000 calls to the database, which is slower than a single query which will perform a single call (it's similar to performing a 1000 API calls vs a single one).
A solution to your issue would be to change your service so that you can query multiple ids at once:
var menuItems = _uow.Menu.GetByIds(Model.Select(parent => parent.Id));
and the service:
public IEnumerable<T> GetByIds(IEnumerable<int> ids)
{
return _context.Set<T>().Where(t => ids.Contains(t.id));
}
From your code point of view, each time you loop, you will access the database once. If there are 1000 counts of the model in the foreach, then you need to access the database 1000 times.
Assuming that it takes 0.01 seconds for the database access to return, the time consumed in the database for 1000 cycles is 10s. This is obviously unreasonable, because the time is proportional to the number of foreach.
Suggestion
If a certain kind of data needs to be accessed frequently, we generally store it in a cache, such as redis cache or MemoryCache. This can optimize speed.

Amazon DynamoDBMapper.delete method does not delete item

I used AWS DynamoDBMapper Java class to build a repository class to support CRUD operations. In my unit test, I created an item, saved it to DB, loaded it and then deleted it. Then I did a query to DB with the primary key of deleted item, query returns an empty list, all looked correct. However, when I check the table on AWS console the deleted item is still there, and another client on a different session can still find this item. What did I do wrong? Is there any other configuration or set up required to ensure the "hard delete" happened as expected? My API looks like this:
public void deleteObject(Object obj) {
Object objToDelete = load(obj);
delete(obj);
}
public Object load(Object obj) {
return MAPPER.load(Object.class, obj.getId(),
ConsistentReads.CONSISTENT.config());
}
private void save(Object obj) {
MAPPER.save(obj, SaveBehavior.CLOBBER.config());
}
private void delete(Object obj) {
MAPPER.delete(obj);
}
Any help/hint/tip is munch appreciated
Dynamodb is eventually consistent by default. Creating -> Reading -> Deleting immediately would not always work.
Eventually Consistent Reads (Default) – the eventual consistency
option maximizes your read throughput. However, an eventually
consistent read might not reflect the results of a recently completed
write. Consistency across all copies of data is usually reached within
a second. Repeating a read after a short time should return the
updated data.
Strongly Consistent Reads — in addition to eventual consistency,
Amazon DynamoDB also gives you the flexibility and control to request
a strongly consistent read if your application, or an element of your
application, requires it. A strongly consistent read returns a result
that reflects all writes that received a successful response prior to
the read.

Does any asp.net data cache support background population of cache entries?

We have a data driven ASP.NET website which has been written using the standard pattern for data caching (adapted here from MSDN):
public DataTable GetData()
{
string key = "DataTable";
object item = Cache[key] as DataTable;
if((item == null)
{
item = GetDataFromSQL();
Cache.Insert(key, item, null, DateTime.Now.AddSeconds(300), TimeSpan.Zero;
}
return (DataTable)item;
}
The trouble with this is that the call to GetDataFromSQL() is expensive and the use of the site is fairly high. So every five minutes, when the cache drops, the site becomes very 'sticky' while a lot of requests are waiting for the new data to be retrieved.
What we really want to happen is for the old data to remain current while new data is periodically reloaded in the background. (The fact that someone might therefore see data that is six minutes old isn't a big issue - the data isn't that time sensitive). This is something that I can write myself, but it would be useful to know if any alternative caching engines (I know names like Velocity, memcache) support this kind of scenario. Or am I missing some obvious trick with the standard ASP.NET data cache?
You should be able to use the CacheItemUpdateCallback delegate which is the 6th parameter which is the 4th overload for Insert using ASP.NET Cache:
Cache.Insert(key, value, dependancy, absoluteExpiration,
slidingExpiration, onUpdateCallback);
The following should work:
Cache.Insert(key, item, null, DateTime.Now.AddSeconds(300),
Cache.NoSlidingExpiration, itemUpdateCallback);
private void itemUpdateCallback(string key, CacheItemUpdateReason reason,
out object value, out CacheDependency dependency, out DateTime expiriation,
out TimeSpan slidingExpiration)
{
// do your SQL call here and store it in 'value'
expiriation = DateTime.Now.AddSeconds(300);
value = FunctionToGetYourData();
}
From MSDN:
When an object expires in the cache,
ASP.NET calls the
CacheItemUpdateCallback method with
the key for the cache item and the
reason you might want to update the
item. The remaining parameters of this
method are out parameters. You supply
the new cached item and optional
expiration and dependency values to
use when refreshing the cached item.
The update callback is not called if
the cached item is explicitly removed
by using a call to Remove().
If you want the cached item to be
removed from the cache, you must
return null in the expensiveObject
parameter. Otherwise, you return a
reference to the new cached data by
using the expensiveObject parameter.
If you do not specify expiration or
dependency values, the item will be
removed from the cache only when
memory is needed.
If the callback method throws an
exception, ASP.NET suppresses the
exception and removes the cached
value.
I haven't tested this so you might have to tinker with it a bit but it should give you the basic idea of what your trying to accomplish.
I can see that there's a potential solution to this using AppFabric (the cache formerly known as Velocity) in that it allows you to lock a cached item so it can be updated. While an item is locked, ordinary (non-locking) Get requests still work as normal and return the cache's current copy of the item.
Doing it this way would also allow you to separate out your GetDataFromSQL method to a different process, say a Windows Service, that runs every five minutes, which should alleviate your 'sticky' site.
Or...
Rather than just caching the data for five minutes at a time regardless, why not use a SqlCacheDependency object when you put the data into the cache, so that it'll only be refreshed when the data actually changes. That way you can cache the data for longer periods, so you get better performance, and you'll always be showing the up-to-date data.
(BTW, top tip for making your intention clearer when you're putting objects into the cache - the Cache has a NoSlidingExpiration (and a NoAbsoluteExpiration) constant available that's more readable than your Timespan.Zero)
First, put the date you actually need in a lean class (also known as POCO) instead of that DataTable hog.
Second, use cache and hash - so that when your time dependency expires you can spawn an async delegate to fetch new data but your old data is still safe in a separate hash table (not Dictionary - it's not safe for multi-reader single writer threading).
Depending on the kind of data and the time/budget to restructure SQL side you could potentially fetch only things that have LastWrite younger that your update window. you will need 2-step update (have to copy dats from the hash-kept opject into new object - stuff in hash is strictly read-only for any use or the hell will break loose).
Oh and SqlCacheDependency is notorious for being unreliable and can make your system break into mad updates.

ASP.NET cache Application Variable

I am wondering about potential issues with – alternatives to - caching using an application variable. To preface, I have a custom application object that stores all application data in ONE application variable – the object organizes data in an XML cloud and stores the cloud in one variable for performance.
I am aware of DataSet caching, but all of my DAL objects select (read only) into data readers for performance, so if the solution involves DataSets I have a ton of refactoring to do – not ideal. Record count is low to medium, involving website data and small to medium project management apps; we're not talking a half-million records.
Here is the structure of the function I intend to use in my DAL classes (Select All):
if (AppSetting.GetSetting(_CacheSettingName) == "")
{
SqlHelper objSqlHelper = new SqlHelper();
XmlReader objReader = objSqlHelper.GetXmlReaderByCmd("Select * From FAQ FOR XML PATH('item'), root('" + _CacheSettingName + "')");
//load Cache
StringBuilder _xml = new StringBuilder();
objReader.Read();
while (objReader.ReadState != ReadState.EndOfFile)
{
_xml.Append(objReader.ReadOuterXml());
}
objSqlHelper.Dispose();
AppSetting.SaveSetting(_CacheSettingName, _xml.ToString());
}
//we have cache loaded
// now load the object list from the XML cloud from the application cache
List<FAQBLL> objFAQList = new List<FAQBLL>();
FAQBLL objFAQ;
XmlDocument oXmlDocument = new XmlDocument();
oXmlDocument.LoadXml(AppSetting.GetSetting(_CacheSettingName));
foreach (XmlNode oNode in oXmlDocument.FirstChild.ChildNodes)
{
objFAQ = new FAQBLL();
objFAQ.ID = Convert.ToInt32(oNode.SelectSingleNode("ID").InnerXml);
objFAQ.Question = oNode.SelectSingleNode("Question").InnerXml;
objFAQ.Answer = oNode.SelectSingleNode("Answer").InnerXml;
objFAQList.Add(objFAQ);
objFAQ = null;
}
return objFAQList.Count > 0 ? objFAQList: null;
So my cache returns all to the calling proc, then I LINK to filter the object (by active, by location). On insert, update, and delete I have one line of code to clear the cache. Again, I have the advantage of using read only data readers for performance, and my application object organizes XML into one application variable. MY thinking was that this is low overhead, not having to discard data readers for data sets when I'm just reading data.
Thoughts? Opinions? Are you traumatized by this?
yes, it is a little traumatizing. The problem with using application variables as a cache is that you forfeit some of the best features of caches. For example, the asp.net cache provides some awesome cache invalidation features. This ensures that a) the cache does not increasingly take up more and more resources, and b) that the cache remains fresh (for example, if sourced from a database).
You should really think about using the right tool for the job

ASP.NET cache objects read-write

what happens if an user trying to read HttpContext.Current.Cache[key] while the other one trying to remove object HttpContext.Current.Cache.Remove(key) at the same time?
Just think about hundreds of users reading from cache and trying to clean some cache objects at the same time. What happens and is it thread safe?
Is it possible to create database aware business objects in cache?
The built-in ASP.Net Cache object (http://msdn.microsoft.com/en-us/library/system.web.caching.cache.aspx) is thread-safe, so insert/remove actions in multi-threaded environments are inherently safe.
Your primary requirement for putting any object in cache is that is must be serializable. So yes, your db-aware business object can go in the cache.
If the code is unable to get the object, then nothing / null is returned.
Why would you bother to cache an object if you would have the chance of removing it so frequently? Its better to set an expiration time and reload the object if its no longer in the cache.
Can you explain "DB aware object"? Do you mean a sql cache dependency, or just an object that has information about a db connection?
EDIT:
Reponse to comment #3.
I think we are missing something here. Let me explain what I think you mean, and you can tell me if its right.
UserA checks for an object in cache
("resultA") and does not find it.
UserA runs a query. Results are
cached as "resultA" for 5 minutes.
UserB checks for an object in cache
("resultA") and does find it.
UserB uses the cached object "resultA"
If this is the case, then you dont need a Sql Cache dependency.
Well i have a code to populate cache:
string cacheKey = GetCacheKey(filter, sort);
if (HttpContext.Current.Cache[cacheKey] == null)
{
reader = base.ExecuteReader(SelectQuery);
HttpContext.Current.Cache[cacheKey] =
base.GetListByFilter(reader, filter, sort);
}
return HttpContext.Current.Cache[cacheKey] as List<CurrencyDepot>;
and when table updated cleanup code below executing:
private void CleanCache()
{
IDictionaryEnumerator enumerator =
HttpContext.Current.Cache.GetEnumerator();
while (enumerator.MoveNext())
{
if (enumerator.Key.ToString().Contains(_TableName))
{
try {
HttpContext.Current.Cache.Remove(enumerator.Key.ToString());
} catch (Exception) {}
}
}
}
Is this usage cause a trouble?

Resources