Oracle Coherence CacheFactory.getCache() usage across threads - oracle-coherence

we have a multi-threaded application that uses Oracle Coherence 3.5 L1/L2 caching heavily (1k requests/second) where performance is critical...
do I need to synchronize access to CacheFactory.getCache()?
should I reuse the NamedCache result for subsequent requests?
currently its doing the following to minimize calls to the CacheFactory and synchronize access to it...
static ConcurrentHashMap<String, NamedCache> cacheMap = new ConcurrentHashMap<String, NamedCache>();
protected static NamedCache getCache(String cacheName)
{
NamedCache cache = cacheMap.get(cacheName);
if (cache == null)
{
cache = CacheFactory.getCache(cacheName);
cacheMap.put(cacheName, cache);
}
return cache;
}
UPDATE: after poking around a bit, this seems unnecessary since the Coherence APIs being supposed to be thread safe...seems like I could simplify to just this, correct?
protected static NamedCache getCache(String cacheName)
{
return CacheFactory.getCache(cacheName);
}

after some performance testing...it seemed that reusing the NamedCache did prove slightly faster, so here is where I ended up...removed synchronized, used putIfAbsent() instead
protected static NamedCache getCache(String cacheName)
{
NamedCache cache = cacheMap.get(cacheName);
if (cache == null)
{
cache = CacheFactory.getCache(cacheName);
NamedCache existing = cacheMap.putIfAbsent(cacheName, cache);
if (existing != null)
return existing;
}
return cache;
}

Related

DNX Core: Encrypt/Decrypt?

I'm porting a website to dnx core/aspnet5/mvc6. I need to store passwords to 3rd party sites in the database (it's essentially an aggregator).
In earlier versions of mvc, I did this using classes like RijndaelManaged. But those don't appear to exist in dnx core. In fact, I haven't been able to find much documentation on any general purpose encryption/decryption stuff in dnx core.
What's the recommended approach for encrypting/decrypting single field values in an mvc6 site? I don't want to encrypt the entire sql server database.
Or should I be looking at a different approach for storing the credentials necessary to access a password-protected 3rd party site?
See the DataProtection API documentation
Their guidance on using it for persistent data protection is a little hedgy but they say there is no technical reason you can't do it. Basically to store protected data persistently you need to be willing to allow unprotecting it with expired keys since the keys could expire after you protect it.
To me it seems reasonable to use it and I am using it in my own project.
Since the IPersistedDataProtector only provides methods with byte arrays I made a couple of extension methods to convert the bytes back and forth from string.
public static class DataProtectionExtensions
{
public static string PersistentUnprotect(
this IPersistedDataProtector dp,
string protectedData,
out bool requiresMigration,
out bool wasRevoked)
{
bool ignoreRevocation = true;
byte[] protectedBytes = Convert.FromBase64String(protectedData);
byte[] unprotectedBytes = dp.DangerousUnprotect(protectedBytes, ignoreRevocation, out requiresMigration, out wasRevoked);
return Encoding.UTF8.GetString(unprotectedBytes);
}
public static string PersistentProtect(
this IPersistedDataProtector dp,
string clearText)
{
byte[] clearBytes = Encoding.UTF8.GetBytes(clearText);
byte[] protectedBytes = dp.Protect(clearBytes);
string result = Convert.ToBase64String(protectedBytes);
return result;
}
}
I also created a helper class specifically for protecting certain properties on my SiteSettings object before it gets persisted to the db.
using cloudscribe.Core.Models;
using Microsoft.AspNet.DataProtection;
using Microsoft.Extensions.Logging;
using System;
namespace cloudscribe.Core.Web.Components
{
public class SiteDataProtector
{
public SiteDataProtector(
IDataProtectionProvider dataProtectionProvider,
ILogger<SiteDataProtector> logger)
{
rawProtector = dataProtectionProvider.CreateProtector("cloudscribe.Core.Models.SiteSettings");
log = logger;
}
private ILogger log;
private IDataProtector rawProtector = null;
private IPersistedDataProtector dataProtector
{
get { return rawProtector as IPersistedDataProtector; }
}
public void Protect(ISiteSettings site)
{
if (site == null) { throw new ArgumentNullException("you must pass in an implementation of ISiteSettings"); }
if (site.IsDataProtected) { return; }
if (dataProtector == null) { return; }
if (site.FacebookAppSecret.Length > 0)
{
try
{
site.FacebookAppSecret = dataProtector.PersistentProtect(site.FacebookAppSecret);
}
catch (System.Security.Cryptography.CryptographicException ex)
{
log.LogError("data protection error", ex);
}
}
// ....
site.IsDataProtected = true;
}
public void UnProtect(ISiteSettings site)
{
bool requiresMigration = false;
bool wasRevoked = false;
if (site == null) { throw new ArgumentNullException("you must pass in an implementation of ISiteSettings"); }
if (!site.IsDataProtected) { return; }
if (site.FacebookAppSecret.Length > 0)
{
try
{
site.FacebookAppSecret = dataProtector.PersistentUnprotect(site.FacebookAppSecret, out requiresMigration, out wasRevoked);
}
catch (System.Security.Cryptography.CryptographicException ex)
{
log.LogError("data protection error", ex);
}
catch (FormatException ex)
{
log.LogError("data protection error", ex);
}
}
site.IsDataProtected = false;
if (requiresMigration || wasRevoked)
{
log.LogWarning("DataProtection key wasRevoked or requires migration, save site settings for " + site.SiteName + " to protect with a new key");
}
}
}
}
If the app will need to migrate to other machines after data has been protected then you also want to take control of the key location, the default would put the keys on the OS keyring of the machine as I understand it so a lot like machinekey in the past where you would override it in web.config to be portable.
Of course protecting the keys is on you at this point. I have code like this in the startup of my project
//If you change the key persistence location, the system will no longer automatically encrypt keys
// at rest since it doesn’t know whether DPAPI is an appropriate encryption mechanism.
services.ConfigureDataProtection(configure =>
{
string pathToCryptoKeys = appBasePath + Path.DirectorySeparatorChar
+ "dp_keys" + Path.DirectorySeparatorChar;
// these keys are not encrypted at rest
// since we have specified a non default location
// that also makes the key portable so they will still work if we migrate to
// a new machine (will they work on different OS? I think so)
// this is a similar server migration issue as the old machinekey
// where we specified a machinekey in web.config so it would not change if we
// migrate to a new server
configure.PersistKeysToFileSystem(new DirectoryInfo(pathToCryptoKeys));
});
So my keys are stored in appRoot/dp_keys in this example.
If you want to do things manually;
Add a reference to System.Security.Cryptography.Algorithms
Then you can create instances of each algorithm type via the create method. For example;
var aes = System.Security.Cryptography.Aes.Create();

Should I use session or cache in this scenario? Or something else?

I have to create a mechanism to store and read preferences (controls default values and settings) per user. I have a problem related to network traffic, as the database can be accessed over the internet and the application server sometimes is connected to a poor 512Kbps internet connection.
My application can have around 50 simultaneous users and each page/form can have up to 50 items (preferences). The amount of pages is around 80.
So, considering a performance perspective, which should I choose to decrease network traffic? Session or cache?
UPDATE
I've created two sample pages, one using cache and another using session.
Load test
90 users
Stored content
1000 elements
20 characters on each element's value
Here are the results from each test case:
MemoryCache
330,725,246 total bytes allocated
Functions Allocating Most Memory
Name Bytes %
System.Runtime.Caching.MemoryCache.Set(string,object,class System.Runtime.Caching.CacheItemPolicy,string) 34,74
System.Web.UI.Page.ProcessRequest(class System.Web.HttpContext) 18,39
System.String.Concat(string,string) 12,65
System.String.Join(string,string[]) 5,31
System.Collections.Generic.Dictionary`2.Add(!0,!1) 4,42
Source code:
protected void Page_Load(object sender, EventArgs e)
{
outputPanel.Text = String.Join(System.Environment.NewLine, ReadEverything().ToArray());
}
private IEnumerable<String> ReadEverything()
{
for (int i = 0; i < 1000; i++)
{
yield return ReadFromCache(i);
}
}
private string ReadFromCache(int p)
{
String saida = String.Empty;
ObjectCache cache = MemoryCache.Default;
Dictionary<int, string> cachedItems = cache["user" + Session.SessionID] as Dictionary<int, string>;
if (cachedItems == null)
{
cachedItems = new Dictionary<int, string>();
}
if (!cachedItems.TryGetValue(p, out saida))
{
saida = Util.RandomString(20);
cachedItems.Add(p, saida);
CacheItemPolicy policy = new CacheItemPolicy();
policy.AbsoluteExpiration = DateTimeOffset.Now.AddSeconds(30);
cache.Set("user" + Session.SessionID, cachedItems, policy);
}
return saida;
}
Session
111,625,747 total bytes allocated
Functions Allocating Most Memory
Name Bytes %
System.Web.UI.Page.ProcessRequest(class System.Web.HttpContext) 55,19
System.String.Join(string,string[]) 15,93
System.Collections.Generic.Dictionary`2.Add(!0,!1) 6,00
System.Text.StringBuilder.Append(char) 5,93
System.Linq.Enumerable.ToArray(class System.Collections.Generic.IEnumerable`1) 4,46
Source code:
protected void Page_Load(object sender, EventArgs e)
{
outputPanel.Text = String.Join(System.Environment.NewLine, ReadEverything().ToArray());
}
private IEnumerable<String> ReadEverything()
{
for (int i = 0; i < 1000; i++)
{
yield return ReadFromSession(i);
}
}
private string ReadFromSession(int p)
{
String saida = String.Empty;
Dictionary<int, string> cachedItems = Session["cachedItems"] as Dictionary<int, string>;
if (cachedItems == null)
{
cachedItems = new Dictionary<int, string>();
}
if (!cachedItems.TryGetValue(p, out saida))
{
saida = Util.RandomString(20);
cachedItems.Add(p, saida);
Session["cachedItems"] = cachedItems;
}
return saida;
}
I forgot to mention that I'm creating a solution to work with ASP.Net and WPF projects, however, if the Session is far better than the MemoryCache option, I can have different solutions for each platform.
both are really the same, they are in memory... If you are using DB session and you have a poor connection then you should use cache if present, load from DB if not and then cache.
I would consider the session a cache mechanism, unlike the others it is just specific to a specific browser session. My approach would consider the following questions.
Is this site load balanced? If it is, using the session will force persistent sessions, possibly causing issues if you take down a server.
Is this data user specific? If it is, the session is an easy way to segregate data without lots of key manipulation. It also has the benefit that when a user's session times-out it automatically gets cleaned up. If it isn't, I'd recommending using the MemoryCache feature added .NET 4.0. It supports expiration.
How will this cache become outdated? If user A can modify data cached for user B, you're now serving dirty data with Session cache. This prompt a shared cache mechanism.
Edit: Once these questions have been answered, you should be able to decide what type of cache mechanism is appropriate for your situation. Then you can evaluate that subset for performance.

Multiple instances use a co-located caching but fail to access, good named caching implementation required

We have been transferring our services and MVC4 website to the cloud, overall this process went fine.
Except for caching, since we have moved to Azure it would also be wise to use some kind of caching which azure provides. We choose for co-located / dedicated caching role which has the advantage that the cache is used over all the instances.
Setting up the caching worked fine, I've got a named caching client which I only initialize when its required. It is set up in a inherited layer of the controllers. As soon as one of the functions is called, it checks if the connection to the data-cache is still there or its created. This all seems to work fine, but I'm building a module do retrieve prices. And multiple ajax inserts (views which get inserted into the page with use of javascript) use these functions, some of them are called at the same time, by multiple ajax views. Some of these views then return either a 404 or 500 error, and I cant explain where these are coming from except a non working caching, or something alike.
Can someone help me with a good implementation of the named caching (co-located or dedicated), since all I can find is many examples illustrating the initializing of the DataCacheFactory, but not of the data insertion and retrieval.
Below is the code as I have it now, I've tried more ways with use of locking etc but this one so far worked best.
private static object magicStick = new object();
private static DataCacheFactory dcf = null;
private static DataCache priceCache = null;
protected void CreateCacheFactory()
{
dcf = new DataCacheFactory();
}
protected void CreatePricesCache()
{
if (dcf == null)
{
CreateCacheFactory();
}
priceCache = dcf.GetCache("Prices");
}
protected PriceData GetPrices(int productID)
{
if (priceCache == null)
{
CreatePricesCache();
}
string cacheKey = "something";
lock (magicStick)
{
PriceData datas = priceCache.Get(cacheKey) as PriceData;
if (datas == null)
{
lock (magicStick)
{
Services svc = new Services();
PriceData pData = svc.PriceService.GetPrices(productID);
if (pData != null && pData.Offers != null && pData.Offers.Count() > 0)
{
datas = pData;
datas.Offers = datas.Offers.OrderBy(pr => (pr.BasePrice + pr.ShippingCosts)).ToArray();
priceCache.Add(cacheKey, datas, new TimeSpan(0, cachingTimePricesKK, 0));
}
}
}
return datas;
}
}
As soon as I get to a page where there are pricelists and the function above is called multiple times with the same arguments, there is a 5-10% chance that it returns an error rather then returning the results. Can anybody help me, im totally stuck with this for a week now and its eating me up inside.
First I'd move your cache and cacheFactory instantiation out of your getPrices method. Also, evaluate your need for the lock - this may be causing timeouts. Another VERY important observation - you are using a constant cache key and saving/retrieving data for every productId with the same cache key. You should be using a cache key like: var cacheKey = string.format("priceDatabyProductId-{0}", productId);. You need to set some breakpoints and examine exactly what you are caching and retrieving from the cache. The code as written will save the first productId to the cache and then keep returning that data regardless of the productId.
Here is a full working example we use in production using the "default" named cache in dedicated cache roles:
public static class MyCache
{
private static DataCacheFactory _cacheFactory = null;
private static DataCache ACache
{
get
{
if (_cacheFactory == null)
{
try
{
_retryPolicy.ExecuteAction(() => { _cacheFactory = new DataCacheFactory(); });
return _cacheFactory == null ? null : _cacheFactory.GetDefaultCache();
}
catch (Exception ex)
{
ErrorSignal.FromCurrentContext().Raise(ex);
return null;
}
}
return _cacheFactory.GetDefaultCache();
}
}
public static void FlushCache()
{
ACache.Clear();
}
// Define your retry strategy: retry 3 times, 1 second apart.
private static readonly FixedInterval _retryStrategy = new FixedInterval(3, TimeSpan.FromSeconds(1));
// Define your retry policy using the retry strategy and the Windows Azure storage
// transient fault detection strategy.
private static RetryPolicy _retryPolicy = new RetryPolicy<StorageTransientErrorDetectionStrategy>(_retryStrategy);
// Private constructor to prevent instantiation
// and force consumers to use the Instance property
static MyCache()
{ }
/// <summary>
/// Add an item to the cache with a key and set a absolute expiration on it
/// </summary>
public static void Add(string key, object value, int minutes = 90)
{
try
{
_retryPolicy.ExecuteAction(() => { ACache.Put(key, value, TimeSpan.FromMinutes(minutes)); });
}
catch (Exception ex)
{
ErrorSignal.FromCurrentContext().Raise(ex);
}
}
/// <summary>
/// Add the object with the specified key to the cache if it does not exist, or replace the object if it does exist and set a absolute expiration on it
/// only valid for Azure caching
/// </summary>
public static void Put(string key, object value, int minutes = 90)
{
try
{
_retryPolicy.ExecuteAction(() => { ACache.Put(key, value, TimeSpan.FromMinutes(minutes)); });
}
catch (Exception ex)
{
ErrorSignal.FromCurrentContext().Raise(ex);
}
}
/// <summary>
/// Get a strongly typed item out of cache
/// </summary>
public static T Get<T>(string key) where T : class
{
try
{
object value = null;
_retryPolicy.ExecuteAction(() => { value = ACache.Get(key); });
if (value != null) return (T) value;
return null;
}
catch (DataCacheException ex)
{
ErrorSignal.FromCurrentContext().Raise(ex);
return null;
}
catch (Exception ex)
{
ErrorSignal.FromCurrentContext().Raise(ex);
return null;
}
}
/// <summary>
/// Microsoft's suggested method for cleaning up resources such as this in a static class
/// to ensure connections and other consumed resources are returned to the resource pool
/// as quickly as possible.
/// </summary>
public static void Uninitialize()
{
if (_cacheFactory == null) return;
_cacheFactory.Dispose();
_cacheFactory = null;
}
}
Note: this is also using the Transient Fault Handling block from the Enterprise Library for transient exception fault handling.

w3wp using between 700mb-1.2gb of memory not under load

I have an MVC3 app using EF/Autofac and memcached (3 front end servers). When starting up the app it will quickly jump up to 300mb with one user. On our production site (where the traffic is low (< 10 users at any one time)) it can easily reach over 1.2gb of memory. This seems like a very large number to me.
I have tried Redgate Memory Profiler but I just can't seem to find anything that looks out of the ordinary.
The only thing I can think of is perhaps the following code is causing some kind of memory leakage what do you think?
private Dictionary<string, object> cacheDictionary = new Dictionary<string, object>();
private IMemcachedClient _client;
public MemcachedManager(IMemcachedClient client)
{
this._client = client;
}
#region ICacheManager Members
public T Get<T>(string key)
{
object data;
if (!cacheDictionary.TryGetValue(key, out data))
{
byte[] byteData = _client.Get<byte[]>(key);
data = CommonHelper.Deserialize<T>(byteData);
cacheDictionary.Add(key, data);
}
return (T)data;
}
public void Set(string key, object data, int cacheTime)
{
if (!cacheDictionary.ContainsKey(key)) //if memcache turned off this should still work
cacheDictionary.Add(key, data);
data = CommonHelper.Serialize(data);
bool setCache = _client.Store(StoreMode.Set, key, data, DateTime.Now.AddMinutes(cacheTime));
if (!setCache)
{
log.ErrorFormat("Failed to set the cache item, key {0}", key);
}
}

ASP.NET Clear Cache

If you cache pages in a HttpHandler with
_context.Response.Cache.SetCacheability(HttpCacheability.Public);
_context.Response.Cache.SetExpires(DateTime.Now.AddSeconds(180));
is it possible to clear a certain page from the cache?
is it possible to Clear a certain page
from the Cache?
Yes:
HttpResponse.RemoveOutputCacheItem("/pages/default.aspx");
You can also use Cache dependencies to remove pages from the Cache:
this.Response.AddCacheItemDependency("key");
After making that call, if you modify Cache["key"], it will cause the page to be removed from the Cache.
In case it might help, I cover caching in detail in my book: Ultra-Fast ASP.NET.
more simple one...
public static void ClearCache()
{
Cache cache = HttpRuntime.Cache;
IDictionaryEnumerator dictionaryEnumerators = cache.GetEnumerator();
foreach (string key in (IEnumerable<string>) dictionaryEnumerators.Key)
{
cache.Remove(key);
}
}
The following code will remove all keys from the cache:
public void ClearApplicationCache(){
List<string> keys = new List<string>();
// retrieve application Cache enumerator
IDictionaryEnumerator enumerator = Cache.GetEnumerator();
// copy all keys that currently exist in Cache
while (enumerator.MoveNext()){
keys.Add(enumerator.Key.ToString());
}
// delete every key from cache
for (int i = 0; i < keys.Count; i++) {
Cache.Remove(keys[i]);
}
}
Modifying the second loop to check the value of the key before removing it should be trivial.
Hope this helps.

Resources