I have an MVC3 app using EF/Autofac and memcached (3 front end servers). When starting up the app it will quickly jump up to 300mb with one user. On our production site (where the traffic is low (< 10 users at any one time)) it can easily reach over 1.2gb of memory. This seems like a very large number to me.
I have tried Redgate Memory Profiler but I just can't seem to find anything that looks out of the ordinary.
The only thing I can think of is perhaps the following code is causing some kind of memory leakage what do you think?
private Dictionary<string, object> cacheDictionary = new Dictionary<string, object>();
private IMemcachedClient _client;
public MemcachedManager(IMemcachedClient client)
{
this._client = client;
}
#region ICacheManager Members
public T Get<T>(string key)
{
object data;
if (!cacheDictionary.TryGetValue(key, out data))
{
byte[] byteData = _client.Get<byte[]>(key);
data = CommonHelper.Deserialize<T>(byteData);
cacheDictionary.Add(key, data);
}
return (T)data;
}
public void Set(string key, object data, int cacheTime)
{
if (!cacheDictionary.ContainsKey(key)) //if memcache turned off this should still work
cacheDictionary.Add(key, data);
data = CommonHelper.Serialize(data);
bool setCache = _client.Store(StoreMode.Set, key, data, DateTime.Now.AddMinutes(cacheTime));
if (!setCache)
{
log.ErrorFormat("Failed to set the cache item, key {0}", key);
}
}
Related
I have a InMemory Cache method which I want to implement using Redis Cache, it uses the concept of Semaphore in the InMemory Cache, hence I have gone through many documentation's like the Redis documentation on Distributed Locking, and tried using different locking mechanisms in Redis such as RedLock.Net , RedLock-cs etc, but I haven't been able to completely go through with the technique, as even though I am able to recreate the method, but the running time for large number of processes for it in Redis is coming out to be huge.
The code that I am trying to recreate is as follows:
private readonly ConcurrentDictionary<string, Lazy<SemaphoreSlim>> _semaphores = new ConcurrentDictionary<string, Lazy<SemaphoreSlim>>();
public async ValueTask<TValue> GetMethod<TValue>(string key,Func<Task<TValue>> get,int time)
{
return await SemaphoreMethod(key, async() =>
{
var value = await get();
if(value != null)
cache.set(key,value,time);
return value;
});
}
private async ValueTask<TValue> SemaphoreMethod<TValue>(string key,Func<Task<TValue>> get)
{
if(cache.TryGetValue(key,out TValue result))
return result;
var semaphore = _semaphores.GetOrAdd(key,k => new Lazy<SemaphoreSlim>(() => new SemaphoreSlim(1,1))).Value;
await semaphore.WaitAsync();
TValue answer = default;
if(cache.TryGetValue(key,out TValue result2))
return result2;
answer = await get();
_semaphores.TryRemove(key,out _);
semaphore.Release();
return answer;
}
And the method which I run against it to check it's implementation and efficiency in comparison to InMemory Cache is as follows:
public async Task GetValueAsync_Tasks()
{
var callCount = 0;
async Task<string> Get()
{
Interlocked.Increment(ref callCount);
return "value";
}
const int count = 1000;
var valueTasks = new List<ValueTask<string>>(count);
for(var i=0;i<count;i++)
valueTask.Add(GetMethod("key",Get));
var tasks = valueTasks.Select(vt => vt.AsTask()).ToArray();
Task.WaitAll(tasks);
callCount.Should().Be(1);
}
One other thing that I am not able to understand that in which part of the test method, would the code run all the tasks created, i.e. in this one
valueTasks.Add(sut.GetValueAsync("key", FetchValue));
or in this one
Task.WaitAll(tasks);
I am using the StackExchange.Redis library, and Azure Cache for Redis for storing the respective data.
I'm porting a website to dnx core/aspnet5/mvc6. I need to store passwords to 3rd party sites in the database (it's essentially an aggregator).
In earlier versions of mvc, I did this using classes like RijndaelManaged. But those don't appear to exist in dnx core. In fact, I haven't been able to find much documentation on any general purpose encryption/decryption stuff in dnx core.
What's the recommended approach for encrypting/decrypting single field values in an mvc6 site? I don't want to encrypt the entire sql server database.
Or should I be looking at a different approach for storing the credentials necessary to access a password-protected 3rd party site?
See the DataProtection API documentation
Their guidance on using it for persistent data protection is a little hedgy but they say there is no technical reason you can't do it. Basically to store protected data persistently you need to be willing to allow unprotecting it with expired keys since the keys could expire after you protect it.
To me it seems reasonable to use it and I am using it in my own project.
Since the IPersistedDataProtector only provides methods with byte arrays I made a couple of extension methods to convert the bytes back and forth from string.
public static class DataProtectionExtensions
{
public static string PersistentUnprotect(
this IPersistedDataProtector dp,
string protectedData,
out bool requiresMigration,
out bool wasRevoked)
{
bool ignoreRevocation = true;
byte[] protectedBytes = Convert.FromBase64String(protectedData);
byte[] unprotectedBytes = dp.DangerousUnprotect(protectedBytes, ignoreRevocation, out requiresMigration, out wasRevoked);
return Encoding.UTF8.GetString(unprotectedBytes);
}
public static string PersistentProtect(
this IPersistedDataProtector dp,
string clearText)
{
byte[] clearBytes = Encoding.UTF8.GetBytes(clearText);
byte[] protectedBytes = dp.Protect(clearBytes);
string result = Convert.ToBase64String(protectedBytes);
return result;
}
}
I also created a helper class specifically for protecting certain properties on my SiteSettings object before it gets persisted to the db.
using cloudscribe.Core.Models;
using Microsoft.AspNet.DataProtection;
using Microsoft.Extensions.Logging;
using System;
namespace cloudscribe.Core.Web.Components
{
public class SiteDataProtector
{
public SiteDataProtector(
IDataProtectionProvider dataProtectionProvider,
ILogger<SiteDataProtector> logger)
{
rawProtector = dataProtectionProvider.CreateProtector("cloudscribe.Core.Models.SiteSettings");
log = logger;
}
private ILogger log;
private IDataProtector rawProtector = null;
private IPersistedDataProtector dataProtector
{
get { return rawProtector as IPersistedDataProtector; }
}
public void Protect(ISiteSettings site)
{
if (site == null) { throw new ArgumentNullException("you must pass in an implementation of ISiteSettings"); }
if (site.IsDataProtected) { return; }
if (dataProtector == null) { return; }
if (site.FacebookAppSecret.Length > 0)
{
try
{
site.FacebookAppSecret = dataProtector.PersistentProtect(site.FacebookAppSecret);
}
catch (System.Security.Cryptography.CryptographicException ex)
{
log.LogError("data protection error", ex);
}
}
// ....
site.IsDataProtected = true;
}
public void UnProtect(ISiteSettings site)
{
bool requiresMigration = false;
bool wasRevoked = false;
if (site == null) { throw new ArgumentNullException("you must pass in an implementation of ISiteSettings"); }
if (!site.IsDataProtected) { return; }
if (site.FacebookAppSecret.Length > 0)
{
try
{
site.FacebookAppSecret = dataProtector.PersistentUnprotect(site.FacebookAppSecret, out requiresMigration, out wasRevoked);
}
catch (System.Security.Cryptography.CryptographicException ex)
{
log.LogError("data protection error", ex);
}
catch (FormatException ex)
{
log.LogError("data protection error", ex);
}
}
site.IsDataProtected = false;
if (requiresMigration || wasRevoked)
{
log.LogWarning("DataProtection key wasRevoked or requires migration, save site settings for " + site.SiteName + " to protect with a new key");
}
}
}
}
If the app will need to migrate to other machines after data has been protected then you also want to take control of the key location, the default would put the keys on the OS keyring of the machine as I understand it so a lot like machinekey in the past where you would override it in web.config to be portable.
Of course protecting the keys is on you at this point. I have code like this in the startup of my project
//If you change the key persistence location, the system will no longer automatically encrypt keys
// at rest since it doesn’t know whether DPAPI is an appropriate encryption mechanism.
services.ConfigureDataProtection(configure =>
{
string pathToCryptoKeys = appBasePath + Path.DirectorySeparatorChar
+ "dp_keys" + Path.DirectorySeparatorChar;
// these keys are not encrypted at rest
// since we have specified a non default location
// that also makes the key portable so they will still work if we migrate to
// a new machine (will they work on different OS? I think so)
// this is a similar server migration issue as the old machinekey
// where we specified a machinekey in web.config so it would not change if we
// migrate to a new server
configure.PersistKeysToFileSystem(new DirectoryInfo(pathToCryptoKeys));
});
So my keys are stored in appRoot/dp_keys in this example.
If you want to do things manually;
Add a reference to System.Security.Cryptography.Algorithms
Then you can create instances of each algorithm type via the create method. For example;
var aes = System.Security.Cryptography.Aes.Create();
I have to create a mechanism to store and read preferences (controls default values and settings) per user. I have a problem related to network traffic, as the database can be accessed over the internet and the application server sometimes is connected to a poor 512Kbps internet connection.
My application can have around 50 simultaneous users and each page/form can have up to 50 items (preferences). The amount of pages is around 80.
So, considering a performance perspective, which should I choose to decrease network traffic? Session or cache?
UPDATE
I've created two sample pages, one using cache and another using session.
Load test
90 users
Stored content
1000 elements
20 characters on each element's value
Here are the results from each test case:
MemoryCache
330,725,246 total bytes allocated
Functions Allocating Most Memory
Name Bytes %
System.Runtime.Caching.MemoryCache.Set(string,object,class System.Runtime.Caching.CacheItemPolicy,string) 34,74
System.Web.UI.Page.ProcessRequest(class System.Web.HttpContext) 18,39
System.String.Concat(string,string) 12,65
System.String.Join(string,string[]) 5,31
System.Collections.Generic.Dictionary`2.Add(!0,!1) 4,42
Source code:
protected void Page_Load(object sender, EventArgs e)
{
outputPanel.Text = String.Join(System.Environment.NewLine, ReadEverything().ToArray());
}
private IEnumerable<String> ReadEverything()
{
for (int i = 0; i < 1000; i++)
{
yield return ReadFromCache(i);
}
}
private string ReadFromCache(int p)
{
String saida = String.Empty;
ObjectCache cache = MemoryCache.Default;
Dictionary<int, string> cachedItems = cache["user" + Session.SessionID] as Dictionary<int, string>;
if (cachedItems == null)
{
cachedItems = new Dictionary<int, string>();
}
if (!cachedItems.TryGetValue(p, out saida))
{
saida = Util.RandomString(20);
cachedItems.Add(p, saida);
CacheItemPolicy policy = new CacheItemPolicy();
policy.AbsoluteExpiration = DateTimeOffset.Now.AddSeconds(30);
cache.Set("user" + Session.SessionID, cachedItems, policy);
}
return saida;
}
Session
111,625,747 total bytes allocated
Functions Allocating Most Memory
Name Bytes %
System.Web.UI.Page.ProcessRequest(class System.Web.HttpContext) 55,19
System.String.Join(string,string[]) 15,93
System.Collections.Generic.Dictionary`2.Add(!0,!1) 6,00
System.Text.StringBuilder.Append(char) 5,93
System.Linq.Enumerable.ToArray(class System.Collections.Generic.IEnumerable`1) 4,46
Source code:
protected void Page_Load(object sender, EventArgs e)
{
outputPanel.Text = String.Join(System.Environment.NewLine, ReadEverything().ToArray());
}
private IEnumerable<String> ReadEverything()
{
for (int i = 0; i < 1000; i++)
{
yield return ReadFromSession(i);
}
}
private string ReadFromSession(int p)
{
String saida = String.Empty;
Dictionary<int, string> cachedItems = Session["cachedItems"] as Dictionary<int, string>;
if (cachedItems == null)
{
cachedItems = new Dictionary<int, string>();
}
if (!cachedItems.TryGetValue(p, out saida))
{
saida = Util.RandomString(20);
cachedItems.Add(p, saida);
Session["cachedItems"] = cachedItems;
}
return saida;
}
I forgot to mention that I'm creating a solution to work with ASP.Net and WPF projects, however, if the Session is far better than the MemoryCache option, I can have different solutions for each platform.
both are really the same, they are in memory... If you are using DB session and you have a poor connection then you should use cache if present, load from DB if not and then cache.
I would consider the session a cache mechanism, unlike the others it is just specific to a specific browser session. My approach would consider the following questions.
Is this site load balanced? If it is, using the session will force persistent sessions, possibly causing issues if you take down a server.
Is this data user specific? If it is, the session is an easy way to segregate data without lots of key manipulation. It also has the benefit that when a user's session times-out it automatically gets cleaned up. If it isn't, I'd recommending using the MemoryCache feature added .NET 4.0. It supports expiration.
How will this cache become outdated? If user A can modify data cached for user B, you're now serving dirty data with Session cache. This prompt a shared cache mechanism.
Edit: Once these questions have been answered, you should be able to decide what type of cache mechanism is appropriate for your situation. Then you can evaluate that subset for performance.
We have been transferring our services and MVC4 website to the cloud, overall this process went fine.
Except for caching, since we have moved to Azure it would also be wise to use some kind of caching which azure provides. We choose for co-located / dedicated caching role which has the advantage that the cache is used over all the instances.
Setting up the caching worked fine, I've got a named caching client which I only initialize when its required. It is set up in a inherited layer of the controllers. As soon as one of the functions is called, it checks if the connection to the data-cache is still there or its created. This all seems to work fine, but I'm building a module do retrieve prices. And multiple ajax inserts (views which get inserted into the page with use of javascript) use these functions, some of them are called at the same time, by multiple ajax views. Some of these views then return either a 404 or 500 error, and I cant explain where these are coming from except a non working caching, or something alike.
Can someone help me with a good implementation of the named caching (co-located or dedicated), since all I can find is many examples illustrating the initializing of the DataCacheFactory, but not of the data insertion and retrieval.
Below is the code as I have it now, I've tried more ways with use of locking etc but this one so far worked best.
private static object magicStick = new object();
private static DataCacheFactory dcf = null;
private static DataCache priceCache = null;
protected void CreateCacheFactory()
{
dcf = new DataCacheFactory();
}
protected void CreatePricesCache()
{
if (dcf == null)
{
CreateCacheFactory();
}
priceCache = dcf.GetCache("Prices");
}
protected PriceData GetPrices(int productID)
{
if (priceCache == null)
{
CreatePricesCache();
}
string cacheKey = "something";
lock (magicStick)
{
PriceData datas = priceCache.Get(cacheKey) as PriceData;
if (datas == null)
{
lock (magicStick)
{
Services svc = new Services();
PriceData pData = svc.PriceService.GetPrices(productID);
if (pData != null && pData.Offers != null && pData.Offers.Count() > 0)
{
datas = pData;
datas.Offers = datas.Offers.OrderBy(pr => (pr.BasePrice + pr.ShippingCosts)).ToArray();
priceCache.Add(cacheKey, datas, new TimeSpan(0, cachingTimePricesKK, 0));
}
}
}
return datas;
}
}
As soon as I get to a page where there are pricelists and the function above is called multiple times with the same arguments, there is a 5-10% chance that it returns an error rather then returning the results. Can anybody help me, im totally stuck with this for a week now and its eating me up inside.
First I'd move your cache and cacheFactory instantiation out of your getPrices method. Also, evaluate your need for the lock - this may be causing timeouts. Another VERY important observation - you are using a constant cache key and saving/retrieving data for every productId with the same cache key. You should be using a cache key like: var cacheKey = string.format("priceDatabyProductId-{0}", productId);. You need to set some breakpoints and examine exactly what you are caching and retrieving from the cache. The code as written will save the first productId to the cache and then keep returning that data regardless of the productId.
Here is a full working example we use in production using the "default" named cache in dedicated cache roles:
public static class MyCache
{
private static DataCacheFactory _cacheFactory = null;
private static DataCache ACache
{
get
{
if (_cacheFactory == null)
{
try
{
_retryPolicy.ExecuteAction(() => { _cacheFactory = new DataCacheFactory(); });
return _cacheFactory == null ? null : _cacheFactory.GetDefaultCache();
}
catch (Exception ex)
{
ErrorSignal.FromCurrentContext().Raise(ex);
return null;
}
}
return _cacheFactory.GetDefaultCache();
}
}
public static void FlushCache()
{
ACache.Clear();
}
// Define your retry strategy: retry 3 times, 1 second apart.
private static readonly FixedInterval _retryStrategy = new FixedInterval(3, TimeSpan.FromSeconds(1));
// Define your retry policy using the retry strategy and the Windows Azure storage
// transient fault detection strategy.
private static RetryPolicy _retryPolicy = new RetryPolicy<StorageTransientErrorDetectionStrategy>(_retryStrategy);
// Private constructor to prevent instantiation
// and force consumers to use the Instance property
static MyCache()
{ }
/// <summary>
/// Add an item to the cache with a key and set a absolute expiration on it
/// </summary>
public static void Add(string key, object value, int minutes = 90)
{
try
{
_retryPolicy.ExecuteAction(() => { ACache.Put(key, value, TimeSpan.FromMinutes(minutes)); });
}
catch (Exception ex)
{
ErrorSignal.FromCurrentContext().Raise(ex);
}
}
/// <summary>
/// Add the object with the specified key to the cache if it does not exist, or replace the object if it does exist and set a absolute expiration on it
/// only valid for Azure caching
/// </summary>
public static void Put(string key, object value, int minutes = 90)
{
try
{
_retryPolicy.ExecuteAction(() => { ACache.Put(key, value, TimeSpan.FromMinutes(minutes)); });
}
catch (Exception ex)
{
ErrorSignal.FromCurrentContext().Raise(ex);
}
}
/// <summary>
/// Get a strongly typed item out of cache
/// </summary>
public static T Get<T>(string key) where T : class
{
try
{
object value = null;
_retryPolicy.ExecuteAction(() => { value = ACache.Get(key); });
if (value != null) return (T) value;
return null;
}
catch (DataCacheException ex)
{
ErrorSignal.FromCurrentContext().Raise(ex);
return null;
}
catch (Exception ex)
{
ErrorSignal.FromCurrentContext().Raise(ex);
return null;
}
}
/// <summary>
/// Microsoft's suggested method for cleaning up resources such as this in a static class
/// to ensure connections and other consumed resources are returned to the resource pool
/// as quickly as possible.
/// </summary>
public static void Uninitialize()
{
if (_cacheFactory == null) return;
_cacheFactory.Dispose();
_cacheFactory = null;
}
}
Note: this is also using the Transient Fault Handling block from the Enterprise Library for transient exception fault handling.
Currently, I use custom written authentication code for my site, which is built on .NET. I didn't take the standard Forms Auth route, as all the examples I could find were tightly integrated with WebForms, which I do not use. For all intents and purposes, I have all static HTML, and any logic is done through Javascript and web service calls. Things like logging in, logging off, and creating a new account are done without even leaving the page.
Here's how it works now: In the database, I have a User ID, a Security ID, and a Session ID. All three are UUIDs, and the first two never change. Each time the user logs on, I check the user table for a row that matches that username and hashed password, and I update the Session ID to a new UUID. I then create a cookie that's a serialized representation of all three UUIDs. In any secure web service calls, I deserialize that cookie that make sure there's a row in the users table with those 3 UUIDs. It's a fairly simple system and works well, however I don't really like the fact that a user can only be logged on with one client at a time. It's going to cause issues when I create mobile and tablet apps, and already creates issues if they have multiple computers or web browsers. For this reason, I'm thinking about throwing away this system and going with something new. Since I wrote it years ago, I figure there might be something much more recommended.
I've been reading up on the FormsAuthentication class in the .NET Framework, which handles auth cookies, and runs as an HttpModule to validate each request. I'm wondering if I can take advantage of this in my new design.
It looks like cookies are stateless, and sessions don't have to be tracked within the database. This is done through the fact that cookies are encrypted with a private key on the server, that can also be shared across a cluster of web servers. If I do something like:
FormsAuthentication.SetAuthCookie("Bob", true);
Then in later requests, I can be assured that Bob is indeed a valid user as a cookie would be very difficult if not impossible to forge.
Would I be wise to use the FormsAuthentication class to replace my current authentication model with? Rather than have a Session ID column in the database, I'd rely on encrypted cookies to represent valid sessions.
Are there third party/open source .NET authentication frameworks that might work better for my architecture?
Will this authentication mechanism cause any grief with code running on mobile and tablet clients, such as an iPhone app or Windows 8 Surface app? I would assume this would work, as long as these apps could handle cookies. Thanks!
Since I didn't get any responses, I decided to take a shot at this myself. First, I found an open source project that implements session cookies in an algorithm agnostic way. I used this as a starting point to implement a similar handler.
One issue I had with the built in ASP.NET implementation, which is a similar restriction in the AppHarbor implementation, is sessions are only keyed by a string username. I wanted to be able to store arbitrary data to identify a user, such as their UUID in the database as well as their logon name. As much of my existing code assumes this data is available in the cookie, it would take a lot of refactoring if this data were no longer available. Plus, I like the idea of being able to store basic user information without having to hit the database.
Another issue with the AppHarbor project, as pointed out in the this open issue, is the encryption algorithm isn't verified. This is not exactly true, as AppHarbor is algorithm agnostic, however it was requested that the sample project should show how to use PBKDF2. For that reason, I decided to use this algorithm (implemented in the .NET Framework through the Rfc2898DeriveBytes class) in my code.
Here's what I was able to come up with. It's meant as a starting point for someone looking to implement their own session management, so feel free to use it for whatever purpose you see fit.
using System;
using System.IO;
using System.Linq;
using System.Runtime.Serialization.Formatters.Binary;
using System.Security;
using System.Security.Cryptography;
using System.Security.Principal;
using System.Web;
namespace AuthTest
{
[Serializable]
public class AuthIdentity : IIdentity
{
public Guid Id { get; private set; }
public string Name { get; private set; }
public AuthIdentity() { }
public AuthIdentity(Guid id, string name)
{
Id = id;
Name = name;
}
public string AuthenticationType
{
get { return "CookieAuth"; }
}
public bool IsAuthenticated
{
get { return Id != Guid.Empty; }
}
}
[Serializable]
public class AuthToken : IPrincipal
{
public IIdentity Identity { get; set; }
public bool IsInRole(string role)
{
return false;
}
}
public class AuthModule : IHttpModule
{
static string COOKIE_NAME = "AuthCookie";
//Note: Change these two keys to something else (VALIDATION_KEY is 72 bytes, ENCRYPTION_KEY is 64 bytes)
static string VALIDATION_KEY = #"MkMvk1JL/ghytaERtl6A25iTf/ABC2MgPsFlEbASJ5SX4DiqnDN3CjV7HXQI0GBOGyA8nHjSVaAJXNEqrKmOMg==";
static string ENCRYPTION_KEY = #"QQJYW8ditkzaUFppCJj+DcCTc/H9TpnSRQrLGBQkhy/jnYjqF8iR6do9NvI8PL8MmniFvdc21sTuKkw94jxID4cDYoqr7JDj";
static byte[] key;
static byte[] iv;
static byte[] valKey;
public void Dispose()
{
}
public void Init(HttpApplication context)
{
context.AuthenticateRequest += OnAuthenticateRequest;
context.EndRequest += OnEndRequest;
byte[] bytes = Convert.FromBase64String(ENCRYPTION_KEY); //72 bytes (8 for salt, 64 for key)
byte[] salt = bytes.Take(8).ToArray();
byte[] pw = bytes.Skip(8).ToArray();
Rfc2898DeriveBytes k1 = new Rfc2898DeriveBytes(pw, salt, 1000);
key = k1.GetBytes(16);
iv = k1.GetBytes(8);
valKey = Convert.FromBase64String(VALIDATION_KEY); //64 byte validation key to prevent tampering
}
public static void SetCookie(AuthIdentity token, bool rememberMe = false)
{
//Base64 encode token
var formatter = new BinaryFormatter();
MemoryStream stream = new MemoryStream();
formatter.Serialize(stream, token);
byte[] buffer = stream.GetBuffer();
byte[] encryptedBytes = EncryptCookie(buffer);
string str = Convert.ToBase64String(encryptedBytes);
var cookie = new HttpCookie(COOKIE_NAME, str);
cookie.HttpOnly = true;
if (rememberMe)
{
cookie.Expires = DateTime.Today.AddDays(100);
}
HttpContext.Current.Response.Cookies.Add(cookie);
}
public static void Logout()
{
HttpContext.Current.Response.Cookies.Remove(COOKIE_NAME);
HttpContext.Current.Response.Cookies.Add(new HttpCookie(COOKIE_NAME, "")
{
Expires = DateTime.Today.AddDays(-1)
});
}
private static byte[] EncryptCookie(byte[] rawBytes)
{
TripleDES des = TripleDES.Create();
des.Key = key;
des.IV = iv;
MemoryStream encryptionStream = new MemoryStream();
CryptoStream encrypt = new CryptoStream(encryptionStream, des.CreateEncryptor(), CryptoStreamMode.Write);
encrypt.Write(rawBytes, 0, rawBytes.Length);
encrypt.FlushFinalBlock();
encrypt.Close();
byte[] encBytes = encryptionStream.ToArray();
//Add validation hash (compute hash on unencrypted data)
HMACSHA256 hmac = new HMACSHA256(valKey);
byte[] hash = hmac.ComputeHash(rawBytes);
//Combine encrypted bytes and validation hash
byte[] ret = encBytes.Concat<byte>(hash).ToArray();
return ret;
}
private static byte[] DecryptCookie(byte[] encBytes)
{
TripleDES des = TripleDES.Create();
des.Key = key;
des.IV = iv;
HMACSHA256 hmac = new HMACSHA256(valKey);
int valSize = hmac.HashSize / 8;
int msgLength = encBytes.Length - valSize;
byte[] message = new byte[msgLength];
byte[] valBytes = new byte[valSize];
Buffer.BlockCopy(encBytes, 0, message, 0, msgLength);
Buffer.BlockCopy(encBytes, msgLength, valBytes, 0, valSize);
MemoryStream decryptionStreamBacking = new MemoryStream();
CryptoStream decrypt = new CryptoStream(decryptionStreamBacking, des.CreateDecryptor(), CryptoStreamMode.Write);
decrypt.Write(message, 0, msgLength);
decrypt.Flush();
byte[] decMessage = decryptionStreamBacking.ToArray();
//Verify key matches
byte[] hash = hmac.ComputeHash(decMessage);
if (valBytes.SequenceEqual(hash))
{
return decMessage;
}
throw new SecurityException("Auth Cookie appears to have been tampered with!");
}
private void OnAuthenticateRequest(object sender, EventArgs e)
{
var context = ((HttpApplication)sender).Context;
var cookie = context.Request.Cookies[COOKIE_NAME];
if (cookie != null && cookie.Value.Length > 0)
{
try
{
var formatter = new BinaryFormatter();
MemoryStream stream = new MemoryStream();
var bytes = Convert.FromBase64String(cookie.Value);
var decBytes = DecryptCookie(bytes);
stream.Write(decBytes, 0, decBytes.Length);
stream.Seek(0, SeekOrigin.Begin);
AuthIdentity auth = formatter.Deserialize(stream) as AuthIdentity;
AuthToken token = new AuthToken() { Identity = auth };
context.User = token;
//Renew the cookie for another 100 days (TODO: Should only renew if cookie was originally set to persist)
context.Response.Cookies[COOKIE_NAME].Value = cookie.Value;
context.Response.Cookies[COOKIE_NAME].Expires = DateTime.Today.AddDays(100);
}
catch { } //Ignore any errors with bad cookies
}
}
private void OnEndRequest(object sender, EventArgs e)
{
var context = ((HttpApplication)sender).Context;
var response = context.Response;
if (response.Cookies.Keys.Cast<string>().Contains(COOKIE_NAME))
{
response.Cache.SetCacheability(HttpCacheability.NoCache, "Set-Cookie");
}
}
}
}
Also, be sure to include the following module in your web.config file:
<httpModules>
<add name="AuthModule" type="AuthTest.AuthModule" />
</httpModules>
In your code, you can lookup the currently logged on user with:
var id = HttpContext.Current.User.Identity as AuthIdentity;
And set the auth cookie like so:
AuthIdentity token = new AuthIdentity(Guid.NewGuid(), "Mike");
AuthModule.SetCookie(token, false);