Is this piece of code where I lock a part of the function correct? Or can it have use drawbacks when multiple sessions ask concurrently for the same Exam?
Purpose is that client that first asks for the Exam will assemble it, all next clients will get the cached version.
public Exam GetExamByExamDto(ExamDTO examDto, int languageId)
{
Log.Warn("GetExamByExamDto");
lock (LockString)
{
if (!ContainsExam(examDto.id, languageId))
{
Log.Warn("Assembling ExamDto");
var examAssembler = new ExamAssembler();
var exam = examAssembler.createExam(examDto);
if (AddToCache(exam))
{
_examDictionary.Add(examDto.id + "_" + languageId, exam);
}
Log.Warn("Returning non cached ExamDto");
return exam;
}
}
Log.Warn("Returning cached ExamDto");
return _examDictionary[examDto.id + "_" + languageId];
}
I have a feeling that this isn't the way to do it.
Never lock on strings - they are immutable and interned, so when trying to access the same string elsewhere you may end up locking your whole application.
Just use a new object as your lock:
private readonly object Padlock = new object();
See this blog post by Tess Ferrandez.
The LockString variable, is it really a string? You shouldn't lock on a string, as you may end up with problems to do with string interning that will cause multiple locks to lock up.
The other thing is that there seems to be a lot of code inside the lock, meaning that you lock for longer than you may have to. If you can, try to get as much code out of the lock as possible.
It looks basically OK but you should not lock on a string. Interning makes it hard to control which instance is which. Just create a separate object to lock on:
private object padLock = new object();
Also you can use double check (after the exam is cached Monitor.Lock won't be invoked at all):
public Exam GetExamByExamDto(ExamDTO examDto, int languageId)
{
Log.Warn("GetExamByExamDto");
if (!ContainsExam(examDto.id, languageId))
{
lock (LockString) // Here should be some private locking object.
{
if (!ContainsExam(examDto.id, languageId))
{
Log.Warn("Assembling ExamDto");
var examAssembler = new ExamAssembler();
var exam = examAssembler.createExam(examDto);
if (AddToCache(exam))
{
_examDictionary.Add(examDto.id + "_" + languageId, exam);
}
Log.Warn("Returning non cached ExamDto");
return exam;
}
}
Log.Warn("Returning cached ExamDto");
return _examDictionary[examDto.id + "_" + languageId];
}
}
Related
I have been scratching my head for this from last one week, i will really appreciate any help here, so here we go:
I have a website running on Dnn 7.4 and i want to add caching functionality on it. On first attempt it reads from database and on second attempt onwards it should read from cache, the problem is on first attempt it reads as expected, on 2nd attempt it reads from cache also, but then again on 3rd attempt it fetches from database. Here's my simple code:
if (Cache["Something"] != null)
{
Response.Write("Cache" + Cache["Something"]);
}
else
{
Cache["Something"] = "Cool";
Response.Write("Latest" + Cache["Something"]);
}
DataCache implementation:
using DotNetNuke.Common.Utilities;
string dCache = DataCache.GetCache("test") as string;
if (String.IsNullOrEmpty(dCache))
{
DataCache.SetCache("test", "cache content");
Response.Write("From Database: cache content");
}
else
{
Response.Write("From Cache: " + dCache);
}
You do not show what kind of cache you are using and your policies of the cached items. Here a sample on how it might work with a MemoryCache:
ObjectCache cache = MemoryCache.Default;
string something = cache["Something"] as string;
if (something == null){
CacheItemPolicy policy = new CacheItemPolicy();
//Configure your expire policy here
cache.Set("Something", "Cool", policy);
something = "Cool";
}
return something;
Now you only have to configure the policy for the cache item handling.
I suggest to use DataCache from DotNetNuke.Common.Utilities
//Get Cache
string str = DataCache.GetCache("something") as string;
//Remove cache
DataCache.RemoveCache("something");
// Set Cache
DataCache.SetCache("something", "value of something");
I have been trying to find a way to make this task more efficient. I am consuming a REST based web service and need to update information for over 2500 clients.
I am using fiddler to watch the requests, and I'm also updating a table with an update time when its complete. I'm getting about 1 response per second. Are my expectations to high? I'm not even sure what I would define as 'fast' in this context.
I am handling everything in my controller and have tried running multiple web requests in parallel based on examples around the place but it doesn't seem to make a difference. To be honest I don't understand it well enough and was just trying to get it to build. I suspect it is still waiting for each request to complete before firing again.
I have also increased connections in my web config file as per another suggestion with no success:
<system.net>
<connectionManagement>
<add address="*" maxconnection="20" />
</connectionManagement>
</system.net>
My Controllers action method looks like this:
public async Task<ActionResult> UpdateMattersAsync()
{
//Only get matters we haven't synced yet
List<MatterClientRepair> repairList = Data.Get.AllUnsyncedMatterClientRepairs(true);
//Take the next 500
List<MatterClientRepair> subRepairList = repairList.Take(500).ToList();
FinalisedMatterViewModel vm = new FinalisedMatterViewModel();
using (ApplicationDbContext db = new ApplicationDbContext())
{
int jobCount = 0;
foreach (var job in subRepairList)
{
// If not yet synced - it shouldn't ever be!!
if (!job.Synced)
{
jobCount++;
// set up some Authentication fields
var oauth = new OAuth.Manager();
oauth["access_token"] = Session["AccessToken"].ToString();
string uri = "https://app.com/api/v2/matters/" + job.Matter;
// prepare the json object for the body
MatterClientJob jsonBody = new MatterClientJob();
jsonBody.matter = new MatterForUpload();
jsonBody.matter.client_id = job.NewClient;
string jsonString = jsonBody.ToJSON();
// Send it off. It returns the whole object we updated - we don't actually do anything with it
Matter result = await oauth.Update<Matter>(uri, oauth["access_token"], "PUT", jsonString);
// update our entities
var updateJob = db.MatterClientRepairs.Find(job.ID);
updateJob.Synced = true;
updateJob.Update_Time = DateTime.Now;
db.Entry(updateJob).State = System.Data.Entity.EntityState.Modified;
if (jobCount % 50 == 0)
{
// save every 50 changes
db.SaveChanges();
}
}
}
// if there are remaining files to save
if (jobCount % 50 != 0)
{
db.SaveChanges();
}
return View("FinalisedMatters", Data.Get.AllMatterClientRepairs());
}
}
And of course the Update method itself which handles the Web requesting:
public async Task<T> Update<T>(string uri, string token, string method, string json)
{
var authzHeader = GenerateAuthzHeader(uri, method);
// prepare the token request
var request = (HttpWebRequest)WebRequest.Create(uri);
request.Headers.Add("Authorization", authzHeader);
request.Method = method;
request.ContentType = "application/json";
request.Accept = "application/json, text/javascript";
byte[] bytes = System.Text.Encoding.ASCII.GetBytes(json);
request.ContentLength = bytes.Length;
System.IO.Stream os = request.GetRequestStream();
os.Write(bytes, 0, bytes.Length);
os.Close();
WebResponse response = await request.GetResponseAsync();
using (var reader = new System.IO.StreamReader(response.GetResponseStream()))
{
return JsonConvert.DeserializeObject<T>(reader.ReadToEnd());
}
}
If it's not possible to do more than 1 request per second then I'm interested in looking at an Ajax solution so I can give the user some feedback while it is processing. In my current solution I cannot give the user feedback while the action method hasn't reached 'return' yet can I?
Okay it's taken me a few days (and a LOT of trial and error) but I've worked this out. Hopefully it can help others. I finally found my silver bullet. And it was probably the place I should have started:
MSDN: Consuming the Task-based Asynchronous Pattern
In the end this following line of code is what brought it all to light.
string [] pages = await Task.WhenAll(from url in urls select DownloadStringAsync(url));
I substituted a few things to make it work for a Put request as follows:
HttpResponseMessage[] results = await Task.WhenAll(from p in toUpload select client.PutAsync(p.uri, p.jsonContent));
'toUpload' is a List of MyClass:
public class MyClass
{
// the URI should be relative to the base pase
// (ie: /api/v2/matters/101)
public string uri { get; set; }
// a string in JSON format, being the body of the PUT request
public StringContent jsonContent { get; set; }
}
The key was to stop trying to put my PutAsync method inside a loop. My new line of code IS still blocking until ALL responses have come back, but that is what I wanted. Also, learning that I could use this LINQ style expression to create a Task List on the fly was immeasurably helpful. I won't post all the code (unless someone wants it) because it's not as nicely refactored as the original and I still need to check whether the response of each item was 200 OK before I record it as successfully saved in my database. So how much faster is it?
Results
I tested a sample of 50 web service calls from my local machine. (There is some saving of records to a SQL Database in Azure at the end).
Original Synchronous Code: 70.73 seconds
Asynchronous Code: 8.89 seconds
That's gone from 1.4146 requests per second down to a mind melting 0.1778 requests per second! (if you average it out)
Conclusion
My journey isn't over. I've just scratched the surface of asynchronous programming and am loving it. I need to now work out how to save only the results that have returned 200 OK. I can deserialize the HttpResponse which returns a JSON object (which has a unique ID I can look up etc.) OR I could use the Task.WhenAny method, and experiment with Interleaving.
Is there any method for storing global variables without using cookies or session[""] in asp.net mvc ?
I know that cookies and session[""] have some disadvantages and I want to use the best method if exit.
If they are indeed global variables, you should implement the singleton pattern and have an Instance globally accessible that holds your variables.
Here is a simple example:
public sealed class Settings
{
private static Settings instance = null;
static readonly object padlock = new object();
// initialize your variables here. You can read from database for example
Settings()
{
this.prop1 = "prop1";
this.prop2 = "prop2";
}
public static Settings Instance
{
get
{
lock (padlock)
{
if (instance == null)
{
instance = new Settings();
}
return instance;
}
}
}
// declare your global variables here
public string prop1 { get; set; }
public string prop2 { get; set; }
}
The you can use them in your code like this:
var globalvar1 = Settings.Instance.prop1;
This class with its variables will be initialized only once (when the application starts) and it will be available in your application globally.
Basically you have following options:
Cookies - valid as long as you set, must be allowed by client's browser, can be deleted by user, stored on user's PC.
Session - valid for all requests, not for a single redirect, stored on server.
ViewData - after redirect it's cleared (lives only during single request).
TempData - it's useful for passing short messages to view, after reading a value it's deleted.
ViewBag - is available only during the current request, if redirection occurs then it’s value becomes null, is dynamic so you don't have intellisense and errors may occur only in runtime.
Here - http://www.dotnet-tricks.com/Tutorial/mvc/9KHW190712-ViewData-vs-ViewBag-vs-TempData-vs-Session.html - you can find fantastic article which describes them.
Sure: HttpContextBase.Application (no expiration) or HttpContextBase.Cache (with expiration). You can access the HttpContextBase instance through the HttpContext property of the Controller class.
So... HACK ALERT... There is no good way to do an MVC 5 or 6 web app using session variables that I have found (yet). MVC doesn't support Session variables or Cookies, which are implemented via session variables. Global variables will be set for ALL users, which is not how Session variables work.
However, you can store "session variables" based on the User.Identity.Name or the underlying User.Identity.Claims.AspNet.Identity.SecurityStamp into a database along with a timestamp and viola! You have implemented primitive session variables. I had a very specific need to save two weeks of programming by not interfering with the GUI that our user interface specialist had written. So I returned NoContent() instead of the normal View() and I saved my hacky session variable based on the user's login name.
Am I recommending this for most situations? No. You can use ViewBag or return View(model) and it will work just fine. But if you need to save session variables in MVC for whatever reason, this code works. The code below is in production and works.
To retrieve the data...
string GUID = merchdata.GetGUIDbyIdentityName(User.Identity.Name);
internal string GetGUIDbyIdentityName(string name)
{
string retval = string.Empty;
try
{
using (var con = new SqlConnection(Common.DB_CONNECTION_STRING_BOARDING))
{
con.Open();
using (var command = new SqlCommand("select GUID from SessionVariablesByIdentityName md where md.IdentityName = '" + name + "' and LastSaved > getdate() - 1", con))
{
using (SqlDataReader reader = command.ExecuteReader())
{
while (reader.Read())
{
retval = reader["GUID"].ToString();
}
}
}
}
}
catch (Exception ex)
{
}
return retval;
}
To save the data...
merchdata.SetGUIDbyIdentityName(User.Identity.Name, returnedGUID);
internal void SetGUIDbyIdentityName(string name, string returnedGUID)
{
RunSQL("exec CRUDSessionVariablesByIdentityName #GUID='" + returnedGUID + "', #IdentityName = '" + name + "'");
}
internal void RunParameterizedSQL(SqlConnection cn, SqlCommand cmd, object sqlStr)
{
string retval = string.Empty;
try
{
cn.Open();
cmd.ExecuteNonQuery();
cn.Close();
}
BTW: The SQL table (named SessionVariablesByIdentityName here) is fairly straightforward and can store lots of other things too. I have a LastSaved datetime field in there so I don't bother retrieving old data from yesterday. For example.
I have to create a mechanism to store and read preferences (controls default values and settings) per user. I have a problem related to network traffic, as the database can be accessed over the internet and the application server sometimes is connected to a poor 512Kbps internet connection.
My application can have around 50 simultaneous users and each page/form can have up to 50 items (preferences). The amount of pages is around 80.
So, considering a performance perspective, which should I choose to decrease network traffic? Session or cache?
UPDATE
I've created two sample pages, one using cache and another using session.
Load test
90 users
Stored content
1000 elements
20 characters on each element's value
Here are the results from each test case:
MemoryCache
330,725,246 total bytes allocated
Functions Allocating Most Memory
Name Bytes %
System.Runtime.Caching.MemoryCache.Set(string,object,class System.Runtime.Caching.CacheItemPolicy,string) 34,74
System.Web.UI.Page.ProcessRequest(class System.Web.HttpContext) 18,39
System.String.Concat(string,string) 12,65
System.String.Join(string,string[]) 5,31
System.Collections.Generic.Dictionary`2.Add(!0,!1) 4,42
Source code:
protected void Page_Load(object sender, EventArgs e)
{
outputPanel.Text = String.Join(System.Environment.NewLine, ReadEverything().ToArray());
}
private IEnumerable<String> ReadEverything()
{
for (int i = 0; i < 1000; i++)
{
yield return ReadFromCache(i);
}
}
private string ReadFromCache(int p)
{
String saida = String.Empty;
ObjectCache cache = MemoryCache.Default;
Dictionary<int, string> cachedItems = cache["user" + Session.SessionID] as Dictionary<int, string>;
if (cachedItems == null)
{
cachedItems = new Dictionary<int, string>();
}
if (!cachedItems.TryGetValue(p, out saida))
{
saida = Util.RandomString(20);
cachedItems.Add(p, saida);
CacheItemPolicy policy = new CacheItemPolicy();
policy.AbsoluteExpiration = DateTimeOffset.Now.AddSeconds(30);
cache.Set("user" + Session.SessionID, cachedItems, policy);
}
return saida;
}
Session
111,625,747 total bytes allocated
Functions Allocating Most Memory
Name Bytes %
System.Web.UI.Page.ProcessRequest(class System.Web.HttpContext) 55,19
System.String.Join(string,string[]) 15,93
System.Collections.Generic.Dictionary`2.Add(!0,!1) 6,00
System.Text.StringBuilder.Append(char) 5,93
System.Linq.Enumerable.ToArray(class System.Collections.Generic.IEnumerable`1) 4,46
Source code:
protected void Page_Load(object sender, EventArgs e)
{
outputPanel.Text = String.Join(System.Environment.NewLine, ReadEverything().ToArray());
}
private IEnumerable<String> ReadEverything()
{
for (int i = 0; i < 1000; i++)
{
yield return ReadFromSession(i);
}
}
private string ReadFromSession(int p)
{
String saida = String.Empty;
Dictionary<int, string> cachedItems = Session["cachedItems"] as Dictionary<int, string>;
if (cachedItems == null)
{
cachedItems = new Dictionary<int, string>();
}
if (!cachedItems.TryGetValue(p, out saida))
{
saida = Util.RandomString(20);
cachedItems.Add(p, saida);
Session["cachedItems"] = cachedItems;
}
return saida;
}
I forgot to mention that I'm creating a solution to work with ASP.Net and WPF projects, however, if the Session is far better than the MemoryCache option, I can have different solutions for each platform.
both are really the same, they are in memory... If you are using DB session and you have a poor connection then you should use cache if present, load from DB if not and then cache.
I would consider the session a cache mechanism, unlike the others it is just specific to a specific browser session. My approach would consider the following questions.
Is this site load balanced? If it is, using the session will force persistent sessions, possibly causing issues if you take down a server.
Is this data user specific? If it is, the session is an easy way to segregate data without lots of key manipulation. It also has the benefit that when a user's session times-out it automatically gets cleaned up. If it isn't, I'd recommending using the MemoryCache feature added .NET 4.0. It supports expiration.
How will this cache become outdated? If user A can modify data cached for user B, you're now serving dirty data with Session cache. This prompt a shared cache mechanism.
Edit: Once these questions have been answered, you should be able to decide what type of cache mechanism is appropriate for your situation. Then you can evaluate that subset for performance.
So I just fixed a bug in a framework I'm developing. The pseudo-pseudocode looks like this:
myoldObject = new MyObject { someValue = "old value" };
cache.Insert("myObjectKey", myoldObject);
myNewObject = cache.Get("myObjectKey");
myNewObject.someValue = "new value";
if(myObject.someValue != cache.Get("myObjectKey").someValue)
myObject.SaveToDatabase();
So, essentially, I was getting an object from the cache, and then later on comparing the original object to the cached object to see if I need to save it to the database in case it's changed. The problem arose because the original object is a reference...so changing someValue also changed the referenced cached object, so it'd never save back to the database. I fixed it by cloning the object off of the cached version, severing the reference and allowing me to compare the new object against the cached one.
My question is: is there a better way to do this, some pattern, that you could recommend? I can't be the only person that's done this before :)
Dirty tracking is the normal way to handle this, I think. Something like:
class MyObject {
public string SomeValue {
get { return _someValue; }
set {
if (value != SomeValue) {
IsDirty = true;
_someValue = value;
}
}
public bool IsDirty {
get;
private set;
}
void SaveToDatabase() {
base.SaveToDatabase();
IsDirty = false;
}
}
myoldObject = new MyObject { someValue = "old value" };
cache.Insert("myObjectKey", myoldObject);
myNewObject = cache.Get("myObjectKey");
myNewObject.someValue = "new value";
if(myNewObject.IsDirty)
myNewObject.SaveToDatabase();
I've done similar things, but I got around it by cloning too. The difference is that I had the cache do the cloning. When you put an object into the cache, the cache will clone the object first and store the cloned version (so you can mutate the original object without poisoning the cache). When you get an object from the cache, the cache returns a clone of the object instead of the stored object (again so that the caller can mutate the object without effecting the cached/canonical object).
I think that this is perfectly acceptable as long as the data you're storing/duping is small.
A little improvement on Marks anwser when using linq:
When using Linq, fetching entities from DB will mark every object as IsDirty.
I made a workaround for this, by not setting IsDirty when the value is not set; for this instance: when null. For ints, I sat the orig-value to -1, and then checked for that. This will not work, however, if the saved value is the same as the uninitialized value (null in my example).
private string _name;
[Column]
public string Name
{
get { return _name; }
set
{
if (value != _name)
{
if (_name != null)
{
IsDirty = true;
}
_name = value;
}
}
}
Could probably be improved further by setting IsDirty after initialization somehow.