I have a function called on every single page:
/// <summary>
/// Gets the date of the latest blog entry
/// </summary>
public static DateTime GetNewestBlogDate()
{
DateTime ReturnDate = DateTime.Now.AddDays(30);
using (var db = new DataClassesDataContext())
{
var q = (from d in db.tblBlogEntries orderby d.date descending select new {d.date}).FirstOrDefault();
if (q != null)
ReturnDate = q.date;
}
return ReturnDate;
}
It works like this website, it gets the latest blog entry date and if it's greater than the users cookie value it displays a new icon next to the blog link.
It seems rather wasteful to keep calling this function per page request, called 1:1 on the number of page requests you have. Say you have 30,000 page views per day, that's 1,250 database queries per hour.
Is there any way I can cache this results, and have it expire say every hour?
I'm aware it's a bit of a micro optimisation, but given 10 or so similar functions per page it might add up to something worthwhile. You could denormalise it into a single table and return them all in one go, but I'd rather cache if possible as it's easier to manage.
Since it's not based on the user (the cookie is, but the query doesn't seem to be) - you can just use the standard ASP.NET Cache.
Just insert the result with an expiration of 1 hour. If you like, you can even use the callback to automatically refresh the cache.
Assuming you've stored it into MS-SQL, you could even use a SqlCacheDependency to invalidate when new data is inserted. Or, if your inserting code is well-factored, you could manually invalidate the cache then.
Just use the ASP.NET Cache object with an absolute expiration of 1 hour. Here's an example of how you might implement this:
public static DateTime GetNewestBlogDate()
{
HttpContext context = HttpContext.Current;
DateTime returnDate = DateTime.Now.AddDays(30)
string key = "SomeUniqueKey"; // You can use something like "[UserName]_NewestBlogDate"
object cacheObj = context.Cache[key];
if (cacheObj == null)
{
using (var db = new DataClassesDataContext())
{
var q = (from d in db.tblBlogEntries orderby d.date descending select new { d.date }).FirstOrDefault();
if (q != null)
{
returnDate = q.date;
context.Cache.Insert(key, returnDate, null, DateTime.Now.AddHours(1), Cache.NoSlidingExpiration);
}
}
}
else
{
returnDate = (DateTime)cacheObj;
}
return returnDate;
}
You haven't indicated what is done with the returned value. If the returned value is displayed the same way on each page, why not just place the code along with the markup to display the result in a user control (ASCX) file? You can then cache the control.
Make it a webmethod with a CacheDuration?
[WebMethod(CacheDuration=60)]
public static DateTime GetNewestBlogDate()
Related
Scenario: Deactivate the user whose login date is less than 42 from today. I have an user whose last login date is 1/22/2020(US Date format)/22/1/2020 5:12 pm. Here I wrote a batch apex for deactivating. My code has executed successfully and my batch status is completed but the user record is not deactivating.
Here is the code:
global class User_Deactivation implements Database.Batchable<SObject>
{
dateTime dt = date.today()-42;
public String query = 'SELECT Name, LastLoginDate, Id From User WHERE IsActive = true AND LastLoginDate=:dt ';
global Database.querylocator start(Database.BatchableContext bc)
{
return Database.getQueryLocator(query);
}
global void execute(Database.BatchableContext bc,List<User> scope)
{
List<User> userList = new List<User>();
for(User s:scope)
{
User u =(user)s;
userList.add(u);
}
if(userList.size() > 0)
{
for(User usr : userList)
{
usr.isActive = false;
}
}
update userList;
}
global void finish(Database.BatchableContext bc)
{
AsyncApexJob a = [SELECT Id, Status, NumberOfErrors, JobItemsProcessed, TotalJobItems, CreatedBy.Email
FROM AsyncApexJob
WHERE Id = :BC.getJobId()];
Messaging.SingleEmailMessage mail = new Messaging.SingleEmailMessage();
String[] toAddresses = new String[] {a.CreatedBy.Email};
mail.setToAddresses(toAddresses);
mail.setSubject('Apex Job Status: ' + a.Status);
mail.setPlainTextBody('The batch Apex job processed ' + a.TotalJobItems + ' batches with '+ a.NumberOfErrors + ' failures.');
Messaging.sendEmail(new Messaging.SingleEmailMessage[] { mail });
}
}
please help me out on this
Multiple things you can improve here, where do I begin...
Initialisation(?) piece
dateTime dt = date.today()-42;
String query = 'SELECT Name, LastLoginDate, Id From User WHERE IsActive = true AND LastLoginDate=:dt';
Do you need Date or DateTime match? The way you wrote it it'll match only people who logged in exactly at midnight. System.debug(dt); would say 2020-01-23T00:00:00.000Z. It shouldn't be an equals sign, should be "less than" or "less or equal".
Or even better - you can make it bit more clear what you want to do, bit more "semantic" so the poor guy who's going to maintain it can understand it without extra comments. This reads more natural and uses the SOQL date literals, special "constants" to simplify your logic: SELECT Id, LastLoginDate FROM User WHERE isActive = true AND LastLoginDate != LAST_N_DAYS:42
What is this section of code anyway. It's not really static variables, it's not a constructor... I think it'll behave as a constructor. Be very, very careful with constructors for batches. The state of the class at the end of the constructor gets saved (serialised) and restored every time the class is scheduled to run. It's tempting to put some initialisation code into constructor, maybe read some custom settings, precalculate stuff... But then you'll be in for nasty surprise when admin adds new custom setting and the batch doesn't pick it up. In your case it's even worse, I'd suspect it'll serialise the dt and your today() will be frozen in time, not what you expected. To be safe move all initialisation logic to start()
And I'd even say whoever gave you the requirement didn't think it through. When you make new user they get a link they need to click in next 72h. If they didn't do it (maybe it was sent late Friday and they want to login on Monday) - this thing will dutifully kill their access at Friday night without giving them any chance to login. You need some "grace period". Maybe something like WHERE isActive = true AND (LastLoginDate < :x OR (LastLoginDate = null AND CreatedDate < :x))
start()
Queries in strings work and that's how a lot of batch documentation is written but they are poor practice. Where possible use a compiled query, in brackets. You get minimal improvement in execution (precompiled), you get compile-time warnings when you mess up (better than a runtime error which you might not notice if you don't monitor jobs). And most importantly - if somebody wants to delete a field - SF will detect a dependency and stop him/her. Use return Database.getQueryLocator([SELECT ...]); wherever you can.
execute()
Your scope already is a list of users, why do you do extra casts to User? Why do you add them to a helper list? Why 2 loops?
for(User u : scope){
u.isActive = false;
}
update users;
and you're done?
P.S. Why "global" all over the place?
I need some clarity about session and how to add objects, because I think I do it the wrong way.
First I create a session to hold a list of Products:
Session["ShoppingCart"] = new List<Products>();
To add Products to the list, I do like this:
Session["ShoppingCart"] = new Products { ID = productId, Name = name };
I guess this isn't the right way?
I guess this isn't the right way?
Yes, this isn't the right way (please skip towards the last paragraph of my answer to know the correct way - which is not to use ASP.NET session at all). The correct way is to first get the object you stored inside the session by trying it to cast it to the same .NET type yo uhave stored inside the session:
var products = Session["ShoppingCart"] as List<Products>;
and then if this item is not null add the corresponding product to the list. We should of course make the necessary type check that the session actually contained a value with the specified key and that this value is of the expected type:
if (products != null)
{
var product = new Products { ID = productId, Name = name };
products.Add(product);
}
Of course we are using object references here which will only work if you are storing your session in-memory (sessionState mode = InProc) which of course is absolutely a terrible disaster and something you should never do in production. In a production environment you are probably persisting your session in a session server or even SQL server, aren't you? In this case it is more than obvious that working with object references is a recipe for disaster. So in this case once you have added the new product to the session you should of course set back the new list value to the session which will serialize the object instance to the corresponding data store:
if (products != null)
{
var product = new Products { ID = productId, Name = name };
products.Add(product);
Session["ShoppingCart"] = products;
}
Now, after all this being said I must admit that using ASP.NET Session is probably the huge mistake you would ever commit in a real world application. So basically every time you are using Session["xxx"] you are doing it wrong. Simply search the entire solution for the Session keyword and just get rid of it.
In order to add itens to an existing list on the Session, you must first retrieve the list then add the object to it. Here's an example:
Session["ShoppingCart"] = new List<Products>();
List<Products> productsList = (List<Products>)Session["ShoppingCart"];
productsList.add(new Products { ID = productId, Name = name });
Session["ShoppingCart"] = productsList;
I found this behaviour by accident, as I return the count of items in a session in an error message and found that some sessions had as many as 120 items in them (they should have 1!). On further investigation I found that every request seems to add an item into the session. They are all negative integers, like -710, -140 -528. I can't seem to see a pattern in what number comes up.
I have checked my code for any interactions with the Session object and as far as I can tell it is not me. I store one item in the session which is my own object which has a number of other properties on it. My session state is SQL server, and I am only serialising a certain set of values that need to be kept.
Has anyone seen anything like this or has any advice on where I can troubleshoot further?
Thank you in advance.
-- Edit, as requested - first where I count the items in the session - this is done in the page load event of my master page. I loop through so I could inspect using the debugger.
int itemCount = Session.Count;
for (int i = 0; i < itemCount; i++)
{
object o = Session[i];
}
-- here is where I add my custom object to the session. This is called at session start and in my master page. It runs on a "get, but if not there, create" principle.
HttpSessionState Session = HttpContext.Current.Session;
HttpRequest Request = HttpContext.Current.Request;
if (Session == null)
return null;
SessionData sessionData = (SessionData)Session[StaticNames.SESSION_NAME];
if (sessionData == null)
{
sessionData = new SessionData();
Session.Add(StaticNames.SESSION_NAME, sessionData);
}
I also have this to get the SessionData object from the session:
public SessionData(SerializationInfo info, StreamingContext ctxt)
{
this.IsManualLogin = (bool)info.GetValue("IsManualLogin", typeof(bool));
this.HasAskedUserForLocation = (bool)info.GetValue("HasAskedUserForLocation", typeof(bool));
// ... etc, more items for all users here
int? loginID = null;
try
{
loginID = info.GetInt32("LoginID");
}
catch
{
return;
}
this.LoginID = loginID.Value;
// ... etc, more items for logged in users only
}
There is also an equivalent method for adding this data to the SerializationInfo used for SqlSessionState.
Credit to the modest jadarnel27.
It turns out the Ajax Control Toolkit NoBot control adds an integer into your session on every request. My website has an auto 40 second refresh, similar to facebook, so this probably would have brought the whole thing crashing down at some point and I am lucky to find it now. Should anyone else consider using the NoBot control, be warned about this behaviour!
I'm reading an ASPX file as a string and using the returned HTML as the source for an email message. This is the code:
public string GetEmailHTML(int itemId)
{
string pageUrl = "HTMLEmail.aspx";
StringWriter stringWriter = new StringWriter();
HttpRuntime.ProcessRequest(new SimpleWorkerRequest(pageUrl, "ItemId=" + itemId.ToString(), stringWriter));
stringWriter.Flush();
stringWriter.Close();
return stringWriter.ToString();
}
HTMLEmail.aspx uses the ItemId query string variable to load data from a DB and populate the page with results. I need to secure the HTMLEmail.aspx page so a manipulated query string isn't going to allow just anybody to see the results.
I store the current user like this:
public User AuthenticatedUser
{
get { return Session["User"] as User; }
set { Session["User"] = value; }
}
Because the page request isn't made directly by the browser, but rather the SimpleWorkerRequest, there is no posted SessionId and therefore HTMLEmail.aspx cannot access any session variables. At least, I think that's the problem.
I've read the overview on session variables here: http://msdn.microsoft.com/en-us/library/ms178581.aspx
I'm wondering if I need to implement a custom session identifier. I can get the current SessionId inside the GetEmailHTML method and pass it as a query string param into HTMLEmail.aspx. If I have the SessionId inside HTMLEmail.aspx I could maybe use the custom session identifier to get access to the session variables.
That fix sounds messy. It also removes the encryption layer ASP automatically applies to the SessionId.
Anyone have a better idea?
As far as I can see, your best bet is to pass on all the values you need inside HTMLEmail.aspx to it via the query parameters, just like you do with ItemId.
Apart from that, you can probably get away with just sending in the UserId of the user to that page and make it hit the DB (or wherever you are storing your users) to the User object, instead of trying to read it off the Session variables.
Edit:
Why don't you use:
public string GetEmailHTML(int itemId)
{
string pageUrl = "HTMLEmail.aspx";
StringWriter stringWriter = new StringWriter();
Server.Execute(pageUrl, stringWriter);
stringWriter.Flush();
stringWriter.Close();
return stringWriter.ToString();
}
instead? As far as I can see Server.Execute inherits the same http request.
I have an ASP.NET application that caches some business objects. When a new object is saved, I call remove on the key to clear the objects. The new list should be lazy loaded the next time a user requests the data.
Except there is a problem with different views of the cache in different clients.
Two users are browsing the site
A new object is saved by user 1 and the cache is removed
User 1 sees the up to date view of the data
User 2 is also using the site but does not for some reason see the new cached data after user 1 has saved a new object - they continue to see the old list
This is a shortened version of the code:
public static JobCollection JobList
{
get
{
if (HttpRuntime.Cache["JobList"] == null)
{
GetAndCacheJobList();
}
return (JobCollection)HttpRuntime.Cache["JobList"];
}
}
private static void GetAndCacheJobList()
{
using (DataContext context = new DataContext(ConnectionUtil.ConnectionString))
{
var query = from j in context.JobEntities
select j;
JobCollection c = new JobCollection();
foreach (JobEntity i in query)
{
Job newJob = new Job();
....
c.Add(newJob);
}
HttpRuntime.Cache.Insert("JobList", c, null, Cache.NoAbsoluteExpiration, Cache.NoSlidingExpiration, CacheItemPriority.Default, null);
}
}
public static void SaveJob(Job job, IDbConnection connection)
{
using (DataContext context = new DataContext(connection))
{
JobEntity ent = new JobEntity();
...
context.JobEntities.InsertOnSubmit(ent);
context.SubmitChanges();
HttpRuntime.Cache.Remove("JobList");
}
}
Does anyone have any ideas why this might be happening?
Edit: I am using Linq2SQL to retreive the objects, though I am disposing of the context.
I would ask you to make sure you do not have multiple production servers for load balancing purpose. In that case you will have to user some external dependency architecture for invalidating/removing the cache items.
That's because you don't synchronize cache operations. You should lock on writing your List to the cache (possibly even get the list inside the lock) and on removing it from the cache also. Otherwise, even if reading and writing are synchronized, there's nothing to prevent storing the old List right after your call to Remove. Let me know if you need some code example.
I would also check, if you haven't already, that the old data they're seeing hasn't been somehow cached in ViewState.
You have to make sure that User 2 sent a new request. Maybe the content it saws is from it's browser's cache, not the cache from your server