Session is timing out on the drop down selected index change - asp.net

Session is timing out on the drop down selected index change
20 minutes ago | LINK
Hello Everyone,
I am facing a weird problem here. I have a report page on which i am using a drop down list which has different years. When user select the year=2009, i am displaying report for 2009 data. The code is given below. The website is live on our web server now. The page access havy data, so sometime it takes one minute or more to load the report for selected year and when that is the case my session expires and user is getting redirected to the default page. But the same thing works fine in the solution in my machine and in one of our local server. It is just not working on our live server. Please help me by posting the solutions if you know any.
I have also placed this line in my web.config but it is not helping:
Code:
protected void ddlYear_SelectedIndexChanged(object sender, EventArgs e)
{
if (Session["UserId"] != null)
{
Session["IsDetailedReportLoaded"] = false;
Session["IsScoreCardLoaded"] = false;
Session["IsChartLoaded"] = false;
Session["IsReportLoaded"] = false;
string strYear = ddlYear.SelectedValue;
LoadReport(Convert.ToInt16(strYear));
lblYear.Text = strYear;
lblAsOf.Text = strYear;
lblYear.Text = ddlYear.SelectedValue.ToString();
lblAsOf.Text = ddlYear.SelectedValue.ToString();
ddlYearDetail.SelectedValue = ddlYear.SelectedValue;
ddlYearScorecard.SelectedValue = ddlYear.SelectedValue;
ddlYearGraph.SelectedValue = ddlYear.SelectedValue;
mpeLoading.Hide();
}
else
Response.Redirect("Default.aspx");
}
Thanks,
Satish k.

One possible problem could be that the web server is running out of memory and forcing the app pool to recycle. This would flush the InProc Session memory. You could try using Sql Session State instead and see if that resolves the problem. Try monitoring the web server processes and see if they're recycling quickly.

You can place a
if(Session.IsNew)
check in your code and redirect/stop code execution appropriately.

I would check the Performance tab in IIS to see whether a bandwidth threshold is set.
Right click on website in IIS
Performance tab
Check "Bandwidth throttling" limit
If a treshold is set you might be hitting the maximum bandwidth (KB per second) limit. Either disable bandwidth throttling, or increase the limit.

Related

ASP. NET Session variable expires much faster

I set a session variable at login:
HttpContext.Current.Session["user_key"] = res; //being some string eg: "asd"
HttpContext.Current.Session.Timeout = 60;
Just in case i also have
<system.web>
<sessionState timeout="60"></sessionState>
Then i need to check for the user and get some date for their ID on pretty much every page and on every Page_load:
if(HttpContext.Current.Session["user_key"]!= null)
{
sesvar = (string)(context.Session["user_key"]);
}
else
{
HttpContext.Current.Response.Redirect("/login/");
}
This works for the most part. But it is definitely not 60mins. I'd get "kicked" (redirected to login) every now and then and can't figure out why.
Also the project is worked on and maintained trough Dreamweaver. Being a WebSite it is not compiled in any way and is live on IIS Server.
It turned out to be a function in our Database ruining every hour which "cleaned" the login table, where it shouldn't have.

Understanding the JIT; slow website

First off, this question has been covered a few times (I've done my research), and, for example, on the right side of the SO webpage is a list of related items... I have been through them all (or as many as I could find).
When I publish my pre-compiled .NET web application, it is very slow to load the first time.
I've read up on this, it's the JIT which I understand (sort of).
The problem is, after the home page loads (up to 20 seconds), many other pages load very fast.
However, it would appear that the only reason they load is because the resources have been loaded (or that they share the same compiled dlls). However, some pages still take a long time.
This indicates that maybe the JIT needs to compile different pages in different ways? If so, and using a contact form as an example (where the Thank You page needs to be compiled by the JIT and first time is slow), the user may hit the send button multiple times whilst waiting for the page to be shown.
After I load all these pages which use different models or different shared HTML content, the site loads quickly as expected. I assume this issue is a common problem?
Please note, I'm using .NET 4.0 but, there is no database, XML files etc. The only IO is if an email doesn't send and it writes the error to a log.
So, assuming my understanding is correct, what is the approach to not have to manually go through the website and load every page?
If the above is a little too broad, then can this be resolved in the settings/configuration in Visual Studio (2012) or the web.config file (excluding adding compilation debug=false)?
In this case, there are 2 problems
As per rene's comments, review this http://msdn.microsoft.com/en-us/library/ms972959.aspx... The helpful part was to add the following code to the global.asax file
const string sourceName = ".NET Runtime";
const string serverName = ".";
const string logName = "Application";
const string uriFormat = "\r\n\r\nURI: {0}\r\n\r\n";
const string exceptionFormat = "{0}: \"{1}\"\r\n{2}\r\n\r\n";
void Application_Error(Object sender, EventArgs ea) {
StringBuilder message = new StringBuilder();
if (Request != null) {
message.AppendFormat(uriFormat, Request.Path);
}
if (Server != null) {
Exception e;
for (e = Server.GetLastError(); e != null; e = e.InnerException) {
message.AppendFormat(exceptionFormat,
e.GetType().Name,
e.Message,
e.StackTrace);
}
}
if (!EventLog.SourceExists(sourceName)) {
EventLog.CreateEventSource(sourceName, logName);
}
EventLog Log = new EventLog(logName, serverName, sourceName);
Log.WriteEntry(message.ToString(), EventLogEntryType.Error);
//Server.ClearError(); // uncomment this to cancel the error
}
The server was maxing out during sending of the email! My code was fine, but, viewing Task Scheduler showed it was hitting 100% memory...
The solution was to monitor the errors shown by point 1 and fix it. Then, find out why the server was being throttled when sending an email!

Profiling ASP.net applications over the long term?

What is the accepted way to instrument a web-site to record execution statistics?
How long it takes to X
For example, i want to know how long it takes to perform some operation, e.g. validating the user's credentials with the Active Directory server:
authenticated = CheckCredentials(Login1.UserName, Login1.Password);
A lot of people will suggest using Tracing, of various kinds, to output, or log, or record, the interesting performance metrics:
var sw = new System.Diagnostics.Stopwatch();
sw.Start();
authenticated = CheckCredentials(Login1.UserName, Login1.Password);
sw.Stop();
//write a number to a log
WriteToLog("TimeToCheckCredentials", sw.ElapsedTicks);
Not an X; all X
The problem with this is that i'm not interested in how long it took to validate a user's credentials against Active Directory. i'm interested in how long it took to validate thousands of user's credentials in ActiveDirectory:
var sw = new System.Diagnostics.Stopwatch();
sw.Start();
authenticated = CheckCredentials(Login1.UserName, Login1.Password);
sw.Stop();
timeToCheckCredentialsSum = timeToCheckCredentialsSum + sw.ElapsedTicks;
timeToCheckCredentialsCount = timeToCheckCredentialsCount + 1;
if ((sw.ElapsedTicks < timeToCheckCredentialMin) || (timeToCheckCredentialMin == 0))
timeToCheckCredentialMin = sw.ElapsedTicks;
if ((sw.ElapsedTicks > timeToCheckCredentialMax) || (timeToCheckCredentialMax == 0))
timeToCheckCredentialMax = sw.ElapsedTicks;
oldMean = timeToCheckCredentialsAverage;
newMean = timeToCheckCredentailsSum / timeToCheckCredentialsCount;
timeToCheckCredentialsAverage = newMean;
if (timeToCheckCredentailsCount > 2)
{
timeToCheckCredentailsVariance = (
((timeToCheckCredentialsCount -2)*timeToCheckCredentailsVariance + (sw.ElapsedTicks-oldMean)*(sw.ElapsedTicks-newMean))
/ (timeToCheckCredentialsCount -1))
}
else
timeToCheckCredentailsVariance = 0;
Which is a lot of boilerplate code that can easily be abstracted away into:
var sw = new System.Diagnostics.Stopwatch();
sw.Start();
authenticated = CheckCredentials(Login1.UserName, Login1.Password);
sw.Stop();
//record the sample
Profiler.AddSample("TimeToCheckCredentials", sw.ElapsedTicks);
Which is still a lot of boilerplate code, that can be abstracted into:
Profiler.Start("TimeToCheckCredentials");
authenticated = CheckCredentials(Login1.UserName, Login1.Password);
Profiler.Stop("TimeToCheckCredentials");
Now i have some statistics sitting in memory. i can let the web-site run for a few months, and at any time i can connect to the server and look at the profiling statistics. This is very much the ability of SQL Server to present it's own running history in various reports:
But ASP kills apps without warning
The problem is that this is an ASP.net web-site/application. Randomly throughout the course of a year, the web-server will decide to shut down the application, by recycling the application pool:
perhaps it has been idle for 3 weeks
perhaps it reached the maximum recycle time limit (e.g. 24 hours)
perhaps a date on a file changed, and the web-server has to recompile the application
When the web-server decides to shut down, all my statistics are lost.
Are there any ASP.net performance/instrumentation frameworks that solve this problem?
Try persisting to SQL Server
i thought about storing my statistics in SQL Server. Much like ASP.net session state can be stored in SQL Server after every request is complete, i could store my values in SQL Server every time:
void AddSample(String sampleName, long elapsedTicks)
{
using (IDbConnection conn = CreateDatabaseConnection())
{
ExecuteAddSampleStoredProcedure(conn, sampleName, elapsedTicks);
}
}
Except now i've introduced a huge latency into my application. This profiling code is called many thousand times a second. When the math is performed only in memory it takes few microseconds. Now it takes few dozen milliseconds. (Factor of 1,000; noticeable delay). That's not going to work.
Save only on application shutdown
i have considered registering my static helper class with the ASP.net hosting environment by implementing IRegisteredObject:
public class ShutdownNotification : IRegisteredObject
{
public void Stop(Boolean immediate)
{
Profiler.SaveStatisticsToDatabase();
}
}
But i'm curious what the right way to solve this problem is. Smarter people than me must have added profiling to ASP.net before.
We use Microsoft's Application Performance Monitoring for this. It captures page load times, DB call times, API call times, etc. When a page load is unexpectedly slow, it also alerts us and provides the stack trace along with the timings of various calls that impacted the load time. It's somewhat rudimentary but it does the trick and allowed us to verify that we didn't have any variations that were not performing as expected.
Advance warning: the UI only works in IE.
http://technet.microsoft.com/en-us/library/hh457578.aspx

Get ASP.NET Session Last Access Time (or Time-to-Timeout)

I'm trying to determine how much time is left in a given ASP.NET session until it times out.
If there is no readily available time-to-timeout value, I could also calculate it from its last access time (but I didn't find this either). Any idea how to do this?
If you are at the server, processing the request, then the timeout has just been reset so the full 20 minutes (or whatever you configured) remain.
If you want a client-side warning, you will need to create some javascript code that will fire about 20 minutes from "now". See the setTimeout method.
I have used that to display a warning, 15 minutes after the page was requested. It pops up an alert like "your session will expire on {HH:mm}, please save your work". The exact time was used instead of "in 5 minutes" as you never know when the user will see that message (did he return to his computer 10 minutes after the alert fired?).
For multi-page solution one could save last request time in cookie, and javascript could consider this last access time for handling warning message or login out action.
I have just implemented a solution like the one asked about here and it seems to work. I have an MVC application and have this code in my _Layout.chtml page but it could work in an asp.net app by placing it in the master page I would think. I am using local session storage via the amplify.js plugin. I use local session storage because as Mr Grieves says there could be a situation where a user is accessing the application in a way that does not cause a page refresh or redirect but still resets the session timeout on the server.
$(document).ready(function () {
var sessionTimeout = '#(Session.Timeout)'; //from server at startup
amplify.store.sessionStorage("sessionTimeout", sessionTimeout);
amplify.store.sessionStorage("timeLeft", sessionTimeout);
setInterval(checkSession, 60000); // run checkSession this every 1 minute
function checkSession() {
var timeLeft = amplify.store.sessionStorage("timeLeft");
timeLeft--; // decrement by 1 minute
amplify.store.sessionStorage("timeLeft", timeLeft);
if (timeLeft <= 10) {
alert("You have " + timeLeft + " minutes before session timeout. ");
}
}
});
Then in a page where users never cause a page refresh but still hit the server thereby causing a reset of their session I put this on a button click event:
$('#MyButton').click(function (e) {
//Some Code that causes session reset but not page refresh here
amplify.store.sessionStorage("sessionTimeout", 60); //default session timeout
amplify.store.sessionStorage("timeLeft", 60);
});
Using local session storage allows my _Layout.chtml code to see that the session has been reset here even though a page never got refreshed or redirected.
You can get the timeout in minutes from:
Session.Timeout
Isn't this enough to provide the information, as the timeout is reset every request? Don't know how you want to display this without doing a request?
Anyhow, best way is on every request setting some Session variable with the last access time. That should provide the info on remote.

ASP.NET - Log User Session Start/End Times for Audit Trail - Global.ASAX?

My ASP.NET intranet web application uses Windows Authentication, and I would like to record the following details:
1) Windows ID
2) Session Start Time
3) Session Stop Time
4) URL being browsed to (optional)
I've got some basic code setup in "Session_Start" method of the Global.ASAX to log session start times (seen below), but that's it so far. I have the feeling this is a primitive approach and there are "better" ways of doing this. So I really have two questions:
1) Is this the right way to go about doing this? If not what are some other options?
2) If this is the right way, do I just need to drop some code in the "Session_End" method to record the time they exit, and thats a complete solution? Does this method always get called when they close the browser tab they have the site open in, or do they have to close the entire browser (I don't have logout functionality)? Any way users can skip over this session end method (or start for that case)?
Dim connsql As New System.Data.SqlClient.SqlConnection(ConfigurationManager.ConnectionStrings("MyConnectionstring").ConnectionString)
Dim cmdsql As System.Data.SqlClient.SqlCommand = connsql.CreateCommand
cmdsql.CommandText = "BeginUserSession"
cmdsql.CommandType = Data.CommandType.StoredProcedure
Try
cmdsql.Parameters.Add("#windowsid", System.Data.SqlDbType.VarChar, 30, "windowsid")
cmdsql.Parameters("#windowsid").Value = Session("UserInfo").identity.name
If connsql.State <> System.Data.ConnectionState.Open Then connsql.Open()
cmdsql.ExecuteNonQuery()
connsql.Close()
Catch ex As Exception
Finally
If connsql.State <> Data.ConnectionState.Closed Then connsql.Close()
End Try
'Stored Proc records start time
Session_End is not reliable.
What I would suggest is on Session_Start you create a record that notes the time the Session was created, and in Session_End you update the record with the time it was ended.
To handle the majority of sessions which are passively abandoned, use Application_BeginRequest to update the record to note when the user was "last seen".
You will then need to determine a way of marking sessions that have been passively abandoned. This will be site/app specific. It could be as simple as picking a number of minutes that must pass before the session is considered abandoned - like 10 minutes.
So then you have a query:
SELECT Username,
SessionStart,
SessionEnd,
LastSeenOn,
DATEDIFF(mi, SessionStart, ISNULL(SessionEnd, LastSeenOn)) DurationMinutes
FROM SessionAudit
WHERE SessionEnd IS NOT NULL
OR DATEDIFF(mi, LastSeenOn, getdate()) > 10
Which will bring back your session audit log.
Your approach could be described as simple, but that could be totally fine - it comes down to what the requirements are. If you need to log a full suite of application errors and warnings, look at implementing something like Log4Net. Otherwise I wouldn't say there is anything wrong with what you are doing.
Sessions are ended when there has been no user activity for the amount of time specified in the timeout value, or when you explicitly call Session.Abandon() in your code. Because of the stateless nature of HTTP, there is no way to tell if a user has left your site, closed the browser or otherwise stopped being interactive with their session.
I am not sure you can catch the end of the session accurately because
The user can close their browser and that will not necessarily end the session.
They can then go back to your site and thus may have multiple sessions.
You can try messing with setting in IIS to kill the session very quickly after inactivity but its not a good idea.
Also... If the users are not all on an internal network you will have no control as to whether they have a "Windows ID" or not.

Resources