I have a strange problem:
I have a site which has an administration system.
In the system there is a way to edit my inputs. To make this easier, I can search for their ids to find them.
The problem is when the page posts back the session variable I hold the "logged in" value in, becomes nothing.. and i get kicked out. Why is that happening? Session variables should hold at least 20 minutes if anything else is stated?
EDIT: It works for a coworker, but not for me.. Also, it only does it on certain inputs.
EDIT2: Turns out i get an exception. but it doesnt say what, only says "property evaluation failed". i get it on this code row:
Response.Redirect("./admin.aspx?search=" + u.FirstOrDefault.ProductID.ToString, False)
And sience it goes through this statement, its not because it is an objectreference not set to an instance of an object
If u.Any Then
If you experience an exception your session might be restarted.
Create a global.asax and set breakpoints to the Application_Error, Session_End and Session_Start events to track down the issue
Edit based on your update:
Make sure that u.Count == 1, because FirstOrDefault will throw an exception, if the count is > 1.
Have a look at this link http://support.microsoft.com/kb/312629/EN-US/ which should cover your original problem
http://forums.asp.net/t/1296202.aspx/1
If you have Web Garden enabled (multiple worker processes for an application pool) this can explain such behavior - been there.
Either have one worker process for the pool, or switch to Database driven Session.
Do you have cookies enabled? If not, this would explain this behavior. Especially as you state it works for the co-worker, you can deduce it's a client-side issue.
Related
I have a table in MSSQL database, and I have an ASPX page, I need to push all new rows to the page in a descending order. I found this awesome tutorial which is using SignalR and SqlDependency and it shows only the last row descarding the previous rows which have been added when I'm online, it does that because it has a span element to show data and every time it overwrites this span, so I modified the JavaScript code to append the new data and it works fine.
The problem now is when I refreshed the page for the first time, I'll get the new rows twice, and if I refreshed the page again I'll get the new rows triple .. and so on.
The only solution is to close the application and reopen it again, it looks like reset the IIS.
So, what can I do to avoid duplicating data in the online show?
It is not a SignalR issue, that happens because the mentioned tutorial has a series of mistakes, the most evident being the fact that it continuously creates SqlDependency instances but then it trashes them without never unsubscribing from the OnChange event. You should start by adding something like this:
SqlDependency dependency = sender as SqlDependency;
dependency.OnChange -= dependency_OnChange;
before calling SendNotifications inside your event handler. Check this for some inspiration.
UPDATE (previous answer not fully accurate but kept in place for context)
The main problem here is that this technique creates a sort of auto-regenerating infinite sequence of SqlDependencies from inside instances of Web Forms pages, which makes them unreachable as soon as you page has finished rendering. This means, once your page lifecycle is complete and the page is rendered, the chain of dependencies stays alive and keeps on working even if the page instance which created has finished its cycle. The event handler also keeps the page instance alive even if unreachable, causing a memory leak.
The only way you can control this is actually to generate these chains somewhere else, for example within a static type you can call passing some unique identifier (maybe a combination of page name and username? that depends on your logic). At the first call it will do what currently happens in your current code, but as soon as you do another call with the same parameters it will do nothing, hence the previously created chain will go on being the only one notifying, with no duplicate calls.
It's just a suggestion, there would be many possible solutions, but you need to understand the original problem and the fact that it is practically impossible to remove those chains of auto-regenerating dependencies if you don't find a way to keep track of them and create them only when necessary. I hope that part is clear.
PS: this behavior is very similar to what you get sometimes with event handlers getting leaked and keeping alive objects which should be killed, this is what fooled me with the previous answer. It is in a way a similar problem (leaked objects), but with a totally different cause. The tutorial you follow does not clarify that and brings you to this situation where phantom code keeps on executing and memory is lost.
I got it, although I don't like this way absolutely, I have declared a static member in the Global.asax file and in the Page_Load event I checked its value, if it was true don't run a new instance of SqlDependency, otherwise run it.
if (!Global.PageIsFired)
{
Global.PageIsFired = true;
SqlDependency.Stop(ConfigurationManager.ConnectionStrings["db"].ConnectionString);
SqlDependency.Start(ConfigurationManager.ConnectionStrings["db"].ConnectionString);
SendNotifications();
}
Dear #Wasp,
Your last update helped me a lot to understand the problem, so thank you so much for your time and prompt support.
Dear #dyatchenko,
Thanks a lot for your comments, it was very useful too.
On a new website, I've an huge formular(meaning really big, needs at least 15-20min to finish it), that configure the whole website for one client for the next year.
It's distributed between several tabs(it's a wizard). Every time we go to the next tab, it makes a regular(non ajax) call to the server that generate the next "page". The previous informations are stored in the session(an object with a custom binder).
Everything was working fine until we test it today with all real data. Real data needs reflexion, work to find correct elements, ... And it takes times.
The problem we got is that the View receive a Model partialy empty. The session duration is set to 1440 minutes(and in IIS too). For now what I know is that I get a NullException the first time I try to access the Model into my view.
I'm checking the controller since something like 1 hour, but it's just impossible it gives a null model. If I put all those data very fast, I don't have any problem(but it's random data).
For now I did only manage to reproduce this problem on the IIS server, and I'm checking elmah logs to debug it, so it's not so easy to reproduce it.
Have you just any idea about how should I debug this? I'm a little lost here
I think you should assume session does not offer reliable persistence. I am not sure about details but I guess it will start freeing some elements when it exceeds its memory limit.
You will be safer if you use database to store that information or you could introduce your own implementation for persisting state.
in addition to ans provided by #Ufuk
you can easily send an ajax request every 1 minute which would actually do nothing but by doing this the session wont get expired and site will continue to run in extended periods
The problem was that the sessions wasn't having enough space I think. I resolved temporary my problem by restarting the application pool. Still searching a solution that will not implies to changes all this code. Maybe with other mode of session states, but I need to make my models serializable.
In a web application, I need the SessionID for some reason, so, I save it in the database. the application does two redirects before it starts, and I found that it acquires two SessionIDs. The second one remains.
Why this behavior ? Any idea? How can I prevent that to save in the DB one record.
Sorry because I can't post code, it's compound with another logic.
There's a million possible reasons for losing / re-starting your session. Without your code, its difficult to offer advice.
One thing you can try on your own is to use the Session_End event in your global.asax file, as long as you're using inProc session. Put some code in there to have it tell you when the session ends, so you can track-down the problem / what's causing it to end in your application.
Another think you can look at is your method of redirecting. Make sure you're preserving the session, like this: Response.Redirect("~/default.aspx", false).
Another possibility is that you might not be putting anything into your first session. By default, you'll get a new sessionID on each postback unless you put something in the session.
In a recent project we are currently getting 12031 errors. here is the complete error:
Sys.WebForms.PageRequestManagerServerErrorException 12031
the status code returned from the
server was 12031
The problem is, this doesn't happen all the time and we are unable to reproduce the error on development environment.
We use AJAX in our application and this exception happens on every page once in a while.
I've found a post on SO with the same problem and tried changing maxRequestLength to "1" to see if I constantly get the same error but I don't. Instead, I'm getting
Maximum request length exceeded.
So I'm starting to think that it is not related to maxRequestLength. I'm actually out of ideas. I have a ScriptManager in my MasterPage and its AsyncPostBackTimeout="240". That is the same amount of time (give or take). I get the 12031 error after 3,5 minutes of "nothing". I'm logging one of the pages and by logging, I mean logging every section of the page like "Page_Load is called" "xyz is called" etc and I have like 15 spots on the page for this. After user clicks a button and ScriptManager tries to do its job, no postback occurs, no logging happens. It is like the page wants to do a postback but too old to do it. Tries this for around 3,5 minutes and fails with the given error.
Please, if you have any ideas, HELP ME OUT .
Thank you
That error almost certainly has nothing to do with the size of the response, AsyncPostBackTimeout, or maxRequestLength.
Connection resets are usually indicative of poor network connectivity or a server loaded down to its capacity limits. A few things you could try:
Inspect the Windows Event Log during the time(s) that the kiosks were known to have received the error. Look for any relevant errors or warnings.
If feasible, ask the kiosk staff to use something like Pingtest to test the quality of their local network connection at the time that they receive the error in your app.
Use a service like Pingdom to ensure that the server itself isn't intermittently losing connectivity.
This error may be due to the HTTP Runtime limitation of the maxRequestLength. The default value is 4096.
Try adding (or editing) the following entry in your Web.Config:
"<httpRuntime maxRequestLength="8192" />" (effectively allowing 8mb of data transmission, instead of the default 4mb).
Please not....You can set data as per you max request. 8192 is not the limit. Also you need to add Page.Form.Attributes.Add("enctype", "multipart/form-data"); in Page_Load event of the page.
You'll want to enter this in the System.Web configuration section.
How do I avoid getting a PageRequestManagerParserErrorException?
To start with, don't do anything from the preceding list! Here's a matching list of how to avoid a given error (when possible):
Calls to Response.Write():
Place an or similar control on your page and set its Text property. The added benefit is that your pages will be valid HTML. When using Response.Write() you typically end up with pages that contain invalid markup.
Response filters:
The fix might just be to not use the filter. They're not used very often anyway. If possible, filter things at the control level and not at the response level.
HttpModules:Same as response filters.
Server trace is enabled:
Use some other form of tracing, such as writing to a log file, the Windows event log, or a custom mechanism.
Calls to Server.Transfer():
I'm not really sure why people use Server.Transfer() at all. Perhaps it's a legacy thing from Classic ASP. I'd suggest using Response.Redirect() with query string parameters or cross-page posting.
Another way to avoid the parse error is to do a regular postback instead of an asynchronous postback. For example, if you have a button that absolutely must do a Server.Transfer(), make it do regular postbacks. There are a number of ways of doing this:
The easiest is to simply place the button outside of any UpdatePanels. Unfortunately the layout of your page might not allow for this.
Add a PostBackTrigger to your UpdatePanel that points at the button. This works great if the button is declared statically through markup on the page.
Call ScriptManager.RegisterPostBackControl() and pass in the button in question. This is the best solution for controls that are added dynamically, such as those inside a repeating template.
Good luck!
I have received this error before when we had a Barracuda Device sitting in front of our website. It was a maximum request length issue because Barracuda protects against overloading the request size. We removed the device temporarily and it solved the problem. Not sure if this is your problem.
I have an application variable which is populated onstart (in this case it is an array). Ideally I need to rebuild this array every 3 hours, what is the best way of going about this?
Thanks, R.
Save the time you last refreshed the variable contents.
On every request, check the current time against the saved time. If there's a three hour difference, lock and refresh the variable.
As long as there are no requests, the variable also needs no refreshing.
If your application variable must remain "in process" with the rest of the site's code, the way suggested by Tomalak may be your only way of achieving this.
However, if it's possible that the application variable could effectively reside "out of process" of the website's ASP code (although still accessible by it), you may be able to utilise a different (and perhaps slightly better) approach.
Please see "ASP 101: Getting Scripts to Run on a Schedule" for the details.
Tomalak's method is effectively Method 1 in the article, whilst Method's 2 & 3 offer different ways of achieving what is effectively something happening on a schedule, and avoid the potentially redundant checking with every HTTP request.