Is there a way using ASP.NET to always run some server side code when a user leaves a page? - asp.net

I was wondering if there is any way to always run some server side code when a user leaves a page in ASP.NET. The page Unload event is no good because that doesn't get called if someone clicks on a link. Ideally I'd also like the code to run even if the user closes the browser.
I suspect what I'm asking isn't possible, but it doesn't hurt to ask

Problem is, HTTP is a stateless protocol, so when the page has finished being served, you wont know if the user is still on the page or not.
The only way to acheive this would be a hidden piece of Javascript that constantly pings the server with it's session ID, or another similar mechanism. When the ping becomes unresponsive you can reasonably assume the page is not being viewed by the user anymore.
Here is a diagram that explains traditional HTTP message flow.

im not really sure if you can do that but i have a workaround in mind.
There is an event in the DOM called onbeforeunload. it get calls everytime a user leaves a page. you can try sending an ajax request to the server from this function.

The closest thing you can come without creating too messy a solution is to enable ASP sessions. This will create a session on the server for each visitor, who will be identified by a cookie.
After a certain amount of inactivity from the visitor, the session will be closed, and a SessionEnd event will be raised. This you can hook up to in the Global.asax file.
I will not recommend this however, because HTTP is pr. definition a session-less protocol, and using server based sessions violates this fact, and are often problematic. Many solutions that use server based sessions run into problems when the user uses the browser-back button, and resubmits a form. Because the content of the submitted form no longer corresponds the data that exists in the server session.
Also, enabling server based sessions seriously hurts the scalability of the application.

Not that I know of. You'll need to use javascript for that, and call a web service on the server side.

Related

How to dynamically keep track on button click count on ASP .NET

I want to make a form where people can sign up for a course. Number of people for a course is limited. I want to make a page where user can see how many places are still available and that number is dynamically updated, so if another user signs for a course the other one sees change. When number of available places reaches 0 the signup button should be disabled. Such task should be easy to implement but I am afraid it is not. I suppose some Ajax will be involved but how to handle server side counting? WebServices? I have a problem to design a logic behind all of this.
The technology/technique you're looking for is called Server Push.
Basic idea: Client should respond to some events happening on Server.
Possible solutions:
Polling some server action via AJAX in a timely fashion;
Keeping long-running AJAX request open on server-side until timeout occurs or event happens, then process acquired result on client (determine if it was server action or just timeout), reestablish connection from client if necessary.
and a couple of other solutions which are basically variations of the above two. Also solution will much depend on server-side technology you're using.
Google has a short yet very informative article on what this technique is and how it can be implemented here. It's (almost) technology agnostic so it should help you to understand concepts and possible solutions.
I'd use a database on the server. For the "courses" table, have an associated table containing the "bookings". Add them up in a SQL query.

Stop Direct Page Calls to Ajax Pages

Is there a "clever" way of stopping direct page calls in ASP.NET? (Page functionality, not the page itself)
By clever, I mean not having to add in hashes between pages to stop AJAX pages being called directly. In a nutshell, this is stopping users from accessing the Ajax pages without it coming from one of your websites pages in a legitimate way. I understand that nothing is impossible to break, I am simply interested in seeing what other interesting methods there are.
If not, is there any way that one could do it without using sessions/cookies?
Have a look at this question: Differentiating Between an AJAX Call / Browser Request
The best answer from the above question is to check for a requested-by or custom header.
Ultimately, your web server is receiving requests (including headers) of what the client sends you - all data that can be spoofed. If a user is determined, then any request can look like an AJAX request.
I can't think of an elegant method to prevent this (there are inelegant and probably non-perfect methods whereby you provide a hash of some sort of request counter between ajax and non-ajax requests).
Can I ask why your application is so sensitive to "ajax" pages being called directly? Could you design around this?
You can check the Request headers to see if the call is initiated by AJAX Usually, you should find that x-requested-with has the value XMLHttpRequest. Or in the case of ASP.NET AJAX, check to see if ScriptMAnager.IsInAsyncPostBack == true. However, I'm not sure about preventing the request in the first place.
Have you looked into header authentication? If you only want your app to be able to make ajax calls to certain pages, you can require authentication for those pages...not sure if that helps you or not?
Basic Access Authentication
or the more secure
Digest Access Authentication
Another option would be to append some sort of identifier to your URL query string in your application before requesting the page, and have some sort of authentication method on the server side.
I don't think there is a way to do it without using a session. Even if you use an Http header, it is trivial for someone to create a request with the exact same headers.
Using session with ASP.NET Ajax requests is easy. You may run into some problems, like session expiration, but you should be able to find a solution.
With sessions you will be able to guarantee that only logged-in users can access the Ajax services. When servicing an Ajax request simply test that there is a valid session associated with it. Of course a logged-in user will be able to access the service directly. There is nothing you can do to avoid this.
If you are concerned that a logged-in user may try to contact the service directly in order to steal data, you can add a time limit to the service. For example do not allow the users to access the service more often than one minute at a time (or whatever rate else is needed for the application to work properly).
See what Google and Amazon are doing for their web services. They allow you to contact them directly (even providing APIs to do this), but they impose limits on how many requests you can make.
I do this in PHP by declaring a variable in a file that's included everywhere, and then check if that variable is set in the ajax call file.
This way, you can't directly call the file ever because that variable will never have been defined.
This is the "non-trivial" way, hence it's not too elegant.
The only real idea I can think of is to keep track of every link. (as in everything does a postback and then a response.redirect). In this way you could keep a static List<> or something of IP addresses(and possible browser ID and such) that say which pages are allowed to be accessed at the moment from that visitor.. along with a time out for them and such to keep them from going straight to a page 3 days from now.
I recommend rethinking your design to be sure that this is really needed though. And also note IPs and such can be spoofed.
Also if you follow this route be sure to read up about when static variables get disposed and such. You wouldn't want one of those annoying "your session has expired" messages when they have been using the site for 10 minutes.

Can a single asp.net user make more than one request at a time if the Session is in use?

I am not able to make more than one request at a time in asp.net while the session is active. Why does this limitation exist? Is there a way to work around it?
This issue can be demonstrated with a WebForms app with just 3 simple aspx pages (although the limitation still applies in asp.net mvc).
Create an asp.net 3.5 web application.
There should be just three pages:
NoWait.aspx, Wait.aspx, and SessionStart.aspx
NoWait.aspx has this single nugget added between the default div tags: <%=DateTime.Now.Ticks %>. The code-behind for this page is the default (empty).
Wait.aspx looks just like NoWait.aspx, but it has one line added to Page_Load in the code-behind: Thread.Sleep(3000); //wait 3 seconds
SessionStart.aspx also looks just like NoWait.aspx, but it has this single line in its code-behind: Session["Whatever"] = "Anything";
Open a browser and go to NoWait.aspx. It properly shows a number in the response, such as: "633937963004391610". Keep refreshing and it keeps changing the number. Great so far! Create a new tab in the same browser and go to Wait.aspx. It sits for 3 seconds, then writes the number to the response. Great so far! No, try this: Go to Wait.aspx and while it's spinning, quickly tab over to NoWait.aspx and refresh. Even while Wait.aspx is sleeping, NoWait.aspx WILL provide a response. Great so far. You can continue to refresh NoWait.aspx while Wait.aspx is spinning, and the server happily sends a response each time. This is the behavior I expect.
Now is where it gets weird.
In a 3rd tab, in the same browser, visit SessionStart.aspx. Next, tab over to Wait.aspx and refresh. While it's spinning, tab over to NoWait.aspx and refresh. NoWait.aspx will NOT send a response until Wait.aspx is done running!
This proves that while a session is active, you can't make concurrent requests with the same user. Requests are all queued up and served synchronously. I do not expect or understand this behavior. I have tested this on Visual Studio 2008's built in web server, and also IIS 7 and IIS 7.5.
So I have a few questions:
1) Am I correct that there is indeed a limitation here, or is my test above invalid because I am doing something wrong?
2) Is there a way to work around this limitation? In my web app, certain things take a long time to execute, and I would like users to be able to do things in other tabs while they wait of a big request to complete. Can I somehow configure the session to allow "dirty reads"? This could prevent it from being locked during the request?
3) Why does this limitation exist? I would like to gain a good understanding of why this limitation is necessary. I think I'd be a better developer if I knew!
Here is a link talking about session state and locking. It does perform and exclusive lock.
The easiest way around this is to make the long running tasks asynchronous. You can make the long running tasks run on a separate thread, or use and asynchronous delegate and return a response to the browser immediately. The client side page can send requests to the server to check and see if it is done (through ajax most likely), and when the server tells the client it's finished, notify the user. That way although the server requests have to be handled one at a time by the server, it doesn't look like that to the user.
This does have it's own set of problems, and you'll have to make sure that account for the HTTP context closing as that will dispose certain functionality in the asp.net session. One example you'll probably have to account for is probably releasing a lock on the session, if that is actually occurring.
This isn't too surprising that this could be a limitation. Each browser would have it's own session, before the advent of ajax, post back requests were synchronous. Making the same session handle concurrent could get really ugly, and I can see how that wouldn't be a priority for the IIS and ASP.NET teams to add in.
For reasons Kevin described, users can't access two pages that might write to their session state at the same time - the framework itself can't exert fine-grained control over the locking of the session store, so it has to lock it for entire requests.
To work around this, pages that only read session data can declare that they do so. ASP.NET won't obtain a session state write lock for them:
// Or false if it doesn't need access to session state at all
EnableSessionState="ReadOnly"

How do I force expiration of an ASP.Net session when a user leaves the site?

We have a scenario in which we like to detect when the user has left our site and immediately expire their .Net session. We're using Forms Authentication. We're not talking about a session timeout, which we already have. We would like to know when a user has browsed away from our site, either via a link, by typing in an address or following a bookmark. If they return to our site, even if right away, they will have to log back in (I understand this is not great usability - this is a security requirement we've been given by our client).
My initial instinct is that this is either not possible, or that any solutions will be extremely unreliable. The only solutions we've come up with are:
Add a JavaScript onBlur event handler that tells the server to log out the session when the user leaves the site.
Once the user has logged in, check the HTTP referrer to ensure that the user has navigated from within the site.
Add AJAX polling back to the server to keep the session refreshed, possibly on a 10-second interval. When the call isn't received on time the session would end.
The onBlur seems like the easiest, but possibly least reliable method - I'm not sure if it would even work. There are also issues with the referrer method, as the user could type in an address within the site and not follow a link. The AJAX method seems like it would work, but it's complicated - I'm not even sure how to handle it on the back-end. I'm thinking there might also be scenarios in which that wouldn't always work.
Any ideas would be appreciated. Thanks.
I have gone for a heartbeat type scenario like you describe above. Either Ajax Polling or an IFRAME. When the user closes the browser and a certain timeout elapses (10 seconds?), then you can log them out.
Another alternative would be to have the site run entirely on AJAX. Thus there is only one "URL" that a user can visit and all content is loaded dynamically. Of course you break all sorts of usability stuff this way, but at least you achieve your goal.
If the user closes their browser, or types in a different URL (including selecting a favourite) there is not much for you to detect.
For links on your site, you could create links that forward via your site (i.e. rather than linking to http://example.com/foo you link to http://mysite.com/forwarder?dest=http://example.com/foo).
Just be careful to only forward to sites you intend to, otherwise you can open up security issues with "universal forwarding" being used for phishing etc..
You absolutely, positively need to tell the client that this is not possible. They are having a basic misunderstanding of how the Web works. Be diplomatic, obviously... hell, it's probably someone else's job... but it needs to be done.
Your suggestions, or a combination of them, may work in a simple proof-of-concept... but they will bring you nothing but support nightmares and will not work consistently enough. Worse, you will undoubtably also create situations where users cannot use the application at all due to the security hacks misfiring on them.
Javascript has an onUnload event, which is triggered when the browser is told to leave the page. You can see this on StackOverflow when you try to press the back button or click a link while editing an answer.
You may use this event to trigger an auto-logoff for your site.
I am unsure, however, if this will handle cases wherein the browser is deliberately closed or the browser process externally terminated (I'm guessing it doesn't happen in the 2nd case).
If all navigation within your site is done through .NET postbacks (no simple html links or javascript open statements), you can do automatic logoff and redirect to the login page if the page load is not a postback. This does not end the session on exit, but it looks like it because it enforces a login if manually navigating to your web app. To get this functionality for all pages, you can use a Master page that does this in the Page_Load.
private void Page_Load(object sender, System.EventArgs e)
{
if (!IsPostBack)
{
System.Web.Security.FormsAuthentication.SignOut();
System.Web.Security.FormsAuthentication.RedirectToLoginPage();
}
}

Notifying the user after a long Ajax task when they might be on a different page

I have an Ajax request to a web service that typically takes 30-60 seconds to complete. In some cases it could take as long as a few minutes. During this time the user can continue working on other tasks, which means they will probably be on a different page when the task finishes.
Is there a way to tell that the original request has been completed? The only thing that comes to mind is to:
wrap the web service with a web service of my own
use my web service to set a flag somewhere
check for that flag in subsequent page requests
Any better ways to do it? I am using jQuery and ASP.Net, if it matters.
You could add another method to your web service that allows you to check the status of a previous request. Then you can use ajax to poll the web service every 30 seconds or so. You can store the request id or whatever in Session so your ajax call knows what request ID to poll no matter what page you're on.
I would say you'd have to poll once in a while to see if request has ended and show some notifications, like this site does with badges for example.
At first make your request return immediately with something like "Started processing...". Then use a different request to poll for the result. It is not good neither for the server nor the client's browser to have long open HTTP sessions. Moreover the user should be informed and educated that he is starting a request that could take some time to complete.
To display the result you could have a"notification area" in all of your web pages. Alternatively you could have a dedicated page for this and instruct the user to navigate there. As others have suggested you could use polling to get the result.
You could use frames on your site, and perform all your long AJAX requests in an invisible frame. Frames add a certain level of pain to development, but might be the answer to your problems.
The only other way I could think of doing it is to actually load the other pages via an AJAX request, such that there are no real page reloads - this would mean that the AJAX requests aren't interrupted, but may cause issues with breaking browser functionality (back/forward, bookmarking, etc).
Since web development is stateless (you can't set a trigger/event on a server to update the client), the viable strategy is to setup up a status function that you can intermittently call using a javascript timer to check whether your code has finished executing. When it finishes, you can update your view.

Resources