Prevent Postback when user clicks browser's back button - asp.net

I have a web page that sends email to multiple users (online distribution list). After the submit button is clicked and the email is sent, a status page is shown listing how many emails were sent, errors, and other information. If the user clicks the back button, the email is resent. How can I prevent this?
NOTE: The browser DOES prompt the user to "resubmit" or "resend" data to the page before actually sending the email, but that does not stop my users from clicking it and then wondering why two copies of the email were sent out.
Environment:
Server: C#, ASP.NET 2.0, IIS6
Client: any browser (I don't want an IE-specific solution like SmartNavigation)

In your code, right after the point where e-mails are sent, do a 302 redirect to a confirmation page:
protected void btnSend_Click(object sender, EventArgs e)
{
SendManyEmails();
Response.Redirect("confirmation.aspx");
}
With this kind of code, the POST to the original page will not end up in the browser history.
This common pattern is known as the Post/Redirect/Get pattern.
Bonus info about keeping state when doing Post/Redirect/Get
The main drawback of this pattern is that all state from the handling of the POST request is lost when redirecting the user - thus, commencing a new request context. In ASP.NET this includes members within the Page and all Control objects, as well as everything stored in the ViewState.
If you generate some kind of "status object" - maybe a log of sent mail messages - while handling the POST request, you will need some way to save this object for the following GET request. Some web frameworks has functionality specifically for this: RoR has flash, ASP.NET MVC has TempData. ASP.NET forms has no such concept built in, so you will have to figure something out yourself.
Saving the object to the Session on the POST, reading and deleting it on the following GET would be one way to solve this. You can build an abstraction around this if you use it in several places, or you could search the web for existing implementations of flash/TempData for ASP.NET forms.

Related

Making fake post back through Http Poster or Http Requester in Firefox(ASP .NET)

I want to make a fake POSTBACK request to aspx webpage which is hosted in IIS 7.5 server with .NET Framework 4.
When i clicked a button seemingly the login button of the page.It just posts the data with
__EVENTVALIDATION and __VIEWSTATE and myButtonId with value Login(Caption of my button).
After hitting the Login button the webpage sents back the result such as Incorrect username or password error message inside tag.
But i want this login button operation done within a single post request.
What i have tried so far?
I know that __EVENTVALIDATION parameter contains the validation of controls or is there any tampering occured in an encrypted way this is checked from the server.
Like that i have passed the __EVENTVALIDATION and __VIEWSTATE(the both key values are obtained from the previous request that had been made to the server) and username and password and the buttonId to the server.
But all fails it just loads as it is the first request.
The server can't identify it as a POSTBACK request.
What to do?.Iam just cracking my heads for days.
Is there any techi over here to solve my problem?
And one request dont make it as a duplicate.Because from all previous stackoverflow questions i did'nt get an acceptable answer.So what iam here posting this...

Server.Transfer from ASP.NET to ASP

No duplicate of “Server.Transfer from ASP to ASP.Net” ;-)
On an IIS web server (running Classic ASP), I have a local URL that a user is remotely redirected to. Presumably, this call is made with data in the query string or transmitted through POST data. When this request is made, I need to remove this data (especially the query string) server-side, so none will be visible to the client.
For example, the user is led to http://example.com/dir/?data=payload. This is what requested, and this is what the user’s browser will display. Now I need the request resource to strip QueryString and Form data, so that the user ends up in e.g. http://example.com/dir/.
On MSDN, they have HttpServerUtility.Transfer, which adds a boolean to the classic Server.Transfer method allowing to preserve or clear data. However, when I try this in an aspx file transfering to an asp file, I get a 0x80004005 HTTP exception (“No http handler was found for request type 'GET'”).
Is it possible at all to “redirect” from an ASP.NET file to a Classic one?
Is there another, better way to remove request data server-side?
My options would be:
Use a redirect on the page without querystrings: Response.Redirect() This will clear post data as well.
Do a HTTP Request to scrape the HTML of the other page, and view it in your current page.
I would probably do option #1

Call and receive URL data in asp.net

In asp.net (or vb) I must to call a page that only autenticates a user, returning the user data if the logon succed. This way, I would like to implement the sequence:
A blank page (mine) request the autenticator page (3rd part) automatically on load;
The user logs in the autenticator page;
My page reads the autenticator page results and do the actions.
I'm a very begginer in asp.net, and I'm using vb.net in the environment for coding the page "onload" event. I'm trying to use the "redirect('url')" method to call the autenticator's page, but in this way, obviously, I can't receive the result. How can I implement this sequence?
Problem solved. I discovered that when I call the autenticator server I must to inform a parameter with the URL for autenticator's page redirecting to (my page, in this case). This way, the autenticator's page itself will request my page after user login, then I'll be able to get the URL parameters with the user data. Thanks.

Prevent multiple users on a page at a time

What whould be the best way to prevent multiple users on a page?
For example if a user is at the page "Home.aspx", no other users should be allowed to go there.
I'm using asp.net on the server and the js-frameword jQuery on the client side.
The easy part is only allowing one user to access a page. You can for example store a session id in an application variable to keep track of who's on the page.
The hard part is to know when the user leaves the page. The HTTP protocol only handles requests, so the server only knows when a user enters the page. There is no concept of "being on" a page in the protocol.
You can use the onunload event in client code to catch when a user goes somewhere else, however this will not always work. If the user loses the internet connection, there is no way to communicate back to the server that the user leaves the page. If the browser or computer crashes, there will naturally be no onunload event.
You can keep requesting data from the server, by for example reloading an image on the page. That way the server can know if the user is still on the page at certain intervals. However, if the user loses the internet connection, the server will think that the user has left, while the user thinks that he/she is still on the page.
Another problem is browser history and cache. A user might leave the page, then go back to the page again. You have to make sure that the page is not cached, or the browser will just use the cached page and the server has no idea that the user thinks that he/she is on the page again.
Agreed with Guffa, you cannot be sure that the browser is already on the page or not, you can only check if the browser is already connected to that page or not.
You can do a sort of "ping", but its more a trick than a 100% working solution and it requires javascript enabled.
I didn't do it but I should look at XMLHTTPRequest and onreadystatechange to handle this :
1) On page load, the browser (client) initiate a XMLHTTPRequest with the web site (server) then wait for callback with the onreadystatechange event.
2) The web site receive the request and "mark" the page as "in use" with the current DateTime.Now.
3) Then the web site sends the response.
4) The onreadystatechange event get the response and the event code re-request the server to re-initiate the 2 after 1 min.
5) If another client request the page, the server check the DateTime mark : if the mark is greater than 1min ago, it means the client didnt respond to the request and may not be on the page again.
Not sure why you would want to do this because it flies in the face of web usability. You could do a locking mechanism on each page in server side code (write user name, page and time to a DB), which is freed up when they go to another page. You would then check on a the page load event to find out if anyone currently has that page locked. However, and this is a big however - have you considered what happens if somebody just shuts their browser down or walks off and leaves it on a page. You would need to seriously consider a timeout to free up locks too. That would need to be a back ground service, either in global.asax as global code or a separate process.
Maybe use static variables to hold the ip of the first user to access the page and then check whether other requests come from the same ip, otherwise display a "no access" page.
make sure you use lock it:
Object thisLock = new Object();
lock (thisLock)
{
// access static variables
}
You should also use "Session_End" method in global.asax to remove the ip address in case the user leaves your website without pressing the logout button

PageMethods security

I'm trying to 'AJAX-ify' my site in order to improve the UI experience. In terms of performance, I'm also trying to get rid of the UpdatePanel. I've come across a great article over at Encosia showing a way of posting using PageMethods. My question is, how secure are page methods in a production environment? Being public, can anyone create a JSON script to POST directly to the server, or are there cross-domain checks taking place? My PageMethods would also write the data into the database (after filtering).
I'm using Forms Authentication in my pages and, on page load, it redirects unauthenticated users to the login page. Would the Page Methods on this page also need to check authentication if the user POSTs directly to the method, or is that authentication inherited for the entire page? (Essentially, does the entire page cycle occur even if a user has managed to post only to the PageMethod)?
Thanks
PageMethods are as secure as the handler in which they reside.
FormsAuthentication will protect everything except the Login page.
On an unprotected handler, like login, you should expose only methods that 1) are not sensitive or 2) validate the user.
EDIT: in response to comments and other answers regarding CSRF and XSS please see http://weblogs.asp.net/scottgu/archive/2007/04/04/json-hijacking-and-how-asp-net-ajax-1-0-mitigates-these-attacks.aspx
You're trying to protect against CSRF attacks.
These attacks can be prevented by requiring an authorization code in the POST parameters, and supplying the auth code in the initial page load. (The auth code should be per-IP address and per-user, and should expire quickly)
For added security, you can make each auth-code only usable once, and have each request return a new auth-code. (However, if any request fails, you'll need to reload the page)
I am working on a project that heavily utilizes ASP.Net WebForms Page Methods which I talk to using Ajax. This is rather very convenient for me than writing all my codes in JavaScript.
However, Securing the page methods became an issue which troubled me. I see that I can access the page methods via Postman and Fiddler hence, enabling hackers to play with your APIs.
My solution was quite simple which I discovered accidentally. Adding a static Cookie request to the page method would return error for any app that is NOT the website.
[WebMethod]
[ScriptMethod(UseHttpGet = false, ResponseFormat = ResponseFormat.Json)]
public static string GetAnything(object dat)
{
HttpCookie myguid = HttpContext.Current.Request.Cookies.Get(Constants.Session.PreventHacking);
var hackguid = myguid.Value ?? ""; //other page method contents
return "anything";
}
A postman request to this method would return :
{
"Message": "There was an error processing the request.",
"StackTrace": "",
"ExceptionType": ""}
While a more detailed error would show if on LocalHost.
I understand there are browser ad-ons that can intercept API calls by sitting just beside the website. I have not tested this. A separate security fix has to be built for this however.
I'll update here once I perform some tests.
Think of Pagemethods like a mini webservie local to the page. The fact is they will have no extra checks and verifications in place except those that are placed on the entire website, and those that you choose to put in.
Using Pagemethods is a smart idea from the point of view of 'Encapsulation', and if you're going to use them it doesn't hurt trying to put in some extra security measures in place.

Resources