I need to open my page in two different browser tabs and work on both of them simultaneously. I have maintained a session in my code which have all the data from the page. When I try to modify in one tab it gets reflected in second tab. How can I achieve two different sessions for the same page?
You can not do it the way you ask for. Each browser have their cookie to connect with the user.
What alternative you have. The point here is to connect the same user (no matter what browser use) to the same data.
You force the user to login (and there you have the same data base on user)
You use some variable on the url that must be the same on all browsers to have the same data. Eg, on first browser you have a url like http://www.test.com?s=123 and ask to from the user to copy paste either that url either the code on the second browser to been able to connect each other.
On each case session is not do the work, and you need some other database to connect the two data pages.
The tabs in a browser share the same set of cookies, which includes the session cookie - so the pages use the same session.
So you need to find some other way to distinguish those two tabs.
You could add a "tab" parameter to the url that you use on that page, with some tab-specific value. You need to maintain that value in the url when you redirect, you can't store it in Session or a cookie.
The processing must then be done in a tab-specific manner.
If it is just for testing, you could use "private tabs" - they don't leak their cookies into regular tabs. But you do not want to force regular users to use this.
Related
I am little new to programming (especially to web designing). I have learned that the World Wide Web is based upon a protocol called HTTP. And also each and every item (I mean web pages, images, css & js files etc) are transferred according to the HTTP Requests. So my problem is this.
When we fill a web form (especially a login form like fb) and click ok, login or submit button, What Happens Next? Does it send another http request or does it use some special technique?
Is it safe or does anyone can hack our user names and passwords when that requests are traveling through internet?
It actually depends on the person who made it. They can create an output which would show the values entered or it can be entered to a database for other usage. There's so many things can be done and that would actually depend on the need of the user.
Added for 2nd question:
There are a number of ways to encrypt these data to avoid being hacked. If you use a very basic technique in transferring the values that you submit then there would be a huge possibility that it can be hacked. But, not to worry as there are plenty of ways to be safe.
My website is used to administer customer accounts. If I access CustomerA's account through the website then open a new tab and access CustomerB's account the session holding the customer ID updates to think I'm now working on CustomerB. Then if I click back to CustomerA's tab and start editing that page I am in fact editing the database record for CustomerB. This has happened and caused all sorts of problems so I need to find a fool proof way of stopping it. I don't want to put the customer ID in the URL as this will make it open to abuse.
Session is not a place to hold information like this exactly because of problems you're describing. You need to pass customer ID along with the page itself (either in hidden field or in url), so when you post back the form, it exactly knows what are you trying to do. Session won't protect you nor add any extra security. You need to determine if the user has correct permissions either way, so you should focus on this aspect.
Is there a way to make a web app handle separate sessions for different browser windows/tabs other than having session id inside the URL?
In general form, what are the ways of storing session-id other than cookies and URL parameter?
I think you can try do it by using hidden fields in forms, but you have to create hidden form and make all links act as submit button (with some short JavaScript code setting proper value in fields responsible for target), but IMHO it is not the best solution, because all the requests will be using POST method and get all disadvantages of it.
And as question about other methods here is short list.
In short, no. You don't get any tab information from the client; a new browser window is just another client. The only way to differentiate clients is via cookies or parameters. Maybe you could create a tab-named cookie based on the javascript window object ID or something, but I kind of doubt it.
HTML5 (advert click-through, sorry) has some per-tab local storage options, but that probably won't help you right now.
Not sure if you searched before posting, but I found another question like yours - unanswered, but some good advice in the suggestions.
What is the most standard or best way to persist data between requests?
Should I use cookies or session variables? I'm interested in keeping data like sort order, sort column, and page number (for paginiation).
I'm coming from a webforms background so normally this type of thing was automatically handled for me in the viewstate of the controls I was using.
update
I like the querystring idea, for searching and more meaningful URLs; however, I'm working on an "index/list" view, which consists of a View with header, and "control" options, like DDLs for filtering and a partial view that renders the table of data.
The DDLs use a $.load() to call an ActionResult on the controller, which returns the partial view, passing parameters there in the querystring, but since these are ajax requests the main page url of the user's browser does not get updated.
Is there a best-practice for taking querystrings off the main-page URL and using them in ajax requests to other ActionResults?
If you want it to survive only through one request/redirect TempData is your friend.
However, for things like your pagination, URL is the best method, for the ability to share links alone.
A standard way is to pass those sort of things via URL Query Parameters. You can modify your routing to expect certain URL variables. That way the pages become more search engine friendly as well.
It depends on how permanent you want the information to be:
Things like the page number should indeed be in the URL (as others have pointed out) - this helps with bookmarking, etc, but remember that if you add more content to the list, then that bookmarked result set will not always be what the user wanted...
If you're happy for these values to be lost when a session times out (by default around 20 minutes), then put them in Session.
If you think that sessions are going to timeout before the next request, or you want to save it across visits then you should be storing them in either cookies, or a profile (potentially allowing "Anonymous" profiles, which work with the users cookies, so they would lose them across machines).
Personally, I'd think very carefully about putting sort order and columns in the URL if you do you could actually end up really confusing search engines:
Lots of pages with very similar content (page 1, sorted by date desc, page 1 sorted by date asc, etc) - search engines don't like duplicate content, and nor should you as Google (for instance) will only show two pages from your site in a default result set, you want them to be valid, not duplicates.
Search engines will spend lots more time crawling your site, and potentially give up - If on every page they find links to "Sort by this column", they will attempt to follow them, resulting in more work on the server, higher bandwidth use, etc.
These can be mitigated through the use of a Robots.txt file denying access to sorted versions of the page, but if this is generated almost dynamically that will be very complex to maintain going forward.
In response to your update, a nice way to achieve that for pages would be to have links to "Previous" and "Next" pages of results (or better yet, a list of all pages in the list), output on the page, with the page numbers, that you then hide with JavaScript.
This way users should see your nice, AJAXy behaviour, and search engines (and users without JavaScript - mobile, or those using older screen readers for instance) will still be able to get access to all your pages - this will help your pages to degrade gracefully, or use "Progressive Enhancement".
Things that were previously in viewstate should probably be put back in the clients hands via either hidden fields or cookies.
Session is "too" easy. In a dev environment it works great, pretty much no matter what you put in it. In production scalability and persistence become a problem. In-process session is likely to disappear unexpectedly if you have crashing bug in your site, and requires server affinity when load balancing. Out-of process session fixes the durability and affinity issues, but can still be a performance bottle neck if too much stuff is put in session. A VERY common problem is that each page will put 1 or 2 items into session but never take them out again when they are done. And even if a page removes it session data when it is no longer needed, the data can still get orphaned if a user starts a process and never completes it.
Cookies is a fast and simple way to persist data between requests, and you can also make them live only for a limited time depending on your needs.
Session are easiest.
A site has 100's of pages, following a certain sitemap. A user can navigate to page2.aspx from page1.aspx. But if the user goes to page2.aspx directly say through a book marked URL, the user should be redirected to page1.aspx.
Edit: I dont want to go in and add code to every page that needs to fulfill this need.
Note: This is not a cross-page postback scenario.
You might consider something that is based off WorkFlow, such as this: http://blogs.msdn.com/mwinkle/archive/2007/06/07/introducing-the-pageflow-sample.aspx
The WCSF team also included a pageflow application block that you can use as a standalone add-on to your application.
I guess you could check the referrer, and if there isn't one / or it isn't page1.aspx then you could redirect back to page1.aspx.
As another answerer mentioned, you could use the Referrer header, but that can be faked by the client.
Since you don't want to modify each page, you could do something with an IHttpModule. Assuming you have some way of describing the valid page navigations, you could do something like this in the BeginRequest handler:
Check the session for a list of valid pages (using a default list for first visit if none are in the session).
If this request is for an invalid page, redirect to the place the user should be.
Based on this request, set up the list of valid pages and redirect page in the session so it's ready for the next request.
I recently worked with real code that checked to see if referrer was blank and used that as a step in authorization. The idea was users wouldn't be able to fake a referrer, you don't need a custom browser to fake a referrer. Users can book mark your page to delicious, then delicious.com is the referrer (and not blank).
I've had real arguments about how sophisticated a user needs to be to do certain hacks-- i.e. if users don't know how to set the referrer, then you can trust it. While true, it's unlikely your users will write a custom browser, but there already are Firefox addons to set headers, referrers etc and they're easy to use.
Josh has the best answer-- on page2 you should check the page hit log and see if the user has recently visted page1
I like alot of the answers above (specifically the workflow).
Another option, is creating each page as a usercontrol and having page1.aspx control what usercontrol gets loaded. This has the advantage of storing your workflow in a single place instead of on each page.
However, I don't think there's a magic bullet out there. It sounds like this security problem is an afterthought, or possibly reported as a bug, and you have been tasked with fixing it quickly and efficiently.
I would start weighing the answers here with their associated cost in hours.. I suspect the quickest solution will be to check referrer addresses on each page. Although hackable, it is obscure and if that risk is acceptable to you it may be the appropriate solution.