What options do I have to work around disabled cookies for session management?
In the page in hidden field
In the query string
In the HTTP header
You can append an SID variable to every link you output to the user. PHP has some built in support for this.
Well, all a cookie does is holds on to the big ugly string your system generated as that user's session identifier (SID) for you. If you don't have cookies, the goal is to get that SID sent in with every request from that specific user.
Creating a hidden form field with the SID in it is necessary when you are accepting input from the user. You should probably read up a bit on Cross-Site Scripting vulnerabilities - might as well head these off while you're monkeying with your forms anyway.
Adding data to links (via the query string) is typically called "URL Rewriting", so just look that up for details. The upshot is that every time you output a link it must have the SID as one of the parameters in the query string.
For example: "http://mysite.com/action?SID=da83fdec49ebfafe4"
Some frameworks can handle this URL rewriting semi-transparently.
Related
I have a website that has been experiencing errors because of null references due to poorly coded logic regarding the user agent. Basically, there has been a slew of incoming requests that contain no user agent which leads to null reference exceptions in the user agent tracking. (It contained a call to "Request.UserAgent.ToLower()) I am correcting this logic to avoid the error condition. Since I'm certain these requests are coming from specialized tools and not ordinary users, I'm also blocking empty user agents via URL rewrite rules.
I need to test both of these changes. However, I can't seem to find a user agent spoofer that will enable me to generate a simple get request with NO USER AGENT. All of the tools that I have tried will allow me to do a custom agent string, but they won't let that string be left empty and there are no options that I can find to tell it to send no user agent.
So my question is, what tools are available, for a Windows-based system, that I can use to emulate a browser request with NO USER AGENT so that I can verify that my changes are working properly?
I believe that value is coming from the request headers. If yes, just try
Fiddler. Go to composer tab (see below) - by default it adds User-Agent to the request, however when you delete it in the Composer it seems to disappear from the request.
I want to check if a user has cookies enabled.
Most solutions involve:
1. creating a cookie
2. redirect the user to a custom page or the same page.
3. Read the cookie.
The issue I have is in the 2nd step. Should I use a query string while doing a response.redirect so that in the next trip I know the cookie has already been set and that I should try to read it? What if the user hard codes the URL(along with that query string) in the browser, while accessing the website?
Also, if I find that the cookies are enabled and I set a session variable to say that cookies are enabled on this browser, so dont check again in that session. Is that OK?
If session is available, is that a good enough indicator that cookies are enabled?
I want to minimize these double trips to each page for checking cookies.
I would use javascript to make an asynchronous request and check to see if the cookies that were set were handed back in this request.
Never pass a querystring. You already hinted at it above, but what if some trickster figures our the url and decides they want to pass their own querystring?
If the user has cookies set up, you can set the session and check that.
Always check the session.
instead of using this technique which involves multiple steps and pages, and extra waiting time for the enduser, can't you just use the HttpBrowserCapabilities class? This particular class has a Cookies property:
HttpBrowserCapabilities.Cookies Property
Grz, Kris.
As per my knowledge I know Two ways to check whether browser enables/accepting cookies
By using "Request.Browser.Cookies"
By using Javascript/Jquery
Example:
if (Request.Browser.Cookies)
{
Response.Write("Welcome To Hello World Cookies Accepted by the browser");
}
else
{
Response.Write("Good Bye To Hello World. Cookie diabled in your browser. Enable cookies and Try again... Cool..");
}
Personally, I try and write secure ASP.NET code. However, I have become quite paranoid about the code I write, as I used to work for a Registrar (high fraud targets). Are there any ASP.NET functions I should look at with extreme scrutiny (other than SQL access - I know enough not to do dynamic SQL).
This is an excellent MSDN article: Security Practices: ASP.NET 2.0 Security Practices at a Glance.
Excerpt:
How to prevent cross site scripting
Validate input and encode output.
Constrain input by validating it for
type, length, format, and range. Use
the HttpUtility.HtmlEncode method to
encode output if it contains input
from the user, such as input from form
fields, query strings, and cookies or
from other sources, such as databases.
Never just echo input back to the user
without validating and/or encoding the
data. The following example shows how
to encode a form field.
Response.Write(HttpUtility.HtmlEncode(Request.Form["name"]));
If you return URL strings that contain
input to the client, use the
HttpUtility.UrlEncode method to encode
these URL strings, as shown here.
Response.Write(HttpUtility.UrlEncode(urlString));
If you have pages that need to accept
a range of HTML elements, such as
through some kind of rich text input
field, you must disable ASP.NET
request validation for the page.
Turn On Custom Errors To Keep Errors Private
<customErrors mode="On" defaultRedirect="YourErrorPage.htm" />
Never trust user input. Never assume client-side validation will prevent bad input data. Always ensure that ValidateRequest="true" and EnableEventValidation="true" in your web.config :
See Request Validation and ASP.NET Security Tutorials.
While I realise that this is usually related to cross site scripting attacks, what I'm wondering is how can a session remain valid throughout multiple subdomains belonging to a single domain (example: a user logging in only once, and being able to access both subdomain1.domain.com and subdomain2.domain.com with the same session). I guess I first need to understand how it works, but so far I haven't been able to find much that would be of any relevance.
But then again, maybe I wasn't asking the right question.
Thanks in advance :)
Inproc sessions cannot remain valid, however you can code your web application to allow cookies across multiple subdomains. You will need to set the domain equal to:
Response.Cookies("CookieName").Domain = ".mydomain.com"
Remember the period.
There are quite a few ways to share session data or cookie data across domains. The simplest is to share it on the server side through a shared data store. But you would not be asking this question if it were that easy.
The other way to do this is equally simple. The domain one.com contains some session data say name=aleem and id=123 and wishes to pass this along to two.com. It will follow these steps:
Make a call to two.com/api/?name=aleem&id=123
When two.com gets the data via query parameters, it creates a cookie with the data. This cookie will be stored under the two.com domain.
two.com will then redirect back to the REFERER which in this case happens to be one.com
This is a simplified scenario. The domain two.com needs to be able to trust one.com and not only that but it needs to know that the request is authentic and not just crafted by the user so you need to use public/private keys to mitigate this.
By default, all cookies for a site are stored together on the client, and all cookies are sent to the server with any request to that site. In other words, every page in a site gets all of the cookies for that site. However, you can set the scope of cookies in two ways:
Limit the scope of cookies to a folder on the server, which allows you to limit cookies to an application on the site.
Set scope to a domain, which allows you to specify which subdomains in a domain can access a cookie.
You can learn more here.
The comments about the cookie being set for the domain to allow subdomains to receive that cookie give you that side but what's missing is the consistency of session.
I think this is very much like the problem of maintaining state across servers in a farm and the solution is probably to ensure that your session store is consistent across both sites (if they are not server from the same 'web site' in IIS). You can move the Session store into SQL Server (HOW TO: Configure SQL Server to Store ASP.NET Session State) which would probably serve the purpose as each site would query the same store when looking for the session data related to the cookie they've been presented with.
I hope that gets you on the right track.
If you have the ability to set up a common subdomain, you can do this:
In your subdomain html files, include a javascript file at the top like this:
<script src="http: //common.domain.com/check.asp"></script>
In check.asp, look for your logged_in cookie and if not present, show a page say, http://common.domain.com/login.asp using something like
<%
if (cookie_not_found){
%>
location.href = "http: //common.domain.com/login.asp";
<%
}
%>
Once a person submits username password, submit it back to the same login.asp and set the session cookie, (which will be set in common.domain.com domain) and then redirect to http://subdomain1.domain.com.
What will happen now is, a call will be made to the embedded "common.domain.com/check.asp", and cookies for common.domain.com will be sent by the browser along with the request. So you will know whether your session is valid or not, even when you are in subdomain1.domain.com.
You can set a cookie for a specific domain.
In php, the setCookie() method contains a parameter in which you can specify the top-level domain, so the cookie is valid for all subdomains. Based on your tags, I see you are working in asp.net. Probably this also exists for asp...
after a little search for asp:
try this:
Response.Cookies("CookieName").Domain = ".mydomain.com"
or read this
Here is a solution which works:
http://anantgarg.com/2010/02/18/cross-domain-cookies-in-safari/
I implemented OpenID support for an ASP.Net 2.0 web application and everything seems to be working fine on my local machine.
I am using DotNetOpenId library. Before I redirect to the third party website I store the orginal OpenID in the session to use when the user is authenticated (standard practice I believe).
However I have a habit of not typing www when entering a URL into the address bar. When I was testing the login on the live server I was getting problems where the session was cleared. My return url was hard coded as www.mysite.com.
Is it possible that switching from mysite.com to www.mysite.com caused the session to switch?
Another issue is that www.mysite.com is not under the realm of mysite.com.
What is the standard solution to these problems. Should the website automatically redirect to www.mysite.com? I could just make my link to the log in page an absolute url with containing www? Or are these just hiding another problem?
Solve the realm problem that you mentioned is easy. Just set the realm to *.mysite.com instead of just mysite.com. If you're using one of the ASP.NET controls included in the library, you just set a property on the control to set the realm. If you're doing it programmatically, you set the property on the IAuthenticationRequest object before calling RedirectToProvider().
As far as the session/cookie problem goes with hopping between the www and non-www host name, you have two options:
Rather than storing the original identifier in the session, which is a bad idea anyway for a few reasons, use the IAuthenticationRequest.AddCallbackArguments(name, value) method to store the user's entered data and then use IAuthenticationResponse.GetCallbackArgument(name) to recall the data when the user has authenticated.
Forget it. There's a reason the dotnetopenid library doesn't automatically store this information for you. Directed identity is just one scenario: If the user types 'yahoo.com', you probably don't want to say to them 'Welcome, yahoo.com!' but rather 'Welcome, id.yahoo.com/andrewarnott'! The only way you're going to get the right behavior consistently is to use the IAuthenticationResponse.FriendlyIdentifierForDisplay property to decide what to display to the user as his logged in identifier. It gives more accurate information, and is easier than storing a value in the callback and getting it back. :)
I dunno how OpenID works, but LiveID gives you a token based on the combination of user and domain. I just would have forwarded www to mysite.com.
The cookies and sessions and everything else get lost between www.site.com and site.com. I don't have patience enough to thoroughly read all the specs, but http://www.w3.org/Protocols/rfc2109/rfc2109 states that
A is a FQDN string and has the form
NB, where N is a non-empty name
string, B has the form .B', and B' is
a FQDN string. (So, x.y.com
domain-matches .y.com but not y.com.)
Note that domain-match is not a
commutative operation: a.b.c.com
domain-matches .c.com, but not the
reverse.
I think that means yes, you do need to forward to www. I have always added domain correction code to my sites when cookies and sessions are being used.