I have a website which i integrate with Facebook (via FBML - JavaScript API).
I have setup the application on Facebook as per normal, specifying the "Connect URL" to be the domain of my website.
However, my application has multiple bindings in IIS for the same website.
Such as:
www.bar.com.au
foo.com.au
The domains are completely different, no relationship in the name whatsoever - so a regex-style rule is not possible (ie base domain). The domains were made different due to a combination of localization and marketing. Keep in mind these domains are baked in to an already live website, in other words i cannot change this architecture.
Is there a way i can specify BOTH of these domain's in the ONE Facebook Application settings for the "Connect URL"? Or will i have to create multiple applications?
Of course i cannot use the "Base Domain" setting as the bindings are not on the same sub-domain.
I actually have around 7 bindings in my website - so i'd rather not have to create 7 seperate Facebook applications - because this means maintaining 7 sets of API key/secret pairs in my application.
alt text http://www.freeimagehosting.net/uploads/268b234e2f.png
What's happening of course is when im on foobar.com.au, the Facebook cookies are not available to the domain.
For the meantime, i will try and create multiple ApiKey's - but i think i might run into issues. I'm going to have to go: "If domain is this, use this ApiKey", then the same logic in every single call to the Graph API. Messy stuff.
So i guess my problem/question isn't really caused by Facebook Connect, its the nature of HTTP Cookies by design.
How can i easily access these cookies cross domain? Will i need to setup a third website and direct all cookie logic to there?
If you want *.foobar.com.au to be allowed, then setup your Base Domain to foobar.com.au.
Could you set one of them up as your "Facebook Authentication" site, and direct all FB Auth-related traffic there, and then use one of a large number of cross-site communication tricks to send the token over to the original site?
In other words, regardless of the site they come in from, you'll use .foobar.com.au (for example) as the redirect URI. Then when they come in to that site with you having noted that they came from .foo.bar.com.au, you'll redirect them back where they came from passing along the access token in some cross-domain fashion (querystring, post vars, etc.)
In my current situation, timeframes have stopped me from doing the proper solution, which is what #Yuliy has highlighted.
For now, i have created multiple Facebook applications. But to keep it DRY, i have abstracted all that away behind exposed properties:
private static string _ApiToken_Site1, _ApiToken_Site2;
public static string ApiToken
{
get
{
if (Site1) return _ApiToken_Site1;
else if (Site2) return _ApiToken_Site2;
}
}
Not exactly clean, but the main thing is i did not have to touch my existing code at all, the smarts to work out which Api Key to use is in that property.
For our next project release, i'll be scrapping this and most likely implementing a WCF/ASMX web service which handles authentication from the one place (ie seperate web service on seperate domain).
Related
I created two different asp.net MVC application using the default template, and launches the two simultaneously, when I login with site a, and refresh site b, site b tries to use the login detail of site a. How do I stop it?
I suspect you are having an issue with the AntiForgery tokens. Something I add by default to my new MVC projects is this (add to Global.asax) :
AntiForgeryConfig.UniqueClaimTypeIdentifier = ClaimTypes.NameIdentifier;
I think this SO answer provides a fairly complete overview
Authentication is persisted via cookies and cookies are domain-bound. All cookies tied to the domain will be sent to a request to that domain, regardless of how many actual websites there are in the mix. Although you haven't specified, you're most likely in development and loading the sites under different localhost ports. It's important to know that a different port is not enough to prevent cookies from being shared. In all cases, when developing locally, the domain will be localhost and cookies will be shared between all sites running on localhost.
You have a couple of options. The simplest option is to simply customize the auth cookie name for each site. If you're using ASP.NET Identity, just add the following property to your cookie auth config:
app.UseCookieAuthentication(new CookieAuthenticationOptions
{
...
CookieName = "foo",
});
If the two sites use different auth cookies (based on the name), it won't matter if they both receive both; they'll only look at the one that belongs to them.
The second option is to use something other than localhost. For example, you can make use of something like localtest.me. It's a domain that has been helpfully set up to redirect all subdomains to localhost. That way you can test your sites via something like site1.localtest.me:12345 and site2.localtest.met:54321 (notice that the ports will be the same as they were with just localhost). However, since these are now different domains, the cookies will no longer be shared. However, doing this requires making changes to IIS Express' ApplicationHost.config file, and you could potentially mess something up if you're not careful. It also will be confined to your specific machine, so any other developers would need to make the same change on their machines. Changing the cookie name will be universally applied.
I have written a custom http module which I successfully deployed to sharepoint. The purpose behind this module was to track if the users to the sharepoint site had accepted an EULA (represented as a cookie in the request context) and then simply redirect them to another website (running independantly) to accept our EULA etc. The problem that I am facing at the moment, is that while users with their web-browsers using the site are fine, everything works as it should, but the trouble occurs with things like the SPFarms serch failing etc when trying to index the site. My quesiton is basically, how/what should I be filtering to ensure that my module only executes my logic for a request coming from a webbrowser and how to detect any of sharepoints crawlers etc such as the search service etc? I realize i can hardcode it to check for the username that the service is running on and check the filepath extensions and filter on that, but that seems like horrible design. please advise if you know of a better way to do this please
Try filtering based on the User-Agent string, in Request.UserAgent -- just don't rely on User-Agent for security purposes, since it can be faked.
I have a sharepoint webpart where I have links to go to different web sites to which login is required. Therefore, I think i need to log the users on before redirect them into deep pages in that site, therefore I think i need to set up a cookie to that web site when the web part is loaded (by using the user credentials of the user's active directory information).
How can I achieve this requirement with out opening up a new browser window? (Though I have used a client side script, it pops up a new browser window)
Any help is highly appreciable...
Thanks
If you are referring to "different web sites" as sites having completely different URL's, then it's probably not possible without SSO system.
The reason is that it's impossible to read/write cookies from other domain in web environment, i.e. pre-login the users like you are saying.
If all the sites are inside same domain, like mycompany.com for example, and different sites are in abc.mycompany.com or mycompany.com/subsite, then yes, you can set the cookie. See top section here http://www.15seconds.com/issue/971108.htm
A simple way to implement SSO is by implementing method described later on in same article.
in the "Requesting Cookie from Another Domain". This is not a very secure method though, but can be done if you restrict it properly to specific slave domains. And obviously all the slave sites have to be modified, as with any SSO implementation.
We have a scenario whereby we are hosting an ASP.NET MVC web site on behalf of someone else.
The customer in this case wants us to restrict access to the web site, to those users who have logged in to their main portal. They should then only be able to get to our web site via a link from that portal.
At this point I'm not yet sure what technology or authentication mechanism the 3rd party are using but just wanted to clarify what the possible options might be.
If we call our hosted site B, and their portal web site A,as I see it we could:
Check the referrer for all requests to B, unless they've come from A they can't get in
Check for a specific cookie (assuming A uses cookies)
I'm sure there are other options, anyone any ideas?
Check the referrer for all requests to B, unless they've come from A they can't get in
Can be faked, but most normal users won't do it.
Check for a specific cookie (assuming A uses cookies)
Ask them to embed in their portal some code portion from your site. This way visiting their portal will resulting in you setting a cookie for your domain. Then you can easily read it later.
One more thing to mention. If you're talking about public sites, then it will suffice for a search engine to somehow discover these hidden urls once, after which the game is over. It will index the pages and keep a cache of it. You may want to consider including some noindex/nocache meta tags in these pages.
But seriously, if you wish to have it done properly and secure, you're going to need some form of shared user authentication that that portal and your site both support.
The solutions you have posted are not secure.
In case this is an enterprise application with real requirements for security, you may want to look at some single sign-on solutions.
List of single sign-on implementations
I'd like to be able to use these "best of breed" opensource solutions, with the only requirement of some sort of single-sign-on between the different sites. I don't want my users having to log-in in 3 different places, so I though it could be possible with OpenId.
Has anyone tried something similar?
OpenID will not avoid the problem of having to sign in 3 separate times. It was allow the user to share the same login credentials between the sites, but they will have to actually log in to each of the three systems. If that is not a problem, go with OpenID. If it is, you have two options:
Use an LDAP server to authenticate on all three sites. I think all three software packages have modules/plugins for LDAP (Drupal, Moodle, MediaWiki). Once you have the LDAP server running, the rest should be easy.
Write custom modules/plugins for each platform that authenticate against a single database. Maybe you could use the Drupal database as the primary one, and have MediaWiki and Moodle authenticate with that. So, effectively, the user will only have an account on the Drupal site, but will get access to all three. This is basically the same idea as an LDAP server, but might save you some overhead and complication.
There is also the Moodle Integration module for Drupal that attempts to accomplish the same thing, only without MediaWiki in the mix. I would check that out.
Good luck!
here are three possible solutions: (1) sigle sign-in site, (2) inject login/register forms into all sites using server site includes - SSI and (3) - ajax.
Single sign-in site.
suppose you have site1.domain.com and site2.domain.com and you want to login/register at both simultaneously. Probably the easiest way to do it will be to create another domain e.g. login.domain.com that will do the job. Your login/register application will need access to databases for site1 and site2 and/or their api's. Since login status usually resides in the cookies, your login application will need to set those login cookies to both sites simultaneously (on successful login/registration) and delete on logout.
To set cookies for all sites from login.domain.com - all of the must sit on .domain.com and cookie domain parameter must be .domain.com
If your solution needs both api access (to the other applications) and access to the same database by several applications - you may need to deal with database transactions. This is because new registrations won't be visible on other sites until transaction is committed - so for example - you can not call api from within login code to retrieve cookies before committing the transaction with the new registration.
One important detail. If you already have users separately registered at site1 and/or site2 but not on both your signon site will either have to handle those cases or you'll need to sync registrations manually yourself upon deployment of your new registration system. Manual fix won't be possible when extra user input is required to complete the cross-site registration. This point also becomes important when you add new sites requiring some new user input for the registration.
Finally, carefully choose domain name handling OpenID. To the best of my knowledge it is impossible to transfer openid endorsements across subdomains without users consent - please correct me if I am wrong. You don't want to ask users to re-register just because you decide to rename the sub-domain.
server side include (ssi) method
Another solution is to inject those forms via sever-side includes into all sites. This may be considerably harder and will depend on the type of webserver in use and will work slower.
A pre-requisite here is that all your applications run on the same subdomain - so that openid works for all of them.
I've once built common user registration for MW (php) and cnprog (python/django).
My solution was to display the same exact registration form on the wiki and the forum site, while generating and processing this form with django. I did it this way because wiki and forum "skins" are so different that I did not want to surprise visitors with the dramatic change of site appearance when they go to the registration page. This is complicated and I will not do it again :) and instead would go with single sign-in method.
in order to display django output through mediawiki I've created a wiki extension printing apache "include virtual" call to glue django-generated content with the wiki output. This comes with problems.
Apache include virtual on my installation cannot POST to subrequests and cannot pass cookies from subrequests and cannot pass redirect responses (all http headers will be thrown out) to the upstream user requests.
So I've added "was_posted=true" to mark the posts for django and a secret code to prevent cross-site forgery. To get the cookies out - had them printed with cookie_morsel.output_js() in python. So javascript must run on the client for this to work. Any redirects will have to be done with javascript too. Extra work will still be needed to upload files (like avatar picture).
So single sign-on may be the best solution.
ajax may be a neat way around - just build forms in all of your sites with javascript and submit them via ajax. Will work fast and will not break appearance of your various sites,
but this won't please the folks allergic to javascript.
Actually, the only method that does not require any javascript is single sign-in site.
Posted this because I've spent enough time building this thing for MW and django - an hour of typing did not make a difference :).