We have two websites with different domain names. One of them is a wordpress site. Both websites have their own authentication system.
For the sake of convenience, it was decided to have a single authentication for both website and making use of session cookies. I searched about it and got to know about Single Sign On. Can anybody tell me how to implement SSO when one of the website is a wordpress site(if this makes any difference)? I would highly appreciate if any help comes.
Pretty simple actually. From the non-wordpress site, you just need to make the login form connect to the wordpress database and check the user-provided credentials. Wordpress uses an md5 password hash, so make sure you hash the users password with the md5() function before passing it to the database. Handle the return results as normal.
One note, make sure that your database user has permissions from the connecting host. If both sites are hosted on the same server (and IP), it won't be an issue regardless. If not, you need to make sure the database user either has permissions from the second IP or from everywhere ('user'#'%').
Related
I often have a need to secure a single page (i.e. Reports) on a public facing app so that only authorized users may access the page. In the past, this mean setting up a custom login form or using the ASP.NET membership provider or something else far too complex to serve the purpose. Is there an easier (safe) way to secure a single page in this fashion?
Some things I've considered:
Client certificates (initial setup is a pain)
A single master password (works in a pinch, but feels dirty)
Restrict access by host address (cumbersome when the need arises to allow external users access to the page). Also, need to support access via proxy (X-Forwarded-For) which can be faked by technical users)
Are there other options? Recommendations?
You can do it in your web.config file something like what is suggested here. As far as authentication is concerned the easiest way is to use windows authentication.
A login system is your best option. If you don't want to go through the trouble of setting up and managing a login system yourself, consider using OpenAuth.
You can achieve functionality pretty easily using DotNetOpenAuth. Users can then log in with their Google, Yahoo, StackOverflow, etc. accounts, and you get a token that you can store to limit access with.
I'm working on an asp.net website that needs to store user passwords for another website so that I can retrieve data periodically without requiring the user to keep logging in. I can't imagine how one-way hashes would work in this case since the user isn't going to be re-keying the password every time. I'm assuming I need to encrypt the passwords to store in a SQL Server database and decrypt them when needed. But that's where things get tough for me. The basic infrastructure is a C#/Asp.Net MVC3 website running on load-balanced Azure compute instances and storing data in SQL Azure. I'm not a crypto guy, and I don't want to make a rookie mistake. There seems to be a lot of information out there, but nothing seems clear to me. Even though the data I'm connecting to isn't sensitive, I want to treat my users' information with the same care I want my personal data treated. Any advice on how to proceed would be appreciated.
EDIT:
I certainly understand that storing passwords is not a best practice, but in some cases it is simply unavoidable. I have come across this project, but have not tried it yet: http://securentity.codeplex.com/. It uses a digital certificate on the web server.
Users of the 3rd party site are able to set their data as "public", in which case I wouldn't need to store their password. So I may give users the option of doing that instead.
You should have a look at OAuth and Single Sign-On.
In simple terms: Only an authentication token is sent between the different systems.
You (the site) should never have knowledge about the users passwords. At most you should know the hash of the passwords to your site.
Read the two topics above and you will know how to properly secure your and the "neighbour" site.
Edit
In Short:
The other site (site B) implements OAuth server processes. Your site (site A) implements OAuth client authentication. When requesting user information from site B you redirect the user to site B authentication to allow site A to read information from site B regarding the user. Site B creates a token that site A can use to access the information. This token can be time limited (or not).
Big picture: I have been asked to create a search engine for our company's intranet. Such a search engine will crawl pages supplied to it by XML files for each independent application on the intranet. Problem is, the entire intranet is using Forms Authentication, and so the crawler will have to have access to each application without actually having user credentials (e.g. username and password).
Each application within the intranet has its access controlled by a permission manager, which is essentially a wrapper on the default Role Manager ASP.NET comes with. Each application can define its own roles and assign people who have those roles.
Please note that there are potentially hundreds of applications.
The crawler has access to the permission manager's database, so it knows what all the roles are. Therefore my idea was to have the crawler create a cookie that identifies it as having all roles for each application.
The problem I'm running into is this: how do I create a forms authentication cookie which already has the roles assigned in it without creating a corresponding user (IPrincipal).
It is entirely possible that I've failed to completely understand how Forms Authentication works, and if so, please tell me what I can do differently.
This is probably not what you want to hear, but...
I would just have the crawler authenticate like anyone else.
Given that this is a crawler you control, why fight Forms Authentication? Seems logical to create a user with all required roles in each application (hopefully you have a central administration point for the hundreds of apps, else I would not want to be an administrator there ;-)
If you do anything that allows "just the crawler" special access (bypass user-based authentication based on... what? The crawler's user agent? A specific origin IP?), you create a security hole that a hacker can leverage to gain access to all of the intranet applications that have otherwise been diligently secured with user IDs, passwords and roles (in fact, the security hole is particularly wide because you propose granting access to EVERY role in the system).
It sounds like what you want is an appropriately encrypted System.Web.Security.FormsAuthenticationTicket (which then gets attached to HTTP requests as a cookie).
The encryption logic is located in System.Web.Security.FormsAuthentication.Encrypt(), which I think uses the MachineKey as the encryption key. Also have a look at the GetAuthCookie() logic (using Reflector).
You might have to write your own version of the encryption method, but what you want to do should be possible, provided you have a copy of the remote site's encryption keys. You don't need the user's passwords -- only the user name is encoded into the Ticket.
It seems to me that the problem is not yet well defined, (at least to me!).
Why do you need to crawl the pages and index them if there are fine grained permissions on them?! How do you show search results without violating the permissions? Why not index the back end by passing the pages altogether (I mean index the database records not the pages)....
I'd like to be able to use these "best of breed" opensource solutions, with the only requirement of some sort of single-sign-on between the different sites. I don't want my users having to log-in in 3 different places, so I though it could be possible with OpenId.
Has anyone tried something similar?
OpenID will not avoid the problem of having to sign in 3 separate times. It was allow the user to share the same login credentials between the sites, but they will have to actually log in to each of the three systems. If that is not a problem, go with OpenID. If it is, you have two options:
Use an LDAP server to authenticate on all three sites. I think all three software packages have modules/plugins for LDAP (Drupal, Moodle, MediaWiki). Once you have the LDAP server running, the rest should be easy.
Write custom modules/plugins for each platform that authenticate against a single database. Maybe you could use the Drupal database as the primary one, and have MediaWiki and Moodle authenticate with that. So, effectively, the user will only have an account on the Drupal site, but will get access to all three. This is basically the same idea as an LDAP server, but might save you some overhead and complication.
There is also the Moodle Integration module for Drupal that attempts to accomplish the same thing, only without MediaWiki in the mix. I would check that out.
Good luck!
here are three possible solutions: (1) sigle sign-in site, (2) inject login/register forms into all sites using server site includes - SSI and (3) - ajax.
Single sign-in site.
suppose you have site1.domain.com and site2.domain.com and you want to login/register at both simultaneously. Probably the easiest way to do it will be to create another domain e.g. login.domain.com that will do the job. Your login/register application will need access to databases for site1 and site2 and/or their api's. Since login status usually resides in the cookies, your login application will need to set those login cookies to both sites simultaneously (on successful login/registration) and delete on logout.
To set cookies for all sites from login.domain.com - all of the must sit on .domain.com and cookie domain parameter must be .domain.com
If your solution needs both api access (to the other applications) and access to the same database by several applications - you may need to deal with database transactions. This is because new registrations won't be visible on other sites until transaction is committed - so for example - you can not call api from within login code to retrieve cookies before committing the transaction with the new registration.
One important detail. If you already have users separately registered at site1 and/or site2 but not on both your signon site will either have to handle those cases or you'll need to sync registrations manually yourself upon deployment of your new registration system. Manual fix won't be possible when extra user input is required to complete the cross-site registration. This point also becomes important when you add new sites requiring some new user input for the registration.
Finally, carefully choose domain name handling OpenID. To the best of my knowledge it is impossible to transfer openid endorsements across subdomains without users consent - please correct me if I am wrong. You don't want to ask users to re-register just because you decide to rename the sub-domain.
server side include (ssi) method
Another solution is to inject those forms via sever-side includes into all sites. This may be considerably harder and will depend on the type of webserver in use and will work slower.
A pre-requisite here is that all your applications run on the same subdomain - so that openid works for all of them.
I've once built common user registration for MW (php) and cnprog (python/django).
My solution was to display the same exact registration form on the wiki and the forum site, while generating and processing this form with django. I did it this way because wiki and forum "skins" are so different that I did not want to surprise visitors with the dramatic change of site appearance when they go to the registration page. This is complicated and I will not do it again :) and instead would go with single sign-in method.
in order to display django output through mediawiki I've created a wiki extension printing apache "include virtual" call to glue django-generated content with the wiki output. This comes with problems.
Apache include virtual on my installation cannot POST to subrequests and cannot pass cookies from subrequests and cannot pass redirect responses (all http headers will be thrown out) to the upstream user requests.
So I've added "was_posted=true" to mark the posts for django and a secret code to prevent cross-site forgery. To get the cookies out - had them printed with cookie_morsel.output_js() in python. So javascript must run on the client for this to work. Any redirects will have to be done with javascript too. Extra work will still be needed to upload files (like avatar picture).
So single sign-on may be the best solution.
ajax may be a neat way around - just build forms in all of your sites with javascript and submit them via ajax. Will work fast and will not break appearance of your various sites,
but this won't please the folks allergic to javascript.
Actually, the only method that does not require any javascript is single sign-in site.
Posted this because I've spent enough time building this thing for MW and django - an hour of typing did not make a difference :).
Here is the situation. I have a site that only allows one user to be logged in at one time. However, I need a server to scrape this site and put data into the database. However the admin need to be be able to log into this site from time to time.
So what I would like is for the server to proxy the admins login so that the server won't attempt to login while the admin is logged in.
How would I go about doing this?
Thanks
EDIT: Sorry, it totally slipped my mind, the reason I need to come up with such a complicated setup is because I do not have the source for this site, nor does the site allow any sort of extensibility. Basically I plan to add features by proxying the site through a more featured filled version of the page that will allow the user to access features not availilbe for the sites normal interface.
If this is a web application you've built yourself, you should be able to track this pretty easily. You just need code that stores the login state of your users in a global way. Possible ways of doing this would be to utilize the database (if this is database driven) or store it in an Application variable (if this is done in ASP.NET).
Then whatever process you've put in place where the server is "scraping" the site can check to see if anyone is logged in before logging in itself.
However, I must ask -- is this something you've built yourself or are you trying to add functionality to an existing product? The reason I ask is that I can't figure out why you tagged this with "proxy" and not the language it was written in. If you don't have access to the source code, for example, that would change things.