I've setup forms authentication in my Google Search Appliance. Is there a way to have the title and a summary come back for protected pages? Currently, since they are all redirected to the login page, all search results are titled as "Login." I'm using asp.net with the .net framework 3.5.
You need to either:
Configure the Search Appliance to authenticate against your server.
Allow the search engine through to your protected pages.
On some of our client sites we've gone with option 2, partly because of the dynamic nature of the protection (i.e. articles published in the last 30 days are open, but you need a subscription to see the archive) which didn't lend themselves to using web.config settings.
We have a "Base Page" class that inherits from System.Web.UI.Page, and that all our pages inherit from.
In that class, we check a number of things, including the IP address and user agent of the calling client, if these match our search engine, we display a custom page layout that removes things like navigation, header, footer, etc (using a master page), and display some additional metadata that we use for filtering - this way the search engine sees and indexes the entire content.
If these checks fail, then we check to see if the user is authenticated, and if they have a vaild subscription.
If they don't have a valid subscription or aren't authenticated, we display a summary of the page, in place, along with a call to log in or register (using standard ASP.NET controls).
If the title of your pages is something other than login, you probably haven't set it up correctly. The title of the document is what was indexed by the GSA during the crawl. I posted previously some tips to completing the SSO wizard here: http://www.mcplusa.com/blog/2009/02/completing-the-sso-wizard-on-the-google-search-appliance/
Related
I have a website that is mostly for anonymous users to access public information on listings pages. A small subset of our user base will have password protected accounts that let them customize the filtering and sorting of information on these pages. The idea is that is that once the user logs in, the site can remember their viewing preferences so when they go to a particular listing it shows up the way they want it to.
Currently we are using Next.JS's Incremental Static Regeneration to serve pre-rendered pages. This is working great for anonymous users.
But I worry that if we add authentication and custom sorting, we would either have one of two issues:
If we keep getStaticProps, logged in users would get a flash of unstyled content as the hydrated page detects that the user is logged in and re-sorts the page content. Or they would get a loading state before actually seeing the content.
If we switch to SSR, we'd get slow authentication checks on the backend for everyone on every page load including for anonymous users, who are the vast majority of our users.
Is there a better way to deal with this? I wonder if, for example, there is a tweak I can add to server.js or something that switches from static to SSR if it sees a session cookie in the request headers.
Here is the scenario...
I have a site:
http://internet.com
and I set a token(cookie, something like that) from http://internet.com when a user has SUCCESSFULLY logged in.
I also have http://web.internet.com.
On http://web.internet.com I want to display data to users that have that token/cookie/etc available to them.
Here is the use-case
user logs into http://internet.com (asp.net framework hosted on different server - this is our primary product that requires a subscription / username & login )
user then has access to a section that is hidden from plublic view on http://web.internet.com (wordpress site hosted on goDadday - this site contains a knowledge base that we do not want to make public unless they have done [XXXXX] )
both sites are hosted independently of each other and do not share a common username and password
======
Another scenario is to set up wordpress to allow a specific section as a jsonp response. but only if the user is logged in at http://internet.com to allow the user to have access to the jsonp response located at http://web.internet.com
Any ideas from you beautiful people?
It really depends on the level of security you require. You can log a user in to a Wordpress site without a password by using wp_set_auth_cookie, however if you are just validating that a user is logged into the ASP.NET site and then using JSONP to load a page that set's the auth cookie, it will work, however you definitely have some security gaps.
A better solution would be to set a domain level cookie for .internet.com with a token that can be read by any server in your domain. The Wordpress site could then check is_user_logged_in(), and if not take that cookie value and make a back end call to the ASP.NET site to verify its authenticity, and then call wp_set_auth_cookie(). A simple web service would likely be the best option. You would still need some level of mapping between usernames on the ASP.NET and Wordpress site however to know which user_ID to pass.
I have a several site collections in the same web application and I need to handle events when user goes from one site collection to another. I need it for specific actions, like setting "lcid" cookie for changing default language of site and claims values to user properties mapping.
Currently I'm using custom HTTP module, which handles all PostAuthorize web application requests and checks current user and site collection, holds last visited site for each user in collection and fires a custom event for subscribers, when detects transition between site collections.
But I think this approach slows down performance of web application. And from logs I see that there are to many PostAuthorize requests even when user simply clicks a link to page in other site collection. Also, in similar cases sometimes there is a series of requests to "next" site collection, then to "previous", and then again to "next". Also there are some issues with SharePoint Designer (can't edit page) become when this module is active.
Could you give me a advice with better approaches for this task? Thanks in advance.
1 way is using a hidden control and cookie.
Keep a hidden control in all the masterpage of all the targeted sitecollections.
This control will check the current site collection url and save it in a cookie. Possibly in the same cookie where you are storing lcid.
From next load onwards it will try and match the url in the cookie and the current site collection url. If different call the code you want to execute and update the url in the cookie.
This will be much lighter on performance than an httpmodule.
Im building an ASP website with user login. Does any one knows what is the best and must secure way to make login page and make pages restricted access? I know some ways and used them for some website but sometimes they were not that secure. There is couple access level for this website. Admin, User, Sales Team, and couple more. Thanks.
you can use session variables to store user level and then on asp code define what user can or can not see.
Or in database, I assume, you have field where level of access is defined as well.
Basically make your security level part of SQL query and show only data user should be able to see.
Basically you should have level of access in database, login page verify credentials and then store user level in session variable.
On any given page, while header loads, ASP retrives session variable and compare it to database.
If user have clearance to see that data he will if not-- display message that he is not authorized or redirect somewhere else where he can be.
Add an include file at the top of your ASP pages which is executed before any of the page's code. This way you can write your security code once, and apply it to all of your pages.
Assuming you are using IIS as your web server, you can let it handle your website security by using the different available authentication methods.
http://www.microsoft.com/technet/prodtechnol/WindowsServer2003/Library/IIS/9b619620-4f88-488b-8243-e6bc7caf61ad.mspx?mfr=true
http://www.authenticationtutorial.com/tutorial/
Perhaps the best authentication method for you would be Windows Integrated Authentication since it allows you to create groups (or maybe use the existing ones) to give access to certain directories or pages.
Let's say I have an ASP.NET web application. I create an aspx page that shows a table containing users and email addresses. The user data is stored in a database, and when the page is requested by a logged-in user, html is generated to display the data. If the users requesting the page are not logged in, they are redirected to a sign-in page.
All of this is very standard.
My question is, is there any way the personal data could end up being indexed by a search engine (besides someone hacking into the site or an evil user publishing the data somewhere public)?
What if there was no requirement that users log in? Would the data then be indexed?
In general, search engines should index exactly whats visible to the public visitors, google will be angry with you if you'll expose something different to their spiders.
if you want to control the pages that are indexed on you server check out: http://www.robotstxt.org
If the users don't have to login to access the data, then I see no reason why a search engine could not get access to it. Your data will be indexed if it's not protected by a login.
If there's a login mechanism, it will not be indexed.
IMO you should remove the login requirement from the profile page and also make a sitemap to give a list of users to the search engines. You should prevent guest from viewing users' extra information only.