WSO2 Identity Server - User disabling does not work until the server is restarted - wso2-api-manager

I'm using the WSO2 Identity Server version 5.3.0. I'm also using WSO2 API Manager 2.1.0.
I created 2 APIs, one which disables the user which by calling the admin service and also an API to enable the user as well. I also created an API to check the user status(disabled/enabled) which checks whether the relevant user is enabled or disabled once the username is entered.
The whole process works fine for a couple of rounds.
disable a user -> check the status(user gets displayed as disabled) -> enable the same user -> check the status(user gets displayed as enabled)
However, if the same user is disabled from a remote computer, the status wrongly gets displayed as enabled and also the user does not get disabled as well. But after I restart the Identity Server, the status gets displayed correctly as disabled and the user is also seen as disabled from the previous API call.
Has this issue got something to do with the cache in Identity Server?
Any suggestive approach to solve this issue is much appreciated. Thanks

From your description, it seems you have more than one node of Identity Server in your deployment. If so, you have to enable clustering in order for the caches to be synced. Otherwise the cache update in node 1 won't be reflected in the node 2 until the cache expiry time reaches (default 15 minutes) or a restart.
Enabling clustering for US 5.3.0

Related

HCL Domino: how to drop users logged in with LtpaToken

Before I start here: this is a duplicate from another post at serverFault.com ("https://serverfault.com/questions/1009946/hcl-domino-how-to-drop-users-logged-in-with-ltpatoken"); I first felt that would be the place to ask, but since I never got an answer I decided to try once more here.
We are running a Domino server (V 10.0.1 FP3) hosting a number of Xpages applications. Apart from the admins all users are accessing those application through http only (i.e., no nrpc / Notes client access).
The users' person data are synched from AD to Domino via TDI. Authentication is done against the company's AD, this is configured through a Web SSO config document. There however is no complete SSO set up so that users will have to authenticate when first accessing the applications through their browsers.
For application related reasons we set the Ltpa token's expiration time to a rather high value (if necessary I am willing to discuss the reasons for that in a separate thread, but this is not really related to this question here).
Here's a screenshot of the config page:
Domino's http task is restarted every morning at 2:30 through a program document issueing
restart task http
Some observations from that server:
an http restart apparently doesn't invalidate the tokens, i.e. once the http is back up a user who kept the browser open can easily continue accessing the applications without having to re-authenticate (within the expiration time frame). (EDIT): This appears to be true even if the entire domino server is restarted
if users simply close their browsers instead of correctly logging off the tokens at the server side aren't removed (again, as long as they don't expire). If then the user logs on again a 2nd /3rd / 4th token for this user appears server-side
there's apparently no direct way to drop a user session, neither through a simple drop console command nor through admin client actions
Question:
are there ways to drop those user sessions from the server side and/or truly invalidate the tokens?
Basically, what I'm looking for is a way to make sure that users have to re-authenticate every morning. As http is restarted at 2:30 every morning (see above) it would be ideal to also do this at or around that time.
For completeness:
for this server we use an Internet site document which is setup like this:
Any hint is very welcome.

How to catch user closed window without logging out of application

My requirement is user should not log in from multiple browser or system simultaneously in application. For this I maintained the flag in database, so whenever user is logged in I am updating flag as yes and when he logged out I am updating it as no.
My issue is if user close the browser window without logging out from application then I am unable to update the flag. So next time when user try to logging in application, It is saying user is already logged in.
I tried using Onbeforeunload event in master page, but whenever I am changing the menu in my application. It is firing that event. For updating the flag I used page methods. But this is not working properly.
I would say, send often via a ajax call to your API that a 'ping' to confirm you are still online. If there is no ping or page change after 3 minutes, I consider the user has been logged off and it sets him as "logged off" in database.
At least, I do this using javascript, but i'm sure you can also in your client-side part of the ASP.NET app you are making.
You can't handle this only by client-side code, using e.g. beforeunload, because the page/browser may be closed for many reasons (e.g. lack of electricity).
What you can do is:
Scheduler on your backend which verifies whether an user did some action since e.g. 1min. In that case you have to update information about user action in your DB after every ajax requests (Hugo Regibo suggested ping requests).
Disadvantage od this solution is this period - when an user turns off the page then he will be not able to log in again for 1min.
Instead of a scheduler you can verify logged-in users (I assume you keep them in a DB table) after each requests.
Use web sockets, you will have continuous connection and you will be notified about closed connection immediately. Disadvantage of this is web sockets don't scale so good as stateless HTTP.
Besides that I don't know whether you use iis with a session provider or not? And when an user closes the page and opens it again should be able to log in with his saved credentials. You should write more of how your project looks like.
I would do it by saving a Session object for each login call. A session ID would then be stored in the user's cookie or authentication token. Each call to the system would validate the user via their session ID. If that session has been invalidated, they just get sent to the login prompt. Whenever the user logs in, it invalidates all of their other sessions.
This way the user could be in their browser on their machine, navigate away, close the browser, and come back to find their session still alive without having to log back in. But, if they log into another machine, then their old session would be invalidated.

Asp.net windows authentication against domain - use local (cached) credentials when offline

I have ASP.NET MVC application that uses windows authentication against remote active directory server. The computer where the app runs is connected via VPN to the AD server. The problem is that after user logs into the PC with domain user and logs into the application it needs to run even while offline as well, but it throws this error:
The trust relationship between workstation and domain failed.
From what I understood there is no cookie and the authorization works on per-request basis. Is there any way to authorize the user name/password against the locally cached credentials? The connection often drops and the application needs to keep running.
Also I can't turn on Anonymous Authentication as we want to sign in users without providing credentials.
Any suggestions appreciated.
Thank you
It was due to calling (while off the network)
User.IsInRole(role)
We have custom role management, so removing base.IsInRole on our custom WindowsPrincipal solved this issue.
After doing research I thought that it actually has to be on the network, but to keep using cached credentials you don't have to be, just do not try to fetch any user related information.

Trouble using Azure Cache Service (Preview) with Azure Website

We currently have 2 separate ASP.Net websites hosted on one of our server. These sites uses StateServer to maintain same session across both sites. We are looking to move these sites to Azure. I was able to upload both sites on Azure without any trouble but I was not able to share the session between both sites. I tried using Azure Cache Service (Preview) to maintain session but for some reason it doesn't work, it always uses different session when I redirect to the second site. I already spent quite a bit of time googling around with no avail.
To get to the bottom of the issue, I now have created 2 test sites and have uploaded it on the Azure. I changed the config to use the Cache Service (Preview) as SessionState but it still doesn't work.
The link to get to the test site is: http://sessiontestsite1.azurewebsites.net/
On the home page, please enter a value in the textbox and click "Update session variable", this will store the text into a session variable. Then, on the top right corner of the page there is a link called "Site2" which will redirect to the second site. I am hoping that second site will be able to access the session variable set in the "Site1" and vice versa. Please note on my local machine I was able to use the same session using StateServer.
I followed this link to configure cache service.
I read somewhere, people were saying that Cache Service cannot be used with Azure Websites but I think those are old posts, as per Scott Gutherie's blog (sorry, StackOverflow doesn't allow me to post more than 2 links) Cache Service can be used with Azure Websites.
Also our requirement is to use these sites as Azure Websites, we do NOT want to use WebRoles or VMs or CloudServices etc.
This is not how session works.
When you put a value in the Session the server assign the client an ID and sends it to the client in a cookie. In ASP.NET it's ASP.NET_SessionId. The client (Browsers) will only send the cookie to the remote address associated with it. If you use Fiddler or browser dev tools you will see the browser sending back the ASP.NET_SessionId cookie to sessionstatesite1 and not to sessionstatesite2 because it has a different hostname.
You can either add the cookie manually (using browser dev tools for example) and test it again.
If you want to use the cache for user session and have the user access the site using 1 URL that get's routed to different Azure Website Instances, that cache will work fine for you. Again you can verify that using multiple ways. Adding the cookie manually above should show you that it's working. If you wanna do it in an end-to-end flow do this:
Create an Azure Website
Scale the site to run on multiple instances
Configure your site to display the Process Id or Instance Id Environment Variable
Configure your site to use caching as you already did before
Store something in session state
log in to kudu (https://<yoursitename>.scm.azurewebsites.net)
Go to process explorer view (https://<yoursitename>.scm.azurewebsites.net/ProcessExplorer) and right click on the w3wp.exe and kill it (you can also verify the PID there)
send a request again from the browser, it will go to a different instance and you can tell by the Process Id, but the session value will still be saved.
Because session sharing in our current setup doesn't work we have decided to use two sites as virtual applications.
We have now created a new website and that website has two sub folders (set as virtual applications). Both sites now sits in theses sub folders. That way session state using cache service works.

Does SDL Tridion CMS authenticate user’s credentials every time a user does an activity on the CME?

It’s a plain vanilla SDL Tridion install where users are added in the CMS CME.
If user successfully log-in to the SDL Tridion CME (No LDAP-AD sync is used means MMC console has no LDAP-AD stuff).
Now user goes on doing his/her business edit/create of content page or navigation stuff around in CME.
Does Tridion authenticate every time user does such activity? Is this authentication done against the LDAP-AD or some sort of cache (if there is one?)
Long long time back someone from advised me there is a config in Tridion where you can turn it On or OFF to authenticate for each/every activity user does after successful login.... (am not able to recall the clarity of the conversation).
I believe IIS will indeed re-authenticate every request.
If you use something like Fiddler (which I strongly recommend) you will see that every request to a URL comes back first as an HTTP 401, then is sent back to the server with the correct credentials.
So, yes, every request must be authenticated. If you disable a user account while that user is working in Tridion, he will start getting "access denied" errors half way through their session.
Yes - Tridion will authenticate you every time. In other words, every time you make a web request, a new TDSE or Session will be created. This can be far more often than you might think. I don't know the details for SDL Tridion 2011, but back in the R5 days, I recall that it took 6 authentications to load your starting view of the GUI.
It's quite likely that authentications against an LDAP server might be cached. The old ISAPI filter did this, however I don't recall the authorization lookups being cached. There was never a solid requirement to support exotic or "heavy" LDAP integrations.
There used to be a setting that would make the GUI cache the TDSE, but this was only ever an internal-use experimental feature. I wouldn't recommend trying to use it, and it's very clear that this configuration isn't supported.

Resources