SonarQube HTTP header authentication - http

Has anyone successfully implemented Sonarqube based on documentation given here ?
I have a project dashboard, which authenticates using my company's LDAP. The requirement is to use this authentication and invoke Sonar using X-Forwarded-Login & X-Forwarded-Name headers as mentioned in the documentation. Is there a way to achieve this ? Based on my analysis, this is not possible, since we cannot set headers for external redirects.
Is there anyway to achieve the stated objective ?

Related

How to (can you) configure Azure API Management for Auth0 single page application

Scenario: Auth0 Single Page application client. .NET Web API and Angular SPA both configured to use this client. Works great.
I'd like to add Azure API Management as a layer in front of the API. Have set up the API in the Management Portal, updated SPA to call API, tested calls from SPA, works great.
Now, I'd like to configure API Management Portal with the right security settings such that people can invoke API calls from the Developer Portal. I've used this [https://auth0.com/docs/integrations/azure-api-management/configure-azure] as a guide.
Where I'm at:
From the Developer portal, I can choose Authorization Code as an Auth type, go through a successful sign-in process with Auth0 and get back a Bearer token. However, calls made to the API always return 401. I think this is because I'm confused about how to set it up right. As I understand it:
either I follow the instructions and setup a new API client in Auth0, but if that's the case then surely it's not going to work, because tokens generated from one client aren't going to work against my SPA client? (or is there something I need to change to make it work)
or, how should I configure Azure API Management to work with a SPA application. (this would be my preferred method, having two clients in Auth0 seems 'messy'). But, don't I need an 'audience' value in my authorization endpoint URL? How do I get that?
If anyone has done this, would very much appreciate some guidance here.
Well, I didn't think I'd be back to answer my own question quite so soon. The reason is mostly rooted in my general ignorance of this stuff, combined with trying to take examples and fuse them together for my needs. Posting this to help out anyone else who finds themselves here.
Rather than take the Single Application Client in Auth0 and make it work with Azure API Management, I decided to go the other way, and make the non-interactive Client work with my SPA. This eventually 'felt' more right: the API is what I'm securing, and I should get the API Management portal working, then change my SPA to work with it.
Once I remembered/realised that I needed to update my audience in the API to match the audience set in the Client in Auth0, then the Management Portal started working. Getting the SPA to work with the API then became a challenge: I was trying to find out how to change the auth0 angular code to pass an audience to match the one the API was sending, but it kept sending the ClientID instead. (by the way, finding all that out was made easier by using https://jwt.io/ to decrypt the Bearer tokens and work out what was happening - look at the 'aud' value for the audience.
In the end, I changed my API, in the new JwtBearerAuthenticationOptions object, the TokenValidationParameters object (of type TokenValidationParameters) has a property ValidAudiences (yes, there is also a ValidAudience property, confusing) which can take multiple audiences. So, I added my ClientID to that.
The only other thing I then changed (which might be specific to me, not sure) is that I had to change the JsonWebToken Signature Algorithm value in Auth0 for my non-interactive client (advanced settings, oAuth tab) from HS256 to RS256.
With all that done, now requests from both the API Management Portal, and my SPA work.
Curious to know if this is the "right" way of doing it, or if I've done anything considered dangerous here.
Since you're able to make the validation of the jwts with the .Net API work, Only few changes are actually necessary to get this working with Azure API Management.
In API management,
Create a validate-jwt inbound policy on an Operation (or all operations)
set the audiences and issuers the same as what you've used with your .NET web api. (you can check the values in Auth0 portal if you don't know this yet)
The important field that is missing at this point is the Open ID URLs since auth0 uses RS256 by default. The url can be found in you Auth0 portal at: Applications -> your single page application -> settings -> Scroll down, Show Advanced Settings -> End points. Then copy the OpenID Configuration
Here's the reference for API management's requirement for JWT tokens
optional reading

Intermixing Spring Security OAuth2 and Jose4j

I was wondering (before I try implemnting somethign along this path) about an approach.
Lets say I have a complete working OAuth2 system (using spring boot and spring cloud but not spring cloud security). This, so far has worked quite well and supports several different grant types.
What I am interested in is the possibility of hand creating the JWT in certain special cases and then utilizing this token with spring security.
I looked at jose4j and it seems like i should be able to use this in place of the authorization server portion of the system. Note that the goal of this was where i had to create a token that would have normally be generated by the authorization server.
You might want to look at TokenEnhancer or in particular JwtAccessTokenConverter. The latter one also provides encode/decode methods you can overwrite to use custom libraries for encoding/decoding of tokens or to add custom properties.
Note: Make sure to check the signature of the tokens!

REST API's and PEGA

I can see that you have some expertise with REST API's and PEGA. I would like to know if we expose web call using REST API's to PEGA, will we get all custom rules and all or we need to replicate the rules?
Regards,
Sudhanshu
If you are consuming the REST API and will be using Pega as the REST client, you do not have to create the rules manually. There is an accelerator ("wizard") that will create the rules for you, based on an example request/response for the REST API.
REST mean "Representational state transfer" in pega to create the REST , it provides widget by using widget we can create it. Please visit the link for more updates:
https://myknowpega.blogspot.com/2019/04/pega-81-application-development.html
Since your question is not entirely clear, let me specify for both options (i.e., connector and service). Besides, I am assuming you are using the latest version of PEGA
Integrating Pega with an external service (via Connect-REST)
This can be achieved using the integration wizard by navigating as specified below Configure -> Integration -> Connectors -> Create REST Integration
Documentation to achieve it step by step is listed here
https://docs.pega.com/data-management-and-integration/84/creating-rest-integration
Pega Exposed as a service to external systems via Service-REST
It is bit of a manual process involving creation of service packages, service-REST rules and the configuring the methods GET/POST/PUT/PATCH/DELETE and its corresponding responses.
Documentation on the same is available here as indicated below.
https://docs.pega.com/data-management-and-integration/84/service-rest-rules

How to perform cross-domain SAML-based authentication/trust

I have a product which consists of internal ASP.NET/MVC web sites all using WIF to enable SSO through a custom STS/IdP service. We now have a new partner site hosted outside our network on another domain and would like to enable SSO for users as they navigate between the sites. The new site uses different technologies (e.g. python) but we assume we can create a trust relationship using SAML standards as the protocol.
With SAML as the underlying protocol we assume this can be achieved but we cannot find any guidance on patterns for implementation, best practice guidance, etc... Can some recommend some resources on how to establish this type of cross-domain trust?
Note: While other options like OAuth could address this, we would prefer to stick with a SAML-based solution
Does your custom STS/IdP service support SAML?
On the python side, they will need a SAML stack. There's a number around - refer Introduction to OneLogin's SAML Toolkits e.g. There's a good diagram there as well that shows the login flow.
Then you need to get the python SAML metadata and give them your custom STS/IdP SAML metadata.
Import on both sides, configure whatever assertions you need and you should be good to go.
You'll also need to sort out the signing certificates that go into the metadata.
Example of process using simpleSAMLPHP - Configuring the SP
Following provides excellent example of SAML implementation:
http://www.codeproject.com/Articles/56640/Performing-a-SAML-Post-with-C
We used it in our project and it worked fine.

How to authenticate in order to access Kibana 2.0?

Q: Kibana is great, but I want to make it so users have to authenticate in order to access it. How do I do that? A: This can be handled a number of ways. The best way is to run Kibana with Passenger and Apache or Nginx. There's sample configurations in the sample directory. You can then handle your preferred authentication mechanism with Apache or Nginx.
How do I do this? I do not use any of these programs. Could someone give me a basic overview of what I have to do? Any help would be nice I am a Student and learning but I need help to stay going, I don't know everything.
I am running Ubuntu.
Well, actually even if you make a kind of authentication against Kibana it won't be enough. As you probably know Kibana runs over Elasticsearch, so even if you "limit" permissions to Kibana , everyone can still have access to elastic search and see existed indeces or even create new ones. So, the main question is whether you can manage AuthN && AuthZ against ElasticSearch.
For authentication you can integrate Kibana/ElasticSearch with any framework (example Play,Spring MVC etc) which ever your are using.Create a login page (authentication) using the framework,point the Kibana to web server/app server embedded in the framework and pass the kibana request to Elastic Search and response from Elastic Search to Kibana through this framework.Basically this framework will be a mediator between kibana and ElasticSearch.Also need to block the Elastic Search server port,so that no body could directly access the ES.
Kibana<-->Intermediate Framework<-->Elastic Search
Hope this helps!

Resources