Is there any authentication method on client id and the operation - ibm-datapower

Is there any idea for the below requirement
I’ve soap ws with three operations add, delete and update. It needs to authenticate Two things
First client id that will be sent in request body using AAA action if true then second will check the privilege on allowed operations for that client

You can write your own Custom AAA stylesheet and do multiple checks in it. Another option would be to run multiple AAA actions after one another but I'd go for the Custom Stylesheet...
See this quick sample here: https://www.ibm.com/support/pages/authentication-customization-websphere-datapower-aaa-policy

Related

Logic App using HTTP Action to access and GET Jira ticket

So I followed this answer and it works fine:
https://stackoverflow.com/a/74981947/20829088
Provided URL:
https://<YOUR_DOMAIN>.atlassian.net/rest/api/2/search?jql=project=<PROJECTID>&fields=issue,status,name&startAt=0&maxResults=8000
However, it take a lot of time. So, if I want the url to check for specific ticket depending on created time and type of ticket. For example
I want ticket that is created within 15 days and that are NOT a sub-task.
so I tried something like this.
.....&fields=issue,summary,issuetype&created>=-15d&hierarchylevel=0
I'm not sure of how it should be written I just tried this and it doesn't work.
Here is the request result in JSON:
It should be earthier [subtask=false] OR [hierarchylevel=0] OR [name=Task]
After reproducing from my end, I could able to achieve this using Condition connector of logic apps. I have initialized an array variable first and then tried to append each item that satisfies the condition. Below is the flow of my logic app.
and then I used Parse Json to retrieve the required values for condition comparision
You can use the below Code view to reproduce the same in your environment.
{"definition":{"$schema":"https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#","actions":{"For_each":{"actions":{"Compose":{"inputs":"#items('For_each')?['fields']?['created']","runAfter":{},"type":"Compose"},"Condition":{"actions":{"Append_to_array_variable":{"inputs":{"name":"Array","value":"#items('For_each')"},"runAfter":{},"type":"AppendToArrayVariable"}},"expression":{"or":[{"greaterOrEquals":["#formatDateTime(outputs('Compose'),'yyyy-MM-dd')","#formatDateTime(addDays(utcNow(),-15),'yyyy-MM-dd')"]},{"equals":["#items('For_each')?['fields']?['issuetype']?['subtask']",false]},{"equals":["#items('For_each')?['fields']?['issuetype']?['hierarchyLevel']",0]},{"equals":["#items('For_each')?['fields']?['status']?['statusCategory']?['name']","Task"]}]},"runAfter":{"Compose":["Succeeded"]},"type":"If"}},"foreach":"#body('Parse_JSON')?['issues']","runAfter":{"Parse_JSON":["Succeeded"]},"type":"Foreach"},"HTTP":{"inputs":{"authentication":{"password":"<API_KEY>","type":"Basic","username":"<USERNAME>"},"method":"GET","uri":"https://jira#<ProjectName>.atlassian.net/rest/api/2/search?jql=project=<ProjectID>"},"runAfter":{},"type":"Http"},"Initialize_variable":{"inputs":{"variables":[{"name":"Array","type":"array"}]},"runAfter":{"HTTP":["Succeeded"]},"type":"InitializeVariable"},"Parse_JSON":{"inputs":{"content":"#body('HTTP')","schema":{"properties":{"expand":{"type":"string"},"issues":{"items":{"properties":{"expand":{"type":"string"},"fields":{"properties":{"aggregateprogress":{"properties":{"progress":{"type":"integer"},"total":{"type":"integer"}},"type":"object"},"aggregatetimeestimate":{},"aggregatetimeoriginalestimate":{},"aggregatetimespent":{},"assignee":{},"components":{"type":"array"},"created":{"type":"string"},"creator":{"properties":{"accountId":{"type":"string"},"accountType":{"type":"string"},"active":{"type":"boolean"},"avatarUrls":{"properties":{"16x16":{"type":"string"},"24x24":{"type":"string"},"32x32":{"type":"string"},"48x48":{"type":"string"}},"type":"object"},"displayName":{"type":"string"},"emailAddress":{"type":"string"},"self":{"type":"string"},"timeZone":{"type":"string"}},"type":"object"},"customfield_10001":{},"customfield_10002":{},"customfield_10003":{},"customfield_10004":{},"customfield_10005":{},"customfield_10006":{},"customfield_10007":{},"customfield_10008":{},"customfield_10009":{},"customfield_10010":{},"customfield_10014":{},"customfield_10015":{},"customfield_10016":{},"customfield_10017":{"type":"string"},"customfield_10018":{"properties":{"hasEpicLinkFieldDependency":{"type":"boolean"},"nonEditableReason":{"properties":{"message":{"type":"string"},"reason":{"type":"string"}},"type":"object"},"showField":{"type":"boolean"}},"type":"object"},"customfield_10019":{"type":"string"},"customfield_10020":{},"customfield_10021":{},"customfield_10022":{},"customfield_10023":{},"customfield_10024":{},"customfield_10025":{},"customfield_10026":{},"customfield_10027":{},"customfield_10028":{},"customfield_10029":{},"customfield_10030":{},"customfield_10033":{},"description":{},"duedate":{},"environment":{},"fixVersions":{"type":"array"},"issuelinks":{"type":"array"},"issuetype":{"properties":{"avatarId":{"type":"integer"},"description":{"type":"string"},"entityId":{"type":"string"},"hierarchyLevel":{"type":"integer"},"iconUrl":{"type":"string"},"id":{"type":"string"},"name":{"type":"string"},"self":{"type":"string"},"subtask":{"type":"boolean"}},"type":"object"},"labels":{"type":"array"},"lastViewed":{},"priority":{"properties":{"iconUrl":{"type":"string"},"id":{"type":"string"},"name":{"type":"string"},"self":{"type":"string"}},"type":"object"},"progress":{"properties":{"progress":{"type":"integer"},"total":{"type":"integer"}},"type":"object"},"project":{"properties":{"avatarUrls":{"properties":{"16x16":{"type":"string"},"24x24":{"type":"string"},"32x32":{"type":"string"},"48x48":{"type":"string"}},"type":"object"},"id":{"type":"string"},"key":{"type":"string"},"name":{"type":"string"},"projectTypeKey":{"type":"string"},"self":{"type":"string"},"simplified":{"type":"boolean"}},"type":"object"},"reporter":{"properties":{"accountId":{"type":"string"},"accountType":{"type":"string"},"active":{"type":"boolean"},"avatarUrls":{"properties":{"16x16":{"type":"string"},"24x24":{"type":"string"},"32x32":{"type":"string"},"48x48":{"type":"string"}},"type":"object"},"displayName":{"type":"string"},"emailAddress":{"type":"string"},"self":{"type":"string"},"timeZone":{"type":"string"}},"type":"object"},"resolution":{},"resolutiondate":{},"security":{},"status":{"properties":{"description":{"type":"string"},"iconUrl":{"type":"string"},"id":{"type":"string"},"name":{"type":"string"},"self":{"type":"string"},"statusCategory":{"properties":{"colorName":{"type":"string"},"id":{"type":"integer"},"key":{"type":"string"},"name":{"type":"string"},"self":{"type":"string"}},"type":"object"}},"type":"object"},"statuscategorychangedate":{"type":"string"},"subtasks":{"type":"array"},"summary":{"type":"string"},"timeestimate":{},"timeoriginalestimate":{},"timespent":{},"updated":{"type":"string"},"versions":{"type":"array"},"votes":{"properties":{"hasVoted":{"type":"boolean"},"self":{"type":"string"},"votes":{"type":"integer"}},"type":"object"},"watches":{"properties":{"isWatching":{"type":"boolean"},"self":{"type":"string"},"watchCount":{"type":"integer"}},"type":"object"},"workratio":{"type":"integer"}},"type":"object"},"id":{"type":"string"},"key":{"type":"string"},"self":{"type":"string"}},"required":["expand","id","self","key","fields"],"type":"object"},"type":"array"},"maxResults":{"type":"integer"},"startAt":{"type":"integer"},"total":{"type":"integer"}},"type":"object"}},"runAfter":{"Initialize_variable":["Succeeded"]},"type":"ParseJson"}},"contentVersion":"1.0.0.0","outputs":{},"parameters":{},"triggers":{"manual":{"inputs":{"schema":{}},"kind":"Http","type":"Request"}}},"parameters":{}}

HERE API demo vs actual

Our exact same endpoint demo query request using Freemium plan is different than the HERE API demo api endpoint results. As you can see, we do not have Address or contacts. I'm not sure why the results vary on same exact query and endpoint. Any ideas?
We have expanded the initial search/explore response based for the Demo App ID in order to enable users do some testing, but it is not turned on for the Freemium App ID. If you need specific details (like address and contact) then you can use places/lookup api like shown below (you search with source: sharing, id: id for the place you can get this from your above query).
We do this because we expect the end-user to select an item to get additional information. When selected, we receive that request, and it is a signal to us that the result is relevant and important to the query.
https://places.demo.api.here.com/places/v1/places/124aabd1-0aef738f80350f8bebb5ed7539bd19a8;context=Zmxvdy1pZD1lNjIyNjczZS0xNDRmLTViMzctYjY3Mi1hNWQ5MmRkNWU4NzRfMTU0MTc4NDk3MzYzOV8wXzU1NDcmc2l6ZT01JlgtRldELUFQUC1JRD1LTnZIaDlhZ0E2WGxKbElDRWhOZiZYLU5MUC1UZXN0aW5nPTE?app_id=xxx&app_code=xxx

Two wordpress database with same users

I want to have the same WordPress users in two different databases
For example, if a user registers on SiteA, then he can login to SiteB. And reverse.
Also i want create same cookie for both after login.
mywebsite.com/ (SiteA_DB)
mywebsite.com/blog/ (SiteB_DB)
I've never done this before and maybe Wordpress has hooks to archive this, but I prefer using mysql for such a trick.
You could try ..
.. using 'federated storage' ( https://stackoverflow.com/a/24532395/10362812 )This is my favorite, because you don't even have to share a database or even the mysql serverThe downside is, that it doesn't work with db cache and uses an additional connection.
.. creating a 'view' ( https://stackoverflow.com/a/1890165/10362812 )This should be possible when using the database-name in the query itself and it would be the simplest solution if it works. Downside: The 2 tables have to share the same mysql-server and have to be assigned to the same user as far as I know.
-- **Backup your database before trying!** --
DROP TABLE `second_database`.`wp_users`;
DROP TABLE `second_database`.`wp_usermeta`;
CREATE VIEW `second_database`.`wp_users` AS SELECT * FROM `first_database`.`wp_users`;
CREATE VIEW `second_database`.`wp_usermeta` AS SELECT * FROM `first_database`.`wp_usermeta`;
This should work, according to: Creating view across different databases
.. creating a 'shadow copy' ( https://stackoverflow.com/a/1890166/10362812 )Works with caching and is a standalone tableDownsides as 2. solution + a bit of setup and I think it might be the worst option in performance
This were answers to this question: How do I create a table alias in MySQL
I merged them together for you and made them fit your use-case.
Please also notice, that solution 1 and 2 will replace your current user-tables auf "second_database" because you write directly into "first_database" when querying the fed. storage or the view. This can lead to problems with user-role plugins. You should take care of syncing the plugin-options too, if you should use one of them and in case it uses different tables or 'wp_options' values.
Let me know if this works, I have to do a similar task next week. While researching I found the linked answers.
EDIT: I was missing the point of "cookie-sharing" in my answer. Your example shows a blog on the same domain - you should be able to change the way wordpress sets its cookies to be domain-wide. What I did once for 2 different domains was, that I hooked into the backend (is_admin) and added a javascript which did a post-request to siteB, receiving a token which is stored but marked as 'invalid' on siteB. This token then was passed back to my plugin on siteA which checked if the user is logged_in and (in my case) have adminrights (current_user_can()) and if so, it was sending this token back to sideB which was marking this token as valid to login. (Make sure only sideA can tell sideB to make this token valid!) Once a user is seen with this token in a cookie on siteB, the user is logged-in automatically in the background. Also I made this bidirectional. I am sorry, that I can't share the code for you. I don't have access to it anymore.
Greetings, Eric!

Bosun: Save Information using post url and the get the same information and use it in template

We have a notification which will post data to an application using the application end point.
notification ABC{
post = savedetailsurl
body = {{.|json}}
useBody = true
}
So the end point will save all the details in mysql DB.
Now in our template we call another end point to get the details which we saved using the webhook in notification.
template ABC {
use the " getDetailsUrl" and use the details in forming the email
}
Now the problem is race condition. Sometimes the details are not saved yet in the backend (mysql), and getDetailsUrl is called. So we get the empty result.
Is there are way to solve the race condition.
Bosun's notification system is designed to be very basic. If you want something more advanced you will need to use a separate system to generate the notification details and/or handle the alert workflow. Some people have used pagerduty or other monitoring systems like Shinken to do more advanced notifications or alert management.
Your best bet is to skip the built in notifications and do everything in a external system. You can still use the http://bosun.org/api to integrate with the various alert states (crit/warn/ack/close/etc) or you can change your alerts to use log = true to bypass all the built in states and create your own workflow.

Can multiple requests update a single environment variable in Paw?

I have a variable named primary_address_id which can be set or updated via several API requests. For example, I may call AddAddress and specify that the new address should be the primary, or I can call MakePrimaryAddress to set an existing address as the primary.
I'm coming from Postman where I have tests defined for each of these API endpoints to update primary_address_id -- simple. But I can't find a way to do this in Paw; it seems I have to set the value to the response of just a single request. Am I missing something obvious? Or is this feature planned for a future release?
A workaround is to set the value of primary_address_id to the response from GetPrimaryAddress, but that means if I'm adding or updating an address I have to make a second call just to update my environment (which I may forget to do). If I could trigger GetPrimaryAddress to run after the Add/Update/List/etc endpoints that would be an acceptable workaround, but I shouldn't need to manually make two separate requests to accomplish this.
It sounds like you will need to make two subsequent requests but you can make groups of requests that will execute in sequence from one command.
Right click the request list and click "New Group" then within that group you can make a sequence of requests that will update your desired environment variable each time.
Create a new group of requests
To run a group of requests click on the group name; in this case "Address" and then click "Send Requests"
Execute group of requests in sequence
Hope this helps.

Resources