DataPower monitoring or validation techniques - ibm-datapower

How to improve datapower monitoring ? I want to improve our monitoring techniques say for example, want to check that all objects (FSH /MQFSHs, SSl proxy, crypto profile etc) are up and incase if it goes down , should be notified by email or something. checking number of files in file management ondisk folders.Basically validate the adapter after deployment (we use soapUi to test adapter functionality, however something else to improve or added validation).please suggest any ideas that can be implemented as a process improvement on Datapower

For example you can get the status off all your domains using this soma call. You can test this using soap UI. You can get the list of various soma calls using the datapower mgmt wsdl (available in datapower store directory).
<!-- get all the domains -->
<xsl:variable name="domainsList">
<dp:url-open target="{$XML-MGMT-URL}" response="responsecode">
<env:Envelope xmlns:env="http://schemas.xmlsoap.org/soap/envelope/">
<env:Body>
<dp:request xmlns:dp="http://www.datapower.com/schemas/management">
<dp:get-status class="DomainStatus"/>
</dp:request>
</env:Body>
</env:Envelope>
</dp:url-open>
</xsl:variable>

Try using SOMA commands of XML management interface to check the object status.

I am not sure if this is the best approach but this is how i have implemented it. You can always create a testing service in DataPower with/without interactive java application to perform all the soap test you are performing using soapUI. You can perform SOMA/AMP calls to check the status of objects, ping external services, etc. You can schedule these test on a regular interval or manual.
Depending how you set it up, you can either generate an email with status of each object/service you are testing or create a html dashboard that records the current status of everything.

Related

Connecting application to a single notebook in Evernote, supportLinkedSandbox parameter

I am trying to limit access to one nootebook using supportLinkedSandbox=true parameter as it is described here:
http://dev.evernote.com/doc/articles/app_notebook.php
It seems this paramerter has no effect on sandbox environment.
https://www.sandbox.evernote.com/OAuth.action?oauth_token=...]&preferRegistration=true&supportLinkedSandbox=true
What could be wrong?
In order to use supportLinkedSandbox=true, the notebook sandboxing feature needs to be enabled for a particular API key. You can request enabling this feature by writing to devsupport at evernote dot com.

Simplest possible BAM Scenario

I’m trying to set up a very simple BAM scenario within BizTalk Server 2013R2 upon which to build, involving tracking just the time of all incoming messages processed by a port.
To this end I have :
Within Excel, created an Activity Definition (called
SimpleReceiveTest) containing a single Item called ReceiveTime of
type milestone (date/time), and a View Definition (also called
SimpleReceiveTest) containing just this Activity Definition and Item.
Imported this BAM definition spreadsheet using bm.exe
Added view rights to SimpleReceiveTest again using bm.exe
Launched the Tracking Profile Editor, imported the BAM Activity
Definition, and mapped ActivityID = MessageID and ReceiveTime =
PortStartTime by drag and drop from the Messaging Property Schema, as
shown below :
Set the Port Mappings for MessageID and PortStartTime to relate to a
test Receive Port ReceivePort1 that I am using for testing. This is
using a pass-through pipeline.
Saved and applied the above Tracking Profile
It is my understanding that for any messages received on port ReceivePort1 I should now get a tracking activity created. However this is not happening – there are no records in any of the BAM tables/views and no data is available within the BAM Portal.
I have tried restarting the hosts, and have verified that the TDDS_FailedTrackingData table is empty, there is nothing relevant in the event log, a Tracking host is running and the SQL Agent Jobs are running. I have also tried running these jobs manually.
Have I missed something, and am I correct in my expectation that this simple scenario should create tracked activities for any messages passing through the Receive Port? If so what can I try to further diagnose this?
Now fixed - it's actually a bug in vanilla BizTalk 2013R2 when using a standard pipeline that has been fixed in CU2.
FIX: BAM tracking doesn’t work when you use the XMLReceive or a custom pipeline in BizTalk Server

Is there any way to input the result got from the curl via fluentd?

We are seeking the most simple way for sending alfresco's audit log to elasticsearch.
I think using the alfresco supplying query and getting audit log would be most simple way.(since audit log data is hardly watchable on db)
And this query processes the effect measure as json type then I'd like to download the query direct using fluentd and send to elasticsearch.
I roughly understood that it would ouput at elasticsearc but I wonder whether I can download 'curl commend' using query direct at fluentd.
Otherwise, if you have other simple idea to get alfresco's audit log then kindly let me know.
I am not sure weather I understood it fully or not but based on your last statement I am giving this answer.
To retrieve audit entries from alfresco repository you could directly use REST APIs of Alfresco which allows you to access them.

Biztalk WCF-webhttp (WCF web publishing wizard)

[I am new in biztalk trying to publish and consume servcie using webhttp (using Biztalk 2013, VS 2012)
getting following message and don't know want to do next to solve this issue.
*you have created a service.
To test this service, you will need to create a client and use it to call the service. you can do this using the svcutil.exe tool from the command line with the following syntax:
svcutil.exe "http://[host]/expwebhttpsampledesktop/service1.svc?singlews"*dl
"svcutil.exe" command it generates .cs, .wsdl, and metadata.xml files for me.
not sure what i am doing wrong here but trying to consume the service i made. and at the end of it i am getting following error
"Error consuming WCF service metadata. Message part missing element. Correct service description ""http://tempuri.org/" message type "service1_operation1_inputmessage"" part "Part" and return the wizard."]
thank you in advance
You need to create a client that will now consume the service. A client can be anything from a simple Console app, a BizTalk Send Port, another Web-Service or a Winforms/WPF app. The client will invoke your service (possibly passing parameters), you service will do its stuff and return a response back to the client.
There are a number of ways to create a client, however you might want to start with this tutorial from MSDN: http://msdn.microsoft.com/en-us/library/ms733133.aspx.
Alternatively, you might want to search for 'Add Service Reference Visual Studio 2012'. Adding a service reference creates the necessary libraries for your client to consume the service.
UPDATE: I found some relevant screenshots, so I thought I would add them....
To add a Service Reference, right-click on your Project and select 'Add Service Reference':
within the 'Add Service Reference' dialog, enter the address of the service (in your case http://[host]/expwebhttpsampledesktop/service1.svc) and click 'Go' for the wizard to auto-discover the service methods. Finally, update the service Namespace:
You will now be able to reference your service just like any other type within C# in order to invoke it.
HTH, Nick.

Pool Multiple Messages with BizTalk 2006 SQL Adapter

I have a StoredProcedure that returns a simple table containing several records:
DECLARE #STEPS_TABLE AS TABLE (OrchestrationID uniqueidentifier, [Message] nvarchar(1000));
-- LOADING THE VALUES HERE
SELECT * FROM #STEPS_TABLE As Step FOR XML AUTO, XMLDATA, ELEMENTS
I used the SQL Transport Schema Generation Wizard to create my schema and could configure the port correctly. If I use this schema on my orchestration, it works perfectly. BizTalk starts one instance of the orchestration everytime the #STEPS_TABLE has more than 1 record.
Reading Microsoft technical documentation, they recommend to get several messages in one call and then use the XML pipeline to disassemble the multi-row BizTalk message into a single-row BizTalk message.
I haven't used the XML pipeline before, so I tried the provided steps but couldn't get it to work.
Could somebody provide me a link to a "how to" (didn't find anything until now, after several hours of searching) or give me some hints to succeed.
Thanks in advance.
... some hours later I could figure it out myself. So if anybody comes across the same issue as me, here you have some guidelines to make it work on your environment.
At the end I followed a different walkthrough from Microsoft and avoided the pipeline recommendation altogether. The documentation I found is called "Disassembling Result Sets Using the SQL Adapter" and does exactly what i was looking for. You can just follow the whole walkthrough from Microsoft but avoid the creation of the send port and make some little adjustment on the receive port.
After following the technical document you will end up with two schemas, I will call them message and envelope (contains several messages) for the sake of this excercise. In your orchestration you can create a receiving port that maps to the message and then when you configure it as a SQL Port and you link it to your stored procedure (or select statement), you only have to change the Document Root Element Name to the envelope root name; the XML Receive pipeline (provided by default in BizTalk 2006) will do the magic of disassembling the messages contained in the envelope and instantiating an orchestration for each message.
The Microsoft "Disassembling Result Sets Using the SQL Adapter" walkthrough can be found under:
http://msdn.microsoft.com/en-us/library/aa562098(v=bts.20).aspx
Mission accomplished :)

Resources