How to Call BizTalk Orchestration Dynamically - biztalk

How can I call a BizTalk Orchestration dynamically knowing the Orchestration name?
The call Orchestration shapes need to know the name and parameters of Orchestrations at design time. I've tried using 'call' XLang keyword but it also required Orchestration name as Design Time like in expression shape, we can write as
call BizTalkApplication1.Orchestration1(param1,param2);
I'm looking for some way to specify calling orchestration name, coming from the incoming message or from SSO config store.
EDIT: I'musing BizTalk 2006 R1 (ESB Guidance is for R2 and I didn't get how it could solve my problem)

The way I've accomplished something similar in the past is by using direct binding ports in the orchestrations and letting the MsgBox do the dirty work for me. Basically, it goes something like this:
Make the callable orchestrations use a direct-bound port attached to your activating receive shape.
Set up a filter expression on your activating receive shape with a custom context-based property and set it equal to a value that uniquely identifies the orchestration (such as the orchestration name or whatever)
In the calling orchestration, create the message you'll want to use to fire the new orchestration. In that message, set your custom context property to the value that matches the filter used in the specific orchestration you want to fire.
Send the message through a direct-bound send port so that it gets sent to the MsgBox directly and the Pub/Sub mechanisms in BizTalk will take care of the rest.
One thing to watch out in step 4: To have this work correctly, you will need to create a new Correlation Set type that includes your custom context property, and then make sure that the direct-bound send port "follows" the correlation set on the send. Otherwise, the custom property will only be written (and not promoted) to the msg context and the routing will fail.
Hope this helps!

Look at ESB Guidance (www.codeplex.com/esb) This package provides the functionality you are looking for

Related

Adding correlation id to automatically generated telemetry with App Insights

I'm very new to Application Insights, and I'm thinking of using it for a set of services I plan on implementing with asp.net webapi. I was able to get the basic telemetry up and running very easily (right-clicking on a project on VS, Add Application Insights), but then I hit a block. I plan to have a correlation id set in the request headers for calls to downstream services, and I would like to tag all the telemetry related to one outside call with the same correlation id.
So far I've found that there is a way to configure a TelemetryInitializer, but if I understood correctly, this is run before I get to access the request, meaning I can't check if there is a correlation id that I should attach.
So I guess there might be 2 ways to solve this: 1) if I can somehow actually get access to the request headers before the initializer, that would obviously solver the problem, or 2) somehow get a hold of the TelemetryClient instance that is used to report the automatically generated telemetry.
Perhaps the last resort would be to turn off all of the automatic stuff and do all of it manually, when I could of course control what properties are set on the TelemetryClient. But this would be quite a lot more work, so I'd prefer to find some other solution.
You were rights saying that you should use TelemetryInitializer. All TelemetryInitializers are called when Track method is called on any telemetry item. Autogenerated request telemetry is "tracked" on request OnEnd, you should have all your custom headers available for you at that time.
Please also have a look at OperationId - this is part of the standard context managed by App Inisghts and is used exactly for the purpose of correlating requests with downstream execution. This is created and passed automatically, including traces (if you use trackTrace).
Moreover, we have built-in support in our UX for easily seeing all telemetry for a particular operation - it can be found in "Search->Details-->Related Items-->All telemetry for this operation"

Direct Binding Partner Orchestrations in Biztalk Server

I have two Orchestrations (Parent & Child), there is a variable existed in my Parent Orchestration and my Question is that I want to access that variable in my Child Orchestration, for your information I'm using Direct Binding Partner Orchestrations.
NOTE: I Don't want to use Call or Start Orchestrations Shape, Somehow i have to implement this with
Direct Binding Partner Orchestrations.
How would you do this?
Add a new field to the XML message that you are publishing via the direct bound port and assign the value of the variable to that field in the XML message.

Is Biztalk 2010 Receive Shape Filter configurable

I am currently writing an orchestration that is directly bound to the message box, picks up messages and filters according to the filter expression in the receive shape inside said orchestration. The problem I'm having is this; I want to be able to change the filter in the BizTalk bindings, just like send filters are changed in the bindings. Really, I just don't want to have to recompile and re deploy every time My filter changes. Is there a way to do this? I'm thinking maybe modify the binding.xml file somehow, or possibly try a custom pipeline with configurable properties(as my last resort).
If it matters I typically use the BizTalk Deployment Framework for deployments.
No, it is not possible to modify a Receive Shape Filter at Runtime.
If the filter needs to be dynamic, then you will have to apply that logic upstream. The idea of using a custom Pipeline Component is a common solution.
One other approach to consider is leaving you Receive Shape Filter broad and testing each incoming message with the BRE. If it 'passes', continue processing, otherwise exit. BRE Policies/Rules can be updated at runtime.
For this sort of thing you will probably want to execute Business Rules in the Receive Pipeline that then sets a context property on the message that then determines the routing.
That way the filter in the Orchestration is lightweight and doesn't need to be changed.
See http://brepipelineframework.codeplex.com/ (Disclosure: This is written by a colleague of mine)

Validating messages in an orchestration, or receive port

I've been working under the assumption that a message entering an orchestration was validated against the messages schema, only to realize recently that this is not the case. There doesn't appear to be a validate shape, so I'm wondering if there is a clean reusable pattern out there to implement this?
You can validate the messages on an XMLReceive pipeline, but unfortunately this requires specifying the DocumentSpecNames which can detract from the flexibility of the receive.
A workaround is to use a custom "ValidatingXmlPipeline" and add the XMLValidator pipeline component to it.
As per your original question, there is a config setting in btsntsvc.config under Debugging called ValidateSchemas when message variables are assigned. I can't say I've used this as it will probably impact performance.

Accessing workflowArguments in a hosted workflow

We are mixing workflows, a workflow using receive activity's more at the end. But at the start we want to pass in some arguments (not using a receive activity!)
Our workflows are already being created and resumed using a dynamic endpoint with IWorkflowCreation and a class derived from WorkflowHostingEndpoint. In the OnGetCreationContext the creationgContext is filled with WorkflowArguments and the workflow runs. At a later part the receive activity's are creating a bookmark which can be resumed with a message. All seems nice.
But in a xamlx there are no WorkflowArguments, i understand why, except that i want them anyway. I though about an activity in which i can write some code to get the Arguments myself, but i do need some help here.
Or is there another way to pass along the WorkflowArguments into a xamls without using Messaging?
You can't pass arguments into a starting workflow service except through the SOAP message that starts it. But there is nothing preventing you from reading any properties in your workflow service. So it is perfectly fine to do read settings or something similar instead of passing them in at startup.
We have solved this exact situation by creating another WCF service which sits alongside our xamlx service on a slightly different url (e.g. /WorkflowMetadata) and this is where we implement a service method that returns a dictionary of string, type.
In the implementation of this service we simply read the xamlx and determine the arguments.
This is what we use to interrogate a target workflow in an activity designer when creating something like a launch-workflow activity.
Creating an activity will not work as that activity will need an instance in order to run. All you want is some metadata about the xamlx service. And if you are using a WorkflowCreationEndpoint to construct a creation context then you are probably only allowing a dictionary of string, object as the start parameters. Therefore standard metadata will not work. This left us with the only option being to provide another service beside the workflow which serves metadata.
Background here: http://blog.petegoo.com/index.php/2011/09/02/building-an-enterprise-workflow-system-with-wf4/

Resources