I have a receive port. It calls a stored procedure for transport, then the receive pipeline is just passthrureceive (I tried XMLReceive, but that caused many more issues). I have a send port that has the filter set so it picks up the type of the receive port. The send port has a map and send pipeline. The map maps the xml from the receive port to a flat file schema, and then sends it to it's location. THe send pipeline has a flat file assembler.
So the map never runs. I just get errors from the send pipeline saying it can't match the document. No it can't, because it wasn't mapped. I read that you need an XML Disassembler on the recieve pipeline of the receive location. I added that, and that just started destroying my messages. They just get turned into a blank message or just a " in the message. So the XML disassembler is clearly not working right. I'm not sure what to do at this point.
OK, I figured it out. I have to use the XML disassembler to make the map run. The reason it was erasing my messages is, well I don't know the reason, but when I set the schema to "elementFormDefault = Qualified", it worked. I'm not really sure what that did, as I really don't have a good understanding of the whole qualified message thing, but that was the problem for me.
There is a wizard in Visual Studio to help you create the schemas and bindings that you use to communicate with stored procedures. I guess you didn't use that? (Right click on the project, hit "Add", then "Add Generated Items", then "Consume Adapter Service")
I'm calling my stored procedure from an orchestration hooked to two way send/receive port. I'm using a custom WCF type port with XMLTransmit and XMLReceive for the pipelines. This seems to work fine. The caveat being it's always a bit fiddly getting WCF to work since there are so many options.
In order to map from one format to another on ports, you need to have MessageType promoted. In your case it can be accomplished by using XMLReceive on the receive pipeline.
Related
I have developed a BizTalk Application. It receives a xml file and, after applying the business logic, it sends the file to another location using FILE adapter. I need to track the start and end time for both Receive Port and Send Port. I have created BAM activities and view and have created a tracking profile using Tracking Profile Editor. I have used Interchange ID as continuation ID token.
The problem is that in the BAM tracking, I am getting two rows, one for the receive port and second for the send port. The continuation between the receive and send port is not working.
The Continuation is not working most likely because InterchangeID is not naturally Promoted.
The small issue you have is that there is no naturally Promoted Property that can be used out of the box for this.
The simplest solution would be to create a custom Pipeline Component that Promotes InterchangeID (same property, just Promoted). Then your Tracking Profile should start working.
FYI, it this point, you don't really need BAM as it's pretty easy to query tracking directly using the same Promoted Property (which is what BAM is essentially doing using a slightly different path).
The interchange id will be present in the message context. Can you please confirm you mapped receive and send ports to the continuations in the tracking profile editor. Refer the article https://www.biztalk-server-tutorial.com/2013/02/08/how-to-enable-bam-continuation-between-receive-send-ports-using-tracking-profile-editor/ which shows the steps to add continuation correctly.
I need to have two orchestrations that take the same input schema message from an HTTP Receive Port.
The orchestrations do different things.
I do not understand how can I call either an orchestration or the other one.
I have just a solution in my mind but I don't think it right.
I create two different receive location. One orchestration -> One receive location..
It looks like the correct solution. But create a receive location mean create a virtual folder in my http site on IIS that contains the BTSHTTPReceive.dll.
So my doubt is: If I have 20 orchestration with same input, should I create 20 virtual folder that contain the DLL?
It looks an horrible solution.
What is the correct way to solve my problem?
Is this a one-way or a two-way receive port/location?
In case of a one-way receive location, just promote properties and use basic content based routing (CBR) using publish/subcribe on your properties.
In case of a two-way receive location: which response will you be giving to your application?
Think of your orchestration as your web service. You need to take in the request and generate one response. How you deal with that request by forwarding it to N number of other orchestrations/applications is up to you, but publish/subscribe is built for this behavior.
I have a receive location in my BizTalk 2010 project, and sometimes that receive location will receive an empty file. The receive pipeline is PassThruReceive. We then have a Send Port that has a filter for that Receive Port Name. So all we are doing is moving the file from the receive location to the send location.
The issue that I'm running into is that in the event that we get an empty file in the receive location, my client wants the file to still be moved to the send port. I know that out of the box, the FILE adapter discards empty files and writes an event to the Event Log stating that it was deleted.
I have followed articles that show a custom FILE adapter accomplishing this task. I have had some success with this custom adapter. The file is picked up. Received by BizTalk and the Send Port successfully sends the file. However, even with this solution, I'm running into an issue in the receive side where the file is locked and cannot be deleted. I have followed various articles on this subject, and I get the same issue every time where the file is locked and can not be deleted.
My question is. Even though batchMessage.Message.BodyPart.Data.Close(); is being called, the stream is still locked. Is there any way for me to find where else BizTalk may be locking the file? Is there any other way of handling this?
One of the articles that I followed is located here: http://biztalkwithshashikant.blogspot.com/2011/04/processing-empty-files-in-biztalk.html
It seems to me that you are running into issues when running your custom FILE adapter multi-server. I bet you are running more than 1 server in the BizTalk Group?
I haven't done this myself, but I heard that getting an adapter to run smoothly multi-server is one of the hardest things to do in BizTalk. The trick is to find a way to be able to share the load between multiple instances of the same BizTalk host.
Do you still have the same problem when running the instance only on 1 server instead of 2?
in the custom pipeline component code method:
IBaseMessage Execute(IPipelineContext pContext, IBaseMessage pInMsg)
You should return pInMsg, not set to null, and the .BodyPart stream positioned at the end. If pInMsg is null BizTalk will discard the message silently. You don't need to close it but you do need to move it to the end to let BizTalk know you read and processed it all.
A way around this is to use the FTP adapter instead to pick up the files, the FTP adapter does not discard empty files.
It is quite possible that to system creating the files, it is still a file location, but also accessible via FTP.
I am new to BizTalk. I got a requirement as below.
Requirement is below:-
Source: Oracle (table). I created a generated schema in BizTalk.
Target: Webservice which receives "object array" (Table of source records from BizTalk) as an input.
Source and Target systems have same structure. Hence no mapping should be implemented. Logic should be in pipelines or orchestration.
Need info on below two topics:
How to incorporate the logic in pipeline or orchestration to map data from source schema to target WS schema.
This question was posed (now deleted) on the other big BizTalk forum. So I'll share my answer here.
What you're asking is simply not possible. It doesn't matter that the source and destination are logically the same. They are represented by two different schemas in BizTalk. There is no way around this except by developing the Web Service to accept the WCF Oracle message directly.
Because of that, you must transform from the source to the destination. Maps are how that is done. While there are technically other ways, they are harder to write, bug prone and would likely offer a less desirable performance profile.
A ban on Maps is just counter-productive and as a long time BizTalk Developer I could not accept a project with such a requirement.
It's not very clear what you are asking for to be honest. Your requirement states that no mapping is required, but then you go on to ask how to incorporate mapping in pipeline or orchestrations.
A standard approach to delivering this would be;
Setup your input process from Oracle by using "Consume Adapter
Service" from visual studio's "add generated item". Use the oracle
binding, setup connection properties for typed polling along with
your query (see here for an example on MS SQL) change to a
service contract type (for inbound operations) and you'll get a set
of schemas representing your dataset, and a binding for your type
receive port poller.
Use "Consume WCF Service" to point to your "sending" web service and
you'll get the schemas, binding and a helpful orchestration with
port types add to your project
Create a simple map mapping your inbound oracle recordset schema to
your web service schema - this should be pretty straight forward if
they are identical, although I suspect you'll have to deal with
multiple sets of data - depends on your data.
Complete by wiring together your orchestration.
I appreciate this is a high level view of what you need to do, but there are plenty of example you can google to get you started. Hope that helps.
Can a message select between an 'older' or 'latest' version of the orchestration he'd like to be prosessed by ?
thanks
If you're talking about the version fo the DLL in the GAC, I don't think this is possible. But if you maintain two seperate Orchestrations, you can use a promoted property to route the message to the appropriate orchestration. If it's more complicated than that, you can have a single receiving orchestration for themessage type and it can call the appropriate orchestration based on whatever criteria you can code into it. This is still the send port groups deciding which messages they want to run. Another approach would be Dynamic Send Ports. This really gives you the freedom to move direction of the message into the orchestration/app itself.
The Microsoft ESB 2.0 Guidance has some extensive thoughts on Itineraries which as I understand it, is the concept of the message containing specific processing steps on board. I am still digesting this, but it may be something to look at.