Processing multiple different files into one output message - biztalk

I have five different excel files (different structure and different data)
that will be processed into one output message (XML file).
Files arrive into different order , all files are required to create output xml file.
How can I do it in Biztalk ?
more specific questions:
1.Is that possible to aggregate different type of messages in Biztalk and have message with multiple bodies ?
2.Can I aggregate 5 excel files into one message and then execute output pipeline to process all of them ?

I would approach this problem as follows:
Create a new schema that represents the destination format, we will be mapping the incoming messages into this format.
Create schema's (probably Xml, not flat-file) that represent the incoming Excel spreadsheets. Disassemble the Excel files into their corresponding schema's either through a custom pipeline component (that isn't too difficult using the Excel SDK), or via a third-party tool (such as Farpoint Spread http://www.fpoint.com/biztalk/default.aspx), there is also an open-source component on codeplex at http://excel2007pipeline.codeplex.com/
Map the incoming xml message (disassembled Excel file) on a Receive Port into the destination format created in 1. Multiple Receive Locations can be used on a Receive Port, one for each incoming message format; likewise, multiple Maps can be specified on a Receive Port and the correct map will be selected automatically by BizTalk based on the incoming message type (schema namespace+root node name).
With regards to aggregating messages, take a look at parallel and sequential convoys; with regards to messages with multiple bodies, take a look at multi-part messages - both are out of the scope of this question unless you add further detail around what you are trying to achieve with these concepts.

Agree broadly with Nick's answer above, especially mapping messages in inbound pipelines.
However, I would not implement aggregation via sequential convoy pattern in BizTalk because doing so requires the use of singleton orchestrations, which are a BizTalk anti-pattern (and a support nightmare).
Basic parallel convoys can work because each "set" of 5 inputs will be routed to one instance of an orchestration which will terminate after finishing.

Related

How can I filter data from file when sending to two different locations using BizTalk?

If I want to send a file e.g. XML to two different locations making it two different folders using BizTalk. But at the same time I want to filter the data from this files then how can I do?
If you want certain XML files to be sent only to a certain location when it matches certain criteria, then what you need to do is to have a promoted property in your message context that you can then have as part of your send port Filter. This promoted property can either be a field of the message that you have promoted or a message context only promoted property that you have set either in a Pipeline or Orchestration.
If however you want to filter the data in a XML file in a Send port, then you would need to have a map on the send port that selected the data from the source XML payload and map that to the target XML payload.
If you mean filter which data you want to process before sending, you can use content based routing. See below links which ever you prefer.
Using esb toolkit
Using orchestration
But if you mean by filter the contents by modifying source xml, you need to:
Create a map using your payload xml as source and target
Add your map in send port's outbound map.

Deliver messages in order based on xml content in BizTalk

I have a problem where I receive files from a third party via a website. These files come in order from the third-party and sit in a folder. Because of security constraints, I am not able to poll the directory directly via SMB but instead I have to fetch the files every 1 minute using SFTP. This creates a problem because the files that were delivered to me in-order are now all together in my receive location's folder. I need to deliver these files in the order they came to my send port.
I have thought of creating a separate program that would open the files and then copy them in order (based a segment called SequenceId in the XML) to a folder within the BizTalk server that is monitored by the receive location which would ensure the files are delivered in order. I would prefer not to introduce another failure point (the program) but I am not sure how I can do this with pure BizTalk.
You can do this with pure BizTalk (search for BizTalk resequencing), however you end up with a complex solution including a singleton Orchestrations and it is usually easier to use a database table.
First interface picks it up, and just insert the data into a table. Either as flat table if your message structure is flat, or as a one field with XML data and another field that contains your sequence number extracted from the payload.
Your second interface polls a stored procedure that looks if the next in sequence is available to be processed.
You will have to have considerations about what to do if a particular message for a sequence never arrives, do you send out an alert, or process the messages you do have after a preset delay.

Append invoice list EDI message on outgoing batch

One of our partners requires that a "summary" edi message is appended to any EDI invoice interchange (apparently known as "invoice list"). This message contains a reference that every individual invoice should have in an RFF segment, as well as the cummulated MOA values.
My question is: does BizTalk Server (in particular BizTalk 2009) provide a convenient way to append another EDI message to the outgoing EDI batch upon release? I cannot find anything on MSDN.
My current idea is to append it in the send pipeline of the port that will transmit the batch, but I really would like a more convenient way.
I'll put it this way, your Trading Partner has a rather unusual, perhaps unique, requirement that is itself inconvenient. So, sorry, there is no 'convenient' way to accomplish this in BizTalk, probably in any platform either.
Anyway, here's what I would do, or at least some things I would try.
Batch the Invoices normally, such as with the Batching Service.
Subscribe to the Batch with an Orchestration.
Map the Batch to the Summary, which ever transaction that is, but it has to be X12/EDIFACT.
Use a dual input Map to create the batch with the summary appended.
Send to Send Port with the EDI Assembler.
I'm thinking it would be easier to use a Custom Xslt for the appending Map. That would save you from having to create a Schema for the Batch message, which doesn't actually ship with BizTalk.
The Xslt itself would be pretty simple since you just copying the two Messages. FYI, the Batch Message Orchestration Type wold be Microsoft.XLANGs.BaseTypes.Any. You can't use XmlDocument as a Map source.
I had the same requirement once, but instead of one specific summary message, I needed to be able to have full control over the the order of the EDI messages in the batch.
This is how I solved this at the time: http://blog.codit.eu/post/2009/11/10/Outbound-EDI-batching-in-BizTalk-Server-2009.aspx
It does have some downsides, but it might be handy to know.

Can a receive port be triggered on 2 diffrent reasons

I have normal receive port using a WCF-Adapter for oracle that uses a polling query. Now the problem is that the receive port not only needs to run once the polling query has a hit, but also once per day, regardless of the polling-statement.
Is there a way to make it possible without creating the entire process again?
The cleanest way will be to use an additional receive location. So you will end up with one receive port that contains two receive locations, one for each query.
In the past I have done this with the WCF adapter when polling SQL Server. The use of two locations did require duplicating the schema, unfortunately, to account for the different namespaces. You will probably need two different (and essentially identical) schemas as well.
WCF-SQL polling locations require distinct InboundId values while WCF Oracle polling (as you have noted in the comments) requires different a PollingId for each receive location.
The ESB toolkit includes pipeline components to remove and add namespaces, if you need additional downstream applications work with only a single schema on the messages coming from both locations and/or do not also want to duplicate a BizTalk map.
Change your polling statement so that it has an OR CURRENT_TIME() BETWEEN ....
That way it will trigger at the time you want.

BizTalk custom adaptor

I am not sure if I ask the right question, but this is the scenario I am trying to run:
Multiple files (XML and a few related files, "attachments") have to get into BizTalk as a single message. I have looked into existing adapters, and don't see that done with existing once. To be more accurate, files are taken from file system. Files are not found at the same time, but arrive one at a time, when order is not ensured. XML (content) file is the one that knows what attachments it has to have (what other files).
We are looking into BizTalk 2009 and I was wondering would be that responsibility of a custom Adaptor, or something else. And were I could look for samples.
Thanks.
It is probably possible to do what you want using a custom adapter, though I'd recommend against it. You can achieve what you require using orchestration.
What you are looking for is likey a convoy, or at the least some use of correlation.
In BizTalk a convoy is a messaging pattern (as opposed a BizTalk feature) that allows groups of messages to be processed by a single orchestration.
You essentially use correlation on a receive port to group messages together in either a parallel (what you probably want) or sequential fashion.
There is an article [here](http://msdn.microsoft.com/en-us/library/ms942189(BTS.10\).aspx) by Stephen W. Thomas about convoys (it is for BT 2004 but the concepts still hold) and there is a lot of additional information on the web and in books (Professional BizTalk server 2006 has a subsection on them)
Without more details on your scenario it is hard to know exactly how the convoy would be built but below are two approaches to look at (also, I've not had a chance to properly use BT2009 so there may be extended support for correlation scenarios that help you out).
Flexible Correlation
If you don't know anything about the files listed in the context XML you will probably need a pattern like the one described by Charles Young in this post.
Non-uniform sequential convoy
If you do have a little bit of info before hand one way might be as follows (basically a Non-uniform sequential convoy):
This makes the assumption that there is some way of linking all the files together so you can correlate them.
Create a single orchestration that subscribes to you inbound receive port (which contains the file receive location).
This orchestration will have a single activation receive shape that is set up for your content file.
Once the orchestration is started by a content file a second correlated receive shape starts picking up the messages that match that content file. (this second receive could possible be in a loop to allow for variable numbers of files)
You then pack them all together into a single outbound file of your design and send them out once the full number of files has been received.
Seems to me a better approach would be to implement the above requirements with a combination of a custom pipeline component and/or a custom adapter. I assume you do not really need to manipulate the incoming files - except for the content XML file - or that you couldn't since they are in binary format. This calls for a custom pipeline component.
What you can do is develop a custom BizTalk adapter to interact with the file system and to implement the listening and looping logic. Next you can develop a custom pipeline component to create a single BizTalk message perhaps with base64 data type in it for binary data. Additionally you could also promote messages right in this component to enable orchestration subscriptions.
Orchestrations are more suited for implementing business work-flow scenarios where the messages are already in XML format. This do not appear to be the case. In any case I think at the very least a custom pipeline component would be needed.
David's answer is the correct answer.
Even in cases where you don't know absolutely nothing about the contents of the expected attachments, surely you know their names and locations. Therefore you can use the Flexible Correlation linked to in david's answer like this:
The key to the solution is to correlate on the builtin BTS.ReceivedFileName property.
First, create a custom receive pipeline, with a custom pipeline component that promotes the BTS.ReceivedFileName context property of the received messages. This simple custom component is fairly easy to write but you can make it straightforward by using third-party frameworks such as, (shameless plug, here) my PipelineComponentBase class or the excellent BizTalk Server Pipeline Component Wizard.
Now for the easy part:
Attachments are received in a specific location, designated by its path on the filesystem.
Create a receive location that listens to an alternate location, used only to control when files are actually swallowed by BizTalk.
In your orchestration, create a correlation type with the BTS.ReceivedFileName property and a correlation set base on this correlation type.
When you want to receive binary attachments, send a dummy message with the BTS.ReceivedFileName context property set to the filename of the binary attachment but with the path matching the alternate location ; the one used by the receive location. Initialize the correlation on the send shape.
Use an expression shape to copy the binary file from its original location to the one used by the receive location.
Finally, use a receive shape bound to the receive port that contains the receive location whose custom receive pipeline will promote the BTS.ReceivedFileName property.
Notice that you actually need to send a message in order to initialize the correlation. It does not matter what message you send actually. What I'd do is send the message through a send pipeline that contains an empty pipeline component. That is a pipeline component that reads the message but return null (so that the message vanishes into thin air before it reaches the adapter). A more elaborate solution would be to use a null adapter. That is an adapter that reads the message but does not do anything about it.
These two solutions avoid having many files accumulate in a temporary location somewhere, just for the sake of initializing a correlation!

Resources