BizTalk generated wrong 999 file? - biztalk

I am working on a BizTalk inbound process to inbound 837-p files from different trading partners. After the file is processed inbound, I am also forwarding the BizTalk auto-generated 999 file to trading partners as an acknowledgement.
For a special trading partner, BizTalk inbound the 837 file and generated a 999 file claim all records in this file are "Accepted" in its AK9 segment.
But the continue process of these records from the file shows it have some records actually failed.
I saved one of the failed message as XML and validated it with the 837-p schema shipped with BizTalk, it actually failed in validation with following error:
error BEC2004: The element 'PRV_BillingProviderSpecialtyInformation'
in namespace 'http://schemas.microsoft.com/BizTalk/EDI/X12/2006' has
incomplete content. List of possible elements expected:
'PRV03_ProviderTaxonomyCode'.
The question is, if the record actually failed in schema validation, why was the 999 generated with all records as "Accept"?
Some other info:
The EDI Validation is turned on in that trading partner's agreement.
I've double verified that all the setting in agreement are matching the
incoming file.
This validation is actually a HIPPA level 2 validation. But per BizTalk document, it should support level 2 validation.
The BizTalk version is BizTalk 2013 with CU3 update.

Finally figure it out. The missing element BizTalk complains is actually holds a single space character. So it passes the inbound validation, but the space char is trimmed later on in an orchestration. Then the error is raised.

Related

Can I use a single pipeline for both a file receive location and an email receive location?

I have a pipeline that is working for a file receive location. This pipeline accepts a csv file and maps it to an XML.
I'm now trying to setup a new email receive location using the same port, pipeline and pipeline settings.
Shouldn't biztalk ignore everything but the attachment if I set the body part index equal to 2? And then it should place the attachment in the pipeline just like with the file location, then the pipeline would output an XML file.
Error:
Microsoft.XLANGs.Core.PersistenceException: Exception occurred when persisting state to the database. ---> Microsoft.BizTalk.XLANGs.BTXEngine.PersistenceItemException: A batch item failed persistence Item-ID 72fbeba9-6bfe-48e0-a0e6-ca5bbd191aa1 OperationType MAIO_CommitBatch Status -1061151998 ErrorInfo The published message could not be routed because no subscribers were found. . ---> Microsoft.BizTalk.XLANGs.BTXEngine.PublishMessageException: Failed to publish (send) a message in the batch. This is usually because there is no one expecting to receive this message. The error was The published message could not be routed because no subscribers were found. with status -1061151998
Pop3 properties:
PipelineConfigurations:
I tried stopping the orchestration and the send port and test both locations.
Testing the email location I got the same error with an email with a .csv attachment.
Testing the file location the data didn't reach the database, but the csv was processed because I could see it in the information logs.
This leads me to the conclusion that the problem is related with the mime decoding and whatever my pipeline is outputing from the email body parts.
Also after researching for a while all solutions seem to point to the necessity of having a pipeline exclusively for email since I need to say which part of the multibody part to decode. I was hoping there was a solution that would allow me to reuse the pipeline I use for the file location.
As #Dijkgraaf mentioned:
It is not failing in the pipeline. It is failing due to there being no Orchestration or Send Port that is expecting the message the Receive Port has published to message box.
That means that the receive worked, that the message has passed the pipeline, and is published in the Messagebox, but there is no matching subscription.
Check for routing failures in the BizTalk management console to find out why. It may be that the message type is not waht you expected, or that one more published properties are not set correctly.
See the suspended message and check which body part is the CSV file. Your config says its it should be the 3rd message BodyPart = 2.

BizTalk 2006 Event Log Warnings - Cannot insert duplicate key row in object 'dta_MessageFieldValues' with unique index 'IX_MessageFieldValues'

We have been seeing the following 'warnings' in the event log of our BizTalk
machine since upgrading to BTS 2006. They seem to occur
randomly 6 or 8 times per day.
Does anyone know what this means and what needs to be done to clear it up?
we have only one BizTalk server which is running on only one machine.
I am new to BizTalk, so I am unable to find how many tracking host instances running for BizTalk server. Also, can you please let me know that we can configure only one instance for one server/machine?
Source: BAM EventBus Service
Event: 5
Warning Details:
Execute batch error. Exception information: TDDS failed to batch execution
of streams. SQLServer: bizprod, Database: BizTalkDTADb.Cannot insert
duplicate key row in object 'dta_MessageFieldValues' with unique index
'IX_MessageFieldValues'.
The statement has been terminated..
I see you got a partial answer in your MSDN Post
go to BizTalk Admin Console ,check in Platform Settings -> Hosts, in the list of hosts on the right, confirm that only a single Host has the Tracking column marked as Yes.
As to your other question. Yes you can run a Single Host Instance on a Single Server. Although when your server starts to come under a bit of load you may want to consider setting up some more so you can balance the workload better.

BizTalk 2009 - How to run a process after all messages have processed from a large disassembled file

We receive many large data files daily in a variety of formats (i.e. CSV, Excel, XML, etc.). In order to process these large files we transform the incoming data into one of our standard 'collection' message classes (using XSLT and a pipeline component - either built-in or custom), disassemble the large transformed message into individual 'object' messages and then call a series of SOAP web service methods to handle business logic and database operations.
Unlike other files received, the latest file will contain all data rows each day and therefore, we have to handle the differences to prevent identical records from being re-processed each day.
I have a suitable mechanism for handling inserts and updates but am currently struggling with the deletes (where the record exists in the database but not in the latest file).
My current thought process is to flag the deleted records in the database using a 'cleanup' task at the end of the entire process but this would require a method to be called once all 'object' messages from the disassembled file have completed.
Is it possible to monitor individual messages from a multi-record file and call a method on completion of the whole file? Currently, all research is pointing to an orchestration with some sort of 'wait' but is this the only option?
Example: File contains 100 vehicle records. This is disassembled into 100 individual XML messages which are processed using 100 calls to a web service method. Wish to call cleanup operation when all 100 messages are complete.
The best way I've found to handle the 'all rows every day' scenario is to pre-stage the data in SQL Server where it's easier to compare the 'current' set to the 'previous' set. The INTERSECT and EXCEPT operators make it pretty easy in most cases.
Then drain the records with a Polling statement.
The component that does the de-batching would need to publish a start of batch message with the number of individual records and a correlation key.
The components that do the insert & update would need to publish a completion message with the same correlation key when it is completed processing.
The start of batch message would have spun up an Orchestration that would would listen for the completion messages with that correlation key and count the number, and either after it has received the correct number or after a timeout period it would call the cleanup or raise an exception.

How to loop in orchestration BizTalk 2010

I'm looking to loop data received from SQL Server data received from wcf-sql adapter.
I use for loop and and the following
itostring=i.ToString();
MessageOne=xpath(MessagePolling,"/*[local-name()='MainData' and namespace-uri()='http..["+itostring+"]");
When the XPath in for the first receive message path[i]
Is this the correct way?
There are two ways^ to loop on multiple records contained within an Xml message received by BizTalk:
Envelope Schemas
When you define the schema that represents the message, mark it as an Envelope Schema. This tells the Receive Pipeline Disassembler to create (and publish) one message to the BizTalk Message Box for each record in the incoming message (in your case from the WCF-SQL Adapter). This will cause a single Orchestration instance to be started for each record in your incoming message.
Richard Seroter has a great blog post on doing this from the WCF-SQL Adapter - http://seroter.wordpress.com/2010/04/08/debatching-inbound-messages-from-biztalk-wcf-sql-adapter/
Be aware that with this approach, you don't want to be de-batching tens of thousands of records from the incoming message as BizTalk will grind to a halt :-)
XPath Inside an Orchestration
If you do not use an Envelope Schema, you will start a single Orchestration instance for the incoming message (containing multiple records). Within an Expression Shape in your Orchestration, you can use XPath (and some other magic) to loop around each record an extract each to an Orchestration variable (which you can then map on etc.)
Take a look at the following links that will help you with extracting via XPath:
http://social.technet.microsoft.com/wiki/contents/articles/6944.biztalk-orchestrations-xpath-survival-guide.aspx
http://www.biztalkgurus.com/biztalk_server/biztalk_blogs/b/biztalk/archive/2004/10/25/using-xpath-inside-biztalk-orchestrations.aspx
http://blog.eliasen.dk/2006/11/05/LoopingAroundElementsOfAMessage.aspx
http://www.codeproject.com/Articles/534627/BizTalk-Looping-through-repeating-message-nodes-in
^There is also a third way to achieve this as of BizTalk Server 2009 (I think - it seems like so long ago) whereby you can execute a Receive Pipeline within an Orchestration, so you could perform your Envelope de-batching in an Orch, instead of a Receive Location's Receive Pipeline.

Send notifcation if expected message did not arrive in BizTalk

I have a BizTalk receive port monitoring an FTP location. I expect a file to arrive at least once per day in that location and for BizTalk to pick it up and kick off an orchestration. This part is working fine.
However, sometimes the sender fails to send a message during a day, in which case I want an email to sent to notify the users that something is amiss.
I could solve this outside of BizTalk, by creating a daily job that looks in our database for processed files and makes sure there is at least one in any given day. However, I'd prefer to solve this "in line" with the BizTalk solution that is already in place, and not deploy a separate, unrelated job which will increase maintenance headaches.
Is there any functionality in BizTalk that would allow me to send a notification if a receive port doesn't receive something in a given timeframe?
Short answer: Not really.
The logic you want to implement would require a customised version of the FTP Adapter. Depends on how comfortable you are rolling up your sleeves and getting into the Adapter SDK.
If you wanted to keep your solution "Purely BizTalk", you could set up a secondary Orchestration using a SQL Receive Location tied to a stored procedure. This stored procedure executes regularly and looks for records in your "Processed File" table received in the past (business) day. If none are found, it fabricates a record and returns it via the SQL Receive Location. This would be your trigger to send the email notification.
One solution, not elegant though, is to have a secondary FILE receive location, with a schedule window, outside your cutoff time.
Failure scenario:
In this FILE receive location, you have an intelligent/dummy message conforming to the same schema as FTP receive. The intelligent part is to have one of the fields in the message telling us when was the last time we received the file from FTP. The rest of the content is dummy.
Within your orchestration, you check where you received your file from. If its the secondary receive location (using the context property BTS.ReceiveLocationName), you check the date field of this dummy/intelligent message and if it is in past 24 hours ( or similar logic) send an email notifying you did not receive the file from the upstream FTP process and also save a copy of the dummy message (received) back to the secondary FILE receive location unchanged.
Success Scenario:
Apart from normal processing, you save a copy of the dummy/intelligent message to the secondary FILE receive location, with the datetime field reflecting when you processed the file you received from FTP receive location.
Initialising:
You start with a dummy/intelligent message in the secondary FILE receive location with the datetime field value well in the past ( assuming we never received the file from FTP) or with yesterday's date ( assuming we received a file successfully from FTP the day before.)
Overview:
Your orchestration has two trigger points.
When you receive a file via FTP
A scheduled FILE receive location, triggered after the cut-off time.

Resources