Transaction with FTP adapter - biztalk

I want to pull something from the server (no delete), parse the file in the pipeline component, process the file, if everything goes successfully, I want the adapter delete the file.
i am thinking to enlist the parsing into the pipeline context, this way, I am picturing if the file cannot be parsed, the file will not get to the message box, therefore it will be deemed as a failed transaction, question, will the adapter participate in this transaction? in other words, my goal is to instruct the adapter to delete the file from the server ONLY when the pipeline processed successfully (transaction is commited), the file is left untouched on the server if the pipeline failed (transaction is rolled back, no message is commited to msg box)
Is this achievable? thanks in advance

I think a little experiment is in order. BizTalk, as part of it's nature, will not delete anything until it has been peristed to the message box. That being said, persitence might happen before PipeLine execution. So, the receive adapter receives the file, perists it to the message box and deletes the file. The message might subsequently fail in the pipeline. If this is the case, then the message is a bad format and it will have to be subsequently resubmitted by the sender. If you want to keep this message, you'll have to pick it up with Failed Message Routing. You can then write it to a directory and implement a resubmit pattern. Or, you can pick up the file via Failed Message Routing and put it back on the FTP server (this is sort of a compensation step).
On the otherhand, if the pipeline fails and the message isn't deleted fromt he server... you're fine.

Related

Can I use a single pipeline for both a file receive location and an email receive location?

I have a pipeline that is working for a file receive location. This pipeline accepts a csv file and maps it to an XML.
I'm now trying to setup a new email receive location using the same port, pipeline and pipeline settings.
Shouldn't biztalk ignore everything but the attachment if I set the body part index equal to 2? And then it should place the attachment in the pipeline just like with the file location, then the pipeline would output an XML file.
Error:
Microsoft.XLANGs.Core.PersistenceException: Exception occurred when persisting state to the database. ---> Microsoft.BizTalk.XLANGs.BTXEngine.PersistenceItemException: A batch item failed persistence Item-ID 72fbeba9-6bfe-48e0-a0e6-ca5bbd191aa1 OperationType MAIO_CommitBatch Status -1061151998 ErrorInfo The published message could not be routed because no subscribers were found. . ---> Microsoft.BizTalk.XLANGs.BTXEngine.PublishMessageException: Failed to publish (send) a message in the batch. This is usually because there is no one expecting to receive this message. The error was The published message could not be routed because no subscribers were found. with status -1061151998
Pop3 properties:
PipelineConfigurations:
I tried stopping the orchestration and the send port and test both locations.
Testing the email location I got the same error with an email with a .csv attachment.
Testing the file location the data didn't reach the database, but the csv was processed because I could see it in the information logs.
This leads me to the conclusion that the problem is related with the mime decoding and whatever my pipeline is outputing from the email body parts.
Also after researching for a while all solutions seem to point to the necessity of having a pipeline exclusively for email since I need to say which part of the multibody part to decode. I was hoping there was a solution that would allow me to reuse the pipeline I use for the file location.
As #Dijkgraaf mentioned:
It is not failing in the pipeline. It is failing due to there being no Orchestration or Send Port that is expecting the message the Receive Port has published to message box.
That means that the receive worked, that the message has passed the pipeline, and is published in the Messagebox, but there is no matching subscription.
Check for routing failures in the BizTalk management console to find out why. It may be that the message type is not waht you expected, or that one more published properties are not set correctly.
See the suspended message and check which body part is the CSV file. Your config says its it should be the 3rd message BodyPart = 2.

No output (send part) on BizTalk pipeline

I have built a flatfile schema with the flatfile schema wizard.
The schema is valid and I could successfully validate my test instance against the schema.(So the XML file was created correctly).
But when I put my test flat file into a receive location that uses a flatfile disassembler pipeline, nothing happens after the receive location has picked up the message (the logical receive port is bounded to an orchestration)
In the BizTalk Admin Console I only see in the tracked message events from the pipeline that the message has been received. But not sent.
Maybe some of you already had a similar issue and could help me here.
Tracked message events in the pipeline
So. The issue was a wrong declared header schema. I have rebuild it and after that it was working fine.
I have recognized the issue by removing the header schema from the pipeline properties and trigger a new process. At this time the message was created correctly (but with the header line due to a missing header schema that would prevent that line).
Thanks to all who helped here!

Spring integration sftp move remote file issue

I'am using the inbound-channel-adapter from Spring Integration to retrieve files over sftp from a remote server. Everything works fine.
But I have an additional requirement: after a file is received on the local side, that file needs to moved to a "send" directory on the remote server.
The "SFTP Outbound Gateway" has the appropriate method for that move action, but my problem is when to call it.
Situation: 10 files on remote server, 0 on local server
When I start my application it will receive all 10 files from the remote server and write them to my local file system. Perfect.
Situation: 1 file on remote server, 10 on local server
In this situation the remote file is received, but for every file on the local file system the receive method of the QueueChannel is also called.
Example log from one file: (file1.zip)
18:12:52.118 [task-scheduler-1] INFO o.s.i.file.FileReadingMessageSource - Created message: [[Payload File content=C:\Downloads\sftpTest\file1.zip][Headers=...]
18:12:52.119 [task-scheduler-1] DEBUG o.s.i.e.SourcePollingChannelAdapter - Poll resulted in Message: [Payload File content=C:\Downloads\sftpTest\file1.zip][Headers=...]
18:12:52.119 [task-scheduler-1] DEBUG o.s.integration.channel.QueueChannel - preSend on channel 'fromChannel', message: [Payload File content=C:\Downloads\sftpTest\file1.zip][...]
18:12:52.119 [task-scheduler-1] DEBUG o.s.integration.channel.QueueChannel - postSend (sent=true) on channel 'fromChannel', message: [Payload File content=C:\Downloads\sftpTest\file1.zip][Headers=...]
18:12:52.119 [main] DEBUG o.s.integration.channel.QueueChannel - postReceive on channel 'fromChannel', message: [Payload File content=C:\Downloads\sftpTest\file1.zip][Headers=......]
So even when the file it not physicaly retreived from the remote server, the channel.receive() method will still receive a message with that file as payload.
This confuses me, because I can't determine from the message if the file was already on the local file system or was just retrieved from the remote server.
I experimented using a custom org.springframework.messaging.support.ChannelInterceptorAdapter, FileFilter, ServiceActivator, but the problem still remains.
My application will process high volumes, so sending the received file to the required directory on the remote server is not an option. And also simply trying to move the file remote for every message that is locally received, is not an option since it will clutter the logfiles with exceptions of not being able to move the file. This way in case of an real error situation the problem will not be detected.
One solution might be a hook in the method copyFileToLocalDirectory of the org.springframework.integration.file.remote.synchronizer.AbstractInboundFileSynchronizer.
There is a check performed if the remote file should be deleted and that loop is only called for the files that were actually transferred from the remote server. My attempts to override this method and add my move behaviour did not succeed, since Spring has already instantiated the classes
that will handle this.
So what is the best way to achieve this? I know the problem will probably be located between my keyboard and my chair, but I've run out of options and any help is highly appreciated.
Thanks a lot,
Frank
You would probably be better off using MGET and an outbound gateway to retrieve the files instead of using the inbound adapter which, as you say, is two-stage - synchronize, and emit message(s) for file(s) in the local dir (unless you use a persistent file list filter, in which case you'll only see "new" files).

Send notifcation if expected message did not arrive in BizTalk

I have a BizTalk receive port monitoring an FTP location. I expect a file to arrive at least once per day in that location and for BizTalk to pick it up and kick off an orchestration. This part is working fine.
However, sometimes the sender fails to send a message during a day, in which case I want an email to sent to notify the users that something is amiss.
I could solve this outside of BizTalk, by creating a daily job that looks in our database for processed files and makes sure there is at least one in any given day. However, I'd prefer to solve this "in line" with the BizTalk solution that is already in place, and not deploy a separate, unrelated job which will increase maintenance headaches.
Is there any functionality in BizTalk that would allow me to send a notification if a receive port doesn't receive something in a given timeframe?
Short answer: Not really.
The logic you want to implement would require a customised version of the FTP Adapter. Depends on how comfortable you are rolling up your sleeves and getting into the Adapter SDK.
If you wanted to keep your solution "Purely BizTalk", you could set up a secondary Orchestration using a SQL Receive Location tied to a stored procedure. This stored procedure executes regularly and looks for records in your "Processed File" table received in the past (business) day. If none are found, it fabricates a record and returns it via the SQL Receive Location. This would be your trigger to send the email notification.
One solution, not elegant though, is to have a secondary FILE receive location, with a schedule window, outside your cutoff time.
Failure scenario:
In this FILE receive location, you have an intelligent/dummy message conforming to the same schema as FTP receive. The intelligent part is to have one of the fields in the message telling us when was the last time we received the file from FTP. The rest of the content is dummy.
Within your orchestration, you check where you received your file from. If its the secondary receive location (using the context property BTS.ReceiveLocationName), you check the date field of this dummy/intelligent message and if it is in past 24 hours ( or similar logic) send an email notifying you did not receive the file from the upstream FTP process and also save a copy of the dummy message (received) back to the secondary FILE receive location unchanged.
Success Scenario:
Apart from normal processing, you save a copy of the dummy/intelligent message to the secondary FILE receive location, with the datetime field reflecting when you processed the file you received from FTP receive location.
Initialising:
You start with a dummy/intelligent message in the secondary FILE receive location with the datetime field value well in the past ( assuming we never received the file from FTP) or with yesterday's date ( assuming we received a file successfully from FTP the day before.)
Overview:
Your orchestration has two trigger points.
When you receive a file via FTP
A scheduled FILE receive location, triggered after the cut-off time.

What could cause a message (from a polling receive location) to be ignored by subscribing orchestration?

I'll try provide as much information as possible:
No error message.
The instance stays in the "ready service instances".
The receive location has the same parameters (except URI, the three polling queries, user account/pw and receive pipeline) as another receive location that points to another database/table which works.
The pipeline is waiting for the correct schema.
The port surface and receive location are both waiting for the correct schema.
In my test example, there are only 10 lines being returned.
The message, which contains those 10 lines, validates against the schema.
I tried to let the instance alone to no avail - 30+ minutes - and no change in its condition.
I had also tried suspending and then resuming it which then places the instance in the "dehydrated orchestrations" list. Again, with no error message.
I'm able to get the message by looking at the body of the message that's in the "ready to run" service. (This is the message that validates versus the schema I use in Visual Studio.)
How might something like this arise?
Stupid question, but I have to ask... Is the corresponding host instance running?

Resources