Need to make a tool to search XML data from BizTalk messagebox.
How do I search all XML data related to lets say a common node called Employee ID from all data stored in the BizTalk MessageBox?
The BizTalk Message Box (BizTalkMsgBoxDb database) is a transient store for messages as they pass through BizTalk. Once a message has finished processing, it will be removed from the Message Box.
You probably want to research Business Activity Monitoring (BAM) which will allow you to capture message data as messages flow through BizTalk; message data can be exposed through its generic web-based portal. BAM is a big product in its own right and I would suggest that you invest time in researching all of the available features to find the one that suits your particular scenario. There are many, many resources available, however you might start by taking look at Business Activity Monitoring. There is also a very good book specifically on BAM: Pro BAM in BizTalk Server 2009
Alternatively, take a look at using the built-in BizTalk Administration Console tools for querying the Tracking database (BizTalkDTADb) which will hold messages for later reference based on your pre-defined configuration options. See Using BizTalk Document Tracking.
Finally, you could consider rolling your own message tracking solution, writing message contents to a SQL Database table, as messages are received in a pipeline for example.
Check out the BizTalk Message Decompressor on CodePlex! I've been using this tool for a couple of years with excellent results. Since you're hitting the messagebox directly, you should be very careful and be very familiar with the queries that you choose to execute.
As noted by a previous poster's answer, BAM and the integrated HAT queries in the admin console are the official, safest, and Microsoft prescribed answers.
Related
Can I access persisted data of running orchestration Instances from the BizTalk database?
My BizTalk application deals with long-running processes and hundreds of orchestration instances can be running at a time.
I would like to access data persisted by these orchestration instances and display it on my application's UI. The data would give an insight about how many instances are running and at which state each of them is.
EDIT :
Let me try to be a little more specific.
My BizTalk application gets ticket requests (messages) from a source and after checking some business rules they are to be assigned to different departments of the company. The tickets can hop between inbox of different departments as each department completes its processing.
Now, the BizTalk orchestration instances are maintaining all the information that which department owns a particular ticket at a given time. I would want to read this orchestration information and generate inbox for each of the department at runtime. I can definitely do this by pushing this information to a separate database and populate the UI from there BUT as all this useful information is already available in the form of orchestration instances I would like to utilize it and avoid any syncing issues.
Does it make any sense?
The answer to your specific question is NO.
BAM exists for this purpose exactly.
Yes it is doable. Your question is little confusing. You can't get the data which is persisted for your orchestration instance, however You can get number of running or dehydrated instances using various options like WMI, ExplorerOM library. As a starting point you can look at some samples provided as part of BizTalk installation under SDK\Samples\Admin folder. Also you should be looking at MSBTS_ServiceInstance WMI class to get the service instances. You can also look at a sample http://msdn.microsoft.com/en-us/library/aa561219.aspx here. You can also use powershell to perform the same operation
I want to find information about a terminated instance in BizTalk 2010 - where can I look for that information? Specifically, I'm looking for information about an instance for one application.
You might start up the BizTalk Administration Console and go to the BizTalk Group node.
Then, on the page thats pops up, select 'Tracked Service Instances'.
Given the fact that you are tracking the specific instance you want to find, you might be able to find this by extending the query with "State" equals "Terminated".
Depending if you tracked it or not AND how long ago it was AND how long your tracking is kept in the DTA database (see the SQL jop parameters for that), you will find what you are looking for.
Hope this helps.
Specifics notwithstanding, all available information about any instance can be viewed in BizTalk Administrator through the context (right click) options.
The only other place you may find relevant information is in the Event Viewer if there was a previous error.
There is no special database-only log or other source of information.
I am new to BizTalk. I got a requirement as below.
Requirement is below:-
Source: Oracle (table). I created a generated schema in BizTalk.
Target: Webservice which receives "object array" (Table of source records from BizTalk) as an input.
Source and Target systems have same structure. Hence no mapping should be implemented. Logic should be in pipelines or orchestration.
Need info on below two topics:
How to incorporate the logic in pipeline or orchestration to map data from source schema to target WS schema.
This question was posed (now deleted) on the other big BizTalk forum. So I'll share my answer here.
What you're asking is simply not possible. It doesn't matter that the source and destination are logically the same. They are represented by two different schemas in BizTalk. There is no way around this except by developing the Web Service to accept the WCF Oracle message directly.
Because of that, you must transform from the source to the destination. Maps are how that is done. While there are technically other ways, they are harder to write, bug prone and would likely offer a less desirable performance profile.
A ban on Maps is just counter-productive and as a long time BizTalk Developer I could not accept a project with such a requirement.
It's not very clear what you are asking for to be honest. Your requirement states that no mapping is required, but then you go on to ask how to incorporate mapping in pipeline or orchestrations.
A standard approach to delivering this would be;
Setup your input process from Oracle by using "Consume Adapter
Service" from visual studio's "add generated item". Use the oracle
binding, setup connection properties for typed polling along with
your query (see here for an example on MS SQL) change to a
service contract type (for inbound operations) and you'll get a set
of schemas representing your dataset, and a binding for your type
receive port poller.
Use "Consume WCF Service" to point to your "sending" web service and
you'll get the schemas, binding and a helpful orchestration with
port types add to your project
Create a simple map mapping your inbound oracle recordset schema to
your web service schema - this should be pretty straight forward if
they are identical, although I suspect you'll have to deal with
multiple sets of data - depends on your data.
Complete by wiring together your orchestration.
I appreciate this is a high level view of what you need to do, but there are plenty of example you can google to get you started. Hope that helps.
I need to invoke a long running task from an ASP.NET page, and allow the user to view the tasks progress as it executes.
In my current case I want to import data from a series of data files into a database, but this involves a fair amount of processing. I would like the user to see how far through the files the task is, and any problems encountered along the way.
Due to limited processing resources I would like to queue the requests for this service.
I have recently looked at Windows Workflow and wondered if it might offer a solution?
I am thinking of a solution that might look like:
ASP.NET AJAX page -> WCF Service -> MSMQ -> Workflow Service *or* Windows Service
Does anyone have any ideas, experience or have done this sort of thing before?
I've got a book that covers explicitly how to integrate WF (WorkFlow) and WCF. It's too much to post here, obviously. I think your question deserves a longer answer than can readily be answered fully on this forum, but Microsoft offers some guidance.
And a Google search for "WCF and WF" turns up plenty of results.
I did have an app under development where we used a similar process using MSMQ. The idea was to deliver emergency messages to all of our stores in case of product recalls, or known issues that affect a large number of stores. It was developed and testing OK.
We ended up not using MSMQ because of a business requirement - we needed to know if a message was not received immediately so that we could call the store, rather than just letting the store get it when their PC was able to pick up the message from the queue. However, it did work very well.
The article I linked to above is a good place to start.
Our current design, the one that we went live with, does exactly what you asked about a Windows service.
We have a web page to enter messages and pick distribution lists. - these are saved in a database
we have a separate Windows service (We call it the AlertSender) that polls the database and checks for new messages.
The store level PCs have a Windows service that hosts a WCF client that listens for messages (the AlertListener)
When the AlertSender finds messages that need to go out, it sends them to the AlertListener, which is responsible for displaying the message to the stores and playing an alert sound.
As the messages are sent, the AlertSender updates the status of the message in the database.
As stores receive the message, a co-worker enters their employee # and clicks a button to acknowledge that they've received the message. (Critical business requirement for us because if all stores don't get the message we may need to physically call them to have them remove tainted product from shelves, etc.)
Finally, our administrative piece has a report (ASP.NET) tied to an AlertId that shows all of the pending messages, and their status.
You could have the back-end import process write status records to the database as it completes sections of the task, and the web-app could simply poll the database at arbitrary intervals, and update a progress-bar or otherwise tick off tasks as they're completed, whatever is appropriate in the UI.
We are building an AbleCommerce 7 web store and trying to integrate it with an existing point-of-sale system. The product inventory will be shared between a phyical store and a web store so we will need to periodically update quantity on hand for each product to keep the POS and the web store as close to in synch as possible to avoid over selling product in either location. The POS system does have an scheduled export that will run every hour.
My question is, has anyone had any experience with synchronizing data with an Able Commerce 7 web store and would you have any advice on an approach?
Here are the approaches that we are currently considering:
Grab exported product data from the POS system and determine which products need to be updated. Make calls to a custom-built web service residing on the server with AbleCommerce to call AbleCommerce APIs and update the web store appropriately.
Able Commerce does have a Data Port utility that can import/export web store data via the Able Commerce XML format. This would provide all of the merging logic but there doesn't appear to be a way to programmatically kick off the merge process. Their utility is a compiled Windows application. There is no command-line interface that we are aware of. The Data Port utility calls an ASHX handler on the server.
Take an approach similar to #1 above but attempt to use the Data Port ASHX handler to update the products instead of using our own custom web service. Currently there is no documentation for interfacing with the ASHX handler that we are aware of.
Thanks,
Brian
We've set this up between AbleCommerce and an MAS system. We entered the products into the AbleCommerce system and then created a process to push the inventory, price, and cost information from the MAS system into the ProductVariants table.
The one issue we ran into is that no records exist in the ProductVariants table until you make a change to the variants data. So, we had to write a Stored Procedure to automatically populate the ProductVariants table so that we could do the sync.
I've done this with POS software. It wasn't AbleCommerce, but retail sales and POS software is generic enough (no vendor wants to tell prospects that "you need to operate differently") that it might work.
Sales -> Inventory
Figure out how to tap into the Data Port for near-real-time sales info. I fed this to a Message-Queue-By-DBMS-Table mechanism that was polled and flushed every 30 seconds to update inventory. There are several threads here that discuss MQ via dbms tables.
Inventory -> Sales
Usually there is a little more slack here - otherwise you get into interesting issues about QC inspection failures, in-transit, quantity validation at receiving, etc. But however it's done, you will have a mechanism for events occurring as new on-hand inventory becomes available. Just do the reverse of the first process. A QOH change event causes a message to be queued for a near-real-time polling app to update the POS.
I actually used a single queue table in MSSQL with a column for messagetype and XML for the message payload.
It ends up being simpler than the description might sound. Let me know if you want info offline.