I have a need to create a BizTalk 2010 application to poll information and I have found this useful blog Polling Oracle Database Using Stored Procedures, Functions, or Packaged Procedures and Functions.
My questions are 2 folds:
The blog hard coded the parameter for the procedure in the package in the polledDataAvailable and the PolledStatement. How do I pass the actual parameters that is going to change? For example, I want to have the ability poll all orders from a customer, and not just the customer hard-coded 'ABC'. The ID of the customer will be defined at real time.
Without using extra receive ports but just based BizTalk monitor (referring back to the blog), how do I examine the results (i.e. viewing the records being polled) on BizTalk monitor?
Maybe the parameter value seems to be hard coded to call the function in the query statement SELECT CCM_ADT_PKG.CCM_COUNT('A02') FROM DUAL , but the real parameters values are passed to the generated instance from the input schema in the section "Modify XML content setting up the right parameters."
I don't know how your message result will be used, but if you use a send port to send the result somewhere, but you can create a simple FILE send port to store your message instance.
Related
Short version:
Can a property with multiple values somehow be promoted so that send ports can subscribe to one of the values in the list?
Long version:
In a database, I have mapping information, where we map people to locations. a person can work at multiple locations and location can have many people working at it. The relationship between locations and people (thousands of people) is maintained by an operations team using a well application which updates the database.
A message comes into Biztalk containing multiple people.
Currently, BizTalk receives the message, pulls out the list of people from the message, and dump messages into a sql database, with an associated list of people. SQL resolves the person/location relationships and writes a distinct list of locations to an associated table. We have a receive port that runs a query and publishes to the messagebox the message from the database with a promoted property that holds the location. From there we have multiple send ports, each that subscribe to a particular location.
The issue is that it is not an efficient process. The message gets published mutliple times to the Biztalk messagebox (once for inbound, and at least once for outbound).
Would it be possible to, using a pipeline component promote the locations that the message should go to, and then have send ports that subscribe to a particular location? The challenge is that some send ports need to be ReST, and some are SOAP, so the integration between locations can be different. I can't seen to find a way to publish multiple a property with multiple values, in a way that send ports can subscribe to one of those values. Looking for ideas...
Funny, this same situation came up just last week....anyway...
Yes, by using Bitwise And predicate in the filter. It's the & option. You'd have to map each location to a value (power of 2) but each property can support up to 32 options (64 if uint64 is supported which...umm....sorry, I just don't recall :)
If you need more than that, just add a second group filter, East, West or whatever.
We have two states stored in Corda Vault (policy and event). Policy can have many events associated with it.
We are attempting to get a joined result (as if we run SQL with JOIN statement) via RPC client and we can't find a graceful way: either we should make several VaultQueries or just use direct JDBC connection to the underlying database and extract the required data. Neither of ways looks appealing and we wonder if there is a good way to extract the data.
As we cannot use JPA/Hibernate annotations to link objects inside the CordApp, we have just policy_id stored in event state.
For more complex queries, it is fine and even expected that the user will query the node's database directly using the JDBC connection.
I have normal receive port using a WCF-Adapter for oracle that uses a polling query. Now the problem is that the receive port not only needs to run once the polling query has a hit, but also once per day, regardless of the polling-statement.
Is there a way to make it possible without creating the entire process again?
The cleanest way will be to use an additional receive location. So you will end up with one receive port that contains two receive locations, one for each query.
In the past I have done this with the WCF adapter when polling SQL Server. The use of two locations did require duplicating the schema, unfortunately, to account for the different namespaces. You will probably need two different (and essentially identical) schemas as well.
WCF-SQL polling locations require distinct InboundId values while WCF Oracle polling (as you have noted in the comments) requires different a PollingId for each receive location.
The ESB toolkit includes pipeline components to remove and add namespaces, if you need additional downstream applications work with only a single schema on the messages coming from both locations and/or do not also want to duplicate a BizTalk map.
Change your polling statement so that it has an OR CURRENT_TIME() BETWEEN ....
That way it will trigger at the time you want.
By using the Orchestration debugger one can get useful time information on the left, regarding entering and leaving shapes. Unfortunately one cannot copy the information from that window. I would like to do some benchmarks and save statistics in Excel.
Does anyone know the sql query to get the same data from the DB? I have tried to find out with SQL Profiler, but did not hit anything that looks like the correct query or stored procedure.
I know I could use BAM, but I just need a quick one for temporary use.
If you are trying to watch with SQL trace be sure you have stopped BizTalk and you are looking at the BizTalkDTADb database otherwise it is guaranteed to be an exercise in futility as BizTalk constantly interacts with SQL Server.
The exact stored procedure it calls to display the orchestration info is dtasp_LocalCallGetActions. You will likely have to do some fancy joins to get some meaningful data out of it. A good place to start is the views in the BizTalkDTADb database which can show the same data you see in the HAT views and will allow you to run the same queries over in query analyzer.
We are building an AbleCommerce 7 web store and trying to integrate it with an existing point-of-sale system. The product inventory will be shared between a phyical store and a web store so we will need to periodically update quantity on hand for each product to keep the POS and the web store as close to in synch as possible to avoid over selling product in either location. The POS system does have an scheduled export that will run every hour.
My question is, has anyone had any experience with synchronizing data with an Able Commerce 7 web store and would you have any advice on an approach?
Here are the approaches that we are currently considering:
Grab exported product data from the POS system and determine which products need to be updated. Make calls to a custom-built web service residing on the server with AbleCommerce to call AbleCommerce APIs and update the web store appropriately.
Able Commerce does have a Data Port utility that can import/export web store data via the Able Commerce XML format. This would provide all of the merging logic but there doesn't appear to be a way to programmatically kick off the merge process. Their utility is a compiled Windows application. There is no command-line interface that we are aware of. The Data Port utility calls an ASHX handler on the server.
Take an approach similar to #1 above but attempt to use the Data Port ASHX handler to update the products instead of using our own custom web service. Currently there is no documentation for interfacing with the ASHX handler that we are aware of.
Thanks,
Brian
We've set this up between AbleCommerce and an MAS system. We entered the products into the AbleCommerce system and then created a process to push the inventory, price, and cost information from the MAS system into the ProductVariants table.
The one issue we ran into is that no records exist in the ProductVariants table until you make a change to the variants data. So, we had to write a Stored Procedure to automatically populate the ProductVariants table so that we could do the sync.
I've done this with POS software. It wasn't AbleCommerce, but retail sales and POS software is generic enough (no vendor wants to tell prospects that "you need to operate differently") that it might work.
Sales -> Inventory
Figure out how to tap into the Data Port for near-real-time sales info. I fed this to a Message-Queue-By-DBMS-Table mechanism that was polled and flushed every 30 seconds to update inventory. There are several threads here that discuss MQ via dbms tables.
Inventory -> Sales
Usually there is a little more slack here - otherwise you get into interesting issues about QC inspection failures, in-transit, quantity validation at receiving, etc. But however it's done, you will have a mechanism for events occurring as new on-hand inventory becomes available. Just do the reverse of the first process. A QOH change event causes a message to be queued for a near-real-time polling app to update the POS.
I actually used a single queue table in MSSQL with a column for messagetype and XML for the message payload.
It ends up being simpler than the description might sound. Let me know if you want info offline.