I have a storemanager dashboard which makes use of Microsoft Dynamics AX database. To avoid writing a lot of code, I plan on using CRT (commerce runtime) which would give me some form of abstraction and also saves my time writing a lot of code by using other integration methods like AIF and .net business connector.
But, my doubt is, the description of CRT says that it makes use of CRT channel database.
Will it have required amount of data that ax database would have and is it the right way to go forward, when you have to make use of Dynamics AX database (central db) that has all the data?
See this overview of the Commerce Runtime Architecture.
If the dashboard can use the services of the CRT, then use that.
The CRT database is not the AX database and contain a subset of the AX data, and is asynchronously updated using a one- or two-way sync depending on the data.
You will have to decide whether this is okay for your application.
Related
What are the different ways available to get access to the data from the Progress procedures in the form of JSON? Other than creating PASOE instances, ODBC or JDBC? If I want to build an API that can communicate with Progress 4GL DB, what are the options available other than what I have mentioned above? Just to give an example I have a front end application which is build on JavaScript/Angular/ASP.NET CORE and if I want to make calls to Progress DB, how do I achieve it? It would be helpful if I can know any latest technologies that can be integrated to communicate with Progress 4GL DB.
If you run old-school appserver or webspeed you can set up a webservice that way.
For .Net (or Java) you can check the Open Client: https://docs.progress.com/bundle/openedge-open-clients/page/Introduction-to-Open-Clients.html
You could also develop some kind of server operating on a socket but I think sticking to tested techniques such as those you mention NOT wanting to use is the winning bet. Whatever you save in money not licensing PAS (if money is the issue) will be losts in time developing that server.
I am new to BizTalk. I got a requirement as below.
Requirement is below:-
Source: Oracle (table). I created a generated schema in BizTalk.
Target: Webservice which receives "object array" (Table of source records from BizTalk) as an input.
Source and Target systems have same structure. Hence no mapping should be implemented. Logic should be in pipelines or orchestration.
Need info on below two topics:
How to incorporate the logic in pipeline or orchestration to map data from source schema to target WS schema.
This question was posed (now deleted) on the other big BizTalk forum. So I'll share my answer here.
What you're asking is simply not possible. It doesn't matter that the source and destination are logically the same. They are represented by two different schemas in BizTalk. There is no way around this except by developing the Web Service to accept the WCF Oracle message directly.
Because of that, you must transform from the source to the destination. Maps are how that is done. While there are technically other ways, they are harder to write, bug prone and would likely offer a less desirable performance profile.
A ban on Maps is just counter-productive and as a long time BizTalk Developer I could not accept a project with such a requirement.
It's not very clear what you are asking for to be honest. Your requirement states that no mapping is required, but then you go on to ask how to incorporate mapping in pipeline or orchestrations.
A standard approach to delivering this would be;
Setup your input process from Oracle by using "Consume Adapter
Service" from visual studio's "add generated item". Use the oracle
binding, setup connection properties for typed polling along with
your query (see here for an example on MS SQL) change to a
service contract type (for inbound operations) and you'll get a set
of schemas representing your dataset, and a binding for your type
receive port poller.
Use "Consume WCF Service" to point to your "sending" web service and
you'll get the schemas, binding and a helpful orchestration with
port types add to your project
Create a simple map mapping your inbound oracle recordset schema to
your web service schema - this should be pretty straight forward if
they are identical, although I suspect you'll have to deal with
multiple sets of data - depends on your data.
Complete by wiring together your orchestration.
I appreciate this is a high level view of what you need to do, but there are plenty of example you can google to get you started. Hope that helps.
Need to make a tool to search XML data from BizTalk messagebox.
How do I search all XML data related to lets say a common node called Employee ID from all data stored in the BizTalk MessageBox?
The BizTalk Message Box (BizTalkMsgBoxDb database) is a transient store for messages as they pass through BizTalk. Once a message has finished processing, it will be removed from the Message Box.
You probably want to research Business Activity Monitoring (BAM) which will allow you to capture message data as messages flow through BizTalk; message data can be exposed through its generic web-based portal. BAM is a big product in its own right and I would suggest that you invest time in researching all of the available features to find the one that suits your particular scenario. There are many, many resources available, however you might start by taking look at Business Activity Monitoring. There is also a very good book specifically on BAM: Pro BAM in BizTalk Server 2009
Alternatively, take a look at using the built-in BizTalk Administration Console tools for querying the Tracking database (BizTalkDTADb) which will hold messages for later reference based on your pre-defined configuration options. See Using BizTalk Document Tracking.
Finally, you could consider rolling your own message tracking solution, writing message contents to a SQL Database table, as messages are received in a pipeline for example.
Check out the BizTalk Message Decompressor on CodePlex! I've been using this tool for a couple of years with excellent results. Since you're hitting the messagebox directly, you should be very careful and be very familiar with the queries that you choose to execute.
As noted by a previous poster's answer, BAM and the integrated HAT queries in the admin console are the official, safest, and Microsoft prescribed answers.
We are building an AbleCommerce 7 web store and trying to integrate it with an existing point-of-sale system. The product inventory will be shared between a phyical store and a web store so we will need to periodically update quantity on hand for each product to keep the POS and the web store as close to in synch as possible to avoid over selling product in either location. The POS system does have an scheduled export that will run every hour.
My question is, has anyone had any experience with synchronizing data with an Able Commerce 7 web store and would you have any advice on an approach?
Here are the approaches that we are currently considering:
Grab exported product data from the POS system and determine which products need to be updated. Make calls to a custom-built web service residing on the server with AbleCommerce to call AbleCommerce APIs and update the web store appropriately.
Able Commerce does have a Data Port utility that can import/export web store data via the Able Commerce XML format. This would provide all of the merging logic but there doesn't appear to be a way to programmatically kick off the merge process. Their utility is a compiled Windows application. There is no command-line interface that we are aware of. The Data Port utility calls an ASHX handler on the server.
Take an approach similar to #1 above but attempt to use the Data Port ASHX handler to update the products instead of using our own custom web service. Currently there is no documentation for interfacing with the ASHX handler that we are aware of.
Thanks,
Brian
We've set this up between AbleCommerce and an MAS system. We entered the products into the AbleCommerce system and then created a process to push the inventory, price, and cost information from the MAS system into the ProductVariants table.
The one issue we ran into is that no records exist in the ProductVariants table until you make a change to the variants data. So, we had to write a Stored Procedure to automatically populate the ProductVariants table so that we could do the sync.
I've done this with POS software. It wasn't AbleCommerce, but retail sales and POS software is generic enough (no vendor wants to tell prospects that "you need to operate differently") that it might work.
Sales -> Inventory
Figure out how to tap into the Data Port for near-real-time sales info. I fed this to a Message-Queue-By-DBMS-Table mechanism that was polled and flushed every 30 seconds to update inventory. There are several threads here that discuss MQ via dbms tables.
Inventory -> Sales
Usually there is a little more slack here - otherwise you get into interesting issues about QC inspection failures, in-transit, quantity validation at receiving, etc. But however it's done, you will have a mechanism for events occurring as new on-hand inventory becomes available. Just do the reverse of the first process. A QOH change event causes a message to be queued for a near-real-time polling app to update the POS.
I actually used a single queue table in MSSQL with a column for messagetype and XML for the message payload.
It ends up being simpler than the description might sound. Let me know if you want info offline.
I've been working a while with a project aiming to integrate AX with the Web.
The company who delivered AX has chosen to use Business Connector (BC.net) directly on my side of the backend.
I've searched a bit, and for me it looks like we must use Application Integration Framework (AIF) / Enterprise portal (EP) - this due to as I understand that the BC is not made for multi-users like on the web, but must be implemented with a session-wrapper like EP - and also it must be run on an LAN and is not capable to connect via WAN.
Any comments about this?
--
-EDIT-
More info:
Oh, sorry - new to stackoverflow - didn't see that you had commented my question.
I'm doing this from scratch.
The inital plan was to create a model, and send objects directly from AX via BC to my data layer, but since BC is not able to pass anything else then Axaptaobjects, we decided to serialise to XML, send as string with BC and then deserialize with my data layer.
Now, everything works, but the stability and performance is really sucky - and I fear that the company delivering the backend (BC ->AX) is doing something really wrong here...
EP connects to AX data and logic throw the BC. So, if your application with BC performance is slower that EP can't be directly fault of the BC itself.
(EP: AX Enterprise Portal, BC: AX .NET Bussiness Connector)