Microsoft AX Dynamics Process Integration through Outbound Ports - axapta

I would like to know the Process Integration steps.
Through Outbound ports
If any of the event occurs at AX Dynamics, we just want to know that events in the form of XML(Process Integration).
Example: Sales Order Creation, Customer Creation, Purchase Order Creation..

Outbound ports are only useful for asynchronous communication.
See AX 2012 Export Data with Outbound ports for an example (using the file system).
The steps to initiate sending data is in the AIF_SendCustomer.
As this is no lightweight operation, you may consider logging the records which needs integration in a custom integration table, then doing the processing in batch.
This is done in the insert and/or update and maybe delete method.
Deletes requires you store the RecId field value in the external system to be used for delete requests. The following does not cover this.
For logged table make the following method:
void syncRecord()
{
XXXRecordLog log;
log.RefTableId = this.TableId;
log.RefRecId = this.RecId;
log.insert();
}
Then call this.syncRecord() in the insert and update methods.
In the query to the outbound service be sure to exists join your table and the log table. This way only changed records are exported.
Make a batch job to do the transfer using the AIF_SendCustomer as a template.
After a synchronous (AifSendMode::Sync) transfer of the records, delete the log records (or mark them transferred).
Finally call AIFoutboundProcessingService to flush the file:
new AIFoutboundProcessingService().run();
Try to keeps things simple. It might be simpler to do a comma file export of the changed records!

Related

how is an event triggered for a service in oracle brm

Hello every one i have been going through the oracle communications business revenue management (BRM) docs for sometime now but one thing i can't figure out is how are the events associated with a service triggered actually. Is it done by the BRM or the client application interacting with the BRM. Somehow the docs don't mention it or i may have missed it.Would be nice if someone can explain it to me. Note the term service is used in the same context as used in BRM docs i.e. something that the customers subscribe to and pay for when use i.e. telephony service. the link to the docs https://docs.oracle.com/cd/E51000_01/index.htm
Events are created by BRM:
the logic is defined in the CM (Connection Manager) tier, inside the Facilities Modules (FMs)
the actual SQL instructions are generated by the Oracle DM (dm_oracle)
For the sake of completeness, there is also another way in which events are created:
Call Detail Records (CDRs) are rated with the Batch Rating Engine (Pipeline Manager) and generate Rated Event records
these are then loaded, via the Rated Event Loader (REL), directly to the database. Under the hood, REL calls SQL*Loader and a stored procedure to update account balances and bill items

how to programmaticaly get events from sterling file gateway?

We have Sterling file gateway with UI and everything and we also have control center where we see the file transfers from SFG. Trying to find out how i can subscribe to the events from Filegateway[SFG] programmaticaly. The documentation is not clear on if there is a way to do this.
The Database tables FG_EVENT and FG_EVENTATTR contain the details about Filegateway events.
Example of SQL query:
select * from fg_event t1,fg_eventattr t2 where t1.event_key=t2.event_key and
event_code='FG_0422'
You can add different criteria to the SQL query to filter on filename, type of delivery, date , etc ...
Then you can use SQL queries with any client to query the Database.
Sterling Control Center can monitor the following events:
•Arrived File events - every Sterling File Gateway Arrived File status code is recorded as a successful (FG_0411 - Arrived File Routed) or a failed (FG_0455 - Arrived File Failed) file transfer
•Route events
•Delivery events
More information about IBM Control Center.
There is also another way to invoke business processes by certain events:
Edit the listenerStartup.properties and listenerStartup.properties.in files to include the line:
Listener.Class.xx=com.sterlingcommerce.server1.
dmi.visibility.event.XpathBPLauncherEventListener
Where xx is the next available number according to how many listeners are already enabled in the file.
Edit the visibility.properties and visibility.properties.in files to add the necessary information to configure the listener to launch the proper business processes based on the correct events. The pattern for registering the events with the listener is:
bp_event_trigger.X=eventPreFilter,xPathExpression,bpname,userId
There is an example in this page:
https://www.ibm.com/support/knowledgecenter/SS3JSW_5.2.0/com.ibm.help.aft.doc/SI_AFT_InternalEvent.html

Hideous performance using Azure mobile services MobileServiceSyncTable

I have a mobile service sync table that is giving me absolutely HORRENDOUS performance.
The table is declared as:
IMobileServiceSyncTable<Myclass> myclassTable;
this.client = new MobileServiceClient("my url here");
var store = new MobileServiceSQLiteStore(“localdb.db”);
store.DefineTable<Myclass>();
this.client.SyncContext.InitializeAsync(store);
this.myclassTable = client.GetSyncTable<Myclass>();
Than later in a button handler I’m calling into:
this.myclassTable.ToCollectionAsync();
The problem is, the performance is horrific. It takes at best minutes and most times just sits there indefinitely.
Is there anything in the above that I’ve done that would explain why performance is so absolutely terrible?
this.myclassTable.ToCollectionAsync();
For IMobileServiceSyncTable table, the above method would execute the SELECT * FROM [Myclass] sql statement against your local sqlite db.
The problem is, the performance is horrific. It takes at best minutes and most times just sits there indefinitely.
AFAIK, when working with offline sync, we may invoke the pull operation for retrieving a subset of the server data, then insert the retrieved data into the local store table. For await this.myclassTable.PullAsync(), it would send request and retrieve the server data with the MaxPageSize in 50, and the client SDK would send another request to confirm whether there has more data and pull them automatically.
In summary, I would recommend you checking with your code to locate the specific code which causes this poor performance. Also, you could leverage adding diagnostic logging, capturing the network traces via Fiddler to troubleshoot with this issue.

best practice for bulk update in document DB

we have a scenario where we need to populate the collection every one hour with the latest data whenever we receive the data file in blob from external sources and at the same time , we do not want to impact the live users while updating the collection.
So, we have done below
Created 2 databases and collection 1 in both databases
Created a another collection in different database( configuration database ) with property as Active and Passive and this will have the Database1 and Database2 as values for the above properties
Now , our web job will run every time it sees the file in blob and check this configuration database and identify which one is active or passive and process the xml file and update the collection in passive database as that is not used by the live feed and once it is done , will update the active database to current and passive to live
now , our service will always check which one is active and passive and fetch the data accordingly and show to user
As we have to delete the data and insert the newly data in web job , wanted to know is this is best design we have come up with ? Does deleting and inserting the data will cost ? Is there better way to do bulks delete and insert as we are doing sequentially now
wanted to know is this is best design we have come up with ?
As David Makogon said, as for your solution, you need to manage and pay for multiple databases. If possible, you could create new documents in same collection and control which document is active in your program logic.
Does deleting and inserting the data will cost ?
the operation/request will consume the request units, which will be charged. To know Request Units and DocumentDB Pricing details, please refer to:
What is a Request Unit
DocumentDB pricing details
Is there better way to do bulks delete and insert as we are doing sequentially now
Stored Procedure that provides a way to group operations like inserts and submit them in bulk. You could create the stored procedures and then execute the stored procedure in your Webjobs function.

Simplest possible BAM Scenario

I’m trying to set up a very simple BAM scenario within BizTalk Server 2013R2 upon which to build, involving tracking just the time of all incoming messages processed by a port.
To this end I have :
Within Excel, created an Activity Definition (called
SimpleReceiveTest) containing a single Item called ReceiveTime of
type milestone (date/time), and a View Definition (also called
SimpleReceiveTest) containing just this Activity Definition and Item.
Imported this BAM definition spreadsheet using bm.exe
Added view rights to SimpleReceiveTest again using bm.exe
Launched the Tracking Profile Editor, imported the BAM Activity
Definition, and mapped ActivityID = MessageID and ReceiveTime =
PortStartTime by drag and drop from the Messaging Property Schema, as
shown below :
Set the Port Mappings for MessageID and PortStartTime to relate to a
test Receive Port ReceivePort1 that I am using for testing. This is
using a pass-through pipeline.
Saved and applied the above Tracking Profile
It is my understanding that for any messages received on port ReceivePort1 I should now get a tracking activity created. However this is not happening – there are no records in any of the BAM tables/views and no data is available within the BAM Portal.
I have tried restarting the hosts, and have verified that the TDDS_FailedTrackingData table is empty, there is nothing relevant in the event log, a Tracking host is running and the SQL Agent Jobs are running. I have also tried running these jobs manually.
Have I missed something, and am I correct in my expectation that this simple scenario should create tracked activities for any messages passing through the Receive Port? If so what can I try to further diagnose this?
Now fixed - it's actually a bug in vanilla BizTalk 2013R2 when using a standard pipeline that has been fixed in CU2.
FIX: BAM tracking doesn’t work when you use the XMLReceive or a custom pipeline in BizTalk Server

Resources