AIF or Data Migration Framework [AX 2012] - axapta

I was importing some entities to AX 2012 using AIF and consuming the web services through an C# ASP.Net application.
I already made it for Customers,Vendors,Workers,Chart of Accounts and now starting General Journals.
Some customization I could find a workaround using the AIF Document Service Wizard:
Creating the DUNS number using a service for the DirDunsNumber table, later associating the customer with the new created DUNS Number.
On the Products data migration will need a lot of customization like this.
This month I heard the annoucement that there is this new framework (Data Migration Framework), still in beta version.
I would like to know if the Data Migration Framework would cover all of these customizations ?
What are the advantages of this new framework over AIF ?

The advantage over the AIF. Then I assume you are referring to the Document services. Well the first big difference is the purpose of the two frameworks:
DMF is to do migration of lots of data and should be built to handle the import of data efficiently (I repeat 'should' because until now the whole DMF code is not quite optimized for performance 'as is') It is using SQL Server to import the data into staging tables which should be faster than AIF.
AIF is to build services to integrate with Ax and is more intended to handle 'messages' instead of doing migrations. The big disadvantage of the AIF is the fomat (it uses soap XML and is validated by a schema which requires the messages to be in certain formatting where you as a developer have less control) Also, AIF in my opinion causes overhead and is not that transparant as a framework. For example you have to do some performance optimizations like caching the number of decimals of EDT's because it takes too long)
In your case I would go with Custom Services. The can be built and deployed relatively easy as opposed to AIF services and you have more control over the Data contracts you are providing to be used.
So in short:
Use the DMF to do migration and import of large data sets
Use Custom services to provide services to communicate to Dynamics AX 2012. These are WCF services and thus well known by non-AX developers
Use AIF if you are forced to do so ;-)
If you want to read about AIF (and Dynamics AX 2012 services in general) , you can have a look at a book that I wrote together with a collegue of mine : Dynamics AX 2012 Services

The Data Migration Framework is found here.
As the name implies it primary focus is "data migration", a one time move of data from a system to AX.
AIF is more for day to day transfer of business data (for example invoices).

Related

Calculating helping columns on R script on azure machine learning, so they could be later added to tabular model

So I am creating flow for imorting data, so some aggragate columns could be uploaded to Azure Sql database and later on tabular model. Now I would to describe the flow, so someone could tell about it, pros and cons.
At this stage of development the flow is:
1.
User imports CSV file to my web service (in ASP.NET CORE 2.1) to Azure Sql database, for importing I am using Sql Bulk Library in .NET Core.
The webservice and database will be located on the server in azure.
Some of the data importstakes about 20minutes.
2.
When the data import is finished, I am calling Azure Machine Learning Web service, who will calculate helping columns so later on with MDX queries I could retrieve data from tabular model more easily and more efficiently. These helping columns will tell if users were active previous month or not for example.
3.
When R script finishes calculation it updates Azure Sql Database table with new column(s).
4.
When columns are updated in a database, I am telling Azure Analysis Service to refresh database(not in .net core version because it doesn't support ADOM.NET) So I am created another web service (.NET 4.7) so I could from web service automatically refresh it.
5.
So finally the new data appears on a tabular model so I can get information of data using MDX queries with ADOM.NET library.
Please tell me if there is a better solution for this flow.
Azure SQL Database supports in memory R execution for feature engineering, training models and inference. It is currently in preview, but will GA soon: https://learn.microsoft.com/en-us/azure/sql-database/sql-database-machine-learning-services-overview
Also at //BUILD, Microsoft announced a serverless performance profile for Azure SQL DB which is perfect for low frequency jobs like this.
This hopefully can simplify your workflow dramatically.

What is the best way to migrate data from oracle to dynamics AX?

In order to migrate data form oracle to Dynamics AX, master data and open transnational data should be migrated to the system like General ledger, vendors, customer, open payments, open invoice, PO...
However, there are several methodology to use in order to migrate data to the target database.
For example :
using DMF
SQL server integration service.
create AX job for each module and classes
What is the best methodology for the user to achieve this goal?
Please advise me, I am also looking for the most guaranteed one to avoid any failure since the finance.
Thanks.
There is no guaranteed way and the project will be of a much bigger scale than you think. A year ago we migrated from a heterogenous system based mostly on Oracle (but with different databases and systems) to AX 2012 R2. Just the migration project took more than a year of very iterative developmnent, in the end we were doing test migrations every 3 weeks for logistics data (coupled with live testing of the migrated data) and we did financial migrations abour every other month.
Generate export data from the original system in the desired structure. This includes all the transformations. We used a staging oracle database with views and / or staging tables. The SQL scripts and procedures used to generate the data are a project in itself, don't forget to use versioning. Partial data validation was part of these scripts as well (esp. important for financials, data in our original financial system differed a lot from how AX needs the data)
Create a tool that can semi-automatically export groups of data from the staging database to CSV files
Use a combination of jobs, DIXF and XLS imports. The XLS imports were used for ledgers (you can quite easily import a csv file into a journal and check everything before you post it). DIXF was used mainly for customers, vendors and adresses. The rest was done using jobs.

DB advice and best practices for ASP.NET based web site?

I have a web site I developed for displaying the results of some data analysis work I did. It relied on ASP.NET for the front end and connected to a MySQL back end utilising Entity Framework and LINQ extensively.
I chose MySQL because I personally have used it in the past and like the database, but this resulted in some serious issues when I had to deploy it to a hosting provider (incompatible connectors, access rights, etc.)
I am now getting ready to redevelop and expand the site and I am looking for some advice to avoid the issues I had last time.
The new DB has to serve two roles. The first is to be a data provider for the charts that are the output of the analysis work. These tables are straightforward, almost flat files, with 10 tables. One table has roughly 200k rows of data the rest have aprox 1200 rows of data each. There are little references or queries between the DB tables, but there are a few. This data is updated periodically by a back end process and does not need to be added to or edited by the user.
The second role of the DB would be as a basic persistent store for a standard user management system. It would need to manage data for adding/ removing clients, user names, passwords, access rights. etc. No financial data or super secure data is involved.
What database approach would you recommend that would give me easy deployment and management at a web host and still allow me to use both Entity Framework and LINQ effectively.
Second, what tools/frameworks should I consider as I rewrite this system. It is very graphical and data focused. Presentation of charts and information is the key factor in this site. Are there any new technologies or frameworks that would add specific value to what I am doing?
A few notes. I am a one man shop and I maintain the entire system myself so I am less worried about enterprise level frameworks than other people. My focus is on the easy development and deployment of the site. Maintainability is also a key factor.
I am also an experienced C# developer, but new to ASP.NET and the web side of things. The first version of this site was a big learning experience. It was good, but I wasted an enormous amount of time on just understanding new technologies and approaches. I am very open to learning, but I can't afford the time to get my head around a complete paradigm shift.
I am looking forward to your thoughts, thanks.
Doug
The natural choice would be SQL Server. I'd guess by your description that you are way under the maximum space limit of the SQL Server Express edition. I of course supports Entity Framework and the drivers are part of the .NET Framework, so no problem with third party assemblies here.
This will also open up the possibility to host your app in the cloud (Azure) later on, because SQL Azure in fact is a Microsoft SQL Server, so there is no overhead in supporting that.
Regarding user management - ASP.NET has this all build in (Membership, Role and Profile provider) and also a SQL Provider for which default tables are available. So you don't have to design your tables by yourself and it runs very naturally on SQL Server.

WPF - new idea of develop asp.net app?

I am a student and in last semester i develop asp.net application - "Payment Helper for School" . In this new semester i will have subject "Graduation Project" and i thinking about develop my application.
In my old app i use:
ASP.NET web forms
mssql 2008 database
linq to query to db
I achieved technologies above at good level and i want learn another one - WPF and XML cause i see that employers demand these and to create "cheaper" data layer ;)
I think to use in new version:
WPF forms instead of ASP.NET
XML (XSLT etc) instead of MSSQL
i stay in linq to XML
Do You think that this is good develop of my app? I am now learning WPF from tutorials, but i want to learn it at practice. I think also about NHibernate which interesting me, but it is too much :/
What advice you can give me at developing app rely on WPF and XML db ?
Doing this would move your application from web based to windows based. Think of WPF as winforms.
If you want to keep this application web based, you could look at silverlight, its a subset of WPF.
I wouldnt change your storage layer from SQL to XML if the app is data driven. Otherwise your going to have to handle concurrency, file locking etc which SQL server does for you by default (row locking etc) Ie, if there is lots of writing / updating of data SQL server is a much cleaner option than XML. In this context, think of XML more of a language independent way of passing data around, and storing simple data structures that are primarily read only.
If this is a graduation project, perhaps focus instead on design methodologies, MVC, or MVVM if you do choose Silverlight. A plus for potential employers and plenty of depth for a dissertation.
Edit - id argue XML would be a more expensive data layer! For the reasons above its going to be more complex and therefore greater dev time. Additioanlly SQL server express is free and the advanced edition with full text search and SSRS is also free; we have loads of clients that use it.
Edit 2 - another option would be to use Flex. It would be a completely independent tech on the presentation layer (and from a Uni point of view this could be good as you may loose marks / not even be allowed to use stuff from a previous project) Then you could use a combo of XML and JSON to pass data between the Flex layer and .NET on the server. Just another thought for you!

Getting data out of PeopleSoft

We have a PeopleSoft installation and I am building a separate web application that needs to pull data from the PeopleSoft database. The web application will be on a different server than PeopleSoft, but the same internal network.
What are my options?
This one's an oldie but it may still be of interest.
PeopleSoft has it's own schema within the host database (Oracle, SQL Server, DB2 etc) which are the PSxxx tables, eg: PSRECDEFN is the equivalent of Oracle's DBA_TABLES. These tables should not be touched by any external code. The application tables are stored in PS_xxx tables, eg: PS_JOB. These tables can be read and updated by any SQL code.
Many batch programs in PeopleSoft (eg: Application Engines, COBOL or SQRs) access the tables directly, and this is the fastest way to get data into or out of the database. However PeopleSoft has quite a rich application layer which is bypassed when doing direct SQL. This application layer must be replicated in direct SQL code, especially for inserts or updates. There may be updates to other tables, calculations or increments of database-stored counters.
To determine how to do this one must look through the PeopleCode (a VB6-like interpreted language), page design (via Application Designer) and use the PeopleCode and SQL trace tools. These days the application layer is huge, so this can be a lengthy task for non-trivial pages. PeopleSoft groups related pages into "Components", and all pages in the component are saved at the same time.
Component Interfaces were introduced with PeopleTools 8 as a means to avoid doing all of this. Using a generator within the PeopleSoft app designer, a Component Interface is generated based on the component. For many components these can be used to access the pages as a user would, and can be accessed via PeopleCode programs, and therefore via App Engine programs and via the Integration Broker. They can also be wrapped in Java code and access directly by code able to execute against the app server with a web service wrapper. This method is best for low-volume transactions: heavy extracts work better with native SQL.
The online development and tracing tools in PeopleSoft are pretty good, and the documentation is excellent (although quite extensive) and available on: http://download.oracle.com/docs/cd/E17566_01/epm91pbr0/eng/psbooks/psft_homepage.htm
If you are just looking at bringing out data from a given Component, the easiest way would be to turn on the SQL trace (under the utilities menu in PeopleSoft) and bring up some records for the Component. Wading through the trace file will give you a good idea of what to do, and much of the SQL could be cut and pasted. Another method would be to find an existing report that is similiar to what you are trying to do and cut out the SQL.
Have a PeopleSoft business analyst on hand to help you develop the requirements wouldn't hurt either.
Yes - Integration Broker is Peoplesoft's proprietary implementation of a publish/subscribe mechanism, speaking xml. You could of course just write code that goes against your database using JDBC or OLE/ODBC. Nothing keeps you from doing this. However, you must understand the Peoplesoft database schema, so that you are pulling from, or inserting/updating/deleting all of the proper data. Peoplesoft takes care of this for you.
Also, check out Component Interfaces - and they are exposed as an API to Java or C/C++.
I guess it depends on your requirement, and which version of PeopleSoft you're on.
Do you want real-time lookup? If that's the case then you'll want to look at Web Services/Integration Broker.
If you want a batch/bulk export then a scheduled App Engine would do the trick.
The best way is to use Integration Broker (IB) services to expose the PeopleSoft database data to external applications. The external application will be able to access the PeopleSoft IB services as XML over HTTP, thus allowing you to use any widely used XML parsers for this purpose.
The problem with component interfaces as opposed to Integration Broker is that component interfaces tend to be much slower than direct DB access from within IB service PeopleCode. Also future additions to the component attached to the component interface sometimes tend to 'break' the interface.
For more details on PeopleSoft Integration broker, you can access the online documentation at http://docs.oracle.com/cd/E26239_01/pt851h3/eng/psbooks/tibr/book.htm
Going directly to the database means you have to re-create the presentation logic... see my longer answer above. You can do this for simple pages but otherwise using a component interface is the way to go.
You can also write a sqr process for bulk data extraction. SQR will create the output file which the other application can pick. SQR would be faster than the application engine programs as it performs most of the operations in memory.

Resources