During the design and development phase, you will need to work on data migration actively. The key steps during this phase include data mapping and transformation for the identified migration elements, creation of tests and a go-live plan for migration as well as developing the scripts, templates, and test datasets for migration.
The following are the key steps for managing the data mapping of the source and target systems:
During the design and development phase of the project, you should develop the overall plan for migrating the identified data elements. The following is a list of items to consider when developing your plan:
In this section, we will talk about the most commonly used tools including Data Import/Export Framework (DIXF) shipped with Dynamics AX. Microsoft has made huge investments in the Data Import/Export Framework, and this has been a big step in solving the challenges in data migration. Although the introduction of stable and rich features in DIXF has reduced the need for other tools for migration, we will discuss several options for migration.
There are several factors that you should consider:
It is often easier to simply enter the data than to migrate it, provided that the dataset to be migrated contains only a few hundred records that can be entered in a timely manner and with relatively few errors. Part of the consideration of whether to manually load the data is the required timing—if you can do it ahead of time or post-release, manually entering the data is a good method. If the data load needs to take place during the downtime and on a critical path, then it could be tricky.
The Data Import/Export Framework (DIXF) is an extension module to help export and import data in Dynamics AX. DIXF is now an essential part of Dynamics AX R3. It can be installed explictly and used with Dynamics AX R2 and the RTM release as well.
The common pattern for the use of DIXF is shown in the following diagram:
The framework is commonly used in the following areas:
The framework is not recommended in the following areas:
Let's first understand the common terms used with DIXF. They are explained as follows:
The following diagram shows the architecture of the Data Import/Export Framework. It can basically be understood as a Source | Staging | Target process:
The Data Import/Export Framework creates a staging table for each entity in the Microsoft Dynamics AX database where the target table resides. Data that is being migrated is first moved to the staging table. There you can verify the data, and perform any cleanup or conversion that is required. You can then move the data to the target table or export it.
The following diagram shows the steps that are required to import or export data in Microsoft Dynamics AX:
This section describes the usage and description of AIF in context of data migration:
The Application Integration Framework is generally used in the following areas:
The framework is not recommended in the following situations:
This section describes the out-of-the-box features of Dynamics AX:
Custom X++ is commonly used for customizations or customized actions. For example, applying cash discount schedules to specific accounts after migrating vendor invoices or applying custom payment schedules to break the due dates of Open AR records. It is also used for large datasets to be migrated from SQL—a need to release 10 million product variants as part of migration.
It is not useful for entities that are supported by DIXF and can handle the large amount of volume.
As the name suggests, you need to write a custom code to import it from a file and apply business rules for the import. In some cases, writing custom code/jobs to import SQL tables and CSV/Flat files for custom processes may be an easier and a faster way to approach data migration.
As this is a custom code, you have more control (and responsibility) over the number of validations you want to add or bypass depending on the pre-validation of source data; standard Dynamics AX business logic does not apply to your data.
In projects that need millions of records to be migrated, you may not be able to afford going through the Dynamics AX business logic. Hence, you need to pick a custom route. You can use the power of SQL to speed up the performance, directly load data into Dynamics AX, and use custom code in Dynamics AX to further process the data. For example, importing data in the staging table using direct SQL statements (you can work around RecId assignment by reserving RecId for the custom staging table through X++ and writing the X++ class, which can be multi-threaded for further processing of data).
To use such an approach, you need a deep understanding of the Dynamics AX processes, its data structure, and how data flows across tables.
It is commonly used for the following purposes:
An Excel add-in is not useful for larger datasets.
This section includes the tips and tricks based on data migration experiences.
Define template spreadsheets for common data migration scenarios and collect sample data from the business early on. Use this data for CRP. This will help to identify problems sooner and eliminate surprises later.
If possible, extract the source data into SQL tables. It is easier to clean and transform data using SQL queries rather than manipulating data in flat files. Data validation can be done directly on the source staging table to avoid repeated failure due to loading bad data and wasting time.
Even if you may never need a particular field, the future versions of AX may eliminate or add extra functionality to that field. Creating new fields in AX is easy.
Come up with an approach to convert the bulky data pieces ahead of time. For example, in a project where we had to migrate 10 million products, we migrated most of them a week earlier and ran them concurrently for the one week period for product maintenance. This reduced our downtime over the release weekend.
You may have data elements that are not required at go-live (fixed assets migration); try to push them to the following weekend.
Set SQL to the Simple Recovery mode for faster performance. Make sure to set it back when completed.
Multithreading can help you speed up the process by executing it in multiple threads. However, it can degrade the performance if you add more threads than what your SQL or AOS can handle. Many times, the Max DOP (Maximum Degree of Parallelism) setting on SQL needs to be changed to allow multiple CPUs per thread in order to get an optimal number of threads that can be executed in parallel.
Consider index and statistics maintenance after a bulk load of master data to speed up the dependent processes. If you have indexes that are slowing down your inserts, you will want to disable them during bulk load of data.
Turn off all database logging in AX; it will significantly slow down the process.
Sometimes, it is easier to perform a basic data migration of standard data in AX and then use SQL to update the values. Ensure that your scripts and steps are well documented.
Sometimes, it is faster to import high-volume data directly into SQL. You can save your time by bypassing the Dynamics AX business logic, so you only need to run validations on data quality. Generation of RecID in Dynamics AX can be tricky. However, it can be done with resources that are well versed with the Dynamics AX architecture (a number sequence is maintained in the system sequences table for each table ID), and which can draw boundaries between importing through SQL and running through the X++ business logic.
This is another important area that can easily go out of control if not managed well. You need to have a good change-tracking mechanism for configuring the changes and the resources responsible for a specific area that are making changes in your golden (stage) environment.
Once you have the golden environment set up, take a backup of your database with all the configuration (premigration) data that can be simply restored every time you start a new migration run. If you have the liberty to set up a production environment ahead of time, all the configurations can be made in production. Bring the production data into the migration environment and run migration tests on it. If you cannot set it up and use the production environment for configurations, create the golden box and use it.
You can create a template legal entity and use it for setting up new legal entities. You can also copy configurations from one environment to another:
There are certain configurations that you need to review carefully on the project, and make decisions for these configurations considering the cross-functional requirements. These are configurations where there is no going back, like the product dimension group, storage dimension group, tracking dimensions, inventory model group, and so on. Each checkbox is going to have a great impact on the outcome, and once you have transactions that take place, you can't change these selections easily. These configurations impact the financials and supply chain aspects, and that is why you need cross-functional reviews.