D365FO Data Migration Strategy

D365FO Data Migration Strategy

Introduction

Data migration is the process of obtaining data from the existing (legacy) systems, mapping this to the new data requirements. Generally, this is the result of introducing a new system or location for the data. The business driver is usually an application migration or consolidation in which legacy systems are replaced or augmented by new applications that will share the same dataset. These days, data migrations are often started as firms move from on-premises infrastructure and applications to cloud-based storage and applications to optimize or transform their company. The data mapping (or the lack there of) will highlight the transformations required as well as the cleansing activities required to ensure the new system(s) is/are populated with current (and not redundant) data.

These data migration activities are driven by the Key Users, as it is the business that owns the existing data, knows best how the data is currently used; and will ultimately own the data in the new system.

Development and testing of the migration is an iterative approach and will be performed a number of times during the test cycle.

The migration strategy aims to do the following:

  • Reduce the amount of effort required close to, and during, the go-live process (cutover).
  • Ensure that the migration processes at go-live can be done quickly and effectively.
  • Ensure that the data is current at the time of go live.
  • Ensure that all parties are confident with the integrity of the data that is being migrated.
  • Provide users with opportunity to gain confidence and understanding of the data prior to go-live—during CRP!

In order to achieve the above objectives we aim to:

  • Have all master data (see definitions below) entered/uploaded as early as possible.
  • Have templates and processes in place for migration of any planned transaction data at go-live. These migrations should be fully tested and reconciled with the legacy system, this will be an iterative process.
  • The deployment site team must have confidence that the take-on process is accurate and able to sign the data take-on off before the system can be put into production.

Migration Planning

A detailed data migration plan is the essential first step in a successful data migration project to select, prepare, extract, transform and transfer data of the correct form and quality. At a high-level, data migration consists of the following activities:

No alt text provided for this image


Analyse

The data first needs to be analysed & retrieved relevant information from data sources in a specific pattern.

Pre-process

In this step the data is transformed?into a format that will be more easily and effectively processed for the purpose of the user. There are a number of different tools and methods used for pre-processing, including:?

  • sampling, which selects a representative subset from a large population of data;
  • transformation, which manipulates raw data to produce a single input;
  • normalise the data based on data templates
  • organize the data for more efficient access;?

Extract

This process involves retrieval of data from data packages. The data is extracted to process it further, migrate the data to a data repository (such as a data warehouse or a data lake) It’s common to?transform the data?as a part of this process.

Load stage

Once the data is extracted & transformed into specified format, the packages then load the data into staging tables in this step.

Validate

Once the data is uploaded to staging the data validation checklist process is executed to validate the data. This validation is performed at table level as well as data entity level.

Target

Once the validation is completed the data from staging table is moved to target table.

No alt text provided for this image

Data Types

The legacy system data need to categorise as below by which migration activity works.?

  • Configuration Data

The configuration data contain the company and system setup information, In D365FO we call that golden configuration which usually applies on gold environment. By using a vanilla system, the base company setups needs to be done before begging DM activity

e.g. system and module parameters

  • Setup Data

Setup Data is an extension of the setup and configuration and can have multiple values.? Depending on the volumes this information can be set up manually or extracted from the old systems, or sites “companies” that are already live and then loaded into the new site company account.

e.g. payment terms, shipping terms??

  • Transactional Data

Transactional data is the data that is currently in the legacy system that will be extracted, transformed to adhere to the new validation rules and then loaded into?

As part of the data migration strategy [Company] must decide whether it will be only current or open transactions that will be migrated or if historical data will be migrated as well. This is defined below in Table 5.

e.g. postings, orders. lines, journals new system.

  • Master Data?

The following section provides information on Master Data with regard to data which changes infrequently. This table will need to be re-evaluated and adjusted for each site deployment.

e.g. customers, vendors, products

Process and best practices

The data migration process begin with configuration and Setup Data that copied from the Base data environment into the Data Migration (and all other) environments periodically. The Master and Transactional data will then be migrated into the Data Migration environment.?

Once the data migration has been completed successfully, the Key Users will test, validate and verify that they are able to sign off the data take-on on at go-live.? The Key Users liaise with the Data Migration lead to correct extracts, templates or scripts if need be.? Migration will be iterative and key users will need to validate the data after each iteration and initiate any needed changes.

Below are the details information about overall process and best practices

Environments

For any standard data migration, mainly two environments are required to keep configuration and transactional data isolated. The environment called ‘Gold’ is to keep configurational data, as per standard best practice this environment should not use for any transactional data upload.

Second environment call ‘DM’ is used to keep master and transactional data top of configuration, which means the base data, should be copied from ‘Gold’ first before importing other data.

Below is the high-level data flow

  1. Gold: This environment is used to keep configuration setups data
  2. Data Migration/Test/Staging environment: This environment is used to keep master and transaction data.

Data Flow

Below diagram, describe the details data from into FO system from various environment during implementation.

No alt text provided for this image

Migration steps?

Following are the steps for the data migration?

  • Template –Dynamics 365 Finance and Operations has ready-made data templates that help you export data. These templates are updated with each new release. To get these templates, go to LCS (Microsoft Dynamics Lifecycle Services) > Shared asset library > Data package. Download the template to your PC and import it into D365FO

1, Default data templates: Dynamics 365 Finance and Operations comes with a set of standard data templates that include commonly used entities that are sequenced to provide a starting point for users to export data from the system. Standard templates are updated with each release of D365FO to reflect changes in the application.

2. Custom templates: Custom data templates can be created either manually or by modifying default templates to meet specific business needs. For instance, you can load template ID '190 – My Template' from LCS and update it with the necessary entities based on your requirements. If you need to change the template ID, you have the option to either rename the existing template or create a new one and import the updated data.


  • Mapping – To match the data structure from legacy system to FO, we need to map the source data with FO data entity, In the process of import, mapping describes which columns in the source file become the columns in the staging table. Therefore, the system can determine which column data in the source file must be copied into which column of the staging table.

In this step, we may have to consider data transformation where data length & type should match with FO.

Note: if during data migration process, customization is going on those entities which in scope on DM, re-mapping needs to be done frequently according to the changes.


  1. Use devOpps for DM tracking – DevOpps is the recommended tool to keep the track of each data migration activity like module wise configurations, data validation, import, configuration changes.

Below DevOpps structure can be followed for DM activity

  • Feature: For each data entity there must be individual feature
  • Task: for any missing configuration/master, there must be a task by which gold environment need updated.
  • Bug: For each data related issue, there should be a bug task, which need to assign to data provider.


  1. Import/update configuration – As per environment plan, the configuration data need to upload into ‘Gold’ environment. The basic company setup, number sequence and other initial parameters needs to be defied and after that other configuration can be updated manually either through configuration data import from templates available in LCS.
  2. Data refresh: Once configurations are ready on gold, DM environment’s DB need to refresh from GOLD. This is a recurrence activity which need to repeat until gold configuration are finalized, this may go upto pre-prod.
  3. Data sequence: The Base company containing the approved setup and configuration sets and the Master Data is copied to the Migration Instance to form the basis of the data load, each subsequent activity must be performed in an agreed sequence, i.e. ensure that referenced data is loaded first.? At a high level the process is

Configuration data

Setup Data

Master Data

Transactional data

Within each high level step there will be sub-steps e.g. customer records will need to be uploaded before customer addresses can be uploaded, vendor accounts will need to be created before vendor bank account details can be uploaded.

The sequence of these activities must be should be stated in the Cutover Register as well as the duration, the responsible person, and dependencies it has.? (SCD comment - Is there a cutover register/checklist?)

  1. Import master & transaction – By following the sequence of data templates, master and transactional data import can be started. Below are the key considerations before starting data import
  2. Data import should be done in required sequence
  3. Mandatory fields values needs to be checked.
  4. Date formats and field length needs to be verified.
  5. Data validation and tracking – Once the records are imported into the system, data needs to be cross verified by comparing the source and target data count. Below are the basic validations needs to follow.?
  6. All failed records should be reimported by fixing the issues
  7. Data tracker should be maintained to track the status of records by using DevOpps tool.
  8. Data Signoff: Once data validation is completed; it needs to be verified by user. User checks all the setup configuration; master data and then provide signoff.
  9. ?Create Data packages – Its recommended to use data package to move the imported and verified data from DM/Test environment to the next level (SIT, UAT)?
  10. Production readiness: Before moving data to the production environment first data needs to be moved to pre-prod environment. For that first gold configuration should be moved & then the data packages needs to be imported. The final data verification and sign-off required from client and then production DB can be refreshed from pre-prod.

Recommendations

  • Always have a configuration management and data migration plan.
  • Always baseline the configuration whenever it is to be deployed in the production.
  • Collaboration tools should be leveraged as a repository with track changes enabled for traceability.
  • Align your configuration and data migration plans with implementation methodology.
  • Ensure sign offs on business requirements towards data migration.
  • Always ensure that you have a golden environment in your plan and use it to seed the other environments.
  • Keep multiple data migration strategies for the following:
  • Initial system load
  • Key configuration masters
  • Business specific master data
  • Open transactions
  • Regular and cut-over time
  • The migrated data must always be verified, tested and accepted as a part of system acceptance.
  • Ensure that the mapping and transformation logic is tested for sample data before running full-fledged for all data.
  • The ‘to be imported’ is always reviewed, validated and cleansed by the business before being imported.
  • Do not forget the sequencing of data load; this is one activity that can bring in a lot of rework if not managed carefully.
  • Always ensure that the naming conventions are defined for the configuration and data migration elements and are used consistently throughout the project.
  • Before importing make sure that all the necessary system setups & related master data is present in the environment.
  • All the number sequences setup is done.?
  • Template is shared to concerned functional consultant or user first, this template is then used to import the records in the system. These templates are generally the standard templates which are available in F&O or the template from the data entities which are exported.
  • Make sure that mandatory fields & data types are marked in the template.
  • There are certain fields which D365 F&O automatically generates while importing the records that fields should not be added in the template. e.g RecId.?
  • Truncating
  • For import projects, you can choose to truncate records in the entities prior to import. This is useful if your records must be imported into a clean set of tables. This setting is off by default.
  • Validate that the source data and target data are mapped correctly



High level Strategic approach

The following steps will be used to assess the specific needs for each site and the decisions will be recorded within the site specific Data Management plan. See appendix A

  1. Identify and allocate resources, responsibilities and ownership

See Table 2, Table 3 and Table 4 above

  1. Identify which data is to be migrated

  • General category e.g. Customers, Sales orders etc.
  • Selection criteria – how the specific records are determined for each data set e.g. Customer accounts that have placed an order in the last two years or have been created in the last two months but not yet used or have pending quotations.

  1. Identify data source

  • Legacy system
  • Discrete source – excel sheets etc.
  • Newly created/generated

  1. Identify target data system

  • AX2012
  • D365FO

  1. Identify extraction mechanism

  • Existing tools/reports
  • Custom report/script

  1. Identify transformation process

  • Rules based – automated
  • Human intervention
  • Timing of transformation

  1. Identify upload/entry process

  • Manual
  • Atlas
  • DIEF/DIXF
  • Script
  • Other (define)

  1. Define the data validation process

  • Inspection (should be scripted) – manual
  • D365FO reports/extracts – expected results defined
  • Custom reports
  • Trial processes – orders/journals/record creation. Migration test scripts.



Migration cycle

Precursors

Configuration & setup data to entered or seeded during the design phase/process

Setup Data – supporting tables entered/uploaded as part of design

Cycle 1?

Upload sample Master Data

Customers

Products

Vendors

Employees

etc.

Test verify sample data

Run migration test scripts including:

Data inspection

Run reports

Transact?

Review cycle 1

Validate the approach, strategy, data quality and tools used. If necessary:

Further data cleansing

Further education/training

Change migration approach (tool)

Modify templates

Adjust Base data

At this point an assessment will be made to determine if the results are sufficiently acceptable that the migration can move onto cycle 2 or if cycle 1 should be repeated. If results are deemed acceptable then this migration will be replicated into the CRP1 environment.

Cycle 2

Upload volume Master Data

Customers

Products

Vendors

Employees

etc.

Test/verify volume data

Run migration test scripts including:

Data inspection

Run reports

Transact?

Review cycle 2

Validate the approach, strategy, data quality and tools used. If necessary:

Further data cleansing

Further education/training

Change migration approach (tool)

Modify templates

Adjust Master data


Cycle 3

Upload volume Master Data and representative transactional data

Customers

Products

Vendors

Employees

Sales orders

Production orders

Purchase orders

etc.

Test/verify volume data

Run migration test scripts including:

Data inspection

Run reports

Transact?

Review cycle 3

Validate the approach, strategy, data quality and tools used. If necessary:

Further data cleansing

Further education/training

Change migration approach (tool)

Modify templates

Adjust Base data

It is intended that at this stage that the process is sufficiently refined that it can be signed of as fit for purpose to use for the go live. If results are not acceptable further migration test cycles will be performed. At his point the process should be acceptable to use to create data in the CRP2 environment.?

Deployment Phase

Following data migration activities are performed during deployment (prior to go-live):

  1. Actual migration of data to the production (live) environment from the legacy systems.
  2. Validation of the migrated data using reports. This acts as the basis of the sign-off for completion of the data migration activities.
  3. Put process in place to manage changes between migration and go-live

Migration of data (from the source or legacy systems) to the new system. production environment is the final critical activity of data migration, and is the key to project success as well as user adoption. Data migrated into Production has to be timed accurately so as to ensure that users see the most recent data when the system is available to them. This activity is usually split into:

Initial migration of data into Production

The initial migration of data into production is performed to avoid overloading the systems during the go-live weekend, and is usually done prior to the actual go-live.? This data typically includes Master Data.

Final data migration into Production

The final data migration is done over the weekend prior to go-live. The final data load will consist of all transactional data.?

Sign-off

The last step in the data migration process is sign-off.? The system cannot be transferred into operation if the data take-on reports are not balancing and signed-off by the key users.


Risks and mitigation

Effective data migration strikes the optimal balance between data accuracy, migration speed, low or no downtime, and minimum costs. The following are data migration success factors requirements and / or possible risks:

?

  1. Data Loss Risk

During the data migration process, data loss can occur. When the data is migrated to the new system or target system, some of the data may not migrate over from the source system.?

This risk can be evaded by counting and cross verification of data from source and target system.

  1. Semantics Risk

Even when the data migration process is done efficiently, semantics errors can occur. For example, if there is a field called “daily average” in the source data, the information from that field could be migrated over into a different column or field in the target system. This inconsistency in data could result in many issues for organizations and IT businesses that need to migrate their data to new environments. For this reason,?data migration testing?is highly recommended when migrating large amounts of business data.

  1. Extended Downtime Risk

The risk of extended downtime comes when the data migration process takes longer than expected. During the migration process, the source system is not active, so this poses potential risks for organizations and stakeholders.

This risk can be mitigate by using non-production environment, its highly recommended not to perform any data migration activity on production.

  1. Data Corruption Risk

When organizations apply rules and validations on the target system, data corruption can occur. Unwanted data can be migrated into the new system, leading to potential crashes and data corruption, which can result in errors for the end user that uses the application.

To deal with risk, D365FO data consistency check can be run which provide the details of all data related issues.

  1. Application Stability Risk

The target application, or target system, can be unstable for a number of reasons, including improper development, improper coding of the new application, or improper coding of business requirements into the new system. These issues increase the risk of the new system being unable to fulfill business needs for end users.

Data migration testing on DM environment can combat these risks where any data entity related changes needs to be test and verified before actual data import.

  1. Orchestration Risk

With orchestration risk, the issue occurs when the processes of data migration are not performed in order. The order of data migration is extremely important, especially since there are varied dependencies between the various business objects.

Sequencing needs to be defined in data import project for all dependent data entities to mitigate this risk.

  1. Interference Risk

Interference risk is particularly problematic when multiple stakeholders make use of the application during the migration process simultaneously. An example of this risk affecting business operations can occur when a stakeholder locks a particular table, making it impossible for other stakeholders to make use of the same table.

To mitigate such risk, data migration operation should perform during non-business hours.

  1. Configuration mismatch Risks

With configuration mismatch risks, the issue always lies with the target application. The target system may not sync with the base configuration setup which may cause configuration mismatch issue.?

To mitigate this risk, target application should by sync with base configuration frequently as soon as there is any configuration change. DevOpps task needs to be maintained to track such changes.

  1. Data error risk

During data import, many records may get fail due to data validation or due to missing of associated master data.

In such cases, DM team usually coordinate with data provider/client to correct the data by highlighting the error records.

If there are many issues on data, the whole DM may get delayed, to avoid this, below process can be followed.

  1. Take DB backup before processing such records
  2. Start data import and update the excel by marking error records
  3. Create missing master data record and maintain those master data reference in separate tracker
  4. Fix the other data issue on given excel and keep reimporting remaining records
  5. At the end, we will be having all the error records and respective remarks
  6. This excel can be shared to data provider to fix the issue
  7. Restore DB with recent backup
  8. Import all missing master data by referring the tracker belongs to point #3
  9. Import data by using the updated excel from data provider

Data Automation and testing

Overall whole data migration more like a manual process as its required lots of human interaction but in certain level automation can be done by using D365FO data task automation feature.

Data task automation lets you easily repeat many types of data tasks and validate the outcome of each task. Data task automation is very useful for projects that are in the implementation phase. For example, you can automate the creation and configuration of data projects. You can also configure and trigger the execution of import/export operations, such as the setup of demo data and golden configuration data, and other tasks that are related to data migration. You can also create automated testing of data entities by using task outcome validation.

Data packages from the Shared Asset Library or Project Asset Library can be downloaded and imported automatically into D365FO using Data Task Automation (DTA), which is available in the Data management workspace. The high level data flow diagram is shown below:

No alt text provided for this image

Detailed information about data automation and testing available from below link

https://docs.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/data-entities/data-task-automation

https://azureintegrations.com/2019/10/07/d365fo-data-task-automation-and-recurring-integration/

Use cases / Data management scenarios

Standard data migration?

The standard data migration process can be followed for a fresh D365FO fresh implementation where you have to migrate the data from non AX legacy system. The key work in this scenario is to extract the required data from legacy system and map with AX.?

No alt text provided for this image


Migration from legacy AX system

Below is the recommended process by Microsoft for data migration from legacy AX system. If you are upgrading your system from AX 2012 to D365FO, you can use the LCS data upgrade tool however MS still recommends to use below

No alt text provided for this image

There are general guidelines that needs to be followed in any project while importing the data.

Challenges and solutions

Joint activity

Certain client required join activity for DM where client owns the sensitive data migration like customer data, credit cards, email address, etc and other data migration own by Hitachi.

Below are the recommendation in that scenario.

  • Define scope and role: At the beginning itself the DM scope should be define and role should be clear for both parties
  • Collaboration : Both parties should collaborate to each other for smooth data migration

DM activity own by client

In many project, client want to take the ownership of data migration, Hitachi provide support on data mapping and templates. Below points can be considered in this scenario.

  1. All standard data entity related templates can be shared with client with additional information like data type & length, mandatory fields.
  2. Its better to arrange a training session to client about overview of DM activity.
  3. Whole data migration process and data flow should be describe to client to avoid any kind of best practice deviation.
  4. To find the dependent data entity, one of the option we can suggest to client is create 4-5 actual records in the system and then export the data.

In this method, the entered can be exported through associated entities and same entities can be refer for further data import.

e.g. For customer or vendor we have dependant data or entities related to currency, address, customer or vendor group etc.?

By using this approach, the dependant data gets identified.

High volume data migration

If you have high volume data to import, you have to deal the data import process by referring below points

Set based operation: A normal data import process first import the data into staging, validate and then it copy the record one by one into main table. The Set based option available on data entity, allow system to move the data from staging to main table in one shot thus it help on performance tuning.

Note: Since this option skip validations and defaulting values, complete validation is required before importing any data, user also have to provide all the default value in template itself.


Bundling: In bundling, we split the data into multiple files, which allow parallel execution. For example, you have to import 10K customer. Through bundling, you can run multiple import job for same data entity by splitting the file for each job.

Entity import execution parameter: In D365FO>Data management> Data import/export framework parameters> Entity setting>Entity import execution parameter, you can define threshold size and task size which allow system to create multiple task for set of records

Sequencing: In D365, definition group provide option to define data entity sequence and grouping which help to run the operation in sequence and in parallel fashion. To import huge data we can leverage this functionality by defining the sequence of dependent entity.

In below example A,B and X,Y belongs to same level which mean they will execute in parallels and C,E will executed once level 1 get completed.

Re-write custom entity for customer specific scenario: We always prefer to use standard data entities to import all types of data but in case to deal with customer specific data this option can be adopted where custom data entity with specific fields can be design which avoid lots of standard operation and can be used for fast data import.

Azure data factory

Azure Data Factory is a cloud-based data integration service that allows you to create data-driven workflows in the cloud for orchestrating and automating data movement and data transformation.

However, it is more useful for integration instead of data migration as data migration is not a recurring job.

Azure Data Factory does not store any data itself. It allows you to create data-driven workflows to orchestrate the?movement of data between supported data stores and processing of data using compute services in other regions or in an on-premise environment. It also allows you to monitor and manage workflows using both programmatic and UI mechanisms.

Data migration activities with Data Factory

By using Data Factory, data migration occurs between two cloud data stores?and between an on-premise data store and a cloud data store.

Copy Activity in Data Factory copies data from a source data store to a sink data store. Azure supports various data stores such as source or sinks data stores like?Azure Blob storage,?Azure Cosmos DB?(DocumentDB API), Azure Data Lake Store, Oracle, Cassandra, etc.

Following is the diagram which shows relationship between the components in data factory.

Import & export data in D365 using azure data factory

Azure data factory has a connector called “Dynamics AX connector” which is used to connect to D365 F&O & azure data factory.

This Dynamics AX connector is supported for the following activities:

Specifically, this Dynamics AX connector supports copying data from Dynamics AX using?OData protocol?with?Service Principal authentication.

OData

OData is a standard protocol for creating and consuming data. The purpose of OData is to provide a protocol that is based on Representational State Transfer (REST) for create, read, update, and delete (CRUD) operations. OData applies web technologies such as HTTP and JavaScript Object Notation (JSON) to provide access to information from various programs. OData provides the following benefits:

  • It lets developers interact with data by using RESTful web services.
  • It provides a simple and uniform way to share data in a discoverable manner.
  • It enables broad integration across products.
  • It enables integration by using the HTTP protocol stack.

?


There are following links which are helpful to understand data import & export using azure data factory.

https://ajitpatra.com/2018/10/30/azure-copy-data-from-csv-file-to-d365-instance-using-azure-data-factory/

https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/data-factory/v1/data-factory-odata-connector.md


https://docs.microsoft.com/en-us/azure/data-factory/connector-dynamics-ax

Azeemuddin Chisty

Microsoft dynamics 365 lead consultant

2 年

Very good article. Great effort to compile everything in one place.

Ashish Srivastava

Solutions Architect Dynamics 365 Finance and Operations

2 年

Very insightful. Thanks for sharing Sudheer!

Nagarajan Veerabathran

Senior Development Manager at Staedean.(formerly known as To-Increase)

2 年

Nice one

Dr. Umesh Pandit

Advisor Solution Architect at DXC Technology | Cloud Solution Architect, ERP AI

2 年

Nice article sir Sudheer Paliwal good share

要查看或添加评论,请登录

Sudheer Paliwal的更多文章

社区洞察

其他会员也浏览了