Interclypse Data Migration Framework
Interclypse
On a continuous mission to have a positive transformational impact on society, community, industry, and individuals.
Following the lessons learned in our previous article, here we’ll cover how to enact a solid plan to achieve a successful data migration (see previous article for more details). For highly complex migrations, we use the following framework:
The Data Migration application should be treated like a production service
We suggest the following Layers within the Data Migration application:
Connectors (Input/Output): These subsystems connect to external systems to read or write information via a web service call, reading a file, etc. They encapsulate any complex logic to authenticate with the external systems.
领英推荐
Filters (Validation/Security): Filters are used to remove old, unused or invalid data as part of the data migration.
Transformer: This system layer converts the old data into the new data format. This should maintain versions of separate executions to ensure consistency and progress toward the end goal. Additional quality checks should occur at this layer to ensure that old data is converted properly, like ensuring required fields are populated or standardizing null vs empty values.
Auditor: This layer provides the final authority prior to transfer to the new system and performs logging for metrics and status.
With this framework in place, we can assure the most accurate transfer of data to the new system with the least amount of errors. In our next article, we will dig into the Data Migration Process including micro-services, API Keys, and what to do when your system fails.