On Prem SQL to Azure Migration
Amesh PARIT
Data Engineer | SQL Server 2019 |ADF|Power BI | ETL |Developer. Available to work Location Germany, Budapest, Netherlands & United Kingdom
Cloud computing is one of the most transformative technologies of our time. By moving data and applications to the cloud, businesses can benefit from increased agility, flexibility and cost efficiency. However, the biggest challenge to moving your existing SQL Server databases to the Azure is you cannot restore a SQL Server backup to Azure due to security and infrastructure variances. As a result, many developers get puzzled ?? which method they should adopt to move the data to Azure. In this post, I will focus on the methods available to migrate SQL Server databases from on-premises to Azure and share my experiences with the methods I chose while working on such projects. However, I suggest that you do your homework before choosing a method for moving your data to Azure.
Here are some ways to migrate your on-premises SQL database to Azure.
Data Migration Service/Wizard.
This is the service that Azure has developed specifically for this task.?The tool can also perform assessments to identify functionality parity and database compatibility issues when you move your SQL Server from on-premises SQL Server to Azure SQL Database on any of the available deployment options for SQL Server. It will present you with a report that identifies all of these issues, as well as possible solutions for the various database compatibility levels available for your chosen destination for migration.
It has offline and online options. Offline, you can migrate either schema only or the schema and data in one step. This method is most suitable for tests/ POC, small databases or applications that can handle a longer downtime during data migration. With the online option, you can continuously synchronise the local database and the Azure SQL database after you have moved the original schema/data.
?Replication
? You can add an Azure SQL database as a subscriber to a transactional replication. This allows you to migrate the data and objects from your on-premises SQL Server to the Azure SQL Database. Since you cannot restore backups, you can only initiate this replication from a snapshot.
Similar to the online migration for DMS, you can start synchronising long before the scheduled migration. Once everything is synchronised, you can decide when to switch and realign the applications to the Azure SQL database. One thing to note is that you can only replicate tables with a primary key. So if you have a number of tables without a primary key, you will need to get creative or use other techniques to move these tables.
DACPAC
DACPAC is a single file that contains the definitions for your database schema - tables, views, stored procedures, functions and other database objects. Something like a schema, only as a backup. You can create this file using SSDT, SSMS or SQLpackage.exe and restore it to Azure SQL Database. Since this only migrates your schema, you need to combine it with other techniques to bring your data to Azure SQL DB.
?? I used this technique at bp , where we just wanted to move the data to Azure. The company had this old treasury database that wasn't actively accessed, but was occasionally needed for old historical transactions. bp wasn't interested in spending money on maintenance and was looking for quick solutions to move the data to the Azure cloud without making any changes.?So I simply created a DACPAC file and migrated the schema to the Azure SQL DB and then wrote an SSIS package to move the data from table to table. I did this data transformation over a week. It was a large amount of data, but SSIS takes care of it as long as there is no error.
BACPAC
BACPAC is similar to DAC-PAC but encapsulates both the schema and the data. This allows you to bring your entire database to Azure SQL Database. This single file stores the information in json format. This strategy is suitable for small databases. You can create a BACPAC file and import one using SSDT, sqlpackage.exe or SSMS.
领英推荐
Generate SQL Scripts
SQL Server Management Studio has a user interface for generating SQL scripts. You can right-click on a database -> Tasks -> Generate SQL Scripts. If you have worked with SQL Server before, you have probably used this before. What you may not have noticed is that if you click Advanced, you can actually control what the script offers you. In particular, you can change the Engine Type from Stand-Alone or Azure SQL Database. This makes the deployed script compatible with it. You can also specify the data type for the script, whether to output schema only or schema and data.?So it is your TSQL version of DACPAC or BACPAC. The advantage is that you can add/ modify the resulting TSQL script as needed. Similar to BACPAC, the script is suitable for small databases, but not for larger databases if you script the data. If you check the resulting script, it actually writes your data as single insert commands per row.
Deploy Database to Azure
Management Studio provides a wizard to deploy your on-premise database to a Microsoft Azure SQL Database server. This can be used to export your on-premise database into a?.bacpac?file and upload it to your Azure SQL Database server. It will help you to deploy your database to Azure SQL Database. One advantage is that the wizard does a pre-validation check to determine if any unsupported object is present in the database which is not supported on Azure SQL Database.
DevOps Source tree.
Create a new source tree in DevOps and clone it to a local folder with Management Studio and create a new database project for the database you plan to move.?Then connect to the database and select all objects from the folders one by one, such as tables, storage procedures and create similar folder in the project and copy them there. Once you have copied all the objects, compile the project and remove all errors if there any and deploy it to Azure SQL DB. As this only migrates your schema, you will need to combine it with other techniques to bring your data to Azure SQL DB.
?? I tried this method in my current project at Orbit Group , it is the best way to start working in a new environment. I created a new repository and copied all the database objects and then deployed them to the development environment and did complete setup like setup SQL agent jobs, SSISDB, Folder setup and so on. So it was a complete new setup of Dev, Test and new production. And Also there was no a source tree for this DB before, so I had many challenges, such as lots of duplicates and temporary objects, link servers, synonyms, hard-coded database names and so on. I had to thoroughly check every object and compile it to success. It was time consuming, but it was worth it, now we have clean code in the repository so can deploy on dev and test environment with a associate CICD pipeline. I used then SSIS package for data transfer so we could run the tests with actual live data. After we got the approval, we did the switch to prod over the weekend.
Migrating large amounts of data to Azure
The Azure Import/Export service is used to securely import large amounts of data into Azure Blob Storage and Azure Files by sending disk drives to an Azure data centre. This service can also be used to transfer data from Azure Blob Storage to hard disk drives and send it to your local sites. Data from one or more drives can be imported into either Azure Blob Storage or Azure Files. Provision your own drives and transfer data using the Azure Import/Export service. You can also use Microsoft-provided disk drives.
? If you want to transfer data using Microsoft-provided drives, you can use Azure Data Box Disk to import data into Azure. Microsoft ships up to 5 encrypted solid-state drives (SSDs) with a total capacity of 40TB per order to your data centre through a regional vendor. You can quickly configure the drives, copy data to the drives via a USB 3.0 connection and send the drives back to Azure. For more information, see Azure Data Box Disk overview.
Conclusion
Azure provides a very helpful tool for planning your migration, and there is also an Azure Database Migration Guide from Microsoft. You could use a simple technique like DACPAC or Generate Script and then move the data using any ETL tool which is simple and cost-effective. If you decide to use Azure services, there will be a cost, but again this will depend on the size of the data and the timeframe for the project.?