Automating VM Deployments with Azure DevOps Pipelines and Terraform for ELK Stack

Automating VM Deployments with Azure DevOps Pipelines and Terraform for ELK Stack

Imagine a world where your infrastructure deployments and software installations happen automatically whenever you commit code changes. With Azure DevOps Pipelines and Terraform, this dream becomes a reality. Azure DevOps Pipelines provide a continuous integration and continuous delivery (CI/CD) platform, while Terraform is an Infrastructure as Code (IaC) tool. Together, they automate the provisioning and management of your infrastructure, including launching virtual machine (VM) instances in Azure. In this article, we'll explore how to create an Azure DevOps pipeline that triggers Terraform scripts upon code commit in an Azure Repos Git repository, resulting in automatic VM deployments pre-configured for an ELK Stack.

Target architecture:

Architecture

Project Scenario:

Imagine you're working on a data science project that requires analyzing massive datasets stored within the ELK Stack. Client data arrives every 1-2 months via a secure SFTP server. Traditionally, this scenario would involve manually configuring a new ELK server instance on each data arrival, a process that can be quite time-consuming.

Challenges of Manual Configuration:

  • Time-Consuming: Setting up a new VM, configuring the ELK Stack (Elasticsearch, Logstash, Kibana), installing Docker, running Docker containers, downloading data from the SFTP server, and indexing it into Elasticsearch can take 4-5 days of manual effort.
  • Error-Prone: Repetitive manual tasks increase the risk of human errors that can disrupt the data analysis process.
  • Inefficiency: Manual deployments limit scalability and agility, hindering your ability to respond quickly to new data arrivals.

To address these challenges, we can leverage the power of Azure DevOps Pipelines and Terraform:

  • Azure DevOps Pipelines: This continuous integration and continuous delivery (CI/CD) platform automates the entire deployment pipeline, triggering infrastructure provisioning and configuration upon code commits or other events.
  • Terraform: This Infrastructure as Code (IaC) tool allows you to define your infrastructure configuration (VM creation, software installation) as code, ensuring consistency and repeatability.

Prerequisites:

Before we dive in, you'll need a few things in place:

  1. An Azure DevOps organization and project.
  2. The Azure DevOps Terraform extension is installed for pipeline tasks (search for "Terraform" in the marketplace).
  3. Creating a Service Principal
  4. An Azure subscription with the desired resources (VM type, storage, etc.).
  5. Terraform files defining your infrastructure configuration for the VM.
  6. An Azure Repos Git repository containing your Terraform files, shell script, and any additional code.

I am assuming you are proficient in Terraform, and the Terraform code for the ELK stack is readily available. In this document, I will demonstrate how to integrate the Terraform code and How we create the Azure DevOps pipeline.

Azure DevOps Pipeline Creation

  • Head over to your Azure DevOps project and navigate to the "Pipelines" section.
  • Click "Create pipeline" and choose "Azure Repos Git" as the source code provider.
  • Select your Azure Repos Git repository and configure the pipeline to trigger on code commits to a specific branch (e.g., main).


Terraform Extension installed using the Azure DevOps Marketplace

  • Navigate to your Azure DevOps organization settings.
  • Click on "Organization settings" from the left navigation menu.
  • In the top navigation bar, click on "Extensions" (or search for "Extensions"


  • Click on Browse Marketplace and a new page will open In the search bar, type "Terraform."

  • The "HashiCorp Terraform" extension should appear as the first result. Click on it.
  • On the extension details page, review the information and ratings.
  • Click the "Install" button.


Create a service connection in Azure DevOps Pipelines

  • Select your project.
  • Navigate to "Project settings" (usually under the gear icon in the top right corner).
  • Within "Project settings," locate the "Service connections" section.
  • Click on "New service connection"
  • A list of various service connection types will appear. Choose the Azure Resource Manager: For connecting to Azure resources using a service principal or managed identity.
  • Azure Resource Manager:
  • Enter a connection name.
  • Choose the authentication method: Service principal (Automatic)


Note: Add the required permission to the Service connection otherwise the pipeline will fail.

Creating the VM: Apply Pipeline

The Apply pipeline stage within your Azure-pipeline-apply.yaml file is responsible for provisioning the virtual machine (VM) in Azure using Terraform.

This stage typically involves the following steps:

  1. Initialization: Terraform initializes its workspace, ensuring it has the necessary configuration files and plugins to interact with Azure resources.
  2. Validation: Terraform validates the Terraform code, checking for syntax errors and potential issues before applying the configuration. This step helps identify any errors early on in the pipeline, preventing unexpected behavior during deployment.
  3. Planning: Terraform generates a preview of the infrastructure changes that will be made based on your Terraform configuration. This allows you to review the planned modifications before applying them.
  4. Applying the Configuration: Once you're satisfied with the planned changes, the Apply step instructs Terraform to create the VM and configure the infrastructure according to your code definitions. This involves provisioning resources in Azure, such as creating a virtual network, storage accounts, and finally deploying the VM itself.

trigger: none # No automatic triggers

pool:
  vmImage: 'ubuntu-latest'  # Adjust VM image as needed

variables:
  bkstrgrg: 'elk-terraform'  # Resource group name for Terraform state
  bkstrg: 'elkterraformstorage'  # Storage account name for Terraform state
  bkcontainer: 'tfstate'  # Container name for Terraform state
  bkstrgkey: 'devpipeline.terraform.tfstate'  # Key for Terraform state

stages:
  - stage: tfvalidate
    jobs:
      - job: validate
        continueOnError: false  # Fail pipeline on validation errors
          steps:
            - task: TerraformInstaller@1  # Install Terraform
              displayName: tfinstall
              inputs:
                terraformVersion: 'latest'
            - task: TerraformTaskV4@4  # Initialize Terraform
              displayName: init
              inputs:
                provider: 'azurerm'
                command: 'init'
                backendAzureRmResourceGroupName: '$(bkstrgrg)'
                backendAzureRmStorageAccountName: '$(bkstrg)'
                backendAzureRmContainerName: '$(bkcontainer)'
                backendAzureRmKey: '$(bkstrgkey)'
            - task: TerraformTaskV4@4  # Validate Terraform configuration
              displayName: validate
              inputs:
                provider: 'azurerm'
                command: 'validate'

  - stage: tfdeploy  # Stage for deployment (requires successful validation)
    condition: succeeded('tfvalidate')  # Only run if validation succeeds
    dependsOn: tfvalidate  # Wait for validation stage to complete
    jobs:
      - job: apply
          steps:
            - task: TerraformTaskV4@4  # Initialize Terraform (repeated for clarity)
              displayName: init
              inputs:
                provider: 'azurerm'
                command: 'init'
                backendServiceArm: 'elk-terraform-service-connection'  # Replace with your service connection name
                backendAzure
            - task: TerraformTaskV4@4
              displayName: plan
              inputs:
                provider: 'azurerm'
                command: 'plan'
                environmentServiceNameAzureRM: 'elk-terraform-service-connection'
            - task: TerraformTaskV4@4
              displayName: apply
              inputs:
                provider: 'azurerm'
                command: 'apply'
                environmentServiceNameAzureRM: 'elk-terraform-service-connection'
        

here is the snap-shot of my pipeline:

Destroying the VM: Destroy Pipeline (Optional)

While the Apply pipeline focuses on creating the VM, you might also want to consider incorporating a Destroy pipeline for cleanup purposes. This pipeline stage allows you to tear down the entire infrastructure or specific resources when they're no longer needed. Here's a breakdown of the typical steps involved:

  1. Initialization (Optional): Similar to the Apply pipeline, you can initialize Terraform again to ensure it has the latest configuration.
  2. Destroying Infrastructure: The Destroy command instructs Terraform to remove the resources it previously created based on your code. This can involve deleting the VM, associated storage accounts, and other Azure resources.

trigger: none

pool:
  vmImage: 'ubuntu-latest'

variables:
  bkstrgrg: 'elk-terraform'
  bkstrg: 'elkterraformstorage'
  bkcontainer: 'tfstate'
  bkstrgkey: 'devpipeline.terraform.tfstate'

stages:
  - stage: tfdestroy
    jobs:
      - job: apply
        steps:
          - task: TerraformTaskV4@4
            displayName: init
            inputs:
              provider: 'azurerm'
              command: 'init'
              backendServiceArm: 'elk-terraform-service-connection'
              backendAzureRmResourceGroupName: '$(bkstrgrg)'
              backendAzureRmStorageAccountName: '$(bkstrg)'
              backendAzureRmContainerName: '$(bkcontainer)'
              backendAzureRmKey: '$(bkstrgkey)'
          - task: TerraformTaskV4@4
            displayName: plan
            inputs:
              provider: 'azurerm'
              command: 'plan'
              environmentServiceNameAzureRM: 'elk-terraform-service-connection'
          - task: TerraformTaskV4@4
            displayName: destroy
            inputs:
              provider: 'azurerm'
              command: 'destroy'
              environmentServiceNameAzureRM: 'elk-terraform-service-connection'        

here is the snap-shot of my destroy pipeline:

My code structure:

Conclusion:

This article has explored the power of Azure DevOps Pipelines and Terraform in automating ELK Stack deployments, a valuable toolset for DevOps teams supporting data analytics environments. By leveraging Infrastructure as Code (IaC) with Terraform and the CI/CD capabilities of Azure DevOps Pipelines, we've presented a solution that delivers significant benefits:

  • Reduced deployment time: Traditionally, manually configuring a new ELK server for each data arrival could take up to a week. By automating the process with Azure DevOps Pipelines and Terraform, this time can be drastically reduced to potentially just 1-2 days, including starting all the Docker containers for the ELK Stack. This frees up critical DevOps resources for other tasks.
  • Streamlined infrastructure provisioning: Automating VM creation and configuration through Terraform simplifies infrastructure management for DevOps teams.
  • Enhances consistency and repeatability: Code-driven infrastructure configuration ensures consistent deployments and simplifies re-deployments whenever needed.
  • Minimizes errors: IaC reduces the risk of human error during manual deployments, leading to a more reliable infrastructure foundation.

This approach empowers DevOps teams to efficiently manage and automate deployments for ELK Stack environments. As your data analysis needs evolve, the automation capabilities can be extended to include additional infrastructure components or integrate with other data pipelines. By embracing automation with Azure DevOps Pipelines and Terraform, DevOps teams can play a crucial role in supporting scalable and reliable data analytics platforms.

Manmeet Kaur

Azure Cloud, DevOps & Site Reliability Engineer

6 个月

Great for efficiency

Manmeet Kaur

Azure Cloud, DevOps & Site Reliability Engineer

6 个月

Great for time saving

Lionel Tchami

???? DevOps Mentor | ?? Helping Freshers | ????Senior Platform Engineer | ?? AWS Cloud | ?? Python Automation | ?? Devops Tools | AWS CB

6 个月

Impressive solution for time-saving deployments! ??

Mirko Peters

Digital Marketing Analyst @ Sivantos

6 个月

Exciting efficiency gains with automation! ???

要查看或添加评论,请登录

社区洞察

其他会员也浏览了