Automating VM Deployments with Azure DevOps Pipelines and Terraform for ELK Stack
Krishna Wattamwar
Sr. DevOps Engineer || 1x AWS || 1x Terraform || 1x RedHat || Azure || k8s || Cloud-Formation || BASH Scripting || Docker || Jenkins || CI/CD || Security ll Monitoring || Automation
Imagine a world where your infrastructure deployments and software installations happen automatically whenever you commit code changes. With Azure DevOps Pipelines and Terraform, this dream becomes a reality. Azure DevOps Pipelines provide a continuous integration and continuous delivery (CI/CD) platform, while Terraform is an Infrastructure as Code (IaC) tool. Together, they automate the provisioning and management of your infrastructure, including launching virtual machine (VM) instances in Azure. In this article, we'll explore how to create an Azure DevOps pipeline that triggers Terraform scripts upon code commit in an Azure Repos Git repository, resulting in automatic VM deployments pre-configured for an ELK Stack.
Target architecture:
Project Scenario:
Imagine you're working on a data science project that requires analyzing massive datasets stored within the ELK Stack. Client data arrives every 1-2 months via a secure SFTP server. Traditionally, this scenario would involve manually configuring a new ELK server instance on each data arrival, a process that can be quite time-consuming.
Challenges of Manual Configuration:
To address these challenges, we can leverage the power of Azure DevOps Pipelines and Terraform:
Prerequisites:
Before we dive in, you'll need a few things in place:
I am assuming you are proficient in Terraform, and the Terraform code for the ELK stack is readily available. In this document, I will demonstrate how to integrate the Terraform code and How we create the Azure DevOps pipeline.
Azure DevOps Pipeline Creation
Terraform Extension installed using the Azure DevOps Marketplace
领英推荐
Create a service connection in Azure DevOps Pipelines
Note: Add the required permission to the Service connection otherwise the pipeline will fail.
Creating the VM: Apply Pipeline
The Apply pipeline stage within your Azure-pipeline-apply.yaml file is responsible for provisioning the virtual machine (VM) in Azure using Terraform.
This stage typically involves the following steps:
trigger: none # No automatic triggers
pool:
vmImage: 'ubuntu-latest' # Adjust VM image as needed
variables:
bkstrgrg: 'elk-terraform' # Resource group name for Terraform state
bkstrg: 'elkterraformstorage' # Storage account name for Terraform state
bkcontainer: 'tfstate' # Container name for Terraform state
bkstrgkey: 'devpipeline.terraform.tfstate' # Key for Terraform state
stages:
- stage: tfvalidate
jobs:
- job: validate
continueOnError: false # Fail pipeline on validation errors
steps:
- task: TerraformInstaller@1 # Install Terraform
displayName: tfinstall
inputs:
terraformVersion: 'latest'
- task: TerraformTaskV4@4 # Initialize Terraform
displayName: init
inputs:
provider: 'azurerm'
command: 'init'
backendAzureRmResourceGroupName: '$(bkstrgrg)'
backendAzureRmStorageAccountName: '$(bkstrg)'
backendAzureRmContainerName: '$(bkcontainer)'
backendAzureRmKey: '$(bkstrgkey)'
- task: TerraformTaskV4@4 # Validate Terraform configuration
displayName: validate
inputs:
provider: 'azurerm'
command: 'validate'
- stage: tfdeploy # Stage for deployment (requires successful validation)
condition: succeeded('tfvalidate') # Only run if validation succeeds
dependsOn: tfvalidate # Wait for validation stage to complete
jobs:
- job: apply
steps:
- task: TerraformTaskV4@4 # Initialize Terraform (repeated for clarity)
displayName: init
inputs:
provider: 'azurerm'
command: 'init'
backendServiceArm: 'elk-terraform-service-connection' # Replace with your service connection name
backendAzure
- task: TerraformTaskV4@4
displayName: plan
inputs:
provider: 'azurerm'
command: 'plan'
environmentServiceNameAzureRM: 'elk-terraform-service-connection'
- task: TerraformTaskV4@4
displayName: apply
inputs:
provider: 'azurerm'
command: 'apply'
environmentServiceNameAzureRM: 'elk-terraform-service-connection'
here is the snap-shot of my pipeline:
Destroying the VM: Destroy Pipeline (Optional)
While the Apply pipeline focuses on creating the VM, you might also want to consider incorporating a Destroy pipeline for cleanup purposes. This pipeline stage allows you to tear down the entire infrastructure or specific resources when they're no longer needed. Here's a breakdown of the typical steps involved:
trigger: none
pool:
vmImage: 'ubuntu-latest'
variables:
bkstrgrg: 'elk-terraform'
bkstrg: 'elkterraformstorage'
bkcontainer: 'tfstate'
bkstrgkey: 'devpipeline.terraform.tfstate'
stages:
- stage: tfdestroy
jobs:
- job: apply
steps:
- task: TerraformTaskV4@4
displayName: init
inputs:
provider: 'azurerm'
command: 'init'
backendServiceArm: 'elk-terraform-service-connection'
backendAzureRmResourceGroupName: '$(bkstrgrg)'
backendAzureRmStorageAccountName: '$(bkstrg)'
backendAzureRmContainerName: '$(bkcontainer)'
backendAzureRmKey: '$(bkstrgkey)'
- task: TerraformTaskV4@4
displayName: plan
inputs:
provider: 'azurerm'
command: 'plan'
environmentServiceNameAzureRM: 'elk-terraform-service-connection'
- task: TerraformTaskV4@4
displayName: destroy
inputs:
provider: 'azurerm'
command: 'destroy'
environmentServiceNameAzureRM: 'elk-terraform-service-connection'
here is the snap-shot of my destroy pipeline:
My code structure:
Conclusion:
This article has explored the power of Azure DevOps Pipelines and Terraform in automating ELK Stack deployments, a valuable toolset for DevOps teams supporting data analytics environments. By leveraging Infrastructure as Code (IaC) with Terraform and the CI/CD capabilities of Azure DevOps Pipelines, we've presented a solution that delivers significant benefits:
This approach empowers DevOps teams to efficiently manage and automate deployments for ELK Stack environments. As your data analysis needs evolve, the automation capabilities can be extended to include additional infrastructure components or integrate with other data pipelines. By embracing automation with Azure DevOps Pipelines and Terraform, DevOps teams can play a crucial role in supporting scalable and reliable data analytics platforms.
Azure Cloud, DevOps & Site Reliability Engineer
6 个月Great for efficiency
Azure Cloud, DevOps & Site Reliability Engineer
6 个月Great for time saving
???? DevOps Mentor | ?? Helping Freshers | ????Senior Platform Engineer | ?? AWS Cloud | ?? Python Automation | ?? Devops Tools | AWS CB
6 个月Impressive solution for time-saving deployments! ??
Digital Marketing Analyst @ Sivantos
6 个月Exciting efficiency gains with automation! ???