Azure Functions with Go and Terraform IaC

Azure Functions with Go and Terraform IaC

In my previous article Azure Functions with Go and SharePoint integration I was describing how to configure the project and environment and publish a functions manually to Azure. In reality no one does it by hands, as it's timeconsuming and error-prone. In this article, I will show how to automate the process with Terraform.

Terraform

Terraform is an open-source infrastructure as code software tool that provides a consistent CLI workflow to manage hundreds of cloud services. Terraform codifies cloud APIs into declarative configuration files. Terraform generates an execution plan describing what it will do to reach the desired state, and then executes it to build the described infrastructure. As the configuration changes, Terraform is able to determine what changed and create incremental execution plans which can be applied.

Azure Functions IaC

You can find the complete configuration in az-fun-go-sp/infra code sources. The sample not only binds together the base artifacts for Azure Functions with Go, but also provides linting and language server support for Terraform.

The most handy editor for Terraform is Visual Studio Code with Terraform extension installed. Also, don't forget installing recommended extensions for the project.

Prerequisites

Sample structure highlights

  • `.env` - environment variables file used by make file commands, it's not the variables which will be in the app enviroment; and it's a different from `terraform.tfvars`
  • `.env.sample` - sample environment variables file with dummy values
  • `.terraform.lock.hcl` - Terraform lock file, same concept as `package-lock.json` for NPM
  • `.tflint.hcl` - Terraform linter configuration file
  • `01_main.tf` - main Terraform configuration file, contains modules and core artifacts
  • `02_storage.tf` - storage account configuration file
  • `03_fuctions.tf` - Azure Functions configuration file
  • `Makefile` - Makefile with handy commands for provisioning and deployment
  • `terraform.sample.tfvars` - sample Terraform variables file with dummy values
  • `terraform.tfvars` - Terraform variables file, used by provisioning during deployment, this file can contain sensitive data, don't commit it to a repository
  • `variables.tf` - Terraform variables file, contains variables definitions

Resource definition

Terraform uses HCL (HashiCorp Configuration Language) to define resources. It's a declarative language, which means that you describe the desired state of the infrastructure, and Terraform figures out how to get there.

HCL might be a bit unusual, but actually it's not a hard thing to catch up with, especially if you configure editor and linters.

No alt text provided for this image

Resource definitions is stored in `.tf` files. They can be organized in a way which is logical to you. I prefer gouping the resources by the type, but it's not a must.

For an Azure Functions app to work, these resources are required:

  • Resource group
  • Virtual network and subnet
  • Storage account
  • App service plan
  • Functions app
  • Application insights (optional)

Sounds like a lot? Yet, that's why such samples are useful to get started. But imagine you'd managed assets manually!

Variables

Terraform gets dynamic variables based on `variables.ts`, actually `variable` definition stanza. You can copy `terraform.sample.tfvars` to `terraform.tfvars` and provide the variables inline. Or, alternatevely, export TF vars via environment variables. If an env var is prefixed with `TF_VAR_`, terraform will automatically load it as a variable. For example, `TF_VAR_subscription_id` will be available as `subscription_id` variable.

Terraform uses variables to dynamically inject them to the provisioning definition. It's agnostic to where a variable is used within templates. E.g. some vars acts in resource definition, others are mapped to the application environment variables.

Providers

Terraform uses providers to interact with the target infrastructure. Providers are responsible for understanding API interactions and exposing resources. Each provider has a different set of resources and data sources that can be used. For Azure, `azurerm` is used.

Initialization

Terraform uses a plugin-based architecture. Providers are plugins that are executed on the local machine. Terraform downloads and installs the plugins automatically when it is initialized. The initialization process is the first step that should be performed when using Terraform with a new configuration or with a configuration that has changed plugins requirements.

In the sample I'm using `azurerm` as Terraform backend. The beckand is the way how Terraform handler configuration state. It's a good practice to use a remote backend, as it allows to share the state between team members and pipelines, and to avoid conflicts.

Before running Terraform init make sure you copied `.env.sample` to `.env` and provided `TERRAFORM_BACKEND_STORAGE_ACCOUNT` variable. It should be a unique name for the storage account. The account should be created ahead of time.

Then run:

make init        

You could see authentication error message, most of the time the errors are informative enough to understand what's going on. In this case, you need to login to Azure CLI first.

az login        

During initialization Terraform will download the provider plugin and will create a state file in the storage account.

Linting

Whereever possible using linters is a good practice. I'd recommend using tflint. It's decent enough and has a nice support for Azure. It's also a considered practice to use linters in CI/CD pipelines.

Functions app code packaging

Before applying Terraform templates, the Azure Functions package should be created with:

make pack        

This will create `./package/functions.zip`, which is used as an app code source deployed to Blobs and bounded with the Azure Functions app via `WEBSITE_RUN_FROM_PACKAGE` environment variable and corresponding automation.

Provisioning

Terraform uses `plan` and `apply` commands to provision resources. The `plan` command is used to create an execution plan. The `apply` command executes the actions proposed in a Terraform plan.

In the sample I'm using `make` to run Terraform commands. As a self documenting tasks listing. Also, the the make file the `.env` file is loaded, so you don't need to export variables manually.

Plan before apply

Plans provisioning, apply stuff in a dry mode. Can be useful for trubleshooting and verification of what a deployment will do. E.g. in some cases a deployment of a resource can't be done without destroying and recreating, which however might be unwanted behavior. With planning, you'd see what property causes such effect.

make plan        

Applying infrastructure as code

make apply        

The command is an alias of `terraform apply -auto-approve`, won't prompt for confirmation.

Azure Functions and Custom Handlers in Go are now published and ready to use.

Destroying when done testing

make destroy        
Don't run this for a prod as all the resources provisioned with Terraform templates will be removed.

Functions App choices

Dynamic Linux plan

I tend to believe and prefer serverless options first approach. Also, I'd chose Linux on a server in all cases when it works.

## 03_functions.tf

# Service Plan
# https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/service_plan
resource "azurerm_service_plan" "service_plan" {
  name                = var.function_app
  resource_group_name = azurerm_resource_group.rg.name
  location            = azurerm_resource_group.rg.location
  os_type             = "Linux"
  sku_name            = "Y1"
  tags                = var.tags
}

# Functions App
# https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/linux_function_app
resource "azurerm_linux_function_app" "functions" {
  name = var.function_app
  # ...

  app_settings = {
    FUNCTIONS_WORKER_RUNTIME = "custom" # Custom Handler runtime
    # ...
  }
}        

SharePoint bindings (bonus)

## 03_functions.tf

resource "azurerm_linux_function_app" "functions" {
  name = var.function_app
  # ...

  app_settings = {
    # SharePoint auth binding configuration, see more in a root project
    SPAUTH_SITEURL           = var.sharepoint_siteurl
    SPAUTH_CLIENTID          = var.sharepoint_clientid
    SPAUTH_CLIENTSECRET      = var.sharepoint_clientsecret

    # ...
  }
}

## variables.tf

variable "sharepoint_siteurl" {
  type        = string
  description = "SharePoint SiteURL"
}

variable "sharepoint_clientid" {
  type        = string
  description = "SharePoint ClientID"
}

variable "sharepoint_clientsecret" {
  type        = string
  description = "SharePoint Client Secret"
}

## terraform.tfvars

sharepoint_siteurl      = "https://contoso.sharepoint.com/sites/site"
sharepoint_clientid     = "428b492b-575d-4d4b-991e-16195a3c496e"
sharepoint_clientsecret = "CgnihMbRphqR....7XLlZ/0QCgw="        

Package deployment

## 03_functions.tf

resource "azurerm_linux_function_app" "functions" {
  name = var.function_app
  # ...

  app_settings = {
    # Deployment from storage container blob settings
    FUNCTION_APP_EDIT_MODE   = "readonly"
    HASH                     = base64encode(filesha256(var.package))
    WEBSITE_RUN_FROM_PACKAGE = "https://${azurerm_storage_account.storage.name}.blob.core.windows.net/${azurerm_storage_container.deployments.name}/${azurerm_storage_blob.appcode.name}${data.azurerm_storage_account_sas.sas.sas}"

    # ...
  }
}

## 02_storage.tf

resource "azurerm_storage_container" "deployments" {
  # ...
}

resource "azurerm_storage_blob" "appcode" {
  name = "functions.zip"
  type = "Block"
  source = var.package # Package archive upload

  # ...
}

# Deployment blob access connection string
data "azurerm_storage_account_sas" "sas" {
   # ...
}

## variables.tf

variable "package" {
  type    = string
  default = "./package/functions.zip" # Package default location
}        

Conclusion

With infrastructure as code, you can provision Azure Functions with Custom Handlers in Go. IaC helps to automate the deployment process, keep it repeatable and reusable. Terraform is a great choice for that.

要查看或添加评论,请登录

Andrew Koltyakov的更多文章

社区洞察

其他会员也浏览了