Terraform Primer
Image from: www.terraform.io

Terraform Primer

Introduction

This is small article written using the the notes I took while trying to learn Terraform so that I can apply it to penetration testing.

I'm by no means an expert on terraform or Linux. This is just my notes that I thought I'd share with you guys. Hopefully it proves to be helpful.

Cheers

Pre-requisites

?For this article, I’ll be using:

  • Ubuntu 20.04 on VMware
  • Terraform installed on Ubuntu
  • Installed VS Code with the Terraform extension
  • Generated an API Token to Digital Ocean
  • Already have setup an SSH public and private key pairing for Digital Ocean

?Adding the API token as an environment variable in Ubuntu.

?From the terminal, enter export TF_VAR_do_token=<token>



No alt text provided for this image


No alt text provided for this image

To install Terraform, follow these steps as listed on the Terraform website:

curl -fsSL https://apt.releases.hashicorp.com/gpg | sudo apt-key add 
sudo apt-add-repository "deb [arch=amd64] https://apt.releases.hashicorp.com $(lsb_release -cs) main"
sudo apt-get update && sudo apt-get install terraform-        

Remember, the curl application may not be installed by default on Ubuntu. Simply do “sudo apt install curl” and you’ll be good to run the above commands.

Terraform

Terraform automates the process of virtual infrastructure setup. Ideally being Infrastructure as Code. It allows a user to create virtualized systems that consistently meet the same requirements every time. It also allows for seamless scaling of infrastructure due to its code-based architecture; high-availability systems are easily and quickly spun up to meet operational demands.

Through code, a user can define a virtual machine that meets the exact requirements to support services and applications of an organization.

The Terraform Registry

The Terraform Registry is a centralized repository of Providers and Modules support by Terraform. It contains all the documentation on using a provider, their respective resources along with additional modules and their resources including usage examples.

Providers

A Provider is a vendor or service that supports connectivity and utilization by the terraform application. Some examples of providers include AWS, Azure, Google Cloud, Oracle, Digital Ocean, and many others.

Modules

Modules are self-contained packages of Terraform configurations that are managed as a group. They provide or support features from that are apart from the base features supported by a Provider. Example there is a module that supports the creation of AWS VPC resources specifically and is separate and apart from the default resources of AWS Provider.

Terraform Workflow

The Terraform Workflow is general progression a user takes to go from design to functional infrastructure.

The core Terraform workflow has three steps:

  1. Write – Design and author infrastructure as code
  2. Plan – Preview changes before applying
  3. Apply – Provision said infrastructure

Key Terms and Commands

terraform workspace

The terraform workspace is defined space used to execute a terraform workflow. Think of a workspace as just that, a workspace. A dedicated location to perform a task. Workspaces are compartmentalized and thus one workspace’s workflow is isolated from another workspace.

terraform init

The init command is used to initialize the working directory containing Terraform configuration files. It reads the content of the terraform files that you’re working with in the current directory and downloads the defined requirements (Provider information, modules, etc.)

terraform plan

terraform plan reads your terraform file, evaluates it and produces an execution plan for your review. It reports on what while be done including any errors in your code.

terraform apply

After writing your code and developing the execution plan (terraform plan), the next step is to execute and apply the code. This is what terraform apply does. It goes through the code and executes on the target Provider.

Data Source

A Terraform Data Source stores information pulled from a Provider. To get a full understanding of what data source are available, read the documentation provided on said Provider.

Resource

Resources are things that can be created by the Provider. Similar to the Data Source, the full listing of Resources that can be created and how to go about creating them can be found in the Provider documentation.

Work with Terraform

To keep my work organized as it makes management easier. In my terminal, I create a folder, “projects” and within there I create another called “terraform”. This will my primary work location. Ideally, a directory will be the source for your Terraform Workflow.

Selecting the Provider

For this, I’ll be using Digital Ocean, however what we’ll cover here is applicable to any supported Provider.

To use Terraform to communicate with a Provider, you’ll need an API key or token. Terraform has complete list of official, verified and community supported providers on their Registry here: https://registry.terraform.io/browse/providers

In the search bar to the top, search for your Provider. In my case, that’ll be Digital Ocean.

No alt text provided for this image

And I select the verified option.

?On the page the loads, there will be a button that says, “Use provider”. Clicking it opens a dialog box with instructions on how to use the provider:

No alt text provided for this image

Creating your first Terraform file

In VS Code, I set the default project folder to “projects”. The Explorer Panel to the left of VS Code will show the sub-folder “terraform”.

No alt text provided for this image

Click on the terraform folder in the panel and a list of icons will appear inline to Projects. The first icon will create a new file in the terraform folder. Click it and name it main.tf. This is where we’ll put the core of our code.

No alt text provided for this image
No alt text provided for this image

Structure of a Terraform file

The most common practice of structuring a terraform file following this pattern:

  1. Provider section
  2. Variables
  3. Data Sources
  4. Resources
  5. Output

Provider section

At the top of “main.tf”, I simply copy the code from the Provider documentation:

terraform {

??? required_providers {

????? digitalocean = {

????????? source = "digitalocean/digitalocean"

????????? version = "2.18.0"

????? }

??? }

}        
No alt text provided for this image

Then I define a section to access my account on Digital Ocean by providing my API Token. Because I’ve set the token as an environment variable, it can access it through Terraform without hard coding it into a file (i.e. more secure).

provider "digitalocean" {

??? token = var.do_token?

}        
No alt text provided for this image

Variables section

Next I’ll define some variables to use when creating a Droplet on Digital Ocean.

Things I considered for my Droplet:

  • Region of deployment
  • Number to be created on a single execution
  • Droplet size (prebuilt Droplets from Digital Ocean)

# Variables

variable "do_token" {}

?

variable "region" {

??? type = string

??? default = "nyc3"

}

?

variable "droplet_count" {

??? type = number

??? default = 1

}

?

variable "droplet_size" {

??? type = string

??? default = "s-1vcpu-1gb"

}        
No alt text provided for this image

One feature I like about Terraform, is the declaration of variables in a separate file that has the file extension of “tfvars”. You can setup all the variables in the main file and in the “.tfvars” file, define them individually.

I created a variable file called “nyc3.tfvars” within the current terraform directory and predefined the variables in there. The names in the main file must match the names in the variable file.

region = "nyc3"

droplet_count = 1

droplet_size = "s-1vcpu-1gb"        
No alt text provided for this image

Data sources

Next consider the things you need for the Droplet: how are you going to access it after, what project should it be located in on Digital Ocean, any tags?

Things I knew I wanted include for my Droplet:

  • Add the SSH key
  • Add the Droplet to a specific Project in Digital Ocean

Things that were optional for my Droplet:

  • Setting a tag

These are data sources already exist on your Digital Ocean account. So, you can simply retrieve them, assign them a name within our code and refer to it as needed. The data sources details including properties can be found on the Data Source section of the Provider page on the Terraform Registry website.

data "digitalocean_ssh_key" "myKey" {

??? name = "KaliVM"

}

?

data "digitalocean_project" "testing" {

??? name = "IoT Network"

}

?

data "digitalocean_tag" "myTag" {

??? name = "pentest-nginx"

}        
No alt text provided for this image

Resources

Resources are what terraform will create. In this case, a Droplet.

Things I considered for my Droplet:

  • Operating System image
  • The name of the Droplet
  • It’s size
  • Region
  • SSH key

Like variables and data sources, resources are structure and defined the same way:

# Create a new proxy Droplet in the nyc3 region

resource "digitalocean_droplet" "proxy" {

? count = var.droplet_count

? image? = "ubuntu-20-04-x64"

? name?? = "nginx-${var.region}-${count.index +1}"

? tags = [data.digitalocean_tag.myTag.name]

? region = var.region

? size?? = var.droplet_size

? ssh_keys = [data.digitalocean_ssh_key.myKey.id]

?

? lifecycle {

??? create_before_destroy = true

? }

}        
No alt text provided for this image

To explain this code block:

  • count = var.droplet_count (pulled from the variable we declared previously)
  • image = “ubuntu-20-04-x64” (gets the image to use from the Digital Ocean repository)
  • ?name = “nginx-${var.region}-${count.index +1}” (This creates the Droplet with the name nginx-the region variable we defined-and gets the count of other droplets that may exist with the same name and adds 1 to it)
  • ?tags = [data.digitalocean_tag.myTag.name] (Assigns our defined tag)
  • ?region = var.region (Assigns the Droplet to our predefined region)
  • ?size??= var.droplet_size (Sets its size as our definition)
  • ?ssh_keys = [data.digitalocean_ssh_key.myKey.id] (Adds the SSH key we’ve defined)

Noticed the lifecycle declaration at the end? This is a safety mechanism that prevents deletion of a droplet should one of the same name already exists. Terraform does not check to verify that there is none existing, but if it encounters a Droplet with the same name, it overwrites it. All data will be lost.

The next section adds it to our Project in Digital Ocean

# After creation move Droplet to Project

resource "digitalocean_project_resources" "myProj" {

??? project = data.digitalocean_project.testing.id

??? resources = digitalocean_droplet.proxy[*].urn

}        
No alt text provided for this image

The “[*]” in the line “resources = digitalocean_droplet.proxy[*].urn” will move the count of newly created droplet to our defined Digital Ocean Project :

(“project = data.digitalocean_project.testing.id”)

Output?

Lastly, we define what output we’d like upon successful execution. In this case, we’d like the IP of our newly create Droplet.

output "server_ip" {

??? value = digitalocean_droplet.proxy.*.ipv4_address

}        
No alt text provided for this image

That above will return the IPv4 address of the Droplet. We can use SSH to connect to it with the root account.

This is what the final code (main.tf) should look like:

terraform {

??? required_providers {

????? digitalocean = {

????????? source = "digitalocean/digitalocean"

????????? version = "2.18.0"

????? }

??? }

}

?

provider "digitalocean" {

??? token = var.do_token?

}

?

# Variables

variable "do_token" {}

?

variable "region" {

??? type = string

??? default = "nyc3"

}

?

variable "droplet_count" {

??? type = number

??? default = 1

}

?

variable "droplet_size" {

??? type = string

??? default = "s-1vcpu-1gb"

}

?

data "digitalocean_ssh_key" "myKey" {

??? name = "KaliVM"

}

?

data "digitalocean_project" "testing" {

??? name = "IoT Network"

}

?

data "digitalocean_tag" "myTag" {

??? name = "pentest-nginx"

}

?

# Create a new proxy Droplet in the nyc3 region

resource "digitalocean_droplet" "proxy" {

? count = var.droplet_count

? image? = "ubuntu-20-04-x64"

? name?? = "nginx-${var.region}-${count.index +1}"

? tags = [data.digitalocean_tag.myTag.name]

? region = var.region

? size?? = var.droplet_size

? ssh_keys = [data.digitalocean_ssh_key.myKey.id]

?

? lifecycle {

??? create_before_destroy = true

? }

}

?

# After creation move Droplet to Project

resource "digitalocean_project_resources" "myProj" {

??? project = data.digitalocean_project.testing.id

??? resources = digitalocean_droplet.proxy[*].urn

}

?

output "server_ip" {

??? value = digitalocean_droplet.proxy.*.ipv4_address

}        

Planning and Applying

Now that the file has been created, we need to plan it (terraform plan) and apply/execute the plan (terraform apply).

?In the terminal, type terraform plan

No alt text provided for this image

Your output, granted there are no errors with your code, should be similar to this:

No alt text provided for this image
No alt text provided for this image

All green means we’re good to execute.

To execute and apply the plan, at the terminal we enter “terraform apply” and hit enter.

No alt text provided for this image

It prints the plan to screen and asks for confirmation. Enter “yes” and hit enter. The application process will begin.

No alt text provided for this image
No alt text provided for this image

It is generally a quick process.

No alt text provided for this image
No alt text provided for this image

Conclusion

?I hope you find this helpful. My next write-up will be around using Ansible with Terraform to configure a Droplet. And finally, using it to deploy infrastructure for a social engineering campaign.

References

Miguel Bissoon

Cloud Service Architect - Digital and Hybrid IT Services at Fujitsu

3 年

I'm bookmarking this.

Great material! ?? ??

要查看或添加评论,请登录

Alex Samm的更多文章

  • A Month in Review - Ransomware attack in Trinidad

    A Month in Review - Ransomware attack in Trinidad

    Here's a quick summary if you don't want to read it all, because it's a long one. Internal administrative credentials…

    10 条评论
  • Email Bomb Threat against schools in Trinidad and Tobago

    Email Bomb Threat against schools in Trinidad and Tobago

    As most people in Trinidad and Tobago would recall, on Friday 28th April 2023, several schools received bomb threats…

    4 条评论
  • Ansible Primer

    Ansible Primer

    Overview Similar to my Terraform Primer write up, this is a small article written using the the notes I took while…

  • Automate Database Attacks

    Automate Database Attacks

    This is a synopsis of an exercise I do with classes while covering Web Application Testing and Database attacks. It's…

  • Penetration Testing - A Brief Engagement Overview

    Penetration Testing - A Brief Engagement Overview

    I initially wrote this article and posted it to a great community site, Peerlyst.com (can be found here until August…

    1 条评论

社区洞察

其他会员也浏览了