Exploring Azure Pipelines, Terraform, and Powershell

As of this writing [Feb 2021], if you’ve used the Terraform extension from Microsoft DevLabs, you’d have noticed that there is support for only a small set of Terraform commands out of the box. These are “init”, “validate”, “plan”, “validate and apply”, and “destroy”.

35.Exploring-AP-TF-PS-image2
But Terraform has many commands: fmt, import, output, show, taint, workspace, etc., which are not referenced by the Terraform task (from Microsoft DevLabs). What are the options if I want to use those? Yes, there are a few more extensions available, like the one by Charles Zipp -but as of now, those are to provision resources in Azure Cloud only. I, however, was interested in provisioning and managing resources in AWS. And so I explored an option -my favorite automation and scripting ally —POWERSHELL!!!
Working with Powershell, I achieved my integration goal, namely using Azure Pipelines and Terraform with Powershell to provision resources in AWS. In this post, I list the steps to deploy that.

Before we begin with the automation process, there are a few pre-requisites we need to have. These are:
-AWS IAM user with permission to create (administer) a VPC and manage (administer) an S3 bucket (for remote state)
-existing AWS S3 bucket details to store the remote state
-a Github repo containing Terraform configuration files
-an Azure DevOps project

The first step is to install Terraform. There are two paths that a project can take while deciding the build infra, self-hosted, or Azure Pipelines hosted.
35.Exploring-AP-TF-PS-image3

If you build on an Azure-supplied VM, then the installer task makes sense since we do not manage the build machine.
If you have a self-hosted box, you may decide to have a dedicated VM to build all Terraform configuration pipelines. In that case, you may have Terraform installed on that VM. The installation steps are easy to follow.
-select the TF version from Terraform download
-download the installer and unzip
-add the path to Terraform.exe to PATH environment variable
-open a new PowerShell console and type in “terraform version”. You should get the same version that you selected to download.
Here is a link for detailed steps: Install Terraform

The other option, even though a self-hosted build VM is used, is to use the Terraform tool installer task (from Microsoft DevLabs) as the first step in a pipeline.
On the other hand, if you build on an Azure-supplied VM, then the installer task makes sense since we do not manage the build machine.

Either way, I think the installer task provides more flexibility since we are not tied to a single version to compile all the Terraform Configuration files.
Once Terraform is installed, while configuring an Azure pipeline, we can follow the same sequence of steps we would when working from on our personal laptop/local.
I had worked on provisioning an AWS S3 bucket in my previous two notes, and so here, I provision a VPC and add two subnets in the exercise. As is good practice, I have a remote backend for the configuration.
I stored the Terraform configuration files along with the Azure-pipelines.yaml file. The flow in a yaml file is described below.

Step 1: Install Terraform
I installed Terraform with the Terraform tool installer task because I have the build machine hosted in Azure pipelines. You need to have the Terraform extension installed for the Azure DevOps project to use this step.

# Install Terraform extension to use this task from:
# https://marketplace.visualstudio.com/items?itemName=ms-devlabs.custom-terraform-tasks&ssr=false#overview
steps:
task: ms-devlabs.custom-terraform-tasks.custom-terraform-installer-task.TerraformInstaller@0
displayName: 'Install Terraform 0.14.0'
inputs:
terraformVersion: 0.14.0
# Terraform versions: https://releases.hashicorp.com/terraform/

view raw
azure-pipelines.yaml
hosted with ❤ by GitHub

Step 2: Terraform init

powershell: |
terraform init -backend-config="bucket=skundu-terraform-remote-state" -backend-config="key=tf/ADO-TF-VPC-Int/terraform.tfstate" -backend-config="region=$(region)" -backend-config="access_key=$(access_key)" -backend-config="secret_key=$(secret_key)" -no-color
workingDirectory: $(build.sourcesdirectory)
displayName: 'terraform init'

view raw
azure-pipelines.yaml
hosted with ❤ by GitHub

Using Azure DevOps as an orchestrator, I provided the remote backend bucket to store the tfstate file, the key (tfstate file), and the user’s region and credentials. Here the bucket (skundu-terraform-remote-state) and key (tf/ADO-TF-VPC-Int/terraform.tfstate) values are specific to the current Terraform configuration. The credentials (access key and secret key) are specific to the IAM user that is being used to provision the resources in AWS.

Step 3: Terraform validate

powershell: |
terraform validate -json -no-color
workingDirectory: $(build.sourcesdirectory)
displayName: 'terraform validate'

view raw
azure-pipelines.yaml
hosted with ❤ by GitHub

The validate command with the ” -json” flag displays the result of a syntatic and consistency check on the Terraform configuration files.

Step 4: Terraform plan

powershell: |
terraform plan -var region=$(region) -var access_key=$(access_key) -var secret_key=$(secret_key) -out application.tfplan -no-color
workingDirectory: $(build.sourcesdirectory)
displayName: 'terraform plan'

view raw
azure-pipelines.yaml
hosted with ❤ by GitHub

Terraform plan creates an execution plan. The “-out application.tfplan” creates a file by that name which is then used in the next step to provision the resource.

Step 5: Terraform apply

powershell: |
terraform apply "application.tfplan" -no-color
workingDirectory: $(build.sourcesdirectory)
displayName: 'terraform apply'

view raw
azure-pipelines.yaml
hosted with ❤ by GitHub

Terraform apply is the last step in provisioning a resource and the “application.tfplan” generated in the previous step is referred here.

And that brings us to the end of how to use Terraform, Azure DevOps, and Powershell to provision a resource in AWS. It is exciting that the terraform commands under the Powershell block in azure-pipelines.yaml run just as they would from the command line on our local laptop. Using Powershell to execute Terraform steps opened up many possibilities as I was trying to automate that using Azure DevOps. I hope you found this post useful.

Here is a link to my Github repo  –AzureDevOps-Terraform-AWS-VPC-Integration. Fork it and give it a try. Let me know if you have any suggestions and questions.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s