I required a few files and folders on an EC2 instance as part of the provisioning process. So, the objective was to upload these files and folders into an AWS S3 bucket and download them from the EC2 instance with the assistance of the user data script and Terraform.
Note: As a reader of this note, I assume that you know AWS S3 and EC2 instance and their usage. If you want to know more, please refer to AWS-Docs on S3 and EC2.
I broke down this use case into five minor use cases. The end objective was to have an EC2 instance with specific files and folders. AWS S3 is a storage solution from AWS, and hence I stored the files and folders there. As part of provisioning the AWS EC2 instance, my approach was to download the files and folders from AWS S3 using the AWS CLI. Provisioning an AWS EC2 is done using the user data script, so I required that. Also, I needed AWS CLI installed on the AWS EC2 instance. Moreover, the EC2 instance also needed permissions to access the AWS S3 bucket. Putting all these together, the sequence of steps were as follows:
Step 1: Provision an AWS S3 bucket and store files and folders required by the AWS EC2 instance
The AWS S3 bucket was already created for this specific use case and so I uploaded the files stored in the local repository (
Step 2: Provision an AWS EC2 instance and have a user data script ready
When taking this approach, there are multiple factors to consider -should the script be run multiple times or only once? What is the sequence of steps to be executed via user data? What is the size of the user data script? I covered them separately –Working with AWS EC2 user data and Terraform.
Step 3: Attach an IAM role to the AWS EC2 instance
The EC2 instance required a set of permissions to communicate with the AWS S3 bucket. I described that in detail at –Attach IAM role to AWS EC2 instance using Terraform.
Step 4: Install AWS CLI on the AWS EC2
I could achieve this objective in two ways if the underlying instance were running Windows. Install the AWS.Tools module for PowerShell or install the AWS CLI. I have two separate notes on them that you may find useful. Install AWS.Tools module. OR Install AWS CLI.
Step 5: Call the S3-CopyObject AWS command
This is the final step, and I completed that via the command shown in the image below in the user data script.
I have a working copy of the Terraform code in my repository at -add-s3-access
Another critical aspect of working with a user data script is to provide clear logging. That comes in handy when dealing with error conditions when building the logic. Also, identify where the user data script (PowerShell) and the logging information are stored. I made these considerations and covered them in the code version stored in the Github repository. I have an image of the log file of the user data execution below.
Comparing the user data script and the log file allows you to understand the logic and process flow. After a few minutes of the
terraform apply command, I logged into the EC2 instance and verified that the two files from the S3 bucket were available in the EC2 instance.
I hope this note was useful. Let me know if you have any questions or suggestions.