If you have worked with Amazon EC2 user data, you’d have noticed a shortcoming in the approach –the inability to pass command-line arguments to the user data script at run time.
Let me explain why I believe that to be a problem. User data is a capability associated with an Amazon EC2 instance as part of the provisioning process. Due to its nature, user data is an ideal candidate to carry the load of server provisioning steps like:
1. Installing 3rd party software.
2. Configuring permissions and access.
3. Installing and configuring windows features.
4. Installing endpoint agents, etc.
A few of these steps involve sharing secure credentials as part of authentication. Due to its inability to work with command-line arguments, the user data script requires that these credentials be provided inside the script. That means anyone who has access to the user data script in the Amazon EC2 instance also has access to the credentials.
That is not desirable, and there is a way around the problem -AWS System Manager parameter store.
Per AWS-Docs, -Parameter Store (is) a capability of AWS Systems Manager (that) provides secure, hierarchical storage for configuration data management and secrets management. More information is available at AWS Systems Manager Parameter Store.
Hence, my thought process was to (i) store the sensitive credentials in the ssm-parameter store, (ii) associate an IAM role to the Amazon EC2 instance that had permission to read from the parameter store, (iii) pass the parameter store variable name to the Amazon EC2 user data script to decrypt and (iv) add the capability in the user data script to read from the parameter store. That way, the sensitive credential could be passed to the user data script in the form of a variable without storing the variable’s value in plain text for anyone to access. Using Terraform, this approach involved four steps that I listed below. If you are interested, I have the code in my GitHub repo: add-ssm-parameter
Step 1: Create an ssm parameter store object for the sensitive value
I created an “AWS Systems Manager parameter store” object named /dev/SecureVariableOne
and stored the sensitive value specified in var.SecureVariableOne
using Terraform. I then provided the value of this variable to Terraform using the -var SecureVariableOne=ThisIsASecureValue
flag in the terraform plan
step.
Note: You should not store the value of SecureVariableOne
in source control, and hence the value must be passed at run time or made available via some other secure process.
On the AWS console, I confirmed that a resource was created with the above specifications.
Step 2: Create an IAM role and associate the Amazon EC2 instance with appropriate permissions.
This step is quite exhaustive, and hence I created a separate note to describe that in detail. You may read about that at –attach IAM role to an Amazon EC2 instance using Terraform. As an overview, the concept was -assign a set of permissions to an Amazon EC2 instance to carry out a specific set of activities. Hence I created an IAM policy file with a set of permissions/rules and assigned it to an IAM role, which was then associated with an IAM instance profile that was then assumed by an Amazon EC2 instance. The Amazon EC2 instance was then able to perform a set of actions listed in the IAM policy file.
The IAM policy file had the below rule that allowed any entity (user or role) attached with it to GetParameter
on the parameter store specified in Resource, which was the same AWS SSM parameter store as specified in Step 1.
Note: If you are wondering the difference between ssm:GetParameter
and ssm:GetParameters
please check this StackOverflow question getparameter-vs-getparameters.
Step 3: Pass the parameter store variable name to the Amazon EC2 user data script to decrypt
I provided the name of the variable aws_ssm_parameter.parameter_one.name
in the user_data
block under aws_instance
. That variable is referred to as $SecureVariable
in the user data script. I have highlighted that in the below code block.
Step 4: Update user data script to read the value from the parameter store
Finally, I added the function call to the ssm-parameter
store variable via the get-ssmparameter
function in the user data script, as shown in the below block.
After applying these four steps, I triggered the terraform plan
and terraform apply
step. As noted earlier (under Step 1), I passed the value of the secure variable at run time via the command line. After the terraform apply
command, I logged into the Amazon EC2 instance and verified that the user data script was successful. Here is an image of the log file that the user data script created. As you can see, the logic correctly decrypted the value of the ssm-parameter
store via the user data script.
Moreover, the value of the ssm-parameter
store is not stored anywhere in plain text on the Amazon EC2 instance.
Note: I logged the value of the secure variable in the log file as an example to show the correct application of the use case. Ideally, secure variables should not be logged or stored in plain text.
Also, note that although the value of the secure variable is not stored anywhere on the Amazon EC2, anybody who can log into the instance can run the get-ssmparameter
function and decrypt the secure string. There are a few approaches to addressing that gap.
And that brings us to the end of this note. If you have any questions or suggestions, please do not hesitate to ask. If you are new to Amazon EC2 user data, I have another note that you might find useful –working with Amazon EC2 user data and Terraform.
2 thoughts on “Manage sensitive variables in Amazon EC2 user data with Terraform”