Recently, I had the need to run Terraform deployments, but couldn’t come up with a nice way to pass in the Terraform deployment file I wanted dynamically without the need to create anything on the filesystem. Luckily, I didn’t need to persist state of my deployment (although remote state would probably work), I just needed infrastructure to get setup and get setup well. I ended up on a solution using Docker and passing the configuration files as environment variables to the Docker container on run.
Reviewing the Dockerfile first:
FROM golang:alpine
LABEL Author="HashiCorp Terraform Team <[email protected]>"
ENV TERRAFORM_VERSION=0.14.5
RUN apk add --update git bash openssh
ENV TF_DEV=true
ENV TF_RELEASE=true
WORKDIR $GOPATH/src/github.com/hashicorp/terraform
RUN git clone https://github.com/hashicorp/terraform.git ./ && \
git checkout v${TERRAFORM_VERSION} && \
/bin/bash scripts/build.sh
WORKDIR $GOPATH
COPY entrypoint.sh /
CMD ["/entrypoint.sh"]
The major change from many Terraform Dockerfiles is utilizing a script for an entrypoint instead of the terraform
command. The entrypoint.sh
has the following contents:
#!/bin/bash
echo $MAIN_CONFIG | base64 -d > main.tf
echo $VARS_CONFIG | base64 -d > variables.tf
terraform init && terraform apply -auto-approve
The entrypoint is looking for environment variables set for Terraform main & variables configuration files as base64, decode them, and then create files for them. After those files have been created, terraform init
can be run followed by terraform apply
.
I’ve taken this a bit further and adapted it to AWS ECS and now I have my own way to ad hoc create the infrastructure I need from anywhere!