top of page
  • Writer's pictureTim Burns

AWS and Terraform: Setting up your first AWS Remote File

AWS is the hidden backbone of many modern companies. It really changes your life as an architect or systems engineer, because all the boring stuff like hardware gets done in anonymous data centers, and all the cool stuff like application development is at your fingertips.

Terraform fills a gap in the creation of these services by providing a simple markup language to specify your hardware infrastructure.

In this chapter, I will go through step by step how you can set up a working example that may seem trivial, but when I complete this article series, will be a backbone of a very powerful and flexible data architecture.

Step 0: Download the Appropriate tools

Because we are computer scientists, we always start with 0. It's mostly about math, people. Maybe also about having lots of cool stuff installed on your computer. Download these applications and install them.

Download AWS, Terraform, Python3.

Step 1: Create your account and local credentials

Go to and create an AWS account. One you have done that, create an IAM user, and get the AWS Access Key ID and the AWS Secret Access Key. Take these keys from the site, and run

aws configure --profile awsdk-book

This will create (or append to) a profile in a folder called ".aws" on your home directory. This will contain the credentials for your AWS account.

Step 2: Create your S3 service with Terraform

Software isn't much fun without data, so let's create some data for our software to play with.

Terraform is a markup language that will let us specify our AWS infrastructure, and we will start with an S3 bucket. S3 is an object store that is much like a file system, but with important differences that I will ignore for now. The bucket can be thought of as the mount point, and the key can be thought of as the full path. Create a S3 bucket that will contain the setup to reference the profile you just created. Call the terraform file ""

provider "aws" { version = "~> 2.7" region = "us-east-1" profile = "awsdk-book" } resource "aws_s3_bucket" "s3_bucket" { bucket = "awsdk-book" acl = "private" }

Now create a JSON file in your local directory and call it json-files/table_of_contents.json

{ "title": "An AWS DataKit", "chapters": [ { "title": "Chapter 1: Setting up S3 with Terraform", } ] }

And finish the Terraform file by adding a reference to the JSON file

resource "aws_s3_bucket_object" "jsonpath_config" { bucket = "${aws_s3_bucket.s3_bucket.bucket}" key = "json-files/table_of_contents.json" source = "json-files/table_of_contents.json" }

Apply the change to your AWS account by running

terraform init
terraform apply -auto-approve

This will create your bucket and key.

Now, because Jeff Bezos has WAY to much money and you don't want to pay him for your AWS account, destroy the file you just created. Then you won't be charged any money.

terraform destroy --force

Stay tuned! My next post will be on how to write a Python script test-first that will parse this JSON object directly from S3.

20 views0 comments

Recent Posts

See All

Carto, Snowflake, and Data Management

A basic principle of data management: Don't move data unless you have to. Moving data is expensive and error-prone. Data Egress Cost: How To Take Back Control And Reduce Egress Charges Archiving to S


bottom of page