Most people are using JSON for one thing or another, and whilst I had an understanding of how to reference JSON objects in Python for use with Lambda functions, I came across a need to do similar with Terraform today, with files hosted on S3. It took me a little while to work out what to do and it wasn’t entirely clear from my searching so thought I’d write it up and share it here.
S3 hosted JSON files
Before even writing any Terraform it’s important to ensure the files in S3 have the correct metadata, as without this the content of the files cannot be read by Terraform. I used application/json for this demo, the Terraform documentation suggests text/* is supported to: -
The contents of the file stored in S3 is a simple user mapping: -
You need permissions to access the S3 bucket to download the file, assuming you have that in place then a simple Terraform data object allows usage in Terraform: -
At this point I used locals within Terraform to extract relevant information from the files, using the jsondecode function and referencing the above data objects ‘body’ attribute (which only works for the content types mentioned above): -
all_users provides a json representation of the code in the file stored on s3: -
all_user_names iterates through the list and only returns the name attribute: -
first_user only returns the first user’s details: -
first_user_name returns on the name of the first user: -
Code to use and print as outputs below: -
data “aws_s3_bucket_object” “users” {
bucket = “<enter bucket name here>”
key = “<enter objects key here>”
}
locals {
json_data = jsondecode(data.aws_s3_bucket_object.users.body)
all_users = local.json_data.users
all_users_names = [for user in local.json_data.users : user.name]
first_user = local.json_data.users[0]
first_user_name = local.json_data.users[0].name
}
output “all_users” {
value = local.all_users
}
output “all_users_names” {
value = local.all_users_names
}
output “first_user” {
value = local.first_user
}
output “first_user_name” {
value = local.first_user_name
}