Introduction ¶
HCL (Hashicorp Configuration Language) is a high-level configuration language used in tools from Hashicorp (such as Terraform). HCL/Terraform is widely used in provisioning cloud infastructure and configuring platforms/services through APIs. This document focuses on HCL 0.13 syntax.
HCL is a declarative language and Terraform will consume all *.tf
files in the current folder, so code
placement and sequence has no significance. Sub-folders can be consumed through modules.
This guide is focused on HCL specifics, you should already be familiar with what Terraform is.
1// Top-level HCL file will interactively ask user values for the variables
2// which do not have a default value
3variable "ready" {
4 description = "Ready to learn?"
5 type = bool
6 // default = true
7}
8
9// Module block consults a specified folder for *.tf files, would
10// effectively prefix all resources IDs with "module.learn-basics."
11module "learn-basics" {
12 source = "./learn-basics"
13 ready_to_learn = var.ready
14}
15
16output "knowledge" {
17 value = module.learn-basics.knowledge
18}
learn-basics ¶
1// Variables are not automatically passed into modules
2// and can be typeless.
3variable "ready" {
4}
5
6// It is good practice to define a type though. There are 3 primitive types -
7// 3 collection types and 2 structural types. Structural types define
8// types recursively
9variable "structural-types" {
10 type = object({
11 object: object({
12 can-be-nested: bool
13 }),
14 tuple: tuple([int, string])
15 })
16
17 default = {
18 object = { can-be-nested: true }
19 tuple = [3, "cm"]
20 }
21}
22
23// Collection types may specify a type, but can also be "any".
24variable "list" {
25 type: list(string)
26 default = ["red", "green", "blue"]
27}
28
29variable "map" {
30 type: map(any)
31 default = {
32 red = "#FF0000"
33 "green" = "#00FF00"
34 }
35}
36
37variable "favourites" {
38 type: set
39 default = ["red", "blue"]
40}
41
42// When the type is not specified or is a mix of scalars
43// they will be converted to strings.
44
45// Use modern IDEs for type completion features. It does not matter
46// in which file and in which order you define a variable, it becomes
47// accessible from anywhere.
48
49// Default values for variables may not use expressions, but you can
50// use locals for that. You don't specify types for locals. With locals
51// you can create intermediate products from other variables, modules,
52// and functions.
53
54locals {
55 ready = var.ready ? "yes": "no"
56
57 yaml = yamldecode(file("${path.module}/file-in-current-folder.yaml"))
58}
59
60// 'locals' blocks can be defined multiple times, but all variables,
61// resources and local names should be unique
62
63locals {
64 set = toset(var.map)
65}
66
67module "more-resources" {
68 source = "../more-learning"
69 yaml-data = local.yaml
70}
71
72// Modules can declare outputs, that can be optionally referenced
73// (see above), typically outputs appear at the bottom of the file or
74// in "outputs.tf".
75output "knowledge" {
76 value = "types so far, more to come"
77}
Terraform exists for managing cloud «resources». A resource could be anything as long as it can be created and destroyed through an API call. (compute instance, distribution, DNS record, S3 bucket, SSL certificate or permission grant). Terraform relies on «providers» for implementing specific vendor APIs. For example the «aws» provider enables use of resources for managing AWS cloud resources.
When terraform
is invoked (terraform apply
) it will validate code, create all resources
in memory, load their existing state from a file (state file), refresh against the current
cloud APIs and then calculate the differences. Based on the differences, Terraform proposes
a «plan» - series of create, modify or delete actions to bring your infrastructrue in
alignment with an HCL definition.
Terraform will also automatically calculate dependencies between resources and will maintain the correct create / destroy order. Failure during execution allows you to retry the entire process, which will usually pick off where things finished.
more-learning ¶
Time to introduce resources.
1variable "yaml-data" {
2
3 // config is sourced from a .yaml file, so technically it is a
4 // map(any), but we can narrow down type like this:
5 type = map(string)
6}
7
8// You do not need to explicitly define providers, they all have reasonable
9// defaults with environment variables. Using a resource that relies on a
10// provider will also transparently initialize it (when you invoke terraform init)
11resource "aws_s3_bucket" "bucket" {
12 bucket = "abc"
13}
14
15// You can also create provider aliases
16provider "aws" {
17 alias = "as-role"
18 assume_role {
19 role_arn = ".."
20 }
21}
22
23// then use them to create resources
24resource "aws_s3_bucket_object" "test-file" {
25
26 // all resources have attributes that can be referenced. Some of those
27 // will be available right away (like bucket) and others may only
28 // become available after the plan begins executing. The test-file resource
29 // will be created only after aws_s3_bucket.bucket finishes being created
30
31 // depends_on = aws_s3_bucket.bucket
32 bucket = aws_s3_bucket.bucket.bucket
33 key = "index.html"
34 content = file("${path.module}/index.html")
35
36 // you can also manually specify provider alias
37 provider = aws.as-role
38}
39
40// Each resource will receive an ID in state, like "aws_s3_bucket.bucket".
41// When resources are created inside a module, their state ID is prepended
42// with module.<module-name>
43
44module "learn-each" {
45 source = "../learn-each"
46}
47
48// Nesting modules like this may not be the best practice, and it's only
49// used here for illustration purposes
learn-each ¶
Terraform offers some great features for creating series of objects:
1locals {
2 list = ["red", "green", "blue"]
3}
4resource "aws_s3_bucket" "badly-coloured-bucket" {
5 count = count(local.list)
6 bucket_prefix = "${local.list[count.index]}-"
7}
8// will create 3 buckets, prefixed with "red-", etc. and followed by
9// a unique identifier. Some resources will automatically generate
10// a random name if not specified. The actual name of the resource
11// (or bucket in this example) can be referenced as attributes
12
13output "red-bucket-name" {
14 value = aws_s3_bucket.badly-coloured-bucket[0].bucket
15}
16
17// note that bucket resource ID will be "aws_s3_bucket.badly-coloured-bucket[0]"
18// through to 2, because they are list index elements. If you remove "red" from
19// the list, however, it will re-create all the buckets as they would now
20// have new IDs. A better way is to use for_each
21
22resource "aws_s3_bucket" "coloured-bucket" {
23 // for_each only supports maps and sets
24 for_each = toset(local.list)
25 bucket_prefix = "${each.value}-"
26}
27
28// the name for this resource would be aws_s3_bucket.coloured-bucket[red]
29
30output "red-bucket-name2" {
31 value = aws_s3_bucket.badly-coloured-bucket["red"].bucket
32}
33
34output "all-bucket-names" {
35
36 // returns a list containing bucket names - using a "splat expression"
37 value = aws_s3_bucket.coloured-bucket[*].bucket
38}
39
40// there are other splat expressions:
41output "all-bucket-names2" {
42 value = [for b in aws_s3_bucket.coloured-bucket: b.bucket]
43}
44// can also include a filter
45output "filtered-bucket-names" {
46 value = [for b in aws_s3_bucket.coloured-bucket:
47 b.bucket if length(b.bucket) < 10 ]
48}
49
50// here are some ways to generate maps {red: "red-123123.."}
51output "bucket-map" {
52 value = {
53 for b in aws_s3_bucket.coloured-bucket:
54 trimsuffix(b.bucket_prefix, '-')
55 => b.bucket
56 }
57}
58
59// as of Terraform 0.13 it is now also possible to use count/each for modules
60
61variable "learn-functions" {
62 type = bool
63 default = true
64}
65
66module "learn-functions" {
67 count = var.learn-functions ? 1: 0
68 source = "../learn-functions"
69}
This is now popular syntax that works in Terraform 0.13 that allows including modules conditionally.
learn-functions ¶
Terraform does not allow you to define your own functions, but there’s an extensive list of built-in functions
1locals {
2 list = ["one", "two", "three"]
3
4 upper_list = [for x in local.list : upper(x) ] // "ONE", "TWO", "THREE"
5
6 map = {for x in local.list : x => upper(x) } // "one":"ONE", "two":"TWO", "three":"THREE"
7
8 filtered_list = [for k, v in local.map : substr(v, 0, 2) if k != "two" ] // "ON", "TH"
9
10 prefixed_list = [for v in local.filtered_list : "pre-${v}" ] // "pre-ON", "pre-TH"
11
12 joined_list = join(local.upper_list,local. filtered_list) // "ONE", "TWO", "THREE", "pre-ON", "pre-TH"
13
14 // Set is very similar to List, but element order is irrelevant
15 joined_set = toset(local.joined_list) // "ONE", "TWO", "THREE", "pre-ON", "pre-TH"
16
17 map_again = map(slice(local.joined_list, 0, 4)) // "ONE":"TWO", "THREE":"pre-ON"
18}
19
20// Usually list manipulation can be useful either for a resource with for_each or
21// to specify a dynamic block for a resource. This creates a bucket with some tags:
22
23resource "aws_s3_bucket" "bucket" {
24 name = "test-bucket"
25 tags = local.map_again
26}
27
28// this is identical to:
29// resource "aws_s3_bucket" "bucket" {
30// name = "test-bucket"
31// tags = {
32// ONE = "TWO"
33// THREE = "pre-ON"
34// }
35// }
36
37// Some resources also contain dynamic blocks. The next example uses a "data" block
38// to look up 3 buckets (red, green and blue), then creates a policy that contains
39// read-only access to the red and green buckets and full access to the blue bucket.
40
41locals {
42 buckets = {
43 red = "read-only"
44 green = "read-only"
45 blue = "full"
46 }
47 // we could load buckets from a file:
48 // bucket = file('bucket.json')
49
50 actions = {
51 "read-only" = ["s3:GetObject", "s3:GetObjectVersion"],
52 "full" = ["s3:GetObject", "s3:GetObjectVersion", "s3:PutObject", "s3:PutObjectVersion"]
53 }
54 // we will look up actions, so that we don't have to repeat actions
55}
56
57// use a function to convert map keys into set
58data "aws_s3_bucket" "bucket" {
59 for_each = toset(keys(local.buckets))
60 bucket = each.value
61}
62
63// create json for our policy
64data "aws_iam_policy_document" "role_policy" {
65 statement {
66 effect = "Allow"
67 actions = [
68 "ec2:*",
69 ]
70 resources = ["*"]
71 }
72
73 dynamic "statement" {
74 for_each = local.buckets
75 content {
76 effect = "Allow"
77 actions = lookup(local.actions, statement.value, null)
78 resources = [data.aws_s3_bucket.bucket[statement.key]]
79 }
80 }
81}
82
83// and this actually creates the AWS policy with permissions to all buckets
84resource "aws_iam_policy" "policy" {
85 policy = data.aws_iam_policy_document.role_policy.json
86}