如何减少Terraform中重复的HCL代码?

时间:2019-01-24 04:55:58

标签: amazon-web-services amazon-s3 terraform

我有一些这样的Terraform代码:

resource "aws_s3_bucket_object" "file1" {
  key    = "someobject1"
  bucket = "${aws_s3_bucket.examplebucket.id}"
  source = "./src/index.php"
}

resource "aws_s3_bucket_object" "file2" {
  key    = "someobject2"
  bucket = "${aws_s3_bucket.examplebucket.id}"
  source = "./src/main.php"
}

# same code here, 10 files more
# ...

有更简单的方法吗?

1 个答案:

答案 0 :(得分:2)

Terraform支持通过count meta parameter在资源和数据源上进行循环。

因此,对于一个稍微简单的示例,如果您想遍历众所周知的文件列表,则可以执行以下操作:

locals {
  files = [
    "index.php",
    "main.php",
  ]
}

resource "aws_s3_bucket_object" "files" {
  count  = "${length(local.files)}"
  key    = "${local.files[count.index]}"
  bucket = "${aws_s3_bucket.examplebucket.id}"
  source = "./src/${local.files[count.index]}"
}

不幸的是,尽管有issue tracking the feature request,但Terraform的AWS提供程序不支持等效的aws s3 syncaws s3 cp --recursive