Apache Beam数据流:运行Terraform google_dataflow_job资源时找不到GoogleCloudOptions参数

时间:2019-06-27 19:42:59

标签: google-cloud-platform terraform apache-beam dataflow

错误

  

google_dataflow_job.validate-job:googleapi:错误400:(a9596c1f713a1bd6):无法创建工作流程。原因:(a9596c1f713a1919):找到意外的参数:['subnetwork'(也许是'runner'),'use_public_ips'(也许是'beam_plugins'),],badRequest

Terraform代码

resource "google_dataflow_job" "validate-job" {
   name = "validate-job"
   project = "${var.gcp_project_us["${terraform.workspace}"]}"
   template_gcs_path = "gs://my_bucket/templates/validate"
   temp_gcs_location = "gs://my_bucket/temp"
   zone = "us-central1-a"
   parameters = {
    runner = "DataflowRunner"
    streaming = true
    use_public_ips = true
    subnetwork = "https://www.googleapis.com/compute/v1/projects/some_project/regions/us-central1/subnetworks/some_sub_network"
  }
}

Apache Beam作业代码

options = PipelineOptions()
with beam.Pipeline(options=options) as gcp:

什么是允许use_publics_ips和子网变量在参数中正确处理的适当方法?

经过大量研究,我认为这与Apache Beam Job Code部分中的PipelineOptions有关。

0 个答案:

没有答案