数据流无法使用自定义模板解析模板文件

时间:2019-09-03 12:08:43

标签: python google-cloud-dataflow

我正在尝试在数据流中运行一个简单的管道

import apache_beam as beam


options = beam.options.pipeline_options.PipelineOptions()

gcloud_options = options.view_as(beam.options.pipeline_options.GoogleCloudOptions)
gcloud_options.job_name = 'dataflow-tutorial1'
gcloud_options.project = 'xxxx'
gcloud_options.staging_location = 'gs://xxxx/staging'
gcloud_options.temp_location = 'gs://xxxx/temp'
gcloud_options.service_account_email = 'dataflow@xxxx.iam.gserviceaccount.com'


worker_options = options.view_as(beam.options.pipeline_options.WorkerOptions)
worker_options.disk_size_gb = 20
worker_options.max_num_workers = 2


options.view_as(beam.options.pipeline_options.StandardOptions).runner = 'DataflowRunner'


p1 = beam.Pipeline(options=options)

(p1 | 'Hello World' >> beam.Create(['Hello World']))

p1.run()

当我从数据流用户界面创建作业并尝试运行它时,我不断得到

Unable to parse template file 'gs://dataflow-sm/pipeline-files/read-write-to-gsc-file.py'.

如果我从终端运行它,

ERROR: (gcloud.dataflow.jobs.run) FAILED_PRECONDITION: Unable to parse template file 'gs://dataflow-sm/pipeline-files/read-write-to-gsc-file.py'.
- '@type': type.googleapis.com/google.rpc.PreconditionFailure
  violations:
  - description: "Unexpected end of stream : expected '{'"
    subject: 0:0
    type: JSON

你知道这里可能是什么问题吗?

1 个答案:

答案 0 :(得分:1)

您缺少一个步骤:将Python代码转换为JSON模板。可以在here中找到说明。对于Python,尤其是:

python read-write-to-gsc-file.py \
  --runner DataflowRunner \
  ...
  --template_location gs://dataflow-sm/pipeline-files/read-write-to-gsc-file

该模板将暂存在--template_location指定的GCS路径上。请参见Google提供的word count template作为示例。

然后您可以execute提供--gcs-location的模板:

gcloud dataflow jobs run [JOB_NAME] \
        --gcs-location gs://dataflow-sm/pipeline-files/read-write-to-gsc-file