使用ValueProivder.RuntimeProvider构建pipline

时间:2017-10-02 23:44:55

标签: google-cloud-dataflow

我有一个使用库版本1.9.1作业的Google Dataflow作业,该作业正在运行运行时参数。我们使用了TextIO.read()。from()。withoutValidation()。自从我们迁移到google dataflow 2.0.0后,在2.0.0中删除了withoutValidation。发行说明页面并未谈及此https://cloud.google.com/dataflow/release-notes/release-notes-java-2

我们尝试将输入作为ValueProvider.RuntimeProvider传递。但在管道构建期间,我们得到以下错误。如果将其作为ValueProvider传递,则管道创建正在尝试验证值提供程序。如何在google cloud dataflow 2.0.0中为TextIO输入提供运行时值提供程序?

java.lang.RuntimeException:方法getInputFile不应该有返回类型RuntimeValueProvider,而是使用ValueProvider。         在org.apache.beam.sdk.options.ProxyInvocationHandler.getDefault(ProxyInvocationHandler.java:505)

1 个答案:

答案 0 :(得分:1)

我将假设您正在使用模板化管道,并且您的管道正在使用运行时参数。以下是使用Cloud Dataflow SDK 2.1.0版的工作示例。它从GCS读取文件(在运行时传递给模板),将每一行转换为TableRow并写入BigQuery。这是一个简单的例子,但它适用于2.1.0

程序args如下:

 --project=<your_project_id>
 --runner=DataflowRunner
 --templateLocation=gs://<your_bucket>/dataflow_pipeline
 --stagingLocation=gs://<your_bucket>/jars
 --tempLocation=gs://<your_bucket>/tmp

程序代码如下:

public class TemplatePipeline {
    public static void main(String[] args) {
        PipelineOptionsFactory.register(TemplateOptions.class);
        TemplateOptions options = PipelineOptionsFactory
                .fromArgs(args)
                .withValidation()
                .as(TemplateOptions.class);
        Pipeline pipeline = Pipeline.create(options);
        pipeline.apply("READ", TextIO.read().from(options.getInputFile()).withCompressionType(TextIO.CompressionType.GZIP))
                .apply("TRANSFORM", ParDo.of(new WikiParDo()))
                .apply("WRITE", BigQueryIO.writeTableRows()
                        .to(String.format("%s:dataset_name.wiki_demo", options.getProject()))
                        .withCreateDisposition(CREATE_IF_NEEDED)
                        .withWriteDisposition(WRITE_TRUNCATE)
                        .withSchema(getTableSchema()));
        pipeline.run();
    }

    private static TableSchema getTableSchema() {
        List<TableFieldSchema> fields = new ArrayList<>();
        fields.add(new TableFieldSchema().setName("year").setType("INTEGER"));
        fields.add(new TableFieldSchema().setName("month").setType("INTEGER"));
        fields.add(new TableFieldSchema().setName("day").setType("INTEGER"));
        fields.add(new TableFieldSchema().setName("wikimedia_project").setType("STRING"));
        fields.add(new TableFieldSchema().setName("language").setType("STRING"));
        fields.add(new TableFieldSchema().setName("title").setType("STRING"));
        fields.add(new TableFieldSchema().setName("views").setType("INTEGER"));
        return new TableSchema().setFields(fields);
    }

    public interface TemplateOptions extends DataflowPipelineOptions {
        @Description("GCS path of the file to read from")
        ValueProvider<String> getInputFile();

        void setInputFile(ValueProvider<String> value);
    }

    private static class WikiParDo extends DoFn<String, TableRow> {
        @ProcessElement
        public void processElement(ProcessContext c) throws Exception {
            String[] split = c.element().split(",");
            TableRow row = new TableRow();
            for (int i = 0; i < split.length; i++) {
                TableFieldSchema col = getTableSchema().getFields().get(i);
                row.set(col.getName(), split[i]);
            }
            c.output(row);
        }
    }
}