读取PubsubIO写入DatastoreIO

时间:2016-01-14 14:27:13

标签: google-cloud-datastore google-cloud-dataflow google-cloud-pubsub

是否可以创建从Pub / Sub读取数据并写入数据存储区的管道?在我的代码中,我将PubsubIO指定为输入,并应用窗口来获取有界PCollection,但似乎无法使用带有options.setStreaming的DatastoreIO.writeTo为true,而这是必需的以便使用PubsubIO作为输入。有没有解决的办法?或者是不可能从pubsub读取并写入数据存储区?

这是我的代码:

DataflowPipelineOptions options = PipelineOptionsFactory.create()
            .as(DataflowPipelineOptions.class);

    options.setRunner(DataflowPipelineRunner.class);
    options.setProject(projectName);
    options.setStagingLocation("gs://my-staging-bucket/staging");
    options.setStreaming(true);

    Pipeline p = Pipeline.create(options);

    PCollection<String> input = p.apply(PubsubIO.Read.topic("projects/"+projectName+"/topics/event-streaming"));
    PCollection<String> inputWindow = input.apply(Window.<String>into(FixedWindows.of(Duration.standardSeconds(5))).triggering(AfterPane.elementCountAtLeast(1)).discardingFiredPanes().withAllowedLateness(Duration.standardHours(1)));
    PCollection<String> inputDecode = inputWindow.apply(ParDo.of(new DoFn<String, String>() {
        private static final long serialVersionUID = 1L;
        public void processElement(ProcessContext c) {
            String msg = c.element();
            byte[] decoded = Base64.decodeBase64(msg.getBytes());
            String outmsg = new String(decoded);
            c.output(outmsg);
        }
    }));
    PCollection<DatastoreV1.Entity> inputEntity = inputDecode.apply(ParDo.of(new CreateEntityFn("stream", "events")));

    inputEntity.apply(DatastoreIO.writeTo(datasetid));


    p.run();

这是我得到的例外:

Exception in thread "main" java.lang.UnsupportedOperationException: The Write transform is not supported by the Dataflow streaming runner.
at com.google.cloud.dataflow.sdk.runners.DataflowPipelineRunner$StreamingWrite.apply(DataflowPipelineRunner.java:488)
at com.google.cloud.dataflow.sdk.runners.DataflowPipelineRunner$StreamingWrite.apply(DataflowPipelineRunner.java:480)
at com.google.cloud.dataflow.sdk.runners.PipelineRunner.apply(PipelineRunner.java:74)
at com.google.cloud.dataflow.sdk.runners.DataflowPipelineRunner.apply(DataflowPipelineRunner.java:314)
at com.google.cloud.dataflow.sdk.Pipeline.applyInternal(Pipeline.java:358)
at com.google.cloud.dataflow.sdk.Pipeline.applyTransform(Pipeline.java:267)
at com.google.cloud.dataflow.sdk.runners.DataflowPipelineRunner.apply(DataflowPipelineRunner.java:312)
at com.google.cloud.dataflow.sdk.Pipeline.applyInternal(Pipeline.java:358)
at com.google.cloud.dataflow.sdk.Pipeline.applyTransform(Pipeline.java:267)
at com.google.cloud.dataflow.sdk.values.PCollection.apply(PCollection.java:159)
at my.own.project.google.dataflow.EventStreamingDataflow.main(EventStreamingDataflow.java:104)

2 个答案:

答案 0 :(得分:5)

流媒体运行时当前不支持DatastoreIO接收器。要从流管道写入数据存储区,您可以从DoFn直接调用数据存储区API。

答案 1 :(得分:5)

好吧,经过大量撞击墙壁后,我终于开始工作了。像danielm建议的那样,我从ParDo DoFn调用Datastore API。一个问题是,我没有意识到有一个单独的API可以在AppEngine之外使用Cloud Datastore。 (com.google.api.services.datastore ...与com.google.appengine.api.datastore ...)。另一个问题是,最新版本的Cloud Datastore API显然存在某种错误(google-api-services-datastore-protobuf v1beta2-rev1-4.0.0,我收到了IllegalAccessError),我通过使用旧版本(v1beta2-rev1-2.1.2)。

所以,这是我的工作代码:

import com.google.cloud.dataflow.sdk.Pipeline;
import com.google.cloud.dataflow.sdk.io.PubsubIO;
import com.google.cloud.dataflow.sdk.options.DataflowPipelineOptions;
import com.google.cloud.dataflow.sdk.options.PipelineOptionsFactory;
import com.google.cloud.dataflow.sdk.runners.DataflowPipelineRunner;
import com.google.cloud.dataflow.sdk.transforms.DoFn;
import com.google.cloud.dataflow.sdk.transforms.ParDo;
import com.google.cloud.dataflow.sdk.values.PCollection;
import com.google.api.services.datastore.DatastoreV1.*;
import com.google.api.services.datastore.client.Datastore;
import com.google.api.services.datastore.client.DatastoreException;
import com.google.api.services.datastore.client.DatastoreFactory;
import static com.google.api.services.datastore.client.DatastoreHelper.*;
import java.security.GeneralSecurityException;
import java.io.IOException;
import org.json.simple.JSONObject;
import org.json.simple.parser.JSONParser;
import org.json.simple.parser.ParseException;

//--------------------

public static void main(String[] args) {
    DataflowPipelineOptions options = PipelineOptionsFactory.create()
            .as(DataflowPipelineOptions.class);

    options.setRunner(DataflowPipelineRunner.class);
    options.setProject(projectName);
    options.setStagingLocation("gs://my-staging-bucket/staging");
    options.setStreaming(true);

    Pipeline p = Pipeline.create(options);
    PCollection<String> input = p.apply(PubsubIO.Read.topic("projects/"+projectName+"/topics/my-topic-name"));

    input.apply(ParDo.of(new DoFn<String, String>() {
        private static final long serialVersionUID = 1L;
        public void processElement(ProcessContext c) throws ParseException, DatastoreException {

            JSONObject json = (JSONObject)new JSONParser().parse(c.element());

            Datastore datastore = null;
            try {
                datastore = DatastoreFactory.get().create(getOptionsFromEnv()
                        .dataset(datasetid).build());
            } catch (GeneralSecurityException exception) {
                System.err.println("Security error connecting to the datastore: " + exception.getMessage());
            } catch (IOException exception) {
                System.err.println("I/O error connecting to the datastore: " + exception.getMessage());
            }

            Key.Builder keyBuilder = makeKey("my-kind");
            keyBuilder.getPartitionIdBuilder().setNamespace("my-namespace");
            Entity.Builder event = Entity.newBuilder()
                    .setKey(keyBuilder);

            event.addProperty(makeProperty("my-prop",makeValue((String)json.get("my-prop"))));

            CommitRequest commitRequest = CommitRequest.newBuilder()
                    .setMode(CommitRequest.Mode.NON_TRANSACTIONAL)
                    .setMutation(Mutation.newBuilder().addInsertAutoId(event))
                    .build();
            if(datastore!=null){
                datastore.commit(commitRequest);
            }

        }
    }));


    p.run();
}

pom.xml中的依赖项:

<dependency>
  <groupId>com.google.cloud.dataflow</groupId>
  <artifactId>google-cloud-dataflow-java-sdk-all</artifactId>
  <version>[1.0.0,2.0.0)</version>
</dependency>
<dependency>
  <groupId>com.google.apis</groupId>
  <artifactId>google-api-services-datastore-protobuf</artifactId>
  <version>v1beta2-rev1-2.1.2</version>
</dependency>
<dependency>
  <groupId>com.google.http-client</groupId>
  <artifactId>google-http-client</artifactId>
  <version>1.17.0-rc</version>
</dependency>
<!-- Some more.. like JUnit etc..  -->